Being a Mastodon Moderator
People ask me what it’s like to be a moderator. Our discussions reveal that a lot of what we do is a mystery. So, I’m gonna lay it out for you. Specifically about the unique fediverse moderation model, mutual aid, and mental health.
What Fediverse Moderation Looks Like
Centralized networks like Instagram, Facebook, and Twitter create rules that please executives and shareholders. Those decision makers don’t have to be ethical—and they rarely are.
They obey money. Big money. And big money is never on the side of the people. Big money gets rid of fact checkers, amplifies malicious bigotry, and platforms nazis.
The fediverse is decentralized. That means there isn’t one server for all accounts and it’s not run by one company. It’s a network of servers (or instances), each self-owned/operated. Sometimes that’s one person, but most often is a small team. And each team makes its own rules for moderation. They all communicate symbiotically to create the single social network.
There’s a common theme to the rules. While not universal, the majority of instances focus on ethics and inclusivity. Those who don’t get blocked through a process called “defederation.” This is a global block between instances. It means no one on the blocked server can interact with anyone from the server who moderated it.
This makes the entire fediverse community stronger. At a local level, on our server, we have a process that seems to serve us well.
What Mas.to Moderation Looks Like
There are three moderators on our team and about 185,000 accounts. About 12,000 of those are active monthly. But they’re able to interact with an estimated 12 million total fediverse accounts. So…we have our hands full. But it’s manageable. Our queue is never very long.
The baseline process works like this:
- Someone reports a post or an account
- The report shows up in our queue
- One of us reviews the report and either resolves (dismisses) or moderates
- If we moderate, the account receives an email notification with an appeal option
- If they appeal, we see that in our queue with their plea
- One of us reviews the appeal and either accepts or rejects
We also have a house rule that no one can act on an appeal for their own moderation. It’s a good way to keep each other in check. Especially because sometimes we don’t agree. This is okay. We just talk it out together and try to make the best call.
There’s an array of moderation options we use:
- Delete one or more posts
- Limit the account (only their followers will see their account)
- Suspend the account
- Send a warning
- Freeze the account (lock them out)
- Force posts to be flagged as sensitive (blurred)
There are five key things to know about the process…
It’s Hard
No decision is easy, and many decisions aren’t binary. There’s lots of gray area, even with explicit rules. We often receive reports that don’t violate those rules. It’s simply someone saying they don’t like what another person said. We aren’t argument referees, but people expect us to be. If that argument becomes harassment or discrimination, it’s time for intervention. But even that isn’t always straightforward.
We Get Some Things Wrong
We’ll sometimes debate for days on a single report. We always do our best to make the right call. Even so, we sometimes make mistakes. We’re willing to accept appeals when that happens.
We Get Most Things Right
We get a lot of things right. You can’t even imagine some of the traumatic things we see. Things we can’t ever unsee.
We’re Targets of Both Love and Hate
We get lots of love, but we get spit on a lot. Some people lose their minds when they’re moderated. They’ll publicly shame us or call for uprisings against us. Quite literally. I’ve even been known to vent about it because it gets exhausting.
Moderation Doesn’t End in the Queue
Some moderation is urgent or the severity requires additional action. We report all CSAM content to legal agencies in relevant countries. And with posts containing violent threats. We do our best to track the location of the offender. Then we report IPs/emails to domain registrars and web hosts.
It’s also important to note that we don’t moderate reports of ourselves, which actually happens. As of this writing, I’ve been reported five times. All were (objectively) resolved without moderation.
One challenging topic is mutual aid (community economic support).
Mutual Aid
First off, I believe in mutual aid. I don’t trust wealthy people or government systems to inherently take care of people. We need to take care of each other. Good, healthy mutual aid is a beautiful thing. This is the way.
But moderating mutual aid is a conundrum. The posts are from either people in need or scammers, and it’s not always easy to tell which one it is. We want to protect our Mastodon community from the latter. So we have to make judgment calls.
Sometimes it’s obvious, but most of the time we have to do some sleuthing. We analyze the account and related IP addresses. We conduct reverse image searches and research online profiles. It’s detective work.
I’d like to say we always get this right. We don’t. And it breaks our hearts if we make a mistake on this one. But we try to get it right. We really do.
Because we care, all of this has an impact on our mental health. Some negative, some positive.
Negative Impact on Mental Health
Content warning: I don’t details about traumatic content. But I do mention some topics that may be triggering.
Much of what we do is routine. But things can get bad, mentally. We take breaks when needed, knowing the others have our back. I’m grateful for this. Because I have definitely needed it.
There are two usual suspects that push the envelope:
- Traumatic content
- Being personally-targeted by people we’ve moderated
Traumatic Content
This section is difficult to discuss. I’ve tried to represent moderators as a whole in the rest of this article. But I’m going to personalize this section. Even though other moderators have shared similar sentiments, I don’t want to speak for anyone else on this.
I’ve seen some things. Really, really, really unimaginable things. Things I never wanted to see in my life that are now in my brain. And I’ve brought almost all of them to therapy.
People get reported for obvious things: slurs, misogyny, homophobia, xenophobia, etc. It’s painful to repeatedly see those words. Even when they’re not directed at me, I hate seeing them. It reminds me of the hate that exists in the world. It muddies the waters while trying to find strength to have even a sliver of optimism about the state of the world. I find myself saying “fuck you, asshole” while moderating these posts.
But there are other reports that stand out. You know these people exist in the world, but when you see them first hand it’s horrifying. CSAM in particular. But also violent threats, especially when used with discrimination.
The words and sentiments alone are hard to moderate. But often there are images. And try as I might, I am unable to shake them.
Targeted Attacks on Moderators
We’re volunteers trying to keep everyone safe, not monsters. And, yet, we get targeted. Sometimes personally. That’s not okay.
I’m certain every moderator has a story. Mine is that some people targeted my family. I discovered that some nazis found a photo of my kids from one of my lectures. They posted it to some repugnant forum, engaging in a racist, homophobic, bigoted hate fest.
They got my kids involved. My kids! Why? Because I dared delete a post filled with the same types of attacks against someone of color on Mastodon.
Positive Impact on Mental Health
The upside is that moderation gives me some control.
Watching a fascist autocracy unfold before your eyes is terrifying. Everything is collapsing on a massive scale in myriad ways. It’s enough to make a person feel helpless.
While it’s not monumental, I do get to curb some of that fascism through moderation. I get to push buttons with labels that read “delete post” and “suspend.” When some ignorant sociopath is harassing people, I get to wave a magic wand and make them disappear. At least from our corner of Mastodon.
That’s empowering. And meaningful. It does make a difference because I get to silence them. I can’t begin to describe how good that feels. Just a little tiny bit of justice.
We also once helped someone who posted a cry for help. He was talking about ending his life. Someone reported his post to see if we could help, and we did. Every once in a while I check on his account. He’s still there, posting cool stuff. I like to think we played a role in that.
Conclusion
So that’s a firehose of information, but it’s only part of the story. If you read this far, you deserve a cookie. If I could invite you over for one straight out of the oven (I make a mean vegan cookie) I would. For now, keep reporting the bad people and we’ll do the rest.
Thanks for taking the time to go on this journey with me.
Resources
If you’re new to Mastodon and the fediverse, free yourself from invasive tech and big social. Here are some resources to help you:
- Join me on Mas.to. It’s way better than Twitter, I promise. Create an account, flesh out your profile, write an introduction, then say hello to me. I’ll be happy to share your introduction and help you get acclimated.
- Learn everything you wanted to know (and more) about the fediverse from Fedi.tips.
- Mastodon search is pretty powerful. I wrote a guide for getting the most out of it.
- The fediverse has maybe the most powerful hashtags in social networking. But they should be accessible. I wrote a guide for that, too.
There’s a massive list of fediverse platforms, which can be overwhelming. Other than Mastodon, here are the ones that will look most familiar to you:
Like this? Find out when I publish new work.
