This is a big problem for Mastodon.

Canadian journalist Erica Ifill just had her mastodon.online account suspended without explanation. She's been sharing stuff critical of Mastodon re: intersectional issues.

I get the decentralized structure and idea that each server has its own rules. But this Reddit-style moderation, where moderators with god-complexes make mysterious and arbitrary decisions, is going to cause people migrating to Mastodon to flee in droves.

https://www.presscheck.org/journalists/erica-ifill

PressCheck.org

I also don't think it is a serious solution to tell people, especially those from vulnerable communities, to just go find a server that respects them / allows them to criticize racism, etc.

People are joining Mastodon in good faith, but if they encounter these problems once, twice or more, it starts to send them the message they're not welcome here. It becomes a real brand problem and platform integrity issue.

There are a lot of people with good intentions putting work and time into building this platform who are probably frazzled and burnt out by how quickly its growing (first among them, Mastodon's founder @Gargron no doubt).

That said, I think it needs to continually ask itself if its structure and rules are advancing its mission and values or getting in the way of them? TBH, this kind of stuff seems at odds with what I like about Mastodon.

Update: Good to see @Mastodon being responsive and I'm sure it was an innocent mistake, but these mistakes keep happening and they point to deeper issues with the approach to moderation over here that really need to be worked out. A lot of responsibility is left on the shoulders of volunteer mods. Things could be more transparent and less arbitrary.

I hope people take some of these concerns that have been surfacing seriously rather than just tell people to go find a different server or whatever

And here is @Gargron addressing the moderation issue outlined above.

Some bigger questions about how these systems are set-up, but at least someone is stepping forward and showing accountability:

https://mastodon.social/@Gargron/109383947978442853

@llebrun Thanks @Gargron for acting in a timely manner. This is the only way to maintain the integrity of the "system"

@llebrun @Gargron I think there are a few questions about the viability and sustainability of volunteer moderators that the Mastodon community need to question, evaluate, learn, and adapt to— and by all accounts they appear to be the type to do that well.

Two items I’d like to (re) raise:

1/x

@llebrun @Gargron
1) the health and safety of the mods and admins. Are they aware, trained, compensated, and protected from harassment, doxxing, threats?
Are they prepared to handle what might be extremely traumatic images and content while sifting through content? Are there supports for their mental health doing this line of work?

Low-paid Facebook moderators have dealt with longterm health issues due to the trauma of seeing content as part of their work.
2/x

@llebrun @Gargron
is it possible they will need to be employed to ensure quality, experience, judgement over a longer period of time? Not to mention benefits, breaks, etc?

3/x

@llebrun @Gargron
Lastly, has the community underestimated the need for more revenue to support adequate resources for moderation? If so will they need to reconsider the staunch donationware-only model in support of other possible revenue streams for instances?

4/x

@llebrun @Gargron

Mastodon looks *a lot* like email. Which is basically at this point a ubiquitous utility. It offers a range of options to users to pay for the service through ads, pay-for use, community run, etc.

Users then choose the course of action based on their values, without (for the most part), any loss of interoperability.

You OK with weird ad surveillance? Go gmail. Want to pay to ensure your privacy? Protonmail. Etc.
5/5

@llebrun @Gargron

The *really* big instances are going to have major problems with this. No income stream and moderating 100,000s of users? no way in heck that's sustainable.

Switching to a smaller, better managed instance really is the reasonable solution. Reduces the problem chance of it happening to you while the systems scale and settle.

also happened here:
https://mastodon.online/@DevinCow/109385269930065020

DevinCow (@[email protected])

Attached: 1 image I don’t know what I did @[email protected] I am sorry

Mastodon
@llebrun @Mastodon Thank goodness that was resolved, and an explanation was given for the error. Hopefully this news is spread beyond Mastodon, so that there is a better understanding of the platform. Thanks for updating. 🙂

@llebrun @Mastodon Obviously you have never experienced the complete arbitrariness of Twitter moderation. At least on that instance there is a human response and explanation instead of algorithmic suspensions (temporary and permanent) with no explanation or recourse. That's the gold standard for opaque and arbitrary, and Facebook is not much better (at least FB will offer a reason).

And that's before a discussion of Twitter burying posts based on content ("shadow banning").

@llebrun What's the solution though, on a decentralized platform? We can't police all of the moderators when there is no central authority. If I don't like the rules of my instance I'm free to change instances or even start my own... is there a better solution?
@klewlis @llebrun I think that this is right, you need to choose an instance that meets your needs, or create your own instance.
You need to be able to trust your admins. It is one of the reasons I choose #hachyderm, I trust that @nova and co. will enforce their rules fairly and I believe that the rules and values align well with mine.
@llebrun @Gargron but it's not really a platform in the sense that you're using it here, right? It's a protocol and open source software. There is no central gatekeeper, that being a feature and not a bug.

@llebrun

#Mastodon is just a client interface for a federated messaging protocol.

If we start getting into ideologically-based "defederation wars", we'll end up balkanizing the network into clumps that don't talk to each other. In the end, more restrictive nodes will likely be outnumbered by the open ones.

We need to de-emphasize server admins and moderators and give users the tools they need to shape their own experiences, or to adopt experiences curated by trusted parties.

@gabrielbauman @llebrun it already is Balkanized tho, there is no use in letting Nazis and spammers be part of your space, just look at the replies on this https://prodromou.pub/@evan/109371763133184283
Evan Prodromou (@[email protected])

If we can fit about 10-100K people on a Mastodon server, and we've got 38M Canadians, then we're going to need 400 to 4000 Canadian servers. Let's start building them. #cosocialca http://cosocial.ca/

Mastodon

@llebrun @Gargron

Not an ideal solution, but as Erica Ifill is a journalist, I wonder if she might consider migrating to newsie.social, which is smaller and I think less likely to make bad moderation mistakes. I think she will need access to her account back to migrate properly, though. @jeff

There are a number of well established smaller instances concerned about if mastodon.online and mastodon.social are growing too fast to moderate well, and mistakes like this are part of the problem.

@llebrun
What does "it" refer to here? What needs to ask itself about its structure etc?
@llebrun Agreed all around. And yet I’m still going to find myself another instance/server. Also, the massive influx of people has likely not made anything, including moderation, any easier — which isn’t in any way meant excuse, just partially explain. I’m looking forward to seeing Erica back and more active here.

@llebrun I think these are good, valid criticisms that #Mastodon will have to deal with in one way or another. I just urge everyone, reporters especially, to take into consideration the MASSIVE, MASSIVE workload these moderators are being put under with Twitter's collapse. Especially when you consider the immense resources deployed by established, corporate networks, like Facebook and (at one point) Twitter.

Some good perspective: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

The secret lives of Facebook moderators in America

In a damning new report, Casey Newton gives an unprecedented look at the day-to-day lives of Facebook moderators in America. His interviews with twelve current and former employees of Cognizant in Arizona reveal a workplace perpetually teetering on the brink of chaos.

The Verge
@llebrun @Gargron can it be just an error? The instances are being overwhelmed, so a mistake is inevitable. Was she able to get it resolved?
@llebrun Gargron is the owner / manager and only admin / mod for both .social and .online
@llebrun @Gargron @dznz looks like other admins on other servers are doing the same thing
@llebrun @Gargron everything is relative comparing Mastodon’s “brand” problems to Twitter’s “brand” problems, isn’t it?
@llebrun Mastodon has rules against discrimination. The owners of those servers cannot always see them. It is up to us to flag these #Horrible actors and help kick them out.

@goddessvault @llebrun Well, the main body of servers that federate to each other, the "non-nazi instances" if you like, have a _culture_. I wouldn't say rules – unless you can link to where they are written down?

Each instance has it's *own* rules, sure. But Mastodon as a whole? Not so much.

@llebrun There needs to be more visibility - so that ppl can make an easier choice in selecting servers. Like a yelp rating for servers in terms of moderation and stability etc.
@llebrun Maybe in interim users who have felt themselves discriminated against can post using hashtag # AdminAlert and share screenshots. Granted this is extra labor for victims, but given the decentralized nature of open-source Mastodon the more data ppl have to make a decision on how best to self-sort it will be good. Will also shine a light so rogue admins can clean up their act when they are in the wrong or risk their server being defederated by servers that don't abide by their actions.
@llebrun Can also give admins the opportunity to share their own evidence to make their own cases for suspension in case of bad faith reports.
@llebrun Best would be for servers to have clear rules defined and TOS and suspend users for clear violations. I suspect for some they may benefit from DEI education and reflection.
@caffeneko @llebrun my understanding is the point of the fediverse is that there may be communities that reject DEI. Other communities will need to decide if they include DEI rejecting servers from their view or accept that as "diversity." If people want more or less sanitized content, they need to find servers with governance that matches their expectations.
@caffeneko @llebrun - I agree on TOS. There should be a clear UI for why individuals, content, or servers are removed from each server so transparency and appeals processes can be built into the platform. Good servers (IMO) will foster communities with clear TOS, governance bylaws, and humane processes that protect us from descending into hateful screaming matches like Birdsite.
@caffeneko @llebrun I've never yet seen a server without rules of behaviour spelled out on the 'about' page.

@caffeneko @llebrun *ahem* you mean like FB/Twitter/…

These instances are still private servers that you are using.

In some jurisdictions (e.g. EU) you still might have privacy expectations, but you do not have an expectation of service.

@caffeneko @llebrun (just mentioning it, while I do not run a mastodon instance, I did run private internet-connected servers for decades, the entitlement of users is fascinating in the context of which is basically at least for some admins, a hobby.
@yacc143 @llebrun that's fine if it's a hobby - then they should have no issue w/ users providing transparency on their experience so that users can make data informed decisions. Would be good for users to know which servers they cannot have expectations of equitable service.
@caffeneko @yacc143 @llebrun With the exception of Eugen's two servers, I don't know of any that are NOT a "hobby". No one makes money from doing this.
@fishidwardrobe @yacc143 @llebrun Then maybe such overburdened mods should not be so eager to suspend someone from a marginalized community, if they don't want to provide clear justification. They should expect that that users community would have questions.
@fishidwardrobe @yacc143 @llebrun If an admin has a very strict CW policy on their server and gives clear warning that to remain on the instance you should use CW for (insert list of problematic topics: rape, police brutality, murder, uspol, ukpol) - that should be sufficient heads-up that the server is probably not the right fit for someone inclined to post political topics w/o CW.

@caffeneko @yacc143 @llebrun Yes – but if a mod misses banning someone problematic then it's likely their whole instance will be defederated by other servers. Boom, whole server. So *of course* they tend to be trigger happy.

There are reasons for this – blocking the entire instance is the only way you can really make your users safe from malicious users on the other instance. But. But!

@caffeneko @llebrun
@TheKinrar

What about incorporating some kind of review/rating system with instances.social?

@ian @llebrun @TheKinrar I think that's a terrific idea - I don't know who maintains that site or how users could enter review metadata per instance. Someone would also have to define metrics by which someone could rate a server.

@llebrun Yes, I heard the same thing from someone posting about racialized issues on .social who got their post moderated for no reason.

I think it's really important to find a good instance (but then, you never know who will change their views).

@llebrun

if Eugene Rochko (@Gargron) does not make Mastodon an affirmatively anti-racist space, he makes it a de-facto
white supremacist space.

Eugene Rochko oversees the day-to-day operations of Mastodon.Social and Mastodon.Online, the flagship instances of the Mastodon project, which he heads.

Positive justice is preferable to "peaceful" injustice.

This is a lesson I had to teach Jeff of Discourse forum project fame the hard way.

@notyoursweetbab @llebrun @Gargron Pre-emotive suspension is commonplace in social media as a protective measure. Getting to clear up any misunderstanding about it with real accountable people is rare and the difference that makes Mastodon a more human place.

@dudleysaunders @llebrun @Gargron

Eugen's explanation and apology lays out what happened and what steps are being taken to address this going forward. These incidents were upsetting, but I am glad that Eugen has taken accountability and action.

https://mastodon.lol/@Gargron@mastodon.social/109383948564712812

Eugen Rochko (@[email protected])

There were two notable moderation incidents on mastodon.social and mastodon.online in the past 24 hours that I would like to address. In the first, a post was wrongfully removed due to a report claiming it contained a dogwhistle, and in the second, a person was wrongfully suspended due to a report claiming it's an impersonator. Both were undone and apologies issued.

Mastodon
@notyoursweetbab @llebrun @Gargron Sort of: one was a suspected impersonator, the other was report of a dog whistle that turned out to be overstated & not objectionable. Absent an active team that can moderate in real time, a pre-emptive suspension is the only way to prevent ongoing harm, i.e. Nazi posts. The difference I see is that Facebook will not engage about their decisions where here you can make your case to a real human. Love that!
@llebrun
Looks to be a big problem. You'd think they'd have a central set of rules around moderating.

@mvlasic @llebrun
Obviously not, that's not what the Fediverse tries to be.

GMail, Hotmail and my private email server have one central set of rules what kind of emails are okay, right?

Reality is, if you don't like your instance and its administration, choose another, or run your own instance. It's basically the same choice you have for getting email service.

Mastodon is just a service running over ActivityPub in the Fediverse.

Just like Email runs over ESMTP.

@mvlasic @llebrun There literally cannot be one common set of rules because instances vary in size, professionalism of the admins, legal jurisdictions that they run in.

Try posting Swastikas on a German/Austrian based instance, and look how long your freedom of speech lasts.

@yacc143 @mvlasic @llebrun right. The issue is in misunderstanding from newcomers that want a replacement for Twitter and this isn’t quite that. Mastodon feels to me like if Discord wanted to look like Twitter. Does a great job at being what it is, but it isn’t a global community by design.

@llebrun i suggest fediscience.org @admin

A great server!

@llebrun Isn't that mostly a big problem for that particular instance?

Good instances, especially the big ones, should have strict and transparent moderation scheme. We should learn from such cases and improve common practices in the fediverse.

@llebrun I’m not commenting on this situation because I don’t know the details, but it got me wondering how easy it is to move to a different part of the #fediverse? How easy is it to change servers or whatever?
#noob