Looks like Meta is about to contract Majorel to do moderation for Facebook.

Majorel already does moderation for TikTok.

Apparently, working conditions at Majorel are appalling!

https://www.wired.com/story/metas-new-moderation-contractor-may-be-worse-than-its-last-one/

Meta Eyes a Moderation Partner With ‘Traumatizing’ Working Conditions

Employees of outsourcing company Majorel have accused it of underpaying moderators and failing to support them.

WIRED

People often ask me, "What is the Fediverse's advantage over Big Social?"

I believe it is moderation.

It is easier for one person to moderate an instance of 100 people than it is for 50,000 people to moderate a social network of 2 billion people.

As well, since my instances are hobbyist projects, I am bound to be more selective about who is allowed to join my instance.

Unlike Meta, I don't tolerate jackasses!

Yes, I moderate for free.

However, I also don't get paid to watch beheadings, mutilations, and suicides for a monthly salary of $281.

If you have to ask, I'd rather moderate an instance of chill people -- and do it free of charge -- than watch the very worst of humanity for a measly pittance.

All of this comes on the heels of Meta facing allegations of forced labor, human trafficking, and union busting in Kenya.

Meta's had previously hired "ethical A.I." company Sama for moderation.

Sama has been accused of running a "digital sweatshop".

https://time.com/6175026/facebook-sama-kenya-lawsuit/

Facebook Faces New Lawsuit Alleging Human Trafficking and Union-Busting in Kenya

Facebook’s parent company and Sama, its largest outsourcing partner in Africa, are facing new allegations of forced labor, human trafficking, and union busting in Kenya.

Time

Working conditions at Facebook moderation farms are so horrible that so-called "contractors" actually describe it as modern slavery.

https://time.com/6147458/facebook-africa-content-moderation-employee-treatment/

Inside Facebook’s African Sweatshop

At an external Facebook content moderation facility in Kenya, employees are paid as little as $1.50 per hour for traumatizing work

Time

Let's call a spade a spade: moderation on Big Social is unsustainable.

Right now, they're trying to outsource moderation to developing companies.

Which is really just exporting trauma to some of the most economically deprived people on the planet.

And in places like Kenya and the Philippines, few people want to work as a Facebook content moderator.

Again, who wants to be traumatized for life while working for $2/hour?

@atomicpoet Won’t Federated Social potentially have a similar moderation problem? And without the deep pockets of a huge corporation to deal with it?
@atomicpoet The distributed nature can provide some benefits for moderation, but it also means that the moderators are far less likely to be trained, or even paid, and may not be prepared for the truly vile material they will have to deal with. They may also end up with legal exposure they wouldn’t have as employees.

@michaelgemar If you're moderating a community of 100 people, you don't need the resources of 4 billion people.

Further, if you're relentless -- and thorough -- regarding who and what is allowed on your instance, it is unlikely you will deal with vile material or legal threats on a daily basis.

As well, if you're boring in your approach to moderation, most people will move on.

I run 5 Fediverse instances. I've yet to deal with anything vile in the communities that I manage.

@atomicpoet It’s certainly a benefit of the Fediverse model that running an instance lets you vet users somewhat. I just don’t want folks to get complacent, and presume that the problematic behaviours seen in other large social platforms won’t show up here as the Fediverse grows.

@michaelgemar Problematic behaviour definitely shows up on the Fediverse.

The way to confront that problem is to keep your instance on a short leash, and to block instances that misbehave.

On that note, there's lots of trolls using the Fediverse. You probably don't see them because # Fediblock does its job.

@atomicpoet Fediblock is a good example of the kind of distributed tools that take the load off of individual moderators. I really do hope generally that what the Fediverse has is up to the challenge of an increasing tide of bad actors attempting to muck things up for the rest of us. I’m glad to hear that actual instance moderators don’t seem too worried.

@michaelgemar @atomicpoet

Yes, those would be concerns I would have . As I understand it the physical location , country, of the server/instance determines the law that applies , and that is where control/responsibility exists, by design. The instance I am on has 2 paid moderators , but that is the exception .

@PBruce @michaelgemar You're also on the largest instance on the Fediverse -- so it *should* have paid moderators.

However, the majority of instances are incredibly small. Some of them even operate as single-user instances.

Moderation is a lot easier when you know everyone by name on your instance.

@atomicpoet @michaelgemar

Totally agree. I ended up on this instance by chance but took the time to check things out to see how things worked .

@Michael Gemar Moderation is much easier at a human scale. When the community is small enough, moderators actually interact with the community and are part of the community. They understand the culture, the context, and the members better than AI, algorithms, or employees thousands of miles or kilometers away. A lot of misguided censorship comes from outsiders, who don't understand the community, coming in, taking comments out of context, and applying their own values onto a community they are not part of.

If you want to keep community moderation fair, then you must keep instances small, and keep them moderated by their own community.

If an entire community doesn't meet your standards, that is where users and administrators can block them.

You don't need a lot of money to moderator a community. You need people who are already part of the community who care enough to make it a safe space and are willing to take time to do so.
@scott The big moderation concerns are not community members being mean to each other, but horrific content as described in the original article: beheadings, suicide, even illegal material like child sexual abuse. (And how one deals with such material may even open a moderator to legal jeopardy with regards to deletion and reporting.) The Fediverse has been a cozy place, but in part it has had the advantage of relative obscurity. Knotty problems may be coming.