Looks like Meta is about to contract Majorel to do moderation for Facebook.

Majorel already does moderation for TikTok.

Apparently, working conditions at Majorel are appalling!

https://www.wired.com/story/metas-new-moderation-contractor-may-be-worse-than-its-last-one/

Meta Eyes a Moderation Partner With ‘Traumatizing’ Working Conditions

Employees of outsourcing company Majorel have accused it of underpaying moderators and failing to support them.

WIRED

People often ask me, "What is the Fediverse's advantage over Big Social?"

I believe it is moderation.

It is easier for one person to moderate an instance of 100 people than it is for 50,000 people to moderate a social network of 2 billion people.

As well, since my instances are hobbyist projects, I am bound to be more selective about who is allowed to join my instance.

Unlike Meta, I don't tolerate jackasses!

Yes, I moderate for free.

However, I also don't get paid to watch beheadings, mutilations, and suicides for a monthly salary of $281.

If you have to ask, I'd rather moderate an instance of chill people -- and do it free of charge -- than watch the very worst of humanity for a measly pittance.

All of this comes on the heels of Meta facing allegations of forced labor, human trafficking, and union busting in Kenya.

Meta's had previously hired "ethical A.I." company Sama for moderation.

Sama has been accused of running a "digital sweatshop".

https://time.com/6175026/facebook-sama-kenya-lawsuit/

Facebook Faces New Lawsuit Alleging Human Trafficking and Union-Busting in Kenya

Facebook’s parent company and Sama, its largest outsourcing partner in Africa, are facing new allegations of forced labor, human trafficking, and union busting in Kenya.

Time

Working conditions at Facebook moderation farms are so horrible that so-called "contractors" actually describe it as modern slavery.

https://time.com/6147458/facebook-africa-content-moderation-employee-treatment/

Inside Facebook’s African Sweatshop

At an external Facebook content moderation facility in Kenya, employees are paid as little as $1.50 per hour for traumatizing work

Time

Let's call a spade a spade: moderation on Big Social is unsustainable.

Right now, they're trying to outsource moderation to developing companies.

Which is really just exporting trauma to some of the most economically deprived people on the planet.

And in places like Kenya and the Philippines, few people want to work as a Facebook content moderator.

Again, who wants to be traumatized for life while working for $2/hour?

@atomicpoet Won’t Federated Social potentially have a similar moderation problem? And without the deep pockets of a huge corporation to deal with it?
@Michael Gemar Moderation is much easier at a human scale. When the community is small enough, moderators actually interact with the community and are part of the community. They understand the culture, the context, and the members better than AI, algorithms, or employees thousands of miles or kilometers away. A lot of misguided censorship comes from outsiders, who don't understand the community, coming in, taking comments out of context, and applying their own values onto a community they are not part of.

If you want to keep community moderation fair, then you must keep instances small, and keep them moderated by their own community.

If an entire community doesn't meet your standards, that is where users and administrators can block them.

You don't need a lot of money to moderator a community. You need people who are already part of the community who care enough to make it a safe space and are willing to take time to do so.
@scott The big moderation concerns are not community members being mean to each other, but horrific content as described in the original article: beheadings, suicide, even illegal material like child sexual abuse. (And how one deals with such material may even open a moderator to legal jeopardy with regards to deletion and reporting.) The Fediverse has been a cozy place, but in part it has had the advantage of relative obscurity. Knotty problems may be coming.