There's A LOT of discussion about content moderation right now and very little of it touches on the fact that we've all lived on the big social sites for the last decade-plus thanks to the massively exploitated labor of mostly-invisible moderation workers. The social web at scale wouldn't have happened without these laborers, who in addition to shit wages, have been exposed to literally every imaginable horror.

If we're remaking this world, let's do better on that front.

Anyway, as you hear about fuckups on moderation and poor decisions by a handful of volunteer mods on servers that have grown by hundreds of thousands of users in a week, keep that in mind.

Finally, the very best coverage of this issue has been from @caseynewton at The Verge whose very haunting stories have stuck with me for years now.

They are worth reading and really thinking about what *waves hands in all directions* really means in terms of the actual human lives at the bottom of all this.

https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

https://www.theverge.com/2019/6/19/18681845/facebook-moderator-interviews-video-trauma-ptsd-cognizant-tampa

The secret lives of Facebook moderators in America

In a damning new report, Casey Newton gives an unprecedented look at the day-to-day lives of Facebook moderators in America. His interviews with twelve current and former employees of Cognizant in Arizona reveal a workplace perpetually teetering on the brink of chaos.

The Verge
@dansinker @caseynewton The seminal work in this area is of course @ubiquity75's work Behind the Screen.
@dansinker @caseynewton I also remember the first longer piece I have ever read about it, by Adrian Chen: https://www.wired.com/2014/10/content-moderation/
@dansinker @caseynewton god, the things I saw as a _product designer_ at a risk analysis company will haunt me forever. I can't even imagine what these folks are put through.

@dansinker @caseynewton That was disturbing... great articles.

There must be a better solution than more policing. But, I guess that's age-old: how does a society enforce its morality norms?

This platform, besides being distributed, by design seeks to remove profit motive. Does that help? It's not enough, certainly.

And is there a name for this social phenomenon of knowing what's behind the scenes? The old "you don't want to watch the sausage being made" situations?

Lots to think about.

@billrehm removing the profit motive certainly removes some exploitative incentives from the equation, but I think that there's a still a very real unanswered question about how moderation scales and how we don't grind the people doing it into dust along the way.
@dansinker @billrehm it's not always expedient or conclusive but I continue to believe the moderation system developed and operated by the community of Wikipedia moderator/contributors stands alone on sustainability. I have proposed that these individuals should form a professional guild to license and accredit their services and processes to other social web platforms.
@pinsk @dansinker @billrehm it grinds people down…

@georgewherbert @pinsk @dansinker @billrehm

@molly0xfff, this feels like a “scaling volunteers” problem similar to Wikipedia, which has it’s faults but has been overall amazingly successful at creating and maintaining the largest store of human knowledge in history—all with volunteers.

Do you think that’s right, and do you think that a model like that is likely to emerge in the Fediverse?

@pinsk @dansinker @billrehm

Wikipedia as an organization culture is pretty damn white, male, abled and US centric, though. For example, a stub I wrote on the European Nuclear Safety Regulators Group (ENSREG) was voted "not notable enough" and deleted in the 2010s. Yes I am still salty, no I haven't edited Wikipedia since, and ENSREG still doesn't have an English Wikipedia article.

Not notable???
https://www.ensreg.eu/members-glance

ENSREG at a glance | ENSREG

ENSREG is an independent expert advisory group, in which all EU Member States are represented by senior officials from their national regulatory authorities or nuclear safety authorities. Senior representatives of the European Commission are also part of the group. In addition, Council of the European Union, Switzerland, Norway, and the International Atomic Energy Agency, have

@pinsk @dansinker @billrehm
Love this open problem discussion re how to scale moderation without grinding people doing it into dust along the way.
I love Wikipedia & have done my bits in fixing, updating, and creating entries over the years.
I like Mastodon having zero profit motives (like Wikipedia) so moderation by volunteers have to be manageable & don't take too much of our time.
We are building this plane as we fly it. Will see how it goes. Thx4 getting me to think about this open problem.
@dansinker @caseynewton this actually makes me wonder if I really should open up my own instance to others.
Besides to suddenly have to moderate content on your server, owners and operators of instances probably also become liable for content.
Maybe the smartest thing (albeit expensive as per cost per user) would be to stay on your private instance or join a well moderated one (with all risks on the owner).
We’re probably about to become mini Twitters 🤔🤔🤔

@sven

How about an option in between? Allow up to 10–20 well-trusted people on your server, and then block any further registrations. It's small enough that DMs or email threads should be efficient for any needed moderation, especially once you get to know each other enough.

@dansinker @caseynewton

@boud @dansinker @caseynewton This would be probably a good recipe to get an instance started, ie. with a reasonably sized group of people whom you already know or who got invited by people who share similar values with you.
@dansinker @sven @boud @caseynewton yeah I would say if you're wondering whether you should open your instance up to a wider group of people, the answer is probably no
@sven I have for sure been thinking about making an instance of one since things have gotten a lot busier over here.
@sven @dansinker @caseynewton As for liability, do we not think 230 would apply? (I'm assuming you're in U.S. If not, please disregard).
Arrgh (he/him/his) @[email protected] 🐀 on Twitter

“@jproulx The legal and technical barriers to T&S adequacy in a federated social universe are already insurmountable, now let's throw in the ethical one where you need an army of humans to manually review horrific shit that the algorithms missed. It's a PTSD toxic waste plant.”

Twitter
@caseynewton @dansinker I feel bad I never thought about this.