There's A LOT of discussion about content moderation right now and very little of it touches on the fact that we've all lived on the big social sites for the last decade-plus thanks to the massively exploitated labor of mostly-invisible moderation workers. The social web at scale wouldn't have happened without these laborers, who in addition to shit wages, have been exposed to literally every imaginable horror.

If we're remaking this world, let's do better on that front.

Anyway, as you hear about fuckups on moderation and poor decisions by a handful of volunteer mods on servers that have grown by hundreds of thousands of users in a week, keep that in mind.

Finally, the very best coverage of this issue has been from @caseynewton at The Verge whose very haunting stories have stuck with me for years now.

They are worth reading and really thinking about what *waves hands in all directions* really means in terms of the actual human lives at the bottom of all this.

https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

https://www.theverge.com/2019/6/19/18681845/facebook-moderator-interviews-video-trauma-ptsd-cognizant-tampa

The secret lives of Facebook moderators in America

In a damning new report, Casey Newton gives an unprecedented look at the day-to-day lives of Facebook moderators in America. His interviews with twelve current and former employees of Cognizant in Arizona reveal a workplace perpetually teetering on the brink of chaos.

The Verge

@dansinker @caseynewton That was disturbing... great articles.

There must be a better solution than more policing. But, I guess that's age-old: how does a society enforce its morality norms?

This platform, besides being distributed, by design seeks to remove profit motive. Does that help? It's not enough, certainly.

And is there a name for this social phenomenon of knowing what's behind the scenes? The old "you don't want to watch the sausage being made" situations?

Lots to think about.

@billrehm removing the profit motive certainly removes some exploitative incentives from the equation, but I think that there's a still a very real unanswered question about how moderation scales and how we don't grind the people doing it into dust along the way.
@dansinker @billrehm it's not always expedient or conclusive but I continue to believe the moderation system developed and operated by the community of Wikipedia moderator/contributors stands alone on sustainability. I have proposed that these individuals should form a professional guild to license and accredit their services and processes to other social web platforms.
@pinsk @dansinker @billrehm
Love this open problem discussion re how to scale moderation without grinding people doing it into dust along the way.
I love Wikipedia & have done my bits in fixing, updating, and creating entries over the years.
I like Mastodon having zero profit motives (like Wikipedia) so moderation by volunteers have to be manageable & don't take too much of our time.
We are building this plane as we fly it. Will see how it goes. Thx4 getting me to think about this open problem.