Sarah T. Roberts

@ubiquity75
25 Followers
386 Following
393 Posts

Professor, researcher, writer, teacher. I care about content moderation, tech, digital labor, the state of the world. I like animals and synthesizers and games. On the internet since 1993. Los Angeles/Tovangaar-based. Gay lady.

I wrote an entire book on content moderation called Behind the Screen. Now might be an interesting time to read it.

@hrheingold Indeed.

Interestingly, they don’t always have seniority but many are men. Like Elon’s goon squad, which is also all men — save for the nanny for little Aexebodyspray127625-2.

One former colleague who was “spared” told me they are one of their managers 75 direct reports. Lol. What.

Another thing I’ve been meaning to share: every single researcher was fired save one. There is no one left who does research of any kind to inform any actions the company takes. Worth noting that those spared are usually one of xxx people, and so they are headless. Effectively fired but being kept on due to WARN Act matters.
They are called “agents” at Twitter. There are at least four third-party companies who sourced and managed them. As with most companies in the space, Twitter’s official FTE numbers of around 7500 employees did NOT count the contractors. This is where ALL OF THESE FIRMS stash their moderators. Facebook has easily has 20-25,000 of them at any given time. Twitter had at least 3,000.
Around 3,000+ contractor employees of Twitter were canned last night (totally normal thing to do, btw). How does Twitter have so many contractors? This is where the CONTENT MODERATOR numbers are hidden. From an ex-colleague I’ll not name: “of the 3,000+ contractors let go last night, I believe that it included a SIGNIFICANT portion of the content moderation workforce.”
@mmasnick @riskybusiness @josh It seems assured that they’ll try this sort of thing.

I don’t know if I actually shared anything but the drawing, so here’s the interview from Harvard Business Review on content moderation and related things.

https://hbr.org/2022/11/content-moderation-is-terrible-by-design

Content Moderation Is Terrible by Design

Social media companies couldn’t exist in their current form without content moderation. But while these jobs are essential, they’re often low-paid, emotionally taxing, and extremely stressful — they require exposure to horrific violence, disturbing sexual content, and generally the worst of what we see (or don’t see) online. Do they have to be? Sarah T. Roberts, faculty director of the Center for Critical Internet Inquiry and associate professor of gender studies, information studies, and labor studies at UCLA, details the evolution of this work, from patchwork approaches to in-house moderators and contractors to the current prevailing model, where generalist contractors work in call center–like offices. There are steps companies could take to improve this work, including providing better technology for moderators as well as better pay and more psychological support. But improvement, at present, is more likely to come from worker organizing and collective demand for better conditions than from the firms that employ the workers or the companies that need the moderation.

Harvard Business Review

@josh @hrheingold @tarleton @ruha9 @inquiline thanks, Josh. Howard, I used to hang out a bit on your online community, along with my pal Molly Wright Steenson. My book is this one:

https://yalebooks.yale.edu/9780300261479/behind-the-screen

Behind the Screen

An eye-opening look at the invisible workers who protect us from seeing humanity’s worst on today’s commercial internet   Social media on the internet c...

Yale University Press
I have a bunch of buddies I want to drag onto Mastodon. How hard do I lean on ‘em?
@tyler @krisnelson It should likely be a fear now, as systems break and the unscrupulous can take data more easily. I intend to delete all my stuff that resides in my account, knowing that that is hardly a complete removal.