Carmakers fail privacy test, give owners little or no control on personal data they collect
https://qz.com/carmakers-fail-privacy-test-give-owners-little-or-no-c-1850807171
Over the last week or so, I keep hearing about a big push among activists and lawmakers to try to get the Kids Online Safety Act (KOSA) into the year-end “must pass” omnibus bill. Earli…
“I’ve talked to moderators who were probably some of the first people outside the U.S. State Department to know about certain conflict zones, with nowhere to go with that information.”
Interview of @ubiquity75
https://hbr.org/2022/11/content-moderation-is-terrible-by-design
Social media companies couldn’t exist in their current form without content moderation. But while these jobs are essential, they’re often low-paid, emotionally taxing, and extremely stressful — they require exposure to horrific violence, disturbing sexual content, and generally the worst of what we see (or don’t see) online. Do they have to be? Sarah T. Roberts, faculty director of the Center for Critical Internet Inquiry and associate professor of gender studies, information studies, and labor studies at UCLA, details the evolution of this work, from patchwork approaches to in-house moderators and contractors to the current prevailing model, where generalist contractors work in call center–like offices. There are steps companies could take to improve this work, including providing better technology for moderators as well as better pay and more psychological support. But improvement, at present, is more likely to come from worker organizing and collective demand for better conditions than from the firms that employ the workers or the companies that need the moderation.
I don’t know if I actually shared anything but the drawing, so here’s the interview from Harvard Business Review on content moderation and related things.
https://hbr.org/2022/11/content-moderation-is-terrible-by-design
Social media companies couldn’t exist in their current form without content moderation. But while these jobs are essential, they’re often low-paid, emotionally taxing, and extremely stressful — they require exposure to horrific violence, disturbing sexual content, and generally the worst of what we see (or don’t see) online. Do they have to be? Sarah T. Roberts, faculty director of the Center for Critical Internet Inquiry and associate professor of gender studies, information studies, and labor studies at UCLA, details the evolution of this work, from patchwork approaches to in-house moderators and contractors to the current prevailing model, where generalist contractors work in call center–like offices. There are steps companies could take to improve this work, including providing better technology for moderators as well as better pay and more psychological support. But improvement, at present, is more likely to come from worker organizing and collective demand for better conditions than from the firms that employ the workers or the companies that need the moderation.
One of the things that I think is sad about the decimation of Twitter eng is that Twitter was doing a lot of interesting (and high ROI) eng work that, at younger companies, is mostly outsourced at great cost.
A few examples off the top of my head:
The now gutted HWENG group was so good at server design that, in a meeting with Intel, the Intel folks couldn't believe the power envelope Twitter achieved and Google thought we were lying about our costs during cloud price negotiations.