In what is hopefully my last child safety report for a while: a report on how our previous reports on CSAM issues intersect with the Fediverse.
https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media
In what is hopefully my last child safety report for a while: a report on how our previous reports on CSAM issues intersect with the Fediverse.
https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media
@det
I think that along with helping to fund the servers that are hosting our accounts that perhaps we need an organization or two to step forward and to become safety "police" for the #Fediverse that we can fund through donations perhaps.
I hope this issue is addressed soon.
I would never trust Meta to create or maintain the tooling for something as important and necessary as policing CSAM. It's an appalling shame that the ActivityPub specification did not account for moderation tools or CSAM blockers, but Meta would never give you those tools for free. They would rather use it as leverage to bend the entire ActivityPub spec to their whim, playing out the "Extend" phase of Embrace, Extend, Extinguish.
It's a Faustian bargain. Those tools need to be developed, absolutely, but by the open source community, not a profit driven amoral company.