The Kolektiva raid reminds us that itโ€™s important to stand with your users in the #fediverse too.
https://www.eff.org/deeplinks/2023/07/fbi-seizure-mastodon-server-wakeup-call-fediverse-users-and-hosts-protect-their
FBI Seizure of Mastodon Server Data is a Wakeup Call to Fediverse Users and Hosts to Protect their Users

Weโ€™re in an exciting time for users who want to take back control from major platforms like Twitter and Facebook. However, this new environment comes with challenges and risks for user privacy, so we need to get it right and make sure networks like the Fediverse and Bluesky are mindful of past...

Electronic Frontier Foundation
@eff Do not underestimate this CSAM matter. As the recent attention from Facebook/Threads illustrates, big tech is starting to wake up to the danger the Fediverse represents to their regime of surveillance capitalism. The ostensible justification for attacking the Fediverse in the name of "rooting out child pornography" is a threat we need to take seriously.
@mastodonmigration @eff Yes we need to take this seriously, but not in the way that big-korpo would like to push on us - that is, connecting to their API , which will scan all our images.
Since such APIs for scanning #CSAM will never be open and free (so that criminals can not "test" materials before publication) then the only option is a decent #moderation #fediverse. But decent means actually manually reviewing all photo/video material published on the servers. And this, in turn, indicates that instances should be no more than real moderation capabilities. Such manual moderation does not seem realistic on instances with tens-hundreds of thousands of accounts.
@miklo @eff You have nicely summarized the problem. One of the most often proffered "solutions" is to hook up to Microsoft PhotoDNA. Which is... from Microsoft.