๐Ÿงต 1/5 If you've seen a headline about Mastodon & child abuse material (yikes!), some comments:

1. Mastodon is just software, not a company or a place, so this is like saying 'lots of child abusers use MS Office'
2. These sites appear in many blocklists and the admins of many Mastodon servers block them, so that crap never travels to your neighbourhood.
3. This is primarily in Japan.
4. Still bad, moderation and blocking are important areas to build up.

Link to article: https://www.washingtonpost.com/politics/2023/07/24/twitter-rival-mastodon-rife-with-child-abuse-material-study-finds/

Twitter rival Mastodon rife with child-abuse material, study finds

The report raises safety questions about alternative social media sites.

The Washington Post
@c_9
Lazy click-bait
#journalism They tout it as a report but no information about methodology and detail of observations.
@Daily_Twerk Yup, but the actual report is linked in my third post in the thread, from one of the researchers describing it in detail: https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media
Addressing Child Exploitation on Federated Social Media

@c_9
Thanks for that.

That is a little different to the news article.
Certainly more thoughtful.

If someone creates an instance for such stuff but doesn't federate then there's little anyone could do.

In 8 months I've been on Fediverse I've never seen such content and have no desire to go looking for it or playing amateur detective.

Does raise an issue about how such content can be found and removed or blocked if federated.
Assuming an admin is not complicit, should the admin contact local law enforcement?

My gut feeling is yes.

Be interesting to know if is on rader of admins. They may not say anything out loud in case it alerts these people to what is going on.