This is bad. Mastodon instances need tooling to automatically detect and ban abuse (or flag for manual review), and to automatically defederate from instances that don’t ban abuse https://www.washingtonpost.com/politics/2023/07/24/twitter-rival-mastodon-rife-with-child-abuse-material-study-finds/
Twitter rival Mastodon rife with child-abuse material, study finds

The report raises safety questions about alternative social media sites.

The Washington Post

@micahflee the report correctly noted that these problematic instances aren't part of the main Fediverse that we are on: "in the case of child safety, Japan has significantly more lax laws related to CSAM which has resulted in a cultural divide where most users in Japan are segregated from the rest of the Fediverse"

I'm sure there are some instances that haven't properly blocked these instances yet, which is a problem! But PhotoDNA is far from a magic bullet, c.f. https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html

A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.

Google has an automated tool to detect abusive images of children. But the system can get it wrong, and the consequences are serious.

The New York Times
@micahflee what a click bait article title. Didn't read, paywalled.
@micahflee I'm confused why that's bad or surprising; that's always been a given with all social platforms. Humans can suck. Remember when the internet was just dorks in their underwear on BBS boards playing D&D? Those days are long gone.

@micahflee If you mean a badly handled study, you’re right.

They included in their report instances that have been defederated from the core of mastodon.

It’s possible that there is a problem with mainstream mastodon but this study doesn’t show that at all.

@micahflee this is bad yes, but it's not something that can be solved at the mastodon level I believe

It's must be solved at the protocol, activityPub level, since it's an issue that happens to other services of the fediverse such as Pixelfed or PeerTube

@micahflee also, not all countries are making the same efforts on the same things

Case in point : Japan and drawings/fictional characters (the loli problem)

Everyone knows, and as long as laws and regulation don't change, well this won't change.

Betting that this is the main part of CSAM issues for Mastodon instances...

@micahflee this is a bad path, we are human to human here #openweb
@micahflee That's a massive yikes! I have seen dodgy stuff here and it was always acted on when I reported it, but still...
@micahflee Why isn't this scoop pay-walled? Also, have the researcher proposed moderation techniques or tools?