@micahflee the report correctly noted that these problematic instances aren't part of the main Fediverse that we are on: "in the case of child safety, Japan has significantly more lax laws related to CSAM which has resulted in a cultural divide where most users in Japan are segregated from the rest of the Fediverse"
I'm sure there are some instances that haven't properly blocked these instances yet, which is a problem! But PhotoDNA is far from a magic bullet, c.f. https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html
@micahflee If you mean a badly handled study, you’re right.
They included in their report instances that have been defederated from the core of mastodon.
It’s possible that there is a problem with mainstream mastodon but this study doesn’t show that at all.
@micahflee this is bad yes, but it's not something that can be solved at the mastodon level I believe
It's must be solved at the protocol, activityPub level, since it's an issue that happens to other services of the fediverse such as Pixelfed or PeerTube
@micahflee also, not all countries are making the same efforts on the same things
Case in point : Japan and drawings/fictional characters (the loli problem)
Everyone knows, and as long as laws and regulation don't change, well this won't change.
Betting that this is the main part of CSAM issues for Mastodon instances...