This Firefish server, bostonsocial.online, and Mastodon server, hear-me.social, now has #CSAM (Child Sexual Abuse Material) scanning active.

If the hash of any image matches the hash of a known CSAM image in the NCMEC.ORG database, a report will be automatically filed, and the image will be blocked. It will soon be deleted from the bucket.

While I realize that nobody on these two servers are involved in sending or receiving CSAM images, because this server relays with over 1,500 other servers, the scans are needed as these images can easily find their way in via the relay.

For clarification, the images are not seen by anyone or any software. A mathematical hash is calculated from the image binary and is matched to a hash stored in the database.

#Mastodon #MastoAdmin
@admin This is a huge step forward, congratulations!
@admin is that something supported by firefish or did you have to implement something yourself?
@admin Thank you for setting this up. Automating this is vital
@admin Is this proprietary to these servers or is there an open CSAM tool that we can implement? If so are there notes on setup?

@mike

He's using Cloudflare so my money is on that he's probably using CF's CSAM scanner.

Cloudflare has a pretty good CSAM solution and is better than what I've been able to implement so far.

@jeff Thanks for pointing me in the right direction. For a small instance like mine already using Cloudflare this seems like a no brainier.

@jeff @mike I'm curious how you know CF's CSAM scanner is good?
Have you used it before? Have you tested it to see if it has false negatives/positives?

Do you know of any alternatives?
CloudFlare is very untrustworthy company.

@iampytest1 @mike

1: I've used it and use it now

2: There is always the chance for false neg/pos.

3: There are several tools available for csam scanning including one that I built CF's and Safer among others,

4: I don't consider CF to be "very untrustworthy" quite the opposite.

@jeff @mike Ok, thanks. I am curious how it works. Is there a lookup API, or do you get a whole database of hashes?

Re Cloudflare: https://0xacab.org/dCF/deCloudflare/-/blob/master/readme/en.md

Thanks.

readme/en.md 路 master 路 Crimeflare / deCloudflare 路 GitLab

The Great Cloudwall / Stop Cloudflare / #deCloudflare #Crimeflare http://crimeflare.eu.org

GitLab
Child safety - Cloudflare

Learn more about how Cloudflare helps our customers make their websites safer for children and helps parents protect their children while using the Internet.

Cloudflare
@admin
hate to talk about the semantics of this topic but does searching by hash make sense for this application? a simple PNG to JPG will change the hash or even a screenshot. definitely a good idea to be monitoring this stuff though. I admit I don't know a lot about this topic so maybe I'm missing context
@max @admin The NCMEC database uses perceptual hashing algorithms designed for image comparison that are (somewhat) resistant to image manipulation. It's not a naive md5sum or similar.
@penllawen @max @admin Then (dumb question) are they as prone to false positives as some previous attempts (ie Apple)?
@penllawen @admin thanks so much for this info !!! thats awesome I didn't realize that was possible !
@admin keep in mind, this won't stop novel generative ai csam.
Congratulation! Please share more details on how this works.
@admin could you share how you implemented this on your Mastodon server?
@admin
I would be interested to know how high the false positive will be
@admin can you explain how it was installed?
@Tealk I use the Cloudflare free scanning included in their caching services. I had to first get a NCMEC.ORG reporting login.
@admin
Cloudflare 馃槺馃あ
@admin this is not a standard feature of Firefish or Mastodon?