This Firefish server, bostonsocial.online, and Mastodon server, hear-me.social, now has #CSAM (Child Sexual Abuse Material) scanning active.

If the hash of any image matches the hash of a known CSAM image in the NCMEC.ORG database, a report will be automatically filed, and the image will be blocked. It will soon be deleted from the bucket.

While I realize that nobody on these two servers are involved in sending or receiving CSAM images, because this server relays with over 1,500 other servers, the scans are needed as these images can easily find their way in via the relay.

For clarification, the images are not seen by anyone or any software. A mathematical hash is calculated from the image binary and is matched to a hash stored in the database.

#Mastodon #MastoAdmin
@admin
hate to talk about the semantics of this topic but does searching by hash make sense for this application? a simple PNG to JPG will change the hash or even a screenshot. definitely a good idea to be monitoring this stuff though. I admit I don't know a lot about this topic so maybe I'm missing context
@max @admin The NCMEC database uses perceptual hashing algorithms designed for image comparison that are (somewhat) resistant to image manipulation. It's not a naive md5sum or similar.
@penllawen @max @admin Then (dumb question) are they as prone to false positives as some previous attempts (ie Apple)?
@penllawen @admin thanks so much for this info !!! thats awesome I didn't realize that was possible !