This Firefish server, bostonsocial.online, and Mastodon server, hear-me.social, now has #CSAM (Child Sexual Abuse Material) scanning active.

If the hash of any image matches the hash of a known CSAM image in the NCMEC.ORG database, a report will be automatically filed, and the image will be blocked. It will soon be deleted from the bucket.

While I realize that nobody on these two servers are involved in sending or receiving CSAM images, because this server relays with over 1,500 other servers, the scans are needed as these images can easily find their way in via the relay.

For clarification, the images are not seen by anyone or any software. A mathematical hash is calculated from the image binary and is matched to a hash stored in the database.

#Mastodon #MastoAdmin
@admin could you share how you implemented this on your Mastodon server?