Apple's statement is the death knell for the idea that it's possible to scan everyone's comms AND preserve privacy.

Apple has many of the best cryptographers + software eng on earth + infinite $.

If they can't, no one can. (They can't. No one can.)

https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/

Apple's Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy

Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.

WIRED
@Mer__edith @joe_no_body anyone know what these on-device features are, tho?
@rabcyr basically opt-in automated nudity detection https://support.apple.com/en-us/HT212850
About Communication Safety on your child's Apple device

If your child receives or attempts to send photos or videos that might contain nudity, Communication Safety warns them, gives them options to stay safe, and provides helpful resources.

Apple Support
@joe_no_body oh huh that’s kinda neat and not actually cringe
@rabcyr yeah, it seems incredibly reasonable in its design. I like that it prompts the kid and doesn't just tattle on them to their parents or something