Apple has finally killed its ill-conceived plan to scan photos for CSAM. This is a direct result of work by experts and activists. Speaking up is important and sometimes we win.

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next

The company plans to expand its Communication Safety features, which aim to disrupt the sharing of child sexual abuse material at the source.

WIRED
@evacide This is good news. I can see good intentions by Apple w/ this originally, but was too problematic in practicality. Glad they're realizing this and walking it back.