Apple has finally killed its ill-conceived plan to scan photos for CSAM. This is a direct result of work by experts and activists. Speaking up is important and sometimes we win.

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next

The company plans to expand its Communication Safety features, which aim to disrupt the sharing of child sexual abuse material at the source.

WIRED
@evacide
I had a deep feeling that apple was not gonna get away with this one (regardless of any good intentions it may have had.