Apple has finally killed its ill-conceived plan to scan photos for CSAM. This is a direct result of work by experts and activists. Speaking up is important and sometimes we win.

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next

The company plans to expand its Communication Safety features, which aim to disrupt the sharing of child sexual abuse material at the source.

WIRED

@evacide
Wow.

I was going to jump in and reflexively criticize Apple for their usual draconian nonsense, but I don't know where I come down on this issue.

Normally, I'm a big stickler for data privacy, and this would result in strangers looking at the most intimate possible photos, but the purpose of the tech is something that I have a hard time disagreeing with.

I honestly don't know where I stand.

@tofugolem @evacide

I think it is misdirection.

I do not see Apple disabling the Client Side Access Method.