Apple has finally killed its ill-conceived plan to scan photos for CSAM. This is a direct result of work by experts and activists. Speaking up is important and sometimes we win.

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next

The company plans to expand its Communication Safety features, which aim to disrupt the sharing of child sexual abuse material at the source.

WIRED
@evacide Time to place yer bets on whether Apple's soon to come opt-in nudity detection algorithm is racist or sexist.
show me the $
0%
$ on neither
0%
$ on racist
44.4%
$ on sexist
55.6%
Poll ended at .
@gpshead The damned machine won't take my vote, which is "both, of course".
@gpshead Oh it does, nevermind 😂 (TIL about checkboxes in votes on Masto.)