Apple has finally killed its ill-conceived plan to scan photos for CSAM. This is a direct result of work by experts and activists. Speaking up is important and sometimes we win.

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next

The company plans to expand its Communication Safety features, which aim to disrupt the sharing of child sexual abuse material at the source.

WIRED
@evacide would be cool if Microsoft/Dropbox/Google followed suit. Really don't like the closed off Apple ecosystem and their propriety bs, but this is definitely a big win.
@shipp @evacide Yes. Best way is to stop using that stuff. And look for alternatives.
@shipp Credit where it's due. @evacide
@evacide Too bad it‘s proprietary software and nobody will be able to check if they really refrained from implementin this.
@evacide
I had a deep feeling that apple was not gonna get away with this one (regardless of any good intentions it may have had.
@evacide just because they published the E2EE feature without it, doesn't mean it's off the table, government like the EU are currently debating about making this stuff mandatory, and there is no doubt that Apple will rather comply than drop the European market.
@evacide I think people should be able to own computers and smartphones without worrying that they're being spied on. So, to that end, I'm thankful this didn't go through.

@evacide
Wow.

I was going to jump in and reflexively criticize Apple for their usual draconian nonsense, but I don't know where I come down on this issue.

Normally, I'm a big stickler for data privacy, and this would result in strangers looking at the most intimate possible photos, but the purpose of the tech is something that I have a hard time disagreeing with.

I honestly don't know where I stand.

@tofugolem @evacide

I think it is misdirection.

I do not see Apple disabling the Client Side Access Method.

@evacide and at about the same time have introduced e2ee on lots of iCloud stuff. Breakthrough. New respect. Watch the state actors come for them.

@evacide I wonder whether this recent incident made them come to senses?

https://twitter.com/msmelchen/status/1597807914395500545

Melissa Chen on Twitter

“Chinese social media users report Huawei phones automatically deleting* videos of the protests that took place in China, without notifying the owners. *Not sure if it’s from the cloud or device level Our sci-fi movies have not even imagined this level of dystopia…”

Twitter
@evacide This is good news. I can see good intentions by Apple w/ this originally, but was too problematic in practicality. Glad they're realizing this and walking it back.
@evacide I'm excited by the precedent they're setting This feels glacial in pace, but I see motion in the right direction.
@evacide Time to place yer bets on whether Apple's soon to come opt-in nudity detection algorithm is racist or sexist.
show me the $
0%
$ on neither
0%
$ on racist
44.4%
$ on sexist
55.6%
Poll ended at .
@gpshead The damned machine won't take my vote, which is "both, of course".
@gpshead Oh it does, nevermind 😂 (TIL about checkboxes in votes on Masto.)
@evacide the real question is whether they will stop with their abusive practice of scanning photos before upload, as it undermines quite a lot of the security.