Apple is going to scan your pictures for harmful content. Here we explain how this affects your privacy online: https://tutanota.com/blog/posts/rotten-apples
Rotten Apples: iOS 15, personal privacy, and the on-going fight against CSAM.

This fall, Apple is planning to release iOS 15 which will begin scanning your devices and iCloud for known images depicting the sexual abuse of minors. Apple is attempting to assuage the public and claims that these features will scan personal devices and cloud storage “while designing for user privacy.” Is this yet another case of invasive surveillance hidden under the guise of protecting the children or is Apple attempting to walk the fine line between preventing the spread of abusive material and protecting user privacy?

@Tutanota always knew /etc was a backdoor...
@Tutanota @schratze
Additionally, people are already exploiting Apple's NeuralHash model https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1
Working Collision? · Issue #1 · AsuharietYgvar/AppleNeuralHash2ONNX

Can you verify that these two images collide? Here's what I see from following your directions: $ python3 nnhash.py NeuralHash/model.onnx neuralhash_128x96_seed1.dat beagle360.png 59a34eabe31910abf...

GitHub
@Tutanota genau das hat @linuzifer im letzten Logbuch Netzpolitik beschrieben aber leider ist da Pritlove anderer Meinung