🇩🇪EU-Kommission jetzt ernsthaft: „Es gibt keine #Chatkontrolle.“

Ihre Fantasie: Ein magischer, 100% perfekter Algorithmus, der nur CSAM findet.

Die Realität: Um etwas zu finden, muss man alles scannen.

Stoppt die Massenüberwachungslüge! #StopScanningMe

https://ec.social-network.europa.eu/@EUCommission/115180569539039179

🇪🇺The EU Commission seriously claims: "There is no such thing as #ChatControl.”

Their fantasy: A magic, 100% perfect algorithm that only finds CSAM.

The reality: To find anything, you must scan everything.

Stop the mass surveillance lie! #StopScanningMe

https://ec.social-network.europa.eu/@EUCommission/115180569539039179

🇫🇷 La Commission européenne ose affirmer : « Le #ChatControl, ça n'existe pas. »

Leur fantasme : Un algorithme magique, 100 % parfait, qui ne trouve que les contenus CSAM.

Stop au mensonge de la surveillance de masse ! #StopScanningMe

https://ec.social-network.europa.eu/@EUCommission/115180569539039179

European Commission (@[email protected])

Hello @[email protected]! Let us be clear: under this proposal, there is no general monitoring of online communications. There will be no such thing as “chat control”. Only material that is clearly child sexual abuse will be searched for and can be detected. Detection orders can only be issued by judicial or independent administrative authorities at the end of a thorough process to assess necessity and proportionality, balancing all the fundamental rights at stake.

European Commission on Mastodon
@echo_pbreyer
Seems like not only AI is hallucinating these days. 

@echo_pbreyer "Only material that is clearly child sexual abuse will be searched for and can be detected."

Cette commission est aussi incompétente et dangereuse que ses alterego nationaux...

Quel ramassis de connerie, comment peuvent-il soutenir que seul les contenus qui sont "clairement" de la pédoporno seront vérifiés sans vérifier tout autre type de contenu ???

@echo_pbreyer These morons don't understand tech a bit ...

@echo_pbreyer You can't find things in a house without searching the whole house. Doesn't matter what you're looking for, you'll need to search the whole house.

It doesn't matter what kind of content you want to detect, you'll have to search through literally everything somebody has to find it. If you mark any spot as "will not be searched" you can be 100% sure that that'll be where they hide it, whether it needs to be called "personal finances" or "home videos" or "system32" or whatever.

@echo_pbreyer we're looking for needles, but we won't see the haystacks
@echo_pbreyer I mean.... how .......
I'm generally very pro-EU, but for something like this
I can't figure if they try to sound dumb to fool people, or they actually are this dumb and believe it.
Either way - that's scary.

@echo_pbreyer It's impossible to search for something without scanning the device's storage or all content that is being uploaded.

Of course in fantasy books or sci-fi this may be possible, but not in real world.

@echo_pbreyer
Searching ‘only’ for child abuse…? How dumb and naïve are these people. If you build a tool like that, how long before the wrong government gets hold of it?

@echo_pbreyer

lol the @EUCommission doesn't even know the implications of their own proposal. How else except with a general monitoring of online communication would you ever implement and comply with their shitty proposal?

@echo_pbreyer what rubbish are they stating in that message 👀

"Only material that is clearly child sexual abuse will be searched for and can be detected"

Since private messages between people are not "clearly child sexual abuse", does that mean nothing will be scanned? In that case, why are the EU politicians so scared about that law being applied to themselves?

@echo_pbreyer
If you think through how the #chatcontrol proposal must be implemented in open standards federating chat networks like XMPP and Matrix, then you see that this proposal turns the whole internet infrastructure (servers, network and clients) into a state controlled infrastructure.
On that basis nothing could ever amount to a general monitoring obligation. A general monitoring obligation *always* involves looking for a specific kind of content.
Thus in SABAM v Netlog, the generality of the proposed filtering system lay in its application to substantially all files of all users; it didn’t matter that the system was intended to detect specific works contained in SABAM’s repertoire.
@echo_pbreyer On that basis nothing could ever amount to a general monitoring obligation. A general monitoring obligation *always* involves looking for a specific kind of content.
@echo_pbreyer Thus in SABAM v Netlog, the generality of the proposed filtering system lay in its application to substantially all files of all users; it didn’t matter that the system was intended to detect specific works contained in SABAM’s repertoire.