Ignoring expert consensus, feeling no shame following exposés showing tech lobbyists shaping these EU surveillance proposals, EU politicians are at it again

So, we'll reiterate: Signal would rather leave the EU market than subject our users to mass gov surveillance. FULL STOP

https://netzpolitik.org/2024/internes-protokoll-belgien-will-nutzer-verpflichten-chatkontrolle-zuzustimmen/

Internes Protokoll: Belgien will Nutzer verpflichten, Chatkontrolle zuzustimmen

Internetnutzer sollen einer Chatkontrolle zustimmen, sonst dürfen sie keine Bilder und Videos hochladen. Das schlägt die belgische Ratspräsidentschaft vor. Damit kommt neue Bewegung in die bisher festgefahrenen Verhandlungen der EU-Staaten. Wir veröffentlichen ein eingestuftes Verhandlungsprotokoll.

netzpolitik.org
On tech lobbyists involvement (and the corruption that allowed this) see: https://balkaninsight.com/2023/09/25/who-benefits-inside-the-eus-fight-over-scanning-for-child-sex-content/
‘Who Benefits?’ Inside the EU’s Fight over Scanning for Child Sex Content

An investigation uncovers a web of influence in the powerful coalition aligned behind the European Commission’s proposal to scan for child sexual abuse material online, a proposal leading experts say puts rights at risk and will introduce new vulnerabilities by undermining encryption.

Balkan Insight

The latest example of robust, longstanding expert consensus tirelessly pushing back against politically motivated magical thinking:

https://x.com/carmelatroncoso/status/1785980270664929481

Carmela Troncoso (@carmelatroncoso) on X

Statement, signed by 250+ researchers, warning that the modifications of the Regulation to detect CSAM proposed by the EU presidency dont solve the issues pointed out by experts. It still introduces societal risks without solving the CSA problem. https://t.co/Jmu3dZ2YXL

X (formerly Twitter)
@Mer__edith Any non-xitter link for this seemingly interesting statement?
@richlv @Mer__edith


Statement, signed by 250+ researchers, warning that the modifications of the Regulation to detect CSAM proposed by the EU presidency dont solve the issues pointed out by experts. It still introduces societal risks without solving the CSA problem.
http://www.csa-scientist-open-letter.org/

I summarized the issues with the regulation before
https://twitter.com/carmelatroncoso/status/1676192115414519808
Given the state of detection technology: too many errors and is easy to evade, it cant be effective
Introducing detection technologies on users' devices undermines the protection of end-to-end encryption


The
@EU2024BE proposes two modifications to address these issues:
1) require more than one suspicious detection to trigger a report
2) only do mandatory detection in high-risk services
They also say that all of it will be done respecting the protecting provided by encryption

The first modification does not address the issue of the bad quality of detectors. False positives will still be large even with two detections. The assumption that such false alarms will be independent is false. In reality this wont be the case, eg sharing several beach pictures

Such modification also does not address the issue that detectors are easy to evade, and those wanting to share illicit material will do so, while innocent users will get caught once and again by the false positives.

The second modification does neither help with the issue of proportionality. End-to-end encrypted services (like the most used messaging systems Signal/WhatsApp/Threema/Telegram) are high risk. Thus their billions of users will still be subject to mandatory detection

Also, as we exposed before, if these platforms become a problem for those sharing CSAM, they will move to other platforms -- proprietary or those that don't have detection, while legitimate users will stay and be subject to continuous screening for no reason.

More importantly, adding detection capabilities to end-to-end encrypted services BREAKS the protection given by encryption. Once messages can be read by anyone that is not the sender or receiver, encryption is useless...

... the same as an envelope doesn't provide privacy if someone can read the letters before they are introduced in it; and walls do not provide privacy if you can put a camera inside -- even if this camera/letter only results in reports when detecting suspicious activity


We are also worried that the regulation pushes for further uses of technology, such as age verification, when it is not clear such technology (as CSAM detectors) is ready -- as the EU Parliament itself recognizes
https://www.europarl.europa.eu/RegData/etudes/ATAG/2023/739350/EPRS_ATA(2023)739350_EN.pdf

(1/2)
Redirecting

@richlv @Mer__edith

Finally, we stress our disappointment that despite the many critiques of the process, it continues evolving without consulting academic experts, and without any transparency on who is consulted and on the methods that will be used to implement the regulation

Protecting children from online abuse while preserving their right to secure communications is critical. Eradicating CSAM relies on eradicating abuse, not only abuse material. Technocentric approaches focused on sharing material dont tackle the core of the problem

We recommend substantial increases in investment and effort to support existing proven approaches to eradicate abuse-- as indicated in the letter --and with it, abusive material. Our communications don't need to be threatened to achieve this.

(2/2)