RE: https://eupolicy.social/@ella/115463747493432363

Positive update on #ChatControl. The EU's member states are set to officially agree to delete forced mass scanning orders (which could undermine encryption) this week!

This puts the protection of civil liberties in a much better situation for final negotiations with the EU Parliament.

However, it doesn't mean that all the proposal's issues have been solved - we're especially worried about the threat of forcing message apps, email services and app stores to perform risky age verification.

Half-good new Danish Chat Control proposal

Denmark, currently presiding over the EU Council, proposes a major change to the much-criticised EU chat control proposal to search all private chats for suspicious content, even at the cost of destroying secure end-to-end encryption: Instead of mandating the general monitoring of private chats ("de

Patrick Breyer
@nemeciii The Council fixed some of the issues in a new updated text! It's not perfect, but it's a big step in the right direction...

@ella There are so many ways in which even voluntary scanning of the contents of a device or backups are problematic.

You are a therapist counselling patients who have experienced abuse as a child.

You are a child chatting to a support line to try to get out of an abusive family situation.

Both are safe with an E2E encrypted app.

Both are no longer safe with client side scanning.

Patient-counsellor confidentially is thrown out of the window even if chat control is only "voluntary".

@the_wub This could certainly be an issue if you're using a non-encrypted service. Voluntary scanning still comes with potentially severe limitations to people's rights, and AI-based scanning can be really unreliable and risky. So I'm certainly not saying that the Council text is perfect or fully rights compliant.
But I really cannot see any encrypted service provider voluntarily implementing CSS, given the massive security risks it creates, so I don't think that's a risk right now. Of course...
@the_wub ...this is only a negotiating position, and who knows what will be in the final law. A lot of things worry me.
But I guess I'm counting on the fact that if you care enough to implement E2EE, it would be very strange to start deploying CSS given that it's voluntary - and the Council and Parliament texts both contain protections for E2EE.

@ella I understood that client side scanning allows access to *anything* on the device.

When you are say using WhatsApp or SIgnal then CSS can record and analyse all of the chats on that device.

See this video about the "SignalGate"/TeleMessage scandal from about 36:55.

There is also a blog post by Micah Lee on the same subject.

The whole video is worth watching if you haven't seen it already.

https://youtu.be/KFYyfrTIPQY

"We are currently clean on OPSEC": The Signalgate Saga (DEFCON 33)

YouTube

@ella From the WhatsApp website

"That's because the encryption and decryption of messages sent and received on WhatsApp occurs entirely on your device. Before a message ever leaves your device, it's secured with a cryptographic lock, and only the recipient has the keys."

Note : " Before a message ever leaves your device" not "at all times even when a message is still on your device waiting to be sent".

A message arrives on your device and is then unlocked so you (or CSS) can read it.

@ella

As far as I understand it, CSS is like having someone reading your chats over your shoulder all the time.

I am happy to be corrected should I have misunderstood something about how encrypted apps and CSS work

@the_wub I'm a bit confused by your replies - we are in agreement that CSS is really dangerous! That's why I've spent the last 3 years campaigning against providers being forced to implement it :)
But, I believe that the new Council text sufficiently rules out the risk of CSS. It even contains provisions about not weakening encryption and upholding cybersecurity.
There will for sure be other threats to encryption in EU laws, but for this law, I don't think CSS will be an issue any more.

@ella My replies related to this comment:

"This could certainly be an issue if you're using a non-encrypted service"

The point about CSS is that even if you are using WhatsApp or Signal which is "encrypted" it will be an issue.

The client side scanning happens on your device built into the apps.

If your outgoing message is encrypted on its way out to the recipient the CSS will still be able to read through it and check it before it is sent.

The same holds true for incoming messages.

@ella Happy to be corrected if I have misunderstood how it all would work.
@the_wub WhatsApp and Signal won't be using CSS. The law won't require them to do scanning, so they won't need to use CSS.

@ella

Despite being American, I'm afraid it'll still pass regardless... unless something's done to really show them who serves who.

@ella I fervently hope this is true, however, I'll believe it when I see it. I don't trust anything coming from the people who have already allowed this abomination to go this far.
@ella wait so age verification is only for app stores and messages apps but this won't affect videogames right?
@Kurosetii it would be mandatory for any service considered "high risk" for grooming - and the current Council text considers that any services that allow private communications are high risk (!). So if the video game has any sort of personal message function, they probably would be required to implement age verification.
The Parliament's text is different, though, and we are hoping that this big issue can be resolved in final negotiations.
@ella oh well good to hear because the games i play don't have any private messages so it's good, and still i hope there's facial scanning on this because i won't give my ID to chat with my friends
@64kb yes - this is a significant improvement on Patrick's point about forced scanning hidden as a "risk mitigation" measure. The Council have added a clause to make sure that there cannot be a loophole that allows this.
It's only a negotiating mandate, and several other issues remain, but it's still an important step forward!
@ella
Document say the age verification "shall be privacy
preserving, proportionate, transparent, effective, accurate, non-discriminatory,
accessible and take as a primary consideration the best interest of the child".