The EU’s “chat control” legislation is the most alarming proposal I’ve ever read. Taken in context, it is essentially a design for the most powerful text and image-based mass surveillance system the free world has ever seen.
This legislation, which is initially targeted at child abuse applications, creates the infrastructure to build in mandatory automated scanning tools that will search for *known* media, *unknown* media matching certain descriptions, and textual conversations.
The legislation is vague about how this will be accomplished, but the “impact assessment” it cites is not. The assessment makes clear that mandatory scanning of images & text, especially in encrypted data, is the only solution the Commission will consider.
The calls for detecting “grooming behavior”. If you wonder what that means, here is a brief description. Roughly it means developing new AI tools that can understand the content of textual conversations and can automatically report you to the police based on them.
You might ask how the EU, famous for its focus on privacy, justifies the development of automated text-analysis tools that scan your private chats. The Impact Assessment has an analysis. To say that this analysis is deficient is really much too kind.

@matthew_d_green The rough rule “Americans don’t trust the government, Europeans don’t trust companies” explains a lot of what we’re seeing. I suspect this would be illegal if offered as parental control feature for kids, but apparently not a problem if it’s just the government…

This is just beyond.