Tuta, the German encrypted email provider, was among the most vocal campaigners against Chat Control 1.0, framing the issue as a fundamental question about whether EU citizens have a right to private digital communication. They celebrated the Parliament's vote as a decisive stance against the voluntary mass scanning regime that they argued violated the ePrivacy Directive.
Submitted the Tuta announcement to Hacker News, where it garnered 351 points, reflecting the developer community's deep engagement with surveillance and encryption policy. The strong upvote signal indicates broad agreement that ending voluntary mass scanning of private messages is a positive development.
The editorial synthesis emphasizes that while Parliament's vote is a clear signal, it does not kill the broader EU push to scan private messages. Chat Control 2.0 — the proposed Child Sexual Abuse Regulation (CSAR) — would make scanning mandatory, including for end-to-end encrypted platforms, through a mechanism the Commission calls 'upload moderation.' The editorial frames the Parliament vote as one battle in a larger war over encryption.
The editorial argues that Chat Control 1.0's voluntary nature was deceptive: platforms like Gmail and Facebook Messenger hash-matched images and used classifiers to flag suspected CSAM across billions of messages without a legal mandate. Privacy advocates contended this operated in a gray zone that violated the ePrivacy Directive, making the 'voluntary' label a mere fig leaf for mass surveillance infrastructure.
The European Parliament has taken a decisive stance against Chat Control 1.0, voting that the temporary derogation allowing tech companies to voluntarily scan private messages for child sexual abuse material (CSAM) must come to an end. The regulation, formally known as the temporary eCSAM derogation (EU 2021/1232), has been in effect since 2021 and was originally set to expire but kept getting extended. Parliament has now said: no more.
The vote represents the culmination of years of lobbying by privacy advocates, encrypted communications providers, and civil liberties organizations. Tuta (formerly Tutanota), the German encrypted email provider, was among the most vocal campaigners, framing the issue as a fundamental question about whether EU citizens have a right to private digital communication. The Hacker News thread on the topic drew a score of 351, reflecting the developer community's deep engagement with surveillance and encryption policy.
Chat Control 1.0 allowed — but did not require — platforms like Meta, Google, and Microsoft to deploy automated scanning tools on unencrypted messages. In practice, this meant services like Gmail and Facebook Messenger could hash-match images and use classifiers to flag suspected CSAM, then report it to authorities. The voluntary nature was always the fig leaf: platforms scanned billions of messages without a legal mandate, operating in a gray zone that privacy advocates argued violated the ePrivacy Directive.
### The encryption battle isn't over
While Parliament's vote against extending Chat Control 1.0 is a clear signal, it does not kill the broader EU push to scan private messages. Chat Control 2.0 — the proposed Child Sexual Abuse Regulation (CSAR) — would make scanning mandatory, including for end-to-end encrypted platforms, through a mechanism the Commission euphemistically calls "upload moderation." That proposal has been stuck in the Council of the EU after multiple member states (led by Germany, Austria, and Poland) blocked it, but it has not been formally withdrawn.
The technical implications of Chat Control 2.0 are what keep cryptographers and security engineers up at night. The proposal would effectively require client-side scanning — analyzing message content on the user's device before encryption. Apple briefly explored this approach in 2021 with its NeuralHash CSAM detection system, then abandoned it after security researchers demonstrated that the hash-matching system could be manipulated and that it created a surveillance infrastructure exploitable by authoritarian governments.
### What the security community has said
The opposition to client-side scanning is not a fringe position. In 2023, over 300 academics and researchers signed an open letter warning that the technology required by Chat Control 2.0 is "deeply flawed." The core argument: you cannot build a scanning system that only detects CSAM. Any client-side scanning mechanism is, by definition, a general-purpose surveillance backdoor that can be repurposed — by governments, by hackers, by insiders. Signal president Meredith Whittaker has repeatedly stated that Signal would rather exit the EU market than comply with mandatory scanning.
The European Data Protection Supervisor (EDPS) and the EU's own legal service have both raised serious concerns about the proportionality and legality of mandatory scanning under EU fundamental rights law. Even Europol's own technical advisors have acknowledged the detection accuracy problem: current scanning technology produces false positive rates that, at the scale of billions of messages per day, would generate millions of false reports — overwhelming law enforcement and flagging innocent people.
### The developer angle
For anyone building communications infrastructure, the regulatory signal from Parliament is mixed. On one hand, the end of Chat Control 1.0 removes the legal cover that platforms used for voluntary scanning, which could paradoxically reduce the amount of scanning happening today. On the other hand, the Commission has shown no signs of abandoning Chat Control 2.0, meaning any messaging app, email service, or collaboration tool serving EU users needs to architect for the possibility that client-side scanning could become law.
This is not an abstract concern. If you're building on Matrix, XMPP, or any federated protocol, or if you're shipping a product with end-to-end encryption, the architectural decision of where content moderation happens — server-side, client-side, or not at all — has compliance implications that could change with a single Council vote.
If you maintain or build encrypted messaging, email, or collaboration tools:
Short term: The end of Chat Control 1.0 is a green light for end-to-end encryption without voluntary scanning. If your platform was scanning under the 1.0 derogation, you'll need to evaluate whether you have another legal basis to continue (spoiler: it's complicated under ePrivacy).
Medium term: Watch the Council. Chat Control 2.0 is not dead — it's in legislative limbo. The Romanian presidency pushed hard for it in late 2024, and future presidencies may revive it. If you're making architectural decisions about encryption today, design for the possibility that client-side content moderation could become a legal requirement in the EU within 2-3 years. This means thinking about modular scanning hooks that can be enabled per-jurisdiction without breaking your E2E guarantees for users in other regions.
Long term: The EU's approach will likely influence regulation globally. The UK's Online Safety Act already contains powers to require scanning of encrypted messages (though Ofcom hasn't activated them). Australia's proposed reforms go in a similar direction. If you're building for a global user base, jurisdiction-aware encryption architecture isn't optional — it's table stakes.
One practical note: if you're using detection APIs from services like PhotoDNA (Microsoft) or the NCMEC hash database, understand that the legal basis for using these in the EU just got murkier with the end of Chat Control 1.0. Consult your legal team before your next sprint planning.
Parliament's vote is a battle won, not a war ended. The European Commission has made clear that it views CSAM detection as a political priority, and the technical debate about whether scanning and encryption can coexist will intensify. For developers, the takeaway is pragmatic: end-to-end encryption is currently safe in the EU, but the regulatory ground is shifting. Build modular, stay informed, and don't assume today's legal framework will survive the next legislative cycle. The best defense for the open internet has always been the same — ship tools so good, so widely adopted, that breaking them becomes politically untenable.
Its time to start trying to push Chat Control 2.0. With enough money and infinite retries eventually all the bad regulations with a power group behind will end being approved.
Thex will try again. And again. It's for the children, don't you know?The only way to really stop this would be to pass legislation that permanently strengthens privacy rights.
I’m confused by> This means on April 6, 2026, Gmail, LinkedIn, Microsoft and other Big Techs must stop scanning your private messages in the EUIt had already passed and started?
Here is the EPP's plea to get this passed earlier.They even used a teddy bear image.https://www.eppgroup.eu/newsroom/epp-urges-support-for-last-..."Protecting children is not optional," said Lena Düpont MEP, EPP Group spokeswoman on Legal and Home Affairs. "We
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
The linked tweet is a bit misleading. There were 2 votes, one for amending the existing proposal re: "unknown messages", and the other for the whole proposal itself. The screenshot in the tweet is about the amendment, which was less important than the fact than then the whole proposal was