The EU's Chat Control Won't Die — Here's What Developers Need to Know

5 min read 1 source clear_take
├── "Chat Control is technically unsound — client-side scanning inevitably creates general-purpose surveillance infrastructure"
│  ├── fightchatcontrol.eu (fightchatcontrol.eu) → read

The campaign site argues the regulation would force end-to-end encrypted services like Signal and WhatsApp to implement client-side scanning or exit the EU market. They frame this as a fundamental threat to digital privacy and encryption, urging citizens to contact representatives to oppose it.

│  └── @x775 (Hacker News) → view

As the creator of Fight Chat Control, x775 notes that the Parliament surprisingly voted on March 11 to replace blanket mass surveillance with targeted monitoring of suspects following judicial involvement — but the Council continues to push broader scanning. They describe the recent legislative maneuvers as 'dumbfounding' and emphasize the campaign is 'once again, needed.'

├── "The EU should enshrine a positive right to private communications rather than just fighting bad proposals one at a time"
│  └── @derefr (Hacker News) → view

derefr argues that repeatedly shooting down surveillance proposals is insufficient — the EU needs proactive legislation enshrining a right to private communications that would make bills like Chat Control impossible to even table. They lament the absence of a 'privacy lobby' with the institutional power to push such affirmative legislation.

├── "The proposal is a deliberate power grab by an unaccountable supranational government, not a good-faith child safety measure"
│  ├── @afh1 (Hacker News) → view

afh1 frames the EU itself as an 'ultra-national government whose sole objective is ever increased control over your life and euros,' dismissing the child-safety rationale entirely. They challenge EU citizens who defend the institution to reckon with its surveillance ambitions.

│  ├── @AnonyMD (Hacker News) → view

AnonyMD argues that when you consider who controls the monitoring infrastructure, it becomes obvious the regulation serves those in power rather than children. They view the child-safety framing as a pretext for political surveillance capabilities.

│  └── @kleiba (Hacker News) → view

kleiba uses Hungary's support for the proposal as a heuristic litmus test — arguing that if Hungary backs a regulation, it's almost certainly bad for civil liberties. This frames the proposal as aligned with the interests of the EU's most authoritarian-leaning member state.

├── "The EU Council is using legislative attrition — reintroducing the proposal repeatedly until opposition fatigue sets in"
│  ├── @leugim (Hacker News) → view

leugim pointedly asks whether the Council will simply keep bringing the proposal back until it passes, highlighting a perception that the legislative process is being abused through sheer persistence rather than democratic persuasion.

│  └── @elzbardico (Hacker News) → view

elzbardico argues the proponents deliberately waited for the Russia-Ukraine war to dominate the news cycle so they could advance the proposal 'under the radar.' They characterize this as a calculated strategy — 'they never quit' — exploiting crisis-driven attention scarcity to push through surveillance measures.

└── "The alarm is overblown — the current vote is merely extending an existing voluntary scanning regime, not mandating new surveillance"
  └── @Stagnant (Hacker News) → view

Stagnant researched the actual legislative text and found the immediate vote concerns extending the temporary Regulation (EU) 2021/1232, which has been in effect since 2021 and covers voluntary scanning of private communications. They criticize the campaign site for not clearly explaining this distinction, suggesting the framing conflates the existing voluntary regime with the more extreme mandatory proposal.

What happened

The EU's proposed Child Sexual Abuse Regulation (CSAR) — widely known as "Chat Control" — is back on the legislative agenda. Despite repeated pushback from the European Parliament, privacy advocates, and the cryptography research community, the Council of the EU continues to advance versions of the regulation that would require platforms to scan private messages, photos, and files for child sexual abuse material (CSAM) before or during transmission.

The latest push, tracked extensively by the campaign site fightchatcontrol.eu, shows the proposal gaining renewed momentum in 2026. The core mechanism remains unchanged from earlier drafts: platforms offering messaging, email, or cloud storage would receive "detection orders" from a new EU Centre, compelling them to scan user content against databases of known CSAM and, more controversially, to use AI classifiers to detect previously unknown material and grooming conversations. The regulation as drafted makes no exception for end-to-end encrypted services — meaning Signal, WhatsApp, and any E2EE platform serving EU users would need to implement client-side scanning or exit the market.

The Hacker News discussion (scoring 845 points) reflects the developer community's sustained alarm. This isn't a fringe policy debate — it's a concrete compliance requirement that would reshape how messaging infrastructure is built.

Why it matters

### The technical problem is unsolvable as specified

The fundamental issue hasn't changed since Apple abandoned its own client-side scanning plans in 2023 after intense backlash: there is no known way to scan content on-device for targeted material without creating infrastructure that can be repurposed for broader surveillance. This isn't a slippery-slope argument — it's a statement about how the technology works. A client-side scanner is a programmable content filter. The hash databases and AI models it checks against are controlled server-side. Today's CSAM detector is tomorrow's political speech detector, and the client can't distinguish between the two.

Over 300 cryptography and security researchers signed an open letter to the EU making exactly this point. The letter didn't mince words: the proposed "upload moderation" framing is technically identical to breaking end-to-end encryption. Whether you scan before encryption (client-side) or break the encryption to scan in transit, the privacy guarantee — that only sender and recipient can read the message — is destroyed.

Signal president Meredith Whittaker has been unequivocal: Signal would rather leave the EU than implement client-side scanning. Threema and Tutanota have made similar statements. If the regulation passes in its current form, EU residents could lose access to the most trusted encrypted messaging platforms, while the scanning mandate would be trivially circumvented by anyone using a VPN or a non-EU service.

### The "upload moderation" rebrand

The Council's strategic evolution deserves attention. Earlier drafts explicitly called for message scanning, which was politically toxic. The current framing — "upload moderation" — positions the scanning as happening at the moment content is uploaded or sent, technically before the encryption envelope is applied. This is a distinction without a difference. The content is still scanned on the user's device, and the results are still reported to authorities without the user's knowledge or consent.

The European Parliament's position, adopted in late 2023, explicitly rejected scanning of encrypted communications and favored targeted approaches — scanning only when there's a court order or specific suspicion, not blanket detection orders. But the trilogue negotiations between Parliament, Council, and Commission have not resolved this gap. The Council, driven by a rotating presidency and heavy lobbying from law enforcement agencies, keeps pushing for broader scanning powers.

### The false positive math

Even a 99.9% accurate scanner, applied to the billions of messages sent daily in the EU, would generate millions of false positives — each one a private photo, conversation, or file incorrectly flagged and potentially reviewed by human moderators or law enforcement. The Swiss Federal Police, which already processes reports from US-based platforms under existing voluntary scanning programs, has reported that over 80% of flagged content turns out to be legal. Scale that to mandatory scanning of all EU messaging traffic and you have a system that is simultaneously a mass surveillance apparatus and an ineffective law enforcement tool.

What this means for your stack

If you build or maintain any application that transmits user-generated content between users in the EU — messaging, email, file sharing, cloud storage, social features, even collaborative documents — you need to be tracking this regulation.

Compliance architecture decisions are being forced now. The regulation, if passed, would come with an implementation timeline. But the architectural decisions — whether to implement client-side content scanning, how to handle detection orders, whether to maintain E2EE — need lead time. If your threat model includes EU regulatory compliance, you should be scenario-planning for a world where content scanning is mandatory.

The detection order mechanism is the key detail. The regulation wouldn't require all platforms to scan all the time. Instead, a new EU Centre would issue detection orders to specific platforms based on risk assessments. But the risk assessment criteria are broad enough that any platform with user-to-user messaging could receive an order. If you operate a platform with private messaging in the EU, the practical question isn't whether you'll get a detection order — it's when, and whether your architecture can comply without a ground-up rebuild.

Open-source and self-hosted platforms face an existential question. How do you compel a self-hosted Matrix server or a federated XMPP deployment to implement client-side scanning? The regulation's enforcement mechanism assumes centralized platforms with a legal entity to serve orders on. Federated and self-hosted communication tools may fall into a legal gray zone — or become the only viable option for private communication in the EU.

US and non-EU developers aren't immune. If your SaaS serves EU users, you're in scope. The GDPR playbook applies: if you process data of EU residents, EU law reaches you. The irony of a regulation that simultaneously requires you to scan private content (CSAR) and prohibits you from processing personal data without consent (GDPR) is not lost on compliance teams.

Looking ahead

The EU Chat Control saga has been running since 2022, and its persistence is the point. Each time the proposal is defeated or watered down, it returns in a slightly different form. The strategy is attrition: keep pushing until defenders are exhausted or distracted. For developers, the actionable move is to treat this as a slow-moving but serious architectural constraint. Monitor the trilogue outcomes, engage with organizations like EDRi and the EFF who are tracking the legislative text, and build your systems so that the encryption guarantees you offer your users don't have a regulatory off switch hidden in the architecture. The fight over Chat Control isn't about one regulation — it's about whether private communication remains technically possible in the EU.

Hacker News 1372 pts 367 comments

The EU still wants to scan your private messages and photos

→ read on Hacker News
x775 · Hacker News

I am the creator of Fight Chat Control.Thank you for sharing. It is unfortunately, once again, needed.The recent events have been rather dumbfounding. On March 11, the Parliament surprisingly voted to replace blanket mass surveillance with targeted monitoring of suspects following judicial involveme

derefr · Hacker News

So... if we all care so much about shooting down the bad idea, why is nobody proposing opposite legislation: a bill enshrining a right to private communications, such that bills like this one would become impossible to even table?Is it just that there's no "privacy lobby" interested i

Stagnant · Hacker News

Okay so I had to look in to it because the site is not really doing a good job explaining it at all. Turns out[0] that they are voting for the extension of the temporary regulation thats been in effect since 2021 (Regulation (EU) 2021/1232). So this is about the "voluntary scanning of priv

kleiba · Hacker News

If you're ever unsure about whether a proposed EU regulation may be good or bad, just look at whether Hungary supports it: if so, it's bad; if not, it might be good. Egészségére!

afh1 · Hacker News

Where are all those "as an EU citizen" commenters? You are but a subject of an ultra-national government whose sole objective is ever increased control over your life and euros.

// share this

// get daily digest

Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.