The numbers donât lie: Germanyâs own data proves the EUâs proposed âChat Controlâ surveillance system would flood police with false reports while decimating digital privacy.
Germany just handed the European Union an inconvenient truth that undermines the entire foundation of its controversial âChat Controlâ proposal. In 2024 alone, nearly half of all tips Germany received through the existing voluntary scanning system were false alarms. According to the Federal Criminal Police Office (BKA), 99,375 of the 205,728 reports forwarded by the US-based National Center for Missing and Exploited Children (NCMEC) were not criminally relevant, an error rate of 48.3%.
The EU Could Be Scanning Your Chats by October 2025 â Hereâs Everything We Know
These arenât just statisticsâthey represent 99,375 instances of innocent personal content being falsely flagged and forwarded to authorities in a single year. Family beach photos. Medical images shared with doctors. Artistic photography. All mistakenly classified as potential child sexual abuse material (CSAM) and sent to law enforcement for investigation.
But hereâs the kicker: this catastrophic failure rate comes from the current voluntary system that doesnât even scan encrypted messages. And now the EU wants to make it mandatory for everyone, everywhere, all the timeâincluding your private WhatsApp and Signal conversations.
The Current System Is Already Broken
Under the current âChat Control 1.0â framework, this system does not apply to end-to-end encrypted services and is not mandatory. Only some platforms like Gmail, Facebook Messenger, and Skype participate voluntarily. Many of these reports are generated by private tech companies such as Meta, Microsoft, and Google, which scan usersâ communications voluntarily for possible child sexual abuse material (CSAM) and pass them to NCMEC.
The false positive problem isnât unique to Germany. In Ireland, 852 out of 4192 (20.3%) of the reports received by the Irish police forces turned out to be actual exploitation material, with 471 (11.2%) being marked as false positives. High false-positive ratesâup to 80% in countries like Switzerlandârisk wrongful accusations.
LinkedIn found 75 accounts that were reported to EU authorities in the second half of 2021, due to files that it matched with known CSAM. But upon manual review, only 31 of those cases involved confirmed CSAM. Thatâs nearly 60% false positives, and LinkedIn uses PhotoDNAâthe same technology specifically recommended by US lawmakers.
EU Chat Control: Opposition Grows as September 12 Deadline Looms
The EUâs Plan Makes Everything Worse
Despite this damning evidence, Denmark, which holds the rotating presidency of the European Council from July 1 until the end of 2025, has made Chat Control a top priority. The next Council debate and vote on chat control will reportedly take place on 14 October, where Danish leaders will attempt to secure adoption.
The proposed âChat Control 2.0â would be far more invasive than the current broken system:
Mass Scanning: It requires providers to scan chats, messages, emails, and cloud storage, breaking end-to-end encryption on platforms like WhatsApp and Signal.
Client-Side Surveillance: This scanning would happen on usersâ devices through âclient-side scanningâ technology, effectively creating backdoors in encrypted communications.
AI Guesswork: The Commissionâs original proposal contains a requirement that platforms, once served with a detection order, must scan peopleâs messages, not just for known CSAM but also for unknown CSAM. This would further ramp up the technical challenge of detecting illegal content with a high degree of accuracy and low false positives.
Text Analysis: A further component in the Commissionâs proposal requires platforms to identify grooming activity in real time. This means, in addition to scanning imagery uploads for CSAM, apps would need to be able to parse the contents of usersâ communications to try to understand when an adult user might be trying to lure a minor to engage in sexual activity.
The Math Doesnât Add Up
Security experts have warned that expanding this broken system would create millions of false positives per day. Given the large amount of messages sent in these platforms (in the order of billions), one can expect a very large amount of false alarms (in the order of millions), according to a letter signed by 270 academics and security researchers.
To get the false positives down to the hundreds, statistically one would have to identify at least 5 repetitions using different, statistically independent images or detectors. And this is only for WhatsApp â if we consider other messaging platforms, including email, the number of necessary repetitions would grow significantly.
This isnât just a technical problemâitâs a human rights disaster waiting to happen. Millions of innocent people could be erroneously implicated in suspicious activity, burdening law enforcement with a pipeline of false reports.
The Legal Reality Check
According to the Council Legal Service, the Danish Chat Control proposal still violates human rights and âthe core problems of access to communication for potentially all users remained unchanged.â
In February 2024, the European Court of Human Rights ruled, in an unrelated case, that requiring degraded end-to-end encryption âcannot be regarded as necessary in a democratic societyâ. The writing is on the wall: mass surveillance of private communications violates fundamental human rights.
The EU legal advisors have concluded that European Chat Control proposals which would require tech companies to scan private and encrypted messages for child abuse material (CSAM) are in breach with EU law.
https://t.co/ahakJAV3aTâ Global Government Affairs (@GlobalAffairs) September 4, 2025
Major Platforms Are Ready to Leave
The tech industry isnât bluffing about the consequences. Signal and other privacy-focused platforms have threatened to shut down their European operations rather than implement scanning technology that would undermine their security promises. Telegram pledges to exit the market rather than âundermine encryption with backdoorsâ.
When the most privacy-conscious platforms abandon Europe, users wonât switch to surveilled alternativesâtheyâll find workarounds. Criminals will simply use encrypted ZIP files, foreign platforms, or custom encryption. Meanwhile, ordinary citizens lose access to secure communication tools that protect journalists, activists, abuse victims, and anyone who values privacy.
The Political Chess Game
The Danish presidency, which began in July 2025, has reignited momentum, reportedly securing the support of 19 of the 27 Member States. But the political landscape is shifting rapidly.
Many countries within the EU Council that helped block Chat Control in 2024 are now wavering. The following countries are undecided or unclear: Belgium, Czech Republic, Estonia, Finland, Germany, Greece, Slovenia, Luxembourg, Romania, while several formerly opposed governments such as France have already given up their opposition.
According to Breyer, Denmark crucially needs to manage to convince Germany of its proposed text. The new government has not yet taken a position at the time of writing. Germanyâs decision could determine whether Europe embraces mass surveillance or maintains its commitment to digital rights.
Timeline: October Deadline Approaching
The critical dates are fast approaching:
- September 12, 2025: Member States are expected to finalise their positions in Council working groups- October 14, 2025: Earliest possible Council vote set for 14 October 2025
If passed without robust safeguards, the law could establish a precedent for scanning private communications on a large scale, thereby reshaping the balance between online safety and privacy not just in Europe, but globally.
The Bottom Line
Germanyâs 2024 data exposes the fundamental flaw in Chat Control: it doesnât work. A system that generates nearly 100,000 false reports annually while scanning only voluntary, unencrypted platforms would become a privacy-destroying nightmare if expanded to mandatory, universal surveillance.
Mass surveillance is being implemented to combat abuse, even when the mechanisms of detecting abuse are inaccurate and peopleâs personal content is being falsely flagged and forwarded to authorities.
The EU faces a clear choice: abandon this technically broken, legally dubious, and fundamentally authoritarian approach, or transform Europe into a surveillance state where every private message, photo, and conversation is subject to algorithmic judgment and potential police investigation.
As Xâs Global Government Affairs correctly notes, âCSAM can be addressed through less-intrusive means, including beefed up law enforcement and enhanced cooperationâ rather than mass surveillance that âthreatens Europeanâs human rights, their right to privacy and freedom of expressionâ and âcould expose peopleâs personal data to a host of actors and degrade cybersecurity for individuals, businesses, and governments globally.â
The assumption that every user is a potential suspect leads to widespread, suspicionless scanning, raising concerns that go well beyond the fight against abuse.
With Germanyâs own law enforcement data proving the systemâs failure, thereâs no excuse for moving forward. The October vote will determine whether Europe remains a beacon of digital rights or becomes another surveillance state where privacy is sacrificed for the illusion of safety.
The October 14, 2025 vote represents a turning point for digital privacy in Europe. Contact your representatives nowâbefore itâs too late to preserve encrypted communication for everyone.