An in-depth investigation into the ASOG amendment and the dangerous precedent of state-sponsored spyware
đ§ Related Podcast Episode
Executive Summary
On December 4, 2025, Berlinâs parliament quietly crossed a threshold that privacy advocates had defended for decades. The amendments to the General Security and Order Act (ASOG) grant police unprecedented surveillance powers: covert home entry to install state spyware, mass geodata collection from telecom operators, biometric facial recognition scanning of social media, and the use of police data to train AI systems. This 700-page legislative overhaul represents the most comprehensive expansion of digital surveillance powers in Berlinâs historyâand security experts warn it creates vulnerabilities that could expose citizens to risks far beyond government oversight.
The Parliamentary Fast Track
The legislation sailed through Berlinâs House of Representatives with support from the ruling CDU-SPD coalition and, controversially, the far-right Alternative for Germany (AfD) party. Interior Senator Iris Spranger (SPD) justified the reform as essential modernization for combating encrypted communications, terrorism, and cybercrime in the digital age.
Critics from Die Linke and the Greens denounced what Niklas Schrader called âa black day for civil liberties.â The Alliance NoASOG has characterized the reform as an assault on civil society, while the Society for Civil Rights (GFF) announced plans to file a constitutional complaint.
Berlinâs data protection commissioner, Meike Kamp, was particularly blunt in her assessment: the legalization of state trojans amounts to âa frontal attack on the IT security of all citizens,â creating a âconstitutionally highly questionable density of surveillance.â
The Technical Arsenal: What ASOG Actually Authorizes
Covert Home Entry & State Trojan Deployment (Paragraphs 26a, 26b, 26)
The most controversial provisions authorize âsource telecommunications surveillanceâ (Quellen-TKĂ) through state trojansâsophisticated spyware designed to intercept encrypted communications before or after decryption. When remote installation proves technically infeasible, paragraph 26 explicitly permits police to conduct secret physical entry into private residences to install the malware directly on devices via USB or other means.
This isnât theoretical. The software would typically:
- Intercept communications from encrypted messaging apps (Signal, WhatsApp, Telegram)- Capture keystrokes including passwords and credentials- Take screenshots of active windows- Access camera and microphone capabilities- Establish command-and-control connections to police servers- Download and execute additional code remotely
Mass Geodata Collection (Paragraph 26e)
Police can now demand traffic data from all mobile devices connected to specific cell towers during defined timeframes. This âcell tower queryâ capability enables the creation of movement profiles for thousands of individuals who happened to be in an areaâtransforming every protest, demonstration, or public gathering into a mass surveillance opportunity.
The implications are staggering: attend a climate protest, walk past a crime scene, or simply be in the wrong neighborhood at the wrong time, and your location data becomes permanent police records.
Biometric Surveillance & Social Media Scanning (Paragraph 28a)
Automated biometric matching of faces, voices, and other identifiers against âpublicly accessible data from the internetââincluding social networksâallows police to identify individuals through facial recognition AI. This crosses a critical line from targeted investigation to dragnet surveillance.
Combined with the existing video surveillance infrastructure in Berlin, this creates a persistent identification system that tracks individuals across time and space.
AI Training on Police Data (Paragraph 42d)
Perhaps most concerning from a security perspective is the authorization to use personal data collected during investigations to train artificial intelligence systems. Data protection experts warn this provision:
- Vastly expands the original purpose limitation of collected data- Creates AI models that may allow reverse-engineering to expose training data- Establishes precedent for repurposing sensitive information without consent- Lacks safeguards against algorithm bias and false positive propagation
Additional Powers
- Bodycam deployment in private homes (Paragraph 24c): Officers can activate body cameras inside residences when they perceive threats to life or limb- Automatic license plate recognition (Paragraph 24d): Continuous scanning and database matching of all vehicles- Drone countermeasures (Paragraph 24h): Authority to neutralize or commandeer civilian drones- Extended preventive detention: Up to five days (seven for terrorism cases) without charge
The Cybersecurity Catastrophe: Learning from R2D2
Germanyâs previous experiment with state trojans offers a cautionary tale. In 2011, the Chaos Computer Club (CCC) discovered and analyzed âR2D2ââa state trojan allegedly used by German law enforcement. The findings were damning:
Critical Security Failures
Fixed Encryption Keys: The trojan used AES encryption in Electronic Codebook (ECB) mode with a hardcoded key shared across all installations. This is undergraduate-level cryptographic failureâany attacker who obtained one sample could decrypt communications from all installations.
No Authentication: The command-and-control infrastructure lacked authentication mechanisms, meaning anyone could spoof commands or inject false data into investigations. As security researcher Bruce Schneier noted, it would be âeasy to fool the Trojan into installing anything.â
Unencrypted Data Transmission: Screenshots and intercepted audio were transmitted in unencrypted JPEG and audio formats, exposing sensitive investigative data to any network observer.
Remote Code Execution: The trojan included functionality to download and execute arbitrary code without verificationâcreating a backdoor that any attacker could exploit once they identified an infected system.
Third-Party Server Infrastructure: Data was transmitted to command-and-control servers in the United States, violating German data sovereignty laws.
The Paradox of State-Sponsored Vulnerabilities
The CCCâs analysis revealed what cybersecurity professionals immediately recognized: âThe security level this trojan leaves the infected systems in is comparable to it setting all passwords to â1234â.â
The German government paid âŹ2 million for software that made citizens less secure. Once installed, R2D2 didnât just create a surveillance channel for policeâit created exploitable vulnerabilities that any competent attacker could leverage.
Anti-virus vendors including F-Secure, Sophos, and Symantec made a principled decision: they would detect and remove the state trojan as malware, prioritizing customer security over law enforcement convenience. As Graham Cluley of Sophos explained: âWhatâs to stop a cybercriminal commandeering a law enforcement Trojan and using it against an innocent party? Our customersâ protection comes first.â
Why State Trojans Are Inherently Dangerous
The Architecture of Vulnerability
State trojans face an impossible contradiction: they must be sophisticated enough to evade detection by security software, yet simultaneously maintainable and controllable by government agencies. This creates several unavoidable problems:
Detection Evasion = Persistence: To avoid detection, state trojans must use techniques indistinguishable from criminal malwareârootkit functionality, process injection, kernel-mode drivers, and anti-analysis methods. These same capabilities make the software persistent and difficult to remove.
Remote Control = Attack Surface: Command-and-control infrastructure required for state trojans creates network endpoints that must be secured against both unauthorized access and discovery. History shows governments consistently fail at this operational security.
Update Mechanisms = Entry Points: The ability to remotely update state trojans (a requirement for maintaining effectiveness as systems and encryption evolve) creates precisely the kind of remote code execution vulnerability that cybersecurity professionals spend their careers preventing.
Fixed Targets = Reverse Engineering: Unlike criminal malware that can be quickly abandoned when discovered, state trojans must remain operational across many installations. This makes them valuable targets for reverse engineering by foreign intelligence services, criminal organizations, and security researchers.
The Supply Chain Problem
State trojans are typically developed by private contractors. The German R2D2 trojan was created by Digitask, a company in Haiger, Germany. This creates additional risks:
- Contractor Vulnerabilities: Private development firms become high-value targets for espionage- Unaccountable Code: Without open-source auditing, no independent verification of security practices exists- Commercial Incentives: Contractors may prioritize functionality over security to meet contract requirements- Knowledge Dispersion: Multiple individuals across development, deployment, and management have access to exploits and infrastructure details
The Proliferation Risk
Once a state trojan exists, it becomes a template. Criminal organizations can reverse-engineer capabilities, foreign intelligence services can repurpose techniques, and authoritarian regimes can adopt surveillance architectures without the pretense of democratic oversight.
As security researchers have repeatedly demonstrated, the distinction between âlawful interceptâ tools and criminal malware is purely one of authorization, not technical capability.
The Cascade Effect: How One Suspect Becomes Hundreds of Targets
The infographic provided illustrates the exponential privacy violation inherent in modern surveillance tools:
Step 1 - Physical Installation: Police obtain warrant to surveil Person X, conduct covert entry to install trojan on their devices.
Step 2 - Digital Collection: The trojan transmits Person Xâs contacts, correspondence, and location data.
Step 3 - Mass Geodata Query: Based on Xâs location history, police request cell tower data on all devices in those locations during those timesâpotentially thousands of individuals.
Step 4 - Biometric Cross-Reference: Photos of identified contacts are automatically scanned using facial recognition against social media platforms to identify connections.
Outcome: Surveillance of one person creates movement profiles and social network maps for dozens or hundreds of innocent individuals who were never subjects of investigation.
This isnât hypothetical. The ASOG amendments explicitly authorize each step of this process.
The Constitutional Question
Germanyâs Federal Constitutional Court has historically imposed strict limitations on surveillance, stemming from the nationâs experience with both Nazi Gestapo and East German Stasi surveillance states. Key precedents include:
- 2008 âOnline Searchâ Ruling: Restricted online surveillance to cases involving concrete dangers to human life or state security, requiring judicial authorization- Purpose Limitation: Collected data must be used only for specified purposes- Proportionality: Surveillance methods must be proportionate to threats- Separation of Powers: Intelligence agencies and police must remain institutionally separate
The ASOG amendments arguably violate each of these principles:
Expanded Scope: Source telecommunications surveillance isnât limited to life-threatening cases but extends to âserious crimesââa much broader category.
Purpose Creep: AI training on police data (Paragraph 42d) explicitly violates purpose limitation by repurposing investigative data.
Disproportionate Tools: Mass geodata queries and biometric scanning create surveillance dragnet effects far beyond targeted investigation.
Institutional Merger: Planned coordination with intelligence services through the pending reform of Berlinâs constitutional protection law (VSG Bln) erodes the separation principle established after Nazi-era abuses.
Constitutional challenges are virtually certain. The GFF has announced plans to file complaints, and legal experts broadly agree the amendments will face scrutiny from Germanyâs highest court.
International Context: The Global Surveillance Expansion
Berlinâs move is part of a broader authoritarian shift in democratic nations:
Comparative Surveillance Regimes
United Kingdom: The Investigatory Powers Act (2016) mandates that communications companies maintain the capability to remove encryption when served with warrants, creating systemic backdoor requirements.
France: Following terrorist attacks, France expanded administrative surveillance powers that allow warrantless wiretapping and device searches based on algorithmic threat assessments.
Australia: The Assistance and Access Act (2018) compels technology companies to create capabilities to access encrypted communications, with criminal penalties for disclosure.
Netherlands: The Intelligence and Security Services Act (2018) authorizes bulk collection of internet traffic and mandatory backdoor access.
United States: While domestic surveillance faces Fourth Amendment constraints, the FISA 702 program enables warrantless surveillance of foreign targets that often captures U.S. personsâ communications.
The Five Eyes Precedent
Intelligence-sharing arrangements among anglophone nations (US, UK, Canada, Australia, New Zealand) have effectively created supranational surveillance architecture that circumvents domestic legal protections. By sharing intercepted data, each nation gains access to its own citizensâ communications collected by foreign partnersâa legal loophole that privacy advocates call âlaundering surveillance.â
Germanyâs participation in similar European intelligence-sharing arrangements creates parallel risks that state trojan data could flow beyond national judicial oversight.
The Economics of Surveillance Capitalism
State surveillance expansion operates in symbiosis with commercial surveillance infrastructure:
The Data Broker Pipeline
- Social media platforms have already collected comprehensive biometric, behavioral, and relationship data- Paragraph 28aâs âpublicly accessibleâ language creates legal fictionâdata shared on âpublicâ social networks was never intended for law enforcement biometric scanning- Commercial data brokers aggregate and sell location data, purchasing histories, and behavioral profiles- Police access to this commercial surveillance infrastructure creates a public-private surveillance merger
The AI Training Market
Paragraph 42dâs authorization to train AI on police data creates potential commercial applications. While ostensibly for improving law enforcement algorithms, this data could be:
- Shared with researchers developing commercial surveillance tools- Used to train facial recognition systems later sold to private security firms- Leveraged to develop behavioral prediction algorithms with dual-use potential
The line between state security and commercial surveillance continues to dissolve.
Operational Implications: How This Affects Security Professionals
For cybersecurity consultants, CISOs, and security teams operating in Germany, the ASOG amendments create significant complications:
Client Advisory Responsibilities
Disclosure Obligations: Security professionals must inform clients that state-sponsored malware may be deliberately introduced to their systems, potentially creating vulnerabilities beyond their control.
Risk Assessment Updates: Threat models must now include the possibility that âauthorizedâ backdoors could be exploited by unauthorized actors. The R2D2 case demonstrates this isnât theoretical.
Incident Response Complications: When detecting suspicious software, teams face the impossible question: is this criminal malware or state surveillance? The distinction may be legally consequential.
Technical Countermeasures and Legal Risk
Anti-Malware Configuration: Security software that detects state trojans creates legal ambiguity. Are you obstructing justice or protecting client systems?
Network Monitoring: Deep packet inspection and traffic analysis that might reveal state surveillance operations could theoretically constitute interference with investigations.
Employee Training: Teaching staff to recognize trojan indicators (unusual processes, unauthorized network connections) becomes legally fraught when some trojans are âauthorized.â
The Compliance Paradox
GDPR requires organizations to implement appropriate technical and organizational measures to protect personal data. But how can compliance officers fulfill this obligation when government-installed malware deliberately circumvents security controls?
The ASOG amendments create direct conflict between data protection law and law enforcement powersâa conflict that will likely require European Court of Justice resolution.
Threat Actor Perspective: A Goldmine for APTs
Advanced Persistent Threat (APT) groupsâsophisticated attackers typically sponsored by nation-states or organized crimeâview state trojan infrastructure as high-value targets:
Intelligence Value
Operational Intelligence: Compromising state trojan command-and-control infrastructure reveals active investigations, targets, and methods.
Technical Intelligence: Reverse-engineering state trojans provides insights into government capabilities, zero-day exploits in use, and detection evasion techniques.
Strategic Intelligence: Access to law enforcement surveillance data creates intelligence advantages for foreign governments and criminal organizations.
Attack Scenarios
Trojan Hijacking: As R2D2 demonstrated, poorly secured state trojans can be commandeered by attackers to inject false evidence, frame innocent parties, or steal intercepted data.
Infrastructure Compromise: C2 servers, update mechanisms, and data repositories become single points of failure. Successful compromise grants access to all infected systems.
Supply Chain Infiltration: Contractors developing state trojans become targets for espionage, potentially resulting in backdoored surveillance tools that serve multiple masters.
Capability Proliferation: Captured state trojans can be repurposed by criminal organizations or sold on dark web markets, turning government surveillance tools into criminal infrastructure.
The Zero-Day Exploitation Concern
State trojans may leverage zero-day exploitsâpreviously unknown vulnerabilities in softwareâto achieve covert installation. These exploits:
- Remain unpatched while in government use, leaving all users vulnerable- Eventually leak or are discovered by others, creating security crises- Incentivize governments to maintain vulnerability rather than report to vendors- Create a perverse incentive structure where security is deliberately weakened
The Vulnerabilities Equities Process in the United States supposedly balances law enforcement needs against public cybersecurity, but critics note the process is opaque and appears to favor retention of exploits over disclosure.
The Protest Surveillance Pipeline
Environmental groups like âLast Generationâ were explicitly mentioned during parliamentary debates about extended detention powers. The ASOG amendments create a comprehensive architecture for surveillance of political dissent:
Pre-Event Intelligence
- Social media monitoring with facial recognition identifies protest organizers and participants- Cell tower queries around previous demonstrations create attendee databases- Automated license plate recognition tracks vehicles to and from political events
Real-Time Surveillance
- Video surveillance with AI-powered crowd analysis identifies individuals- Drone deployment provides aerial tracking- Body cameras on officers create comprehensive documentation
Post-Event Investigation
- Mass geodata requests identify everyone present in protest areas- Biometric cross-referencing links attendees across multiple events- Movement profile creation enables prediction of future protest participation
The Chilling Effect
This surveillance pipeline doesnât need to result in prosecutions to be effective. The mere knowledge that attendance at protests creates permanent records in law enforcement databases deters political participationâprecisely the chilling effect that First Amendment and European Convention on Human Rights protections aim to prevent.
As Berlin saw over 7,665 demonstrations in 2024 alone (over 20 per day), the ASOG amendments create infrastructure for comprehensive political surveillance at population scale.
Technical Recommendations for Organizations
Given the new surveillance environment, cybersecurity professionals should consider:
Network Security Posture
Enhanced Monitoring: Implement comprehensive network traffic analysis to detect unusual C2 communications, regardless of authorization.
Certificate Pinning: Prevent man-in-the-middle attacks by pinning certificates for critical applications, making trojan interception more detectable.
Behavioral Analysis: Deploy UEBA (User and Entity Behavior Analytics) to identify anomalous system behavior that may indicate compromise.
Segmentation: Network segmentation limits trojan lateral movement and contains potential compromise.
Endpoint Protection
Application Whitelisting: Only allow execution of approved applications, making unauthorized code execution (including from state trojans) more difficult.
Integrity Monitoring: File integrity monitoring detects unauthorized modifications to system files and binaries.
Privilege Management: Least-privilege access controls limit the impact of compromised user accounts.
Encrypted Storage: Full-disk encryption makes physical device compromise more difficult, though trojans installed by authorities may have encryption keys.
Organizational Policies
Transparent Risk Communication: Inform executives and board members that state surveillance may introduce security vulnerabilities beyond organizational control.
Incident Response Planning: Develop protocols for scenarios where detected malware may be government-authorizedâincluding legal counsel engagement criteria.
Data Minimization: Reduce data retention to minimize exposure in surveillance scenariosâdata that doesnât exist canât be compromised.
Geographically Distributed Infrastructure: For multinational organizations, consider data sovereignty implications of processing in jurisdictions with different surveillance regimes.
Personal Operational Security
For high-risk individuals (journalists, activists, lawyers, executives):
Device Separation: Maintain separate devices for high-risk and routine communications.
Operating System Diversity: Use diverse operating systems (Linux distributions, GrapheneOS) that may be less targeted by state trojan development.
Physical Security: Given covert entry authorization, implement tamper-evident seals and forensic detection measures.
Communications Security: Assume endpoint compromise; use technologies with forward secrecy and perfect forward secrecy characteristics.
Airgapping: For truly sensitive work, maintain completely offline systems with no network connectivity.
The Path Forward: Legal and Political Resistance
Constitutional Challenges
The GFF and other civil rights organizations are preparing comprehensive constitutional complaints. Key arguments will likely include:
Proportionality Violation: Mass surveillance tools like geodata queries and biometric scanning are disproportionate to stated law enforcement needs.
Purpose Limitation Breach: AI training on police data violates fundamental principle that data may only be used for purposes that justified its collection.
Dignity Infringement: Covert home entry and device compromise violate constitutional protections of human dignity and private sphere.
Separation of Powers: Coordination with intelligence services erodes post-Nazi safeguards against police state development.
Political Pressure
Opposition parties, civil society organizations, and affected communities can mobilize:
Public Education: Most citizens remain unaware of ASOGâs implicationsâinformation campaigns can build political pressure for repeal.
Electoral Accountability: The CDU-SPD coalitionâs support for surveillance expansion creates electoral vulnerability, particularly among younger, tech-savvy voters.
International Pressure: European Parliament and Council of Europe mechanisms provide additional venues for challenging surveillance overreach.
Technological Resistance
Security Tool Development: Open-source security tools that detect and document state surveillance attempts create public accountability.
Encrypted Infrastructure: Widespread adoption of end-to-end encryption raises costs and technical barriers for surveillance.
Decentralized Systems: Peer-to-peer and decentralized architectures resist centralized surveillance infrastructure.
Transparency Tools: Technologies that document surveillance encounters (encrypted video/audio recording, automated backup) create evidence trails.
Conclusion: The Canary in the Democratic Coal Mine
Berlinâs ASOG amendments represent more than a local law enforcement policy changeâtheyâre a test case for how far democratic societies will tolerate surveillance expansion in the name of security.
The technical reality is clear: state trojans create exploitable vulnerabilities that make all citizens less secure. The R2D2 case definitively demonstrated that governments cannot simultaneously compromise system security for surveillance purposes while maintaining that security against other threats.
The constitutional question is straightforward: can a society claim to protect privacy, dignity, and freedom of expression while authorizing secret home entries, mass location tracking, and biometric surveillance of political activities?
The historical lesson is unambiguous: surveillance infrastructure, once created, expands beyond its original justification and persists long after threats that justified it have passed.
Germany, of all nations, understands these lessons viscerally. The Gestapo and Stasi cast long shadows. The question facing Berlinâand increasingly, democracies worldwideâis whether that historical memory is sufficient to resist the authoritarian temptation of total surveillance.
For cybersecurity professionals, the message is clear: we must be honest with our clients, organizations, and communities about the security implications of state-sponsored vulnerabilities. We must detect all malware regardless of claimed authorization. And we must resist the normalization of surveillance architecture that makes everyone less safe while claiming to protect us.
The walls of privacy are indeed thinning. Whether they collapse entirely depends on the vigilance, technical competence, and moral courage of those who understand whatâs at stake.
References and Further Reading
- Chaos Computer Club Analysis: CCC Analysis of R2D2- Berlin Data Protection Commissioner Statement on ASOG- Society for Civil Rights (GFF) Constitutional Complaint Documentation- F-Secure R2D2 Technical Analysis- Bruce Schneier: âOfficial Malware from the German Policeâ- Sophos: German Government R2D2 Trojan FAQ- Electronic Frontier Foundation: International Surveillance Law Database- Privacy International: State Surveillance Technology Reports
Attribution: Research synthesized from multiple sources including reporting by heise online, Reclaim The Net, World Socialist Web Site, Slashdot, and historical analysis of German state surveillance by CCC, F-Secure, Sophos, and independent security researchers.