A deep dive into the alarming technical and legal implications of Schedule 3 and how encryption developers could face national security prosecution
In a report that should concern every developer working on secure communications, the UKâs Independent Reviewer of State Threats Legislation has revealed that creating apps with end-to-end encryption could legally qualify as âhostile activityâ under current national security legislation. The implications extend far beyond encryption â even targeting a football player could theoretically count as a threat to national security.
The Technical Reality: How Encryption Became âHostileâ
The UKâs National Security Act 2023 and Schedule 3 to the Counter-Terrorism and Border Security Act 2019 create an expansive framework for identifying âhostile activityâ that Jonathan Hall KC, the Independent Reviewer, describes in his December 2025 report as having âincredibly broad scope.â
Download: E03512978_-_Un-Act_The_National_Security_Act_in_2024_ELAY E03512978_-_Un-Act_The_National_Security_Act_in_2024_ELAY.pdf1002 KB.a{fill:none;stroke:currentColor;stroke-linecap:round;stroke-linejoin:round;stroke-width:1.5px;}download-circle
The Encryption Problem
Under Schedule 3, a person can be examined for âhostile activityâ even if theyâre completely unaware their actions qualify as such. The report explicitly states:
âUnder Schedule 3 a person may be engaged in hostile activity even though unaware that their activity is hostile activity. So a person could be examined on account of their wholly inadvertent and morally blameless conduct.â
Hall provides a chilling example directly related to encryption technology:
âThe developer of an app, whose selling point is end-to-end encryption which would make it more difficult for UK security and intelligence agencies to monitor communications. It is a reasonable assumption that this would be in the interests of a foreign state even if though the foreign state has never contemplated this potential advantage.â
This means developers of apps like Signal, WhatsApp, or any secure messaging platform could technically fall within the legal definition of hostile activity simply because their encryption makes surveillance harder â regardless of intent.
The Double-Ignorance Doctrine
The legislation introduces what the report calls a âdouble-ignoranceâ scenario:
âSince hostile activity does not require any knowledge or tasking by a foreign state, the phenomenon of double-ignorance could arise. A person may be engaged in hostile activity if they do something which, unknown to them threatens, national security and which is in the interests of another State, also entirely in the dark.â
From a technical perspective, this creates an impossible situation. Every cryptographic library, every security protocol, every privacy-enhancing technology could theoretically benefit a foreign state by making surveillance more difficult. The code itself becomes suspect.
The Football Player Paradox
In perhaps the most absurd example illustrating the lawâs overbreadth, the report reveals that even sports could fall under national security legislation:
âFinally, there is no requirement that the hostile act is in the interests of a foreign state in way that is relevant to its national security. For example, it is in interests of Country A to advance to the Quarter Finals of the World Cup. If Country A were to send an agent to break the leg of Englandâs leading goal scorer, that would therefore count as involvement in hostile activity (under the serious crime limb) even if it could not sensibly be said that getting a quarter final place was a matter of Country Aâs national security.â
This example demonstrates how the absence of proportionality requirements in the legislation means virtually any action benefiting a foreign state â from encryption development to industrial espionage to sports sabotage â falls under the same legal framework.
Technical Definitions That Sweep Too Broadly
The Foreign Power Condition
The legislationâs âForeign Power Conditionâ operates in two modes, both problematic for developers:
Tasked Mode: Conduct carried out for or on behalf of a foreign power through:
- Direction or control- Financial or other assistance- Collaboration or agreement
Untasked Mode: Conduct intended to benefit a foreign power, even without their knowledge.
The report explains:
âForeign âpatriotsâ are brought into scope. In 2024 US intelligence agencies referred to individuals not under the direct supervision of the Chinese Communist Party who âmay attempt election influence activities they perceive are in line with Beijingâs goalsâ. The point is that some individuals act on their own initiative and do not need to be tasked.â
For encryption developers, this means creating privacy tools with the general intent of protecting people from surveillance â including from authoritarian regimes â could satisfy the âuntaskedâ mode if prosecutors argue the developer intended to benefit foreign states.
Foreign Intelligence Service: A Dangerously Vague Definition
The Act defines a Foreign Intelligence Service (FIS) as:
ââŚany person whose functions include carrying out intelligence activities for or on behalf of a foreign powerâ
The report acknowledges this captures far more than traditional spy agencies:
âThe FIS definition is lexically short but conceptually broad because there are no universal standards for organising the intelligence activities of foreign powers of concern to the United Kingdom⌠the function-orientated definition is capable of capturing certain diplomats, police, foreign ministries, private companies, and student bodies tasked with reporting back to the mother country, as well as conventional intelligence bureaux.â
This creates severe uncertainty for developers. If you build encryption tools, who might use them? Could research collaborations with international institutions qualify as assisting a FIS? The boundaries are deliberately unclear.
Real-World Technical Scenarios at Risk
Scenario 1: The AI Researcher
The report provides this example:
âAn artificial intelligence researcher who puts his novel code on a public-access platform, believing strongly in the importance of open collaboration, where this code is likely to assist a FIS in hacking UK-based computers.â
This is standard practice in modern software development. GitHub, the worldâs largest code hosting platform, exists precisely to enable this kind of collaboration. Yet under Schedule 3, this could constitute hostile activity if:
- The code could be used for offensive security purposes2. A foreign intelligence service might benefit from it3. The researcher âought reasonably to knowâ this is likely
The âconstructive knowledgeâ standard is particularly problematic. As the report notes:
âConstructive knowledge is therefore distinct from wilful blindnessâ and brings in âan element of objectivity (âought to knowâ) which is not necessarily present in actual knowledge.â
Scenario 2: The Lobbyist for Electronic Chip Manufacturing
Another example from the report:
âThe lobbyist for a foreign firm, who seeks to persuade an electronic chip manufacturer to build its factory in France rather than the UK. This would engage the UKâs economic well-being in a way relevant to national security.â
This illustrates how economic competition becomes securitized. For tech companies, this means:
- Business development with international partners could be hostile activity- Advocating for technology policy that benefits foreign competitors could be hostile activity- Even routine market competition could theoretically fall under national security law
Scenario 3: The Journalist with Leaked Information
âExamples could include a journalist carrying confidential information whose significance to national security he did not understand, or the victim of planted material. The examining officer could act if there was no possibility that the person was aware that its dissemination might be in the interests of a foreign state, or even that they were carrying the material.â
For security researchers and journalists working in cybersecurity, this creates a chilling effect. Publishing vulnerability research, handling leaked documents, or reporting on surveillance capabilities all potentially qualify as hostile activity under these provisions.
The Broader Assault on Encryption
This isnât happening in isolation. The report contextualizes these provisions within the UKâs wider legislative framework attacking encryption:
The Online Safety Act Connection
The report notes potential synergies with the Online Safety Act 2023:
âThe potential impact of the Foreign Interference offence on foreign policy work by think tanks and journalistsâ and how âThe Online Safety Act Dimensionâ interacts with these national security provisions.
The Online Safety Act already contains controversial provisions requiring platforms to implement âaccredited technologyâ to scan for child abuse material â technology that doesnât exist without breaking encryption. Combined with the NSAâs hostile activity provisions, this creates a pincer movement against privacy technology.
The Apple iCloud Precedent
The report doesnât mention it, but context matters: Apple was served with a technical capability notice under the Investigatory Powers Act demanding it weaken encryption for UK users. Rather than create a backdoor, Apple disabled its Advanced Data Protection feature in the UK.
This demonstrates that UK authorities are willing to use legal pressure to undermine encryption, and the NSA provides another legal pathway to achieve the same goal.
Schedule 3 Powers: Detention Without Suspicion
The Schedule 3 examination powers are particularly concerning from a technical perspective because they:
Require no suspicion: Officers can stop and examine anyone to determine if theyâre engaged in hostile activity Allow device searches: Including accessing encrypted devices and their contents Permit detention: Up to 6 hours for examination, extendable further Retain data: Copied information can be retained indefinitely under certain conditions
The report explains the examination standard:
âExamining officers must take particular care to ensure that âprotected characteristicsâ (whether separately or together) are not used as criteria for selection except to the extent that they are used in association with considerations that relate to the threat from hostile activity.â
For developers traveling with encrypted devices containing proprietary code, security research, or confidential client information, this creates substantial risk. The âconstructive knowledgeâ standard means you could be detained and examined simply because an officer determines you âought to have knownâ your encryption work might benefit a foreign state.
Protected Information: When Is Your Code a National Secret?
The protected information offence (Section 1) creates additional technical problems. Information qualifies as âprotectedâ if:
âaccess is restricted âin any wayâ for the purposes of protecting the safety or interests of the United Kingdom; or access would reasonably be expected to be restrictedâ
The report notes the problematic breadth:
âThe information, document or article does not have to be held at a government location or by a government official, and the restriction can be applied by anyone.â
This means:
- Your private cryptographic research, if password-protected, could be âprotected informationâ- Security vulnerabilities youâve discovered and disclosed privately could qualify- Even information that should have been restricted but wasnât still falls under the Act
The Prosecutorial Discretion Illusion
Throughout the report, Hall repeatedly notes that the governmentâs answer to concerns about overbreadth is âprosecutorial discretionâ â the idea that authorities will exercise judgment in deciding what to prosecute.
Heâs not convinced:
âEntrusting discretion to a Law Officer can never be a complete answer to the potential reach of the new offences, applicable as they are in zones of precious human activity such as journalism, protest and politics. It is limited comfort to an individual who is worried about the legality of the conduct that they will never, in practice, be prosecuted for it.â
For developers, this uncertainty is corrosive. The law technically criminalizes normal security engineering practices, with only assurances that prosecutors wonât misuse their power. This creates:
- Chilling effects: Developers may avoid certain legitimate projects2. Selective enforcement risk: Authorities could target specific individuals or projects3. Discriminatory application: As seen with the US âChina Initiative,â national security law can enable racial profiling
Recommendations and Next Steps
Hall makes only three formal recommendations in his report, acknowledging the legislation is too new to fully evaluate. However, he identifies several âareas which will require particular vigilanceâ:
- The power to stop and detain unwitting travelers at borders2. The Foreign Power Conditionâs operation without requiring contact with foreign powers3. Innocent interactions with Foreign Intelligence Services4. The potential impact of foreign interference offences on think tanks and journalists5. Avoiding excessively harsh measures against âweak, foolish, and inadequate individualsâ
For the encryption development community, the report makes clear that current UK law:
- Lacks proportionality requirements distinguishing serious threats from routine privacy engineering- Uses vague definitions that sweep in legitimate security research- Relies too heavily on prosecutorial discretion rather than clear legal boundaries- Creates liability based on âconstructive knowledgeâ that may not reflect technical reality
What Developers Should Know
If you develop encryption tools, privacy-enhancing technologies, or security software:
- Your work could technically qualify as hostile activity under Schedule 3, even without any malicious intent2. International collaboration is particularly risky given the broad FIS definition3. Publishing code publicly could satisfy the âmaterial assistanceâ standard if it might help any foreign intelligence service4. Traveling to/from the UK with devices containing security research exposes you to detention and device searches5. Open source security research falls into the same legal category as espionage
The law makes no distinction between:
- Building encryption to protect dissidents from oppression- Building encryption to hide criminal activity- Building encryption as a competitive feature- Building encryption because itâs technically interesting
All are theoretically âhostile activityâ if they might benefit a foreign state by impeding UK surveillance capabilities.
Technical Comparisons to Other Jurisdictions
The UKâs approach is increasingly out of step with technical reality and international norms:
European Union: GDPR mandates encryption for data protection; the EU is moving toward stronger encryption requirements, not weaker ones.
United States: Despite pressure from law enforcement, major platforms continue offering end-to-end encryption. The US government hasnât criminalized encryption development itself.
Australia: While the Assistance and Access Act allows technical capability notices, it explicitly doesnât permit creating systemic weaknesses. It also has proportionality requirements.
The UKâs framework is unique in treating encryption development itself as potentially hostile activity, rather than focusing on specific misuse of technology.
The Economic Impact
Beyond civil liberties concerns, this legal framework threatens the UKâs technology sector:
- Talent flight: Security researchers and cryptographers will avoid UK-based work2. Investment deterrence: Why invest in UK privacy tech companies facing legal uncertainty?3. Competitive disadvantage: UK companies canât credibly offer end-to-end encryption if the government labels it hostile4. Innovation suppression: Cutting-edge security research becomes legally risky
The report itself notes:
âThe UK has no hope of being a leader in AI and advanced technologies if its regulations get in the way of creating basic communications systems.â
đ§ Related Podcast Episode
Conclusion: A Framework Too Broad for Technical Reality
Jonathan Hallâs review makes clear that the UKâs national security legislation has created a legal framework where:
- Normal software development could be hostile activity- International research collaboration could be assisting a Foreign Intelligence Service- Publishing security research could be prejudicial to UK interests- Building privacy tools could benefit foreign states
The football player example isnât just absurd â itâs instructive. It shows that without proportionality requirements or clear limits, anything that might benefit another country becomes a potential national security threat.
For encryption developers and security researchers, the message is stark: the UK has created a legal environment where the tools that protect everyoneâs privacy and security are treated with the same suspicion as espionage.
As Hall notes in his conclusion:
âSome of the powers and offences extend well into the zone of political activity, journalism, protest and day-to-day human activity. However useful, they must be tested against misuse and overreach.â
The encryption development community is now in that testing phase, operating under laws that could criminalize the very work that makes the internet secure.
For more information:
- Full Report: State Threats Legislation in 2024- Jonathan Hall KCâs analysis represents independent oversight of these laws, but Parliament has not yet acted to narrow their scope- The Foreign Influence Registration Scheme (Part 4 of the NSA) was not yet in force in 2024 but will add additional compliance burdens for organizations with foreign connections
This analysis is based on the December 2025 report by Independent Reviewer Jonathan Hall KC and publicly available legislation. It does not constitute legal advice.