A deep dive into the alarming technical and legal implications of Schedule 3 and how encryption developers could face national security prosecution

In a report that should concern every developer working on secure communications, the UK’s Independent Reviewer of State Threats Legislation has revealed that creating apps with end-to-end encryption could legally qualify as “hostile activity” under current national security legislation. The implications extend far beyond encryption – even targeting a football player could theoretically count as a threat to national security.

The Technical Reality: How Encryption Became “Hostile”

The UK’s National Security Act 2023 and Schedule 3 to the Counter-Terrorism and Border Security Act 2019 create an expansive framework for identifying “hostile activity” that Jonathan Hall KC, the Independent Reviewer, describes in his December 2025 report as having “incredibly broad scope.”

Download: E03512978_-_Un-Act_The_National_Security_Act_in_2024_ELAY E03512978_-_Un-Act_The_National_Security_Act_in_2024_ELAY.pdf1002 KB.a{fill:none;stroke:currentColor;stroke-linecap:round;stroke-linejoin:round;stroke-width:1.5px;}download-circle

The Encryption Problem

Under Schedule 3, a person can be examined for “hostile activity” even if they’re completely unaware their actions qualify as such. The report explicitly states:

“Under Schedule 3 a person may be engaged in hostile activity even though unaware that their activity is hostile activity. So a person could be examined on account of their wholly inadvertent and morally blameless conduct.”

Hall provides a chilling example directly related to encryption technology:

“The developer of an app, whose selling point is end-to-end encryption which would make it more difficult for UK security and intelligence agencies to monitor communications. It is a reasonable assumption that this would be in the interests of a foreign state even if though the foreign state has never contemplated this potential advantage.”

This means developers of apps like Signal, WhatsApp, or any secure messaging platform could technically fall within the legal definition of hostile activity simply because their encryption makes surveillance harder – regardless of intent.

The Double-Ignorance Doctrine

The legislation introduces what the report calls a “double-ignorance” scenario:

“Since hostile activity does not require any knowledge or tasking by a foreign state, the phenomenon of double-ignorance could arise. A person may be engaged in hostile activity if they do something which, unknown to them threatens, national security and which is in the interests of another State, also entirely in the dark.”

From a technical perspective, this creates an impossible situation. Every cryptographic library, every security protocol, every privacy-enhancing technology could theoretically benefit a foreign state by making surveillance more difficult. The code itself becomes suspect.

The Football Player Paradox

In perhaps the most absurd example illustrating the law’s overbreadth, the report reveals that even sports could fall under national security legislation:

“Finally, there is no requirement that the hostile act is in the interests of a foreign state in way that is relevant to its national security. For example, it is in interests of Country A to advance to the Quarter Finals of the World Cup. If Country A were to send an agent to break the leg of England’s leading goal scorer, that would therefore count as involvement in hostile activity (under the serious crime limb) even if it could not sensibly be said that getting a quarter final place was a matter of Country A’s national security.”

This example demonstrates how the absence of proportionality requirements in the legislation means virtually any action benefiting a foreign state – from encryption development to industrial espionage to sports sabotage – falls under the same legal framework.

Technical Definitions That Sweep Too Broadly

The Foreign Power Condition

The legislation’s “Foreign Power Condition” operates in two modes, both problematic for developers:

Tasked Mode: Conduct carried out for or on behalf of a foreign power through:

  • Direction or control- Financial or other assistance- Collaboration or agreement

Untasked Mode: Conduct intended to benefit a foreign power, even without their knowledge.

The report explains:

“Foreign ‘patriots’ are brought into scope. In 2024 US intelligence agencies referred to individuals not under the direct supervision of the Chinese Communist Party who ‘may attempt election influence activities they perceive are in line with Beijing’s goals’. The point is that some individuals act on their own initiative and do not need to be tasked.”

For encryption developers, this means creating privacy tools with the general intent of protecting people from surveillance – including from authoritarian regimes – could satisfy the “untasked” mode if prosecutors argue the developer intended to benefit foreign states.

Foreign Intelligence Service: A Dangerously Vague Definition

The Act defines a Foreign Intelligence Service (FIS) as:

“…any person whose functions include carrying out intelligence activities for or on behalf of a foreign power”

The report acknowledges this captures far more than traditional spy agencies:

“The FIS definition is lexically short but conceptually broad because there are no universal standards for organising the intelligence activities of foreign powers of concern to the United Kingdom… the function-orientated definition is capable of capturing certain diplomats, police, foreign ministries, private companies, and student bodies tasked with reporting back to the mother country, as well as conventional intelligence bureaux.”

This creates severe uncertainty for developers. If you build encryption tools, who might use them? Could research collaborations with international institutions qualify as assisting a FIS? The boundaries are deliberately unclear.

Real-World Technical Scenarios at Risk

Scenario 1: The AI Researcher

The report provides this example:

“An artificial intelligence researcher who puts his novel code on a public-access platform, believing strongly in the importance of open collaboration, where this code is likely to assist a FIS in hacking UK-based computers.”

This is standard practice in modern software development. GitHub, the world’s largest code hosting platform, exists precisely to enable this kind of collaboration. Yet under Schedule 3, this could constitute hostile activity if:

  1. The code could be used for offensive security purposes2. A foreign intelligence service might benefit from it3. The researcher “ought reasonably to know” this is likely

The “constructive knowledge” standard is particularly problematic. As the report notes:

“Constructive knowledge is therefore distinct from wilful blindness” and brings in “an element of objectivity (‘ought to know’) which is not necessarily present in actual knowledge.”

Scenario 2: The Lobbyist for Electronic Chip Manufacturing

Another example from the report:

“The lobbyist for a foreign firm, who seeks to persuade an electronic chip manufacturer to build its factory in France rather than the UK. This would engage the UK’s economic well-being in a way relevant to national security.”

This illustrates how economic competition becomes securitized. For tech companies, this means:

  • Business development with international partners could be hostile activity- Advocating for technology policy that benefits foreign competitors could be hostile activity- Even routine market competition could theoretically fall under national security law

Scenario 3: The Journalist with Leaked Information

“Examples could include a journalist carrying confidential information whose significance to national security he did not understand, or the victim of planted material. The examining officer could act if there was no possibility that the person was aware that its dissemination might be in the interests of a foreign state, or even that they were carrying the material.”

For security researchers and journalists working in cybersecurity, this creates a chilling effect. Publishing vulnerability research, handling leaked documents, or reporting on surveillance capabilities all potentially qualify as hostile activity under these provisions.

The Broader Assault on Encryption

This isn’t happening in isolation. The report contextualizes these provisions within the UK’s wider legislative framework attacking encryption:

The Online Safety Act Connection

The report notes potential synergies with the Online Safety Act 2023:

“The potential impact of the Foreign Interference offence on foreign policy work by think tanks and journalists” and how “The Online Safety Act Dimension” interacts with these national security provisions.

The Online Safety Act already contains controversial provisions requiring platforms to implement “accredited technology” to scan for child abuse material – technology that doesn’t exist without breaking encryption. Combined with the NSA’s hostile activity provisions, this creates a pincer movement against privacy technology.

The Apple iCloud Precedent

The report doesn’t mention it, but context matters: Apple was served with a technical capability notice under the Investigatory Powers Act demanding it weaken encryption for UK users. Rather than create a backdoor, Apple disabled its Advanced Data Protection feature in the UK.

This demonstrates that UK authorities are willing to use legal pressure to undermine encryption, and the NSA provides another legal pathway to achieve the same goal.

Schedule 3 Powers: Detention Without Suspicion

The Schedule 3 examination powers are particularly concerning from a technical perspective because they:

Require no suspicion: Officers can stop and examine anyone to determine if they’re engaged in hostile activity Allow device searches: Including accessing encrypted devices and their contents Permit detention: Up to 6 hours for examination, extendable further Retain data: Copied information can be retained indefinitely under certain conditions

The report explains the examination standard:

“Examining officers must take particular care to ensure that ‘protected characteristics’ (whether separately or together) are not used as criteria for selection except to the extent that they are used in association with considerations that relate to the threat from hostile activity.”

For developers traveling with encrypted devices containing proprietary code, security research, or confidential client information, this creates substantial risk. The “constructive knowledge” standard means you could be detained and examined simply because an officer determines you “ought to have known” your encryption work might benefit a foreign state.

Protected Information: When Is Your Code a National Secret?

The protected information offence (Section 1) creates additional technical problems. Information qualifies as “protected” if:

“access is restricted ‘in any way’ for the purposes of protecting the safety or interests of the United Kingdom; or access would reasonably be expected to be restricted”

The report notes the problematic breadth:

“The information, document or article does not have to be held at a government location or by a government official, and the restriction can be applied by anyone.”

This means:

  • Your private cryptographic research, if password-protected, could be “protected information”- Security vulnerabilities you’ve discovered and disclosed privately could qualify- Even information that should have been restricted but wasn’t still falls under the Act

The Prosecutorial Discretion Illusion

Throughout the report, Hall repeatedly notes that the government’s answer to concerns about overbreadth is “prosecutorial discretion” – the idea that authorities will exercise judgment in deciding what to prosecute.

He’s not convinced:

“Entrusting discretion to a Law Officer can never be a complete answer to the potential reach of the new offences, applicable as they are in zones of precious human activity such as journalism, protest and politics. It is limited comfort to an individual who is worried about the legality of the conduct that they will never, in practice, be prosecuted for it.”

For developers, this uncertainty is corrosive. The law technically criminalizes normal security engineering practices, with only assurances that prosecutors won’t misuse their power. This creates:

  1. Chilling effects: Developers may avoid certain legitimate projects2. Selective enforcement risk: Authorities could target specific individuals or projects3. Discriminatory application: As seen with the US “China Initiative,” national security law can enable racial profiling

Recommendations and Next Steps

Hall makes only three formal recommendations in his report, acknowledging the legislation is too new to fully evaluate. However, he identifies several “areas which will require particular vigilance”:

  1. The power to stop and detain unwitting travelers at borders2. The Foreign Power Condition’s operation without requiring contact with foreign powers3. Innocent interactions with Foreign Intelligence Services4. The potential impact of foreign interference offences on think tanks and journalists5. Avoiding excessively harsh measures against “weak, foolish, and inadequate individuals”

For the encryption development community, the report makes clear that current UK law:

  • Lacks proportionality requirements distinguishing serious threats from routine privacy engineering- Uses vague definitions that sweep in legitimate security research- Relies too heavily on prosecutorial discretion rather than clear legal boundaries- Creates liability based on “constructive knowledge” that may not reflect technical reality

What Developers Should Know

If you develop encryption tools, privacy-enhancing technologies, or security software:

  1. Your work could technically qualify as hostile activity under Schedule 3, even without any malicious intent2. International collaboration is particularly risky given the broad FIS definition3. Publishing code publicly could satisfy the “material assistance” standard if it might help any foreign intelligence service4. Traveling to/from the UK with devices containing security research exposes you to detention and device searches5. Open source security research falls into the same legal category as espionage

The law makes no distinction between:

  • Building encryption to protect dissidents from oppression- Building encryption to hide criminal activity- Building encryption as a competitive feature- Building encryption because it’s technically interesting

All are theoretically “hostile activity” if they might benefit a foreign state by impeding UK surveillance capabilities.

Technical Comparisons to Other Jurisdictions

The UK’s approach is increasingly out of step with technical reality and international norms:

European Union: GDPR mandates encryption for data protection; the EU is moving toward stronger encryption requirements, not weaker ones.

United States: Despite pressure from law enforcement, major platforms continue offering end-to-end encryption. The US government hasn’t criminalized encryption development itself.

Australia: While the Assistance and Access Act allows technical capability notices, it explicitly doesn’t permit creating systemic weaknesses. It also has proportionality requirements.

The UK’s framework is unique in treating encryption development itself as potentially hostile activity, rather than focusing on specific misuse of technology.

The Economic Impact

Beyond civil liberties concerns, this legal framework threatens the UK’s technology sector:

  1. Talent flight: Security researchers and cryptographers will avoid UK-based work2. Investment deterrence: Why invest in UK privacy tech companies facing legal uncertainty?3. Competitive disadvantage: UK companies can’t credibly offer end-to-end encryption if the government labels it hostile4. Innovation suppression: Cutting-edge security research becomes legally risky

The report itself notes:

“The UK has no hope of being a leader in AI and advanced technologies if its regulations get in the way of creating basic communications systems.”

Conclusion: A Framework Too Broad for Technical Reality

Jonathan Hall’s review makes clear that the UK’s national security legislation has created a legal framework where:

  • Normal software development could be hostile activity- International research collaboration could be assisting a Foreign Intelligence Service- Publishing security research could be prejudicial to UK interests- Building privacy tools could benefit foreign states

The football player example isn’t just absurd – it’s instructive. It shows that without proportionality requirements or clear limits, anything that might benefit another country becomes a potential national security threat.

For encryption developers and security researchers, the message is stark: the UK has created a legal environment where the tools that protect everyone’s privacy and security are treated with the same suspicion as espionage.

As Hall notes in his conclusion:

“Some of the powers and offences extend well into the zone of political activity, journalism, protest and day-to-day human activity. However useful, they must be tested against misuse and overreach.”

The encryption development community is now in that testing phase, operating under laws that could criminalize the very work that makes the internet secure.


For more information:

  • Full Report: State Threats Legislation in 2024- Jonathan Hall KC’s analysis represents independent oversight of these laws, but Parliament has not yet acted to narrow their scope- The Foreign Influence Registration Scheme (Part 4 of the NSA) was not yet in force in 2024 but will add additional compliance burdens for organizations with foreign connections

This analysis is based on the December 2025 report by Independent Reviewer Jonathan Hall KC and publicly available legislation. It does not constitute legal advice.