New Zealand is embarking on a critical journey to redefine privacy in our increasingly digital world. As technology advances at an unprecedented pace, so too does the public’s concern about the impact on their personal information. The good news? New Zealand’s privacy regulations are actively adapting, driven by a clear mandate to balance innovation with robust safeguards for every individual’s privacy.

Public sentiment is loud and clear: privacy is a major concern for New Zealanders. A recent survey found that two-thirds of respondents view protecting personal information as a top priority in their lives, with over 80% wanting greater control and choice over how their data is collected. There’s particular unease around children’s privacy, how social media companies manage personal data, and the growing use of Artificial Intelligence (AI) by both government and businesses in decision-making. This collective concern is not just a sentiment; it’s a powerful signal that organisations must embrace a ā€œPrivacy on Purposeā€ approach.

The Biometric Processing Privacy Code 2025: A New Era for Sensitive Data

One of the most significant recent developments is the Biometric Processing Privacy Code 2025, which officially takes effect on November 3, 2025 for new biometric processing activities, and August 3, 2026 for existing ones. This landmark Code establishes specific privacy rules for any organisation collecting and using individuals’ biometric information, such as facial recognition data, for identification or to infer personal details.

The Code directly confronts a range of critical privacy risks, including:

  • Over-collection and over-retention of sensitive biometric data.- Inaccuracy or security vulnerabilities affecting biometric information.- A fundamental lack of transparency about how biometric data is processed.- The risk of misidentification or misclassification, particularly based on attributes like race, ethnicity, gender, age, or disability, which can perpetuate bias.- The chilling effect of surveillance, monitoring, or profiling that could deter individuals from exercising their protected rights.- ā€œScope creep,ā€ where collected information is used for expanded purposes without explicit knowledge or authorisation.- A diminished ability for individuals to avoid monitoring in spaces where they reasonably expect privacy.

Crucially, the Code places significant limits on ā€œbiometric categorisationā€ā€”automated processes that analyse biometric information to infer health information, personal characteristics (like personality, mood, or mental state), or to assign individuals to demographic categories. Such categorisation is generally prohibited unless strict conditions are met, such as assisting with accessibility, preventing serious threats to public health or safety, or for ethical statistical or research purposes.

Organisations collecting biometric data must adhere to comprehensive rules on purpose, collection manner, storage and security, individual access and correction rights, retention limits, and restrictions on its use and disclosure, including cross-border transfers. This demonstrates a clear commitment to protecting some of our most sensitive personal information.

AI, Digital Identity, and Empowering Individuals

Beyond biometrics, New Zealand’s broader digital strategy reinforces its commitment to privacy. The country launched its first national Artificial Intelligence (AI) strategy in July 2025, setting principles for responsible AI use and emphasizing the need for public trust and ethical data practices. This is supported by initiatives like the Ministry of Business, Innovation and Employment (MBIE)ā€˜s ā€œResponsible AI Guidance for Businesses toolkitā€ and a Public Service AI Framework.

The push for a modern digital identity system also integrates privacy principles. The Digital Identity Services Trust Framework (DISTF), which took effect in late July 2025, regulates providers of verified digital identity services. The Minister for Digitising Government emphasized that modern digital identity systems using biometrics, verifiable credentials, and digital wallets are designed to be more user-friendly, secure, and private.

Furthermore, recent legislative updates are designed to empower individuals with greater control over their data:

  • The Privacy Amendment Bill, which received royal assent in June 2025, introduces new transparency enhancements to privacy law.- The Customer and Product Data Act (CPDA) 2025 establishes a customer data right, giving individuals greater access and control over their data and facilitating secure, standardised data sharing. Breaches of its data storage and security requirements are even considered breaches of the Privacy Act, investigable by the Privacy Commissioner.

A key concept championed by the Privacy Commissioner is ā€œdata minimisationā€ā€”collecting only the essential information needed for a specific purpose. This isn’t just good practice; it’s a proactive defense, as ā€œif it’s not there, it can’t be stolen by cybercriminalsā€.

Privacy as Our First Line of Cybersecurity Defense

The importance of strong privacy practices is underscored by New Zealand’s challenging cybersecurity landscape. Cyber threats are becoming increasingly sophisticated, leveraging AI and cryptocurrency to disguise attacks and make them untraceable. New Zealanders experienced an estimated $1.6 billion in financial losses from online threats in 2024, with over half of these losses impacting businesses. The first quarter of 2025 alone saw NZD $7.8 million lost to cyber incidents, a 14.7% increase from the previous quarter. Ransomware, in particular, is evolving with ā€œdoubleā€ and ā€œtriple extortionā€ tactics, which involve not just encrypting data but also stealing and threatening to leak sensitive personal information, directly harming individuals whose data was compromised.

Despite this, nearly half (44%) of people who experience cyber attacks do not report them, often due to apathy or a belief that reporting won’t make a difference. This underreporting hinders a full understanding of the threat landscape and effective responses.

This highlights a critical point: strong privacy measures are not just about compliance; they are fundamental to cybersecurity. By collecting less data, ensuring its accuracy, implementing robust security safeguards, and being transparent about its use, organisations significantly reduce their vulnerability to cyber attacks and the severe harm they inflict.

Building Trust Through ā€œPrivacy on Purposeā€

The message to organisations is clear: proactive and intentional privacy practices are non-negotiable. New Zealanders are not passive observers; two-thirds would consider changing service providers due to poor privacy practices. This makes ā€œPrivacy on Purposeā€ not just an ethical stance, but a strategic imperative.

For organisations handling any personal information, especially sensitive biometric data, it’s vital to:

  • Understand and implement the Biometric Processing Privacy Code 2025, paying close attention to limitations on categorisation and the extensive safeguards required.- Embrace ā€œprivacy by designā€ and ā€œsecurity by designā€ in all technological developments, particularly with AI systems.- Prioritise data minimisation to reduce both privacy and cybersecurity risks.- Ensure transparency with individuals about what data is collected, why, how it’s used, and what alternatives are available.- Empower individuals with clear processes for accessing, correcting, and controlling their personal information.

By doing so, organisations not only comply with New Zealand’s evolving regulations but also foster the public trust essential for a thriving, secure, and privacy-respecting digital Aotearoa.