Age Verification and Child Protection Online: A Legal Perspective Based on the AEPD’s Guidance

Age Verification and Child Protection Online: A Legal Perspective Based on the AEPD’s Guidance
Photo by Rod Long / Unsplash

As online interactions and digital services increasingly integrate into the everyday lives of children, concerns over the protection of their personal data and exposure to harmful content have surged. To address these concerns, regulatory frameworks across Europe have evolved to impose stringent requirements on digital platforms, ensuring that children’s rights are protected and that age verification systems are properly implemented. The Spanish Data Protection Agency (AEPD) has taken a leading role by publishing comprehensive guidance on how to protect minors online while adhering to data protection regulations. In this article, we will explore the legal basis underpinning these measures, drawing from key pieces of European and Spanish legislation.

1. The GDPR: A Foundation for Child Data Protection

The General Data Protection Regulation (GDPR), widely regarded as the most comprehensive data protection law globally, places a particular emphasis on protecting the privacy and personal data of children. This is specifically addressed in Recital 38 and Article 8, which call for additional safeguards when handling children’s data, recognizing that minors may be less aware of the risks and implications of personal data processing.

Recital 38 of the GDPR:

This section emphasizes the vulnerability of children in digital spaces, mandating special protection for their data, particularly in relation to:

  • Marketing practices,
  • User profiling, and
  • The collection of personal data when services are offered directly to a child.

The GDPR requires that any data collection practices targeting minors must ensure that children's rights are prioritized and that appropriate safeguards are in place. For instance, companies must obtain parental consent before processing the personal data of children under the age of 16 (or lower, depending on national laws). This highlights the importance of age verification systems, which are necessary to ensure that only parents or guardians can consent to data processing on behalf of children.

Snapchat’s Privacy Concerns and CSAM Issues Since 2010
Introduction Snapchat, a popular social media app known for its ephemeral messaging feature, has faced significant scrutiny over privacy concerns and child sexual abuse material (CSAM) issues dating back to 2010. Central to the controversy is the claim that Snapchat never actually deleted videos, contrary to its user privacy promises,

Article 8 of the GDPR:

Article 8 further enforces the need for parental consent by stipulating the conditions under which consent is deemed lawful when children’s data is involved. The legal basis for data processing requires companies to:

  • Implement mechanisms to verify the age of users,
  • Obtain verifiable parental consent for children under the specified age threshold, and
  • Ensure that data processing activities are transparent, with clear communication about the handling of children's data.
In Addition to COPPA and KOSA for Child Safety Bills
In the digital era, the safety of children on the internet has become a paramount concern, prompting significant legislative and policy-driven conversations. The recent move by the US Senate Judiciary Committee to summon top tech CEOs for a hearing on their platforms’ child safety measures highlights the urgency and complexity

2. The Digital Services Act (DSA): Safeguarding Minors in Digital Spaces

The Digital Services Act (DSA), passed by the European Union in 2022, establishes a broad regulatory framework for digital services, from small platforms to the largest tech giants. A key aspect of the DSA is the protection of children’s rights in the digital environment, specifically through age verification and the creation of safe online spaces for minors.

Recital 71 of the DSA:

This section underscores the responsibility of online platforms to ensure that services accessible to minors meet high standards of privacy, security, and protection by default. This means that digital platforms must implement age-appropriate designs and ensure that minors are not exposed to harmful content or contacted by malicious actors.

Platforms are further prohibited from using profiling techniques to target minors with ads based on personal data. This clause effectively mandates that companies operating within the EU must:

  • Implement age verification systems that do not infringe on children’s privacy rights,
  • Avoid using data-driven profiling techniques for advertising to minors, and
  • Ensure that children’s data is not unnecessarily collected or stored.

The DSA recognizes that enforcing age verification is essential for complying with data protection regulations, but it also warns against over-collecting data under the guise of verification. Recital 71 explicitly notes that platforms must not be incentivized to gather more personal data than necessary to determine the age of users, ensuring a balance between child protection and privacy.

Children’s Online Privacy Protection Act (COPPA): A Comprehensive Guide
The Children’s Online Privacy Protection Act (COPPA) is a critical piece of legislation that was enacted by the U.S. Congress in 2000. The primary aim of COPPA is to address concerns about children’s access to obscene or harmful content over the Internet. Ensuring Child Safety Online: Tech CEOs Face

3. Spanish Audiovisual Communication Law: Age Verification for Harmful Content

In addition to the GDPR and DSA, Spain has implemented its own national legislation to further protect minors in the audiovisual sector. The Spanish Audiovisual Communication Law (Law 13/2022) outlines specific provisions for protecting children from harmful content, such as violent or pornographic material. The law mandates the use of age verification systems to ensure that minors cannot access such content, making this a central requirement for platforms operating in Spain.

Article 88 of the Audiovisual Communication Law:

This article requires platforms that offer video-sharing services to implement measures that protect minors from harmful content, including:

  • Age verification systems for users attempting to access violent or pornographic content,
  • Parental control systems that allow guardians to restrict access to inappropriate content,
  • User-friendly mechanisms for reporting content that may harm minors’ physical, mental, or moral development.

Platforms that fail to implement these safeguards are at risk of non-compliance, which can lead to significant penalties under Spanish law. The law recognizes that children deserve enhanced protection in digital spaces and that platforms must take proactive steps to prevent minors from accessing inappropriate content.

Article 89 of the Audiovisual Communication Law:

This section builds on Article 88 by providing additional details about the enforcement of age verification systems and the responsibilities of service providers. It obligates video-sharing platforms to:

  • Establish transparent and user-friendly procedures for notifying platform providers about harmful content,
  • Implement effective age verification systems that block minors from accessing content like pornography or gratuitous violence, and
  • Provide end-user parental controls to safeguard children.

The law places a clear responsibility on platforms to prevent harm to minors by verifying the age of users before allowing them to access harmful content, reinforcing the principle that protection should be by design and by default.

Kids Online Safety Act - S.1409: Protecting Our Children in the Digital Age
In an age where children are growing up with smartphones and social media accounts, their online safety has become a paramount concern for parents, lawmakers, and society as a whole. The Kids Online Safety Act, or KOSA S.1409, is a legislative initiative that aims to address these concerns head-on.

4. The Future of Child Protection in Digital Spaces

As regulatory frameworks like the GDPR, DSA, and the Audiovisual Communication Law evolve, the focus remains firmly on protecting minors in online environments. However, as these legal frameworks highlight, it is not enough to simply implement age verification systems—these systems must be designed in a way that minimizes data collection, respects children’s privacy, and provides a safe digital environment by default.

Moving forward, the challenge for digital platforms is to balance privacy, security, and compliance with the legal requirements of child protection. Platforms must consider how to:

  • Implement privacy-preserving age verification that avoids over-collection of personal data,
  • Ensure transparency in how children's data is processed and protected, and
  • Collaborate with regulators, parents, and educators to create a safer internet for all users, particularly minors.

The AEPD’s guidance serves as a crucial roadmap for how companies can align their practices with existing legal frameworks while addressing the critical need to protect children online. By integrating these legal principles into their systems and services, platforms can not only ensure compliance but also foster trust among users and contribute to a safer digital future.

Texas Attorney General Ken Paxton Sues TikTok for Sharing Minors’ Personal Data in Violation of Texas Parental Rights
In a significant legal move, Texas Attorney General Ken Paxton has filed a lawsuit against TikTok, accusing the social media giant of violating the privacy rights of minors and breaching the state’s laws designed to protect parental authority. This lawsuit comes amidst increasing scrutiny of TikTok’s data collection practices

Conclusion

The intersection of child protection, age verification, and data privacy remains one of the most critical issues for digital platforms. The legal frameworks provided by the GDPR, Digital Services Act, and Spanish Audiovisual Communication Law form the backbone of the protection measures, ensuring that children are safe in the digital world. Platforms must now focus on implementing these legal mandates effectively while minimizing risks to children’s privacy. As age verification becomes increasingly essential, these legal standards will continue to shape how platforms engage with and protect their youngest users.

The GDPR, Digital Services Act (DSA), and Spanish Audiovisual Communication Law are primarily European regulations, and while they share similar objectives with laws like COPPA (Children’s Online Privacy Protection Act) in the United States, they are distinct and not directly tied to COPPA or the Children’s Advertising Review Unit’s Self-Regulatory Program for Children’s Advertising (CASM).

Let’s briefly break down the differences and relationships between these laws:

1. COPPA (Children’s Online Privacy Protection Act)

  • Jurisdiction: United States
  • Focus: COPPA is a U.S. law that was enacted in 1998, and it specifically regulates the online collection of personal information from children under the age of 13. Websites, apps, and online services that are directed towards children or knowingly collect information from children must comply with COPPA.
  • Requirements:
    • Obtain verifiable parental consent before collecting personal data from children under 13.
    • Provide notice of data collection practices.
    • Allow parents to review and delete the data collected from their children.
    • Implement reasonable data security practices.

Relation to GDPR/DSA:

  • While COPPA focuses specifically on the protection of children's privacy in the U.S., GDPR is broader and covers all individuals’ data across the European Union, with Recital 38 and Article 8 providing specific protections for children's data.
  • Both COPPA and GDPR aim to protect children’s personal data, but GDPR applies to children under 16 (though member states can lower the age threshold to 13), and COPPA applies to children under 13. Both also require parental consent for data processing.

Key Difference: COPPA mandates protections for children’s data at a younger age threshold, while GDPR applies across a wider range of data protection regulations for all citizens, with extra safeguards for minors.

2. CASM (Children’s Advertising Review Unit’s Self-Regulatory Program for Children’s Advertising)

  • Jurisdiction: United States (self-regulation, but tied to the advertising industry)
  • Focus: CASM is a self-regulatory program that ensures that advertising aimed at children under 12 follows certain guidelines to avoid deceptive practices. It is part of the Children's Advertising Review Unit (CARU), which is associated with the Better Business Bureau National Programs.
  • Requirements:
    • Ads should not mislead children about products, pricing, or results.
    • Advertising should be sensitive to the developmental stages of children.
    • Data collection must comply with COPPA regulations, ensuring that ads do not collect data from children under 13 without parental consent.

Relation to GDPR/DSA:

  • CASM focuses on advertising practices and ensuring that ads do not deceive or exploit children. The GDPR and DSA contain provisions that prevent data-driven advertising to minors, but these are broader regulations not focused specifically on advertising practices.
  • GDPR prohibits the profiling of children for marketing purposes (Recital 38), whereas CASM deals more directly with advertising content and its appropriateness for children.

Key Difference: CASM focuses on the ethical aspects of advertising content directed at children, while GDPR and the DSA govern the collection, processing, and protection of data (with implications for advertising through restrictions on profiling and targeted ads to children).

3. How GDPR, DSA, and the Spanish Audiovisual Law Relate to COPPA and CASM

  • Privacy and Parental Consent: All these regulations emphasize the importance of protecting children's privacy and ensuring parental consent. The GDPR and COPPA are directly aligned in this respect, as both require parental involvement when processing children's data.
  • Age Thresholds: While COPPA applies to children under 13, GDPR applies to children under 16 (with some countries setting the threshold as low as 13). The Spanish Audiovisual Communication Law doesn't define a universal age for all contexts, but similar to GDPR, it generally targets content that could harm minors under 18.
  • Advertising Restrictions: CASM's regulations focus on ensuring ethical advertising practices towards children, while the Digital Services Act (DSA) and GDPR prohibit the profiling of children and restrict targeted advertising. The Audiovisual Communication Law also mandates that platforms block minors from accessing harmful content, such as violent or explicit material.

While COPPA and CASM share similar goals with GDPR, DSA, and Spanish Audiovisual Communication Law, they operate under different legal frameworks and jurisdictions. COPPA is specific to U.S. privacy law and focuses on protecting children under 13, while the European laws are more expansive, offering privacy protections to all citizens, including special provisions for children under 16.

The European regulations are not directly tied to COPPA or CASM, but they align in principle, particularly in terms of protecting children’s privacy and restricting harmful content. The main distinction lies in the scope (U.S. vs. EU) and the age thresholds that each law enforces.

Several other global privacy regulations specifically address the protection of children’s privacy and data, similar to COPPA, GDPR, DSA, and the Spanish Audiovisual Communication Law. Different countries have recognized the importance of protecting minors in digital spaces, and have either adopted or are in the process of developing laws to safeguard children's data. Here are some key regulations worldwide:

1. Children’s Personal Information Protection Law (China)

  • Jurisdiction: China
  • Focus: This law, which came into effect in October 2019, is one of the most stringent child privacy regulations globally. It governs the collection, storage, and usage of personal information for children under the age of 14.
  • Key Provisions:
    • Platforms must obtain verifiable parental consent before collecting any personal data from children.
    • Companies must ensure transparency in how they process children’s data and must notify parents regarding the types of data collected and how it will be used.
    • Data minimization: Only the necessary amount of data should be collected, and companies are responsible for protecting this data from breaches or misuse.
  • Comparison: Similar to COPPA and GDPR, the law emphasizes parental consent and protecting minors' personal data but applies a stricter age limit (under 14) and imposes more rigorous obligations on companies to minimize the amount of data collected.

2. Personal Information Protection Act (PIPA) – South Korea

  • Jurisdiction: South Korea
  • Focus: South Korea’s PIPA contains specific provisions for the protection of minors under the age of 14. It aligns with general data protection standards while placing additional restrictions on processing children’s data.
  • Key Provisions:
    • Websites and platforms must obtain parental consent for processing the personal data of children under 14.
    • Children’s data should not be used for direct marketing or profiling without explicit consent from parents or guardians.
    • Parents can request access to the data collected and can demand that it be corrected or deleted if necessary.
  • Comparison: South Korea’s PIPA is stricter than COPPA in terms of data usage for marketing and profiling but has similar requirements regarding parental consent. It also shares GDPR’s commitment to data minimization and transparency for children's data.

3. Online Safety Act (Australia)

  • Jurisdiction: Australia
  • Focus: The Online Safety Act, enacted in 2021, focuses on child safety in digital spaces, particularly in combating online harm like cyberbullying and exposure to inappropriate content. While it doesn’t specifically cover data protection in the same way as COPPA or GDPR, it plays a significant role in children’s overall digital safety.
  • Key Provisions:
    • The eSafety Commissioner can require platforms to remove harmful or inappropriate content aimed at children within 24 hours.
    • There are age verification requirements for accessing certain content, such as pornography or violent material.
    • Platforms must develop robust reporting systems for children and parents to flag harmful or abusive behavior.
  • Comparison: This law focuses more on content safety and platform responsibility rather than data collection, but it shares similarities with the Spanish Audiovisual Communication Law in its emphasis on age verification for harmful content. It also aligns with GDPR in protecting minors’ online rights.

4. Child Rights Protection Law (United Arab Emirates)

  • Jurisdiction: United Arab Emirates
  • Focus: This law, known as Wadeema’s Law, primarily focuses on protecting the rights of children, including their right to privacy. It emphasizes child welfare, online safety, and protection from exploitation, including in digital contexts.
  • Key Provisions:
    • Websites that provide services to minors must obtain parental consent for any personal data collection.
    • The law ensures that children are not exploited or exposed to harmful content online.
    • There are requirements for service providers to ensure the safety of children's personal information.
  • Comparison: While Wadeema’s Law focuses more on overall child protection, including online privacy, its data protection measures mirror COPPA and GDPR in terms of requiring parental consent and protecting children from online risks.

5. Brazilian General Data Protection Law (LGPD)

  • Jurisdiction: Brazil
  • Focus: Brazil’s LGPD, which came into effect in 2020, is a comprehensive data protection law similar to GDPR. It includes specific provisions for protecting the personal data of children.
  • Key Provisions:
    • Processing of children’s data (under 18) must be carried out with specific consent from parents or legal guardians.
    • Companies must provide clear information about how children’s data is collected, processed, and used.
    • Similar to GDPR, data minimization is required: only data necessary for specific purposes should be collected.
  • Comparison: The LGPD closely mirrors GDPR in its data protection framework, including requirements for parental consent, transparency, and data minimization when it comes to children’s data. The age threshold (under 18) is higher than both GDPR and COPPA, offering broader protection for minors.

6. Personal Data Protection Act (PDPA) – Singapore

  • Jurisdiction: Singapore
  • Focus: While the PDPA does not have specific provisions solely dedicated to children's data, it enforces general data protection rules that extend to minors. Parental consent is recommended but not explicitly required for processing children’s data, unless the child lacks the capacity to understand consent.
  • Key Provisions:
    • Organizations must ensure appropriate safeguards when collecting personal data from minors.
    • Parental consent is encouraged, especially for younger children.
  • Comparison: The PDPA is less strict than COPPA or GDPR regarding children's data, as it doesn’t mandate parental consent explicitly. However, it aligns with the general principles of privacy protection, emphasizing data security and minimization.

7. New Zealand’s Privacy Act 2020

  • Jurisdiction: New Zealand
  • Focus: This law doesn’t provide a specific child-focused privacy framework, but it governs the collection, usage, and disclosure of personal information, including that of children. It emphasizes ensuring that individuals (including minors) understand the implications of their personal data being collected and processed.
  • Key Provisions:
    • Data controllers must ensure that any personal data collected from children is used in a manner that the child or their guardian can understand and consent to.
    • Strong protections exist against the collection of unnecessary data from vulnerable individuals, including minors.
  • Comparison: New Zealand’s Privacy Act lacks the explicit child-focused clauses seen in COPPA or GDPR but aligns with general data protection principles, particularly in promoting transparency and minimizing the collection of personal data from vulnerable groups like children.

8. Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA)

  • Jurisdiction: Canada
  • Focus: PIPEDA governs the collection, use, and disclosure of personal information in the private sector. While it doesn’t have specific provisions solely for children, it does offer protections for the personal information of vulnerable individuals, including minors.
  • Key Provisions:
    • Organizations must ensure meaningful consent, particularly when processing the data of minors who may not fully understand the implications.
    • Parents or guardians can provide consent for younger children’s data.
    • Like GDPR, PIPEDA mandates data minimization and requires that data collection practices be transparent and limited to necessary purposes.
  • Comparison: PIPEDA is less focused on age-specific protections like COPPA but aligns closely with GDPR in its emphasis on obtaining informed consent, data minimization, and transparency, making it a broad privacy protection law applicable to all individuals, including children.

Countries around the world have recognized the importance of protecting children in digital spaces, with several laws that align with the GDPR, COPPA, and other frameworks mentioned earlier. Whether focused on data privacy (such as LGPD and PIPA) or online safety (such as Australia’s Online Safety Act), these regulations emphasize parental consent, data minimization, and transparency as key components of protecting minors online. The variety of global approaches reflects the universal challenge of ensuring that children are protected in the rapidly evolving digital world.

The Role of Age Verification in Child Protection

Age verification is critical in the context of child protection online, particularly in situations where there is exposure to inappropriate content, unsafe environments, online consent, and the overall design of internet services that should be appropriate for children. Historically, many approaches have focused on identifying which users are children and implementing reactive strategies only after harm or risks have been detected. The AEPD’s technical note advocates for a proactive approach, where the digital environment is designed with safeguards that prevent exposure to risk by default.

Key Points of the AEPD’s Technical Note

  1. Privacy by Design and Default:
    • The AEPD emphasizes the need to design online environments that protect children’s data by default, without the necessity for intrusive monitoring or surveillance that exposes them to new risks. This protection aligns with the principles of the General Data Protection Regulation (GDPR) and the notion of "data minimization"—where only the essential data for age verification is processed, and children’s personal data is not unnecessarily collected or stored.
  2. Age Verification as an Enabler:
    • One significant shift proposed by the AEPD is the concept of age verification as an “enabler” for adults, rather than a restrictive tool for children. In this model, adults verify their age to access content, features, or services that are deemed risky for children. This removes the burden from children to prove their age, which would otherwise expose them to privacy risks. This proactive model allows guardians or parents to control access, thus protecting minors from potentially harmful content without requiring the collection of detailed personal data.
  3. Addressing Misconceptions:
    • Common misconceptions around age verification and safe environments are clarified in the note. For instance, the idea that age verification must require the system to identify specific users as children is debunked. The aim should be to create systems that protect children by default, without having to collect sensitive information such as precise age or identity.
    • Another misconception is that child-friendly design necessitates separate “child accounts” or extensive customization for minors. In reality, the focus should be on ensuring that environments are safe by default for all users, particularly for those under a certain age threshold.

The AEPD's Framework for Age Verification

The AEPD proposes four use cases for age verification:

  1. Protection Against Inappropriate Content:
    • Age verification can prevent minors from accessing harmful content such as violent or pornographic material. The note emphasizes that service providers should implement age verification systems before minors can access such content.
  2. Safe Environments for Children:
    • Beyond content, age verification should help create environments where children are not exposed to harmful behaviors, contacts, or illegal activities. The AEPD encourages a holistic approach, integrating age verification into tools such as parental controls and restricting communications for minors.
  3. Consent for Online Data Processing:
    • Age verification is crucial when obtaining consent from children for data processing, ensuring that only those with the legal authority to do so (such as guardians) can give consent on behalf of minors.
  4. Age-Appropriate Design:
    • The AEPD highlights the need for internet services and platforms to be designed in an age-appropriate manner. Age verification mechanisms should be integrated into this design to ensure that minors are protected from risks while using these services.

Implications for Internet Providers and Companies

For companies and service providers, this technical note presents both a challenge and an opportunity. Implementing age verification systems that balance child protection with user privacy is no small task. Providers must ensure that age verification is conducted in compliance with GDPR principles, particularly those of data minimization and accuracy, to avoid excessive or unnecessary data collection.

The AEPD also notes the risks associated with improper age verification practices, which could lead to profiling, surveillance, or the creation of identity schemes that could be abused. For instance, a flawed implementation of age verification could result in a monopoly or excessive control over user data by third-party providers of verification services, creating systemic risks for both individuals and the wider internet ecosystem.

Conclusion

The AEPD’s technical note underscores a significant shift in how we approach online child protection. Age verification is a powerful tool, but it must be used thoughtfully, with the goal of creating a safe internet by design—one that prioritizes children’s privacy, minimizes data processing, and empowers adults to take responsibility for the risks posed by digital services. Companies that adopt this proactive, privacy-first approach to child protection will not only comply with legal standards but will also foster greater trust among users, particularly families and guardians concerned about the safety of children online.

This framework calls for the internet ecosystem, including service providers, governments, educators, and regulators, to work together to implement solutions that genuinely protect children without infringing on the rights of all users.

This shift towards proactive measures is necessary to create a more secure digital world for children, balancing innovation and user protection in the ever-evolving landscape of the internet.

Read more