Disney’s mislabeling of YouTube videos highlights growing regulatory pressure on content creators and signals the evolution of age assurance technologies in child safety.


The Bottom Line

Disney will pay $10 million to settle Federal Trade Commission allegations that the company allowed personal data to be collected from children who viewed kid-directed videos on YouTube without notifying parents or obtaining their consent as required by the Children’s Online Privacy Protection Rule (COPPA Rule). This landmark case marks the first time a major content creator has been held accountable for COPPA violations on a third-party platform since YouTube’s own 2019 settlement, establishing important precedents for the entertainment industry.

In Addition to COPPA and KOSA for Child Safety Bills

What Happened: The Details Behind Disney’s Violation

The complaint says the mislabeling allowed Disney, through YouTube, to collect personal data from children under 13 viewing child-directed videos and use that data for targeted advertising to children. The violations occurred between 2020 and 2022, primarily during the pandemic when Disney was uploading substantial amounts of content to YouTube.

According to the complaint, when Disney uploaded videos to YouTube, its policy was to set the audience at the channel level, rather than checking the audience for each video. As a result, some child-directed videos were incorrectly designated as “not made for kids.” The videos in question included content from movies like Frozen, Inside Out, Finding Dory, and Encanto.

The consequences extended beyond simple data collection. Kids viewing these mis-designated videos were also exposed to YouTube features not meant for kids: autoplay to other “not made for kids” videos and access unrestricted public comments. This created what privacy advocates call “rabbit holes” that could expose children to inappropriate content.

The Regulatory Context: Building on the 2019 YouTube Settlement

Disney’s case stems directly from YouTube’s record $170 million settlement with the FTC in 2019, which was the largest amount the FTC has ever obtained in a COPPA case since Congress enacted the law in 1998. That settlement fundamentally changed how content creators must operate on YouTube.

Following a 2019 settlement with the FTC over allegations it violated COPPA, YouTube began requiring content creators, including Disney, to indicate if the videos they upload to YouTube are “Made for Kids” (MFK) or “Not Made for Kids” (NMFK) in order to comply with COPPA.

Importantly, as part of that settlement, the FTC at the time said it would follow up with investigations of the content providers on YouTube. Disney’s settlement represents the first major result of that promised follow-up enforcement action.

FTC Chair Warns Tech Giants Against Weakening Data Privacy for Foreign Compliance

Why This Settlement Matters for Privacy

Expanding Platform Liability The fine targets Disney for content that wasn’t uploaded to its own platforms, likely opening the door to penalties against other content providers that distribute their work on other sites and apps. This creates new compliance obligations for any company that creates content for children, regardless of where that content is hosted.

Parental Rights Focus “This case underscores the FTC’s commitment to enforcing COPPA, which was enacted by Congress to ensure that parents, not companies like Disney, make decisions about the collection and use of their children’s personal information online,” said FTC Chairman Andrew Ferguson in a statement. The settlement emphasizes that COPPA violations fundamentally undermine parental authority over their children’s digital lives.

Industry-Wide Impact Other big entertainment companies may also face legal exposure over the mislabeling of videos targeted at youngsters on YouTube. Major studios, toy manufacturers, and other children’s content creators likely face heightened scrutiny of their YouTube practices.

The Role of Age Assurance Technologies

Disney’s settlement includes a forward-looking provision that reflects the growing importance of age verification technologies. Under the proposed settlement, Disney will be required to establish and implement a program to review whether videos posted to YouTube should be designated as MFK—unless YouTube implements age assurance technologies that can determine the age, age range, or age category of all YouTube users or no longer allows content creators to label videos as MFK.

The European Digital Identity Crackdown: How Five EU Countries Are Following the UK’s Censorship Playbook

This provision acknowledges the technological evolution happening across major platforms:

YouTube’s AI-Powered Approach Google announced that it will be deploying machine learning algorithms to estimate the age of YouTube users. “That’s why we’ll use machine learning in 2025 to help us estimate a user’s age – distinguishing between younger viewers and adults – to help provide the best and most age appropriate experiences and protections.”

Roblox’s Facial Age Estimation Roblox is starting age-estimation technology for users who want to chat more freely on the social-gaming platform as part of its new “trusted connections” feature. The system uses a video selfie matched against Persona’s datasets to estimate age.

The Privacy Trade-off However, age estimation methods are little more than educated guesses, which means that they have an error rate. But what’s often unmentioned by age assurance providers is what happens when an error arises. Despite promises, age estimation could easily become secondary verification.

YouTube’s Family Plan Location Surveillance: The Next Phase of Digital Control

Lessons for Organizations

Content Classification Must Be Granular Disney’s blanket approach of marking entire channels as “not made for kids” proved insufficient. The FTC’s complaint noted that the Pixar channel was marked as “not made for kids,” while the similar videos on the Pixar Cars channel was designated as “made for kids.” “This difference in the setting illustrates Disney’s failure to mark child-directed content as MFK when such content is uploaded to NMFK channels,” according to the complaint.

Administrative Errors Have Real Consequences In settling the matter, Disney concedes that it made an administrative error in the way it characterized videos it uploaded to YouTube mostly during the pandemic. Even unintentional mistakes can result in significant penalties and regulatory action.

Proactive Compliance Is Essential YouTube told Disney they had changed the settings on 300 Disney videos from “Not Made for Kids” to “Made for Kids” but Disney did not change their policies after that notice. Organizations must respond promptly to platform notifications about content classification issues.

YouTube’s Secret AI Video Alterations: When Platforms Cross the Line

The Broader Privacy Landscape

Disney’s settlement fits within a broader pattern of COPPA enforcement. Fortnite maker Epic Games agreed to pay a $275 million penalty over COPPA violations in 2022. TikTok, Microsoft and many others have all been subject to COPPA fines over the past few years.

This enforcement trend reflects evolving expectations around children’s digital rights and corporate responsibility. Any company interacting with children online should be aware that the FTC is closely watching how their data practices affect kids.

What’s Next: Age Assurance as the Future

Age assurance is a broad term for methods to determine the age or age range of an individual (adults, teens, or kids) online. Effective age assurance technologies that reliably identify users’ ages can ease the burden on parents, allow kids to have an age-appropriate experience online, and protect kids from harmful content online.

The Disney settlement suggests that regulatory agencies view age assurance technologies as a potential solution to the complex challenges of protecting children online while preserving the open nature of platforms like YouTube. However, the technology remains imperfect, and implementation raises significant privacy and accuracy concerns.

Google’s AI Age Verification Expands from YouTube to Search: The Digital ID Surveillance Net Tightens

Key Takeaways for Privacy Professionals

  1. Content creators face new liability for how their material is classified on third-party platforms, not just their own properties2. Granular content review is now essential—blanket channel-level settings are insufficient for COPPA compliance3. Age assurance technologies are becoming central to regulatory expectations, but their limitations must be understood4. Prompt response to platform notifications about content classification can prevent regulatory issues5. COPPA enforcement is expanding beyond traditional website operators to encompass the broader content ecosystem

Disney’s $10 million settlement represents more than just a penalty—it’s a signal of how children’s privacy protection is evolving in an age of complex, multi-platform content distribution. As age assurance technologies mature and regulatory expectations solidify, organizations across the entertainment industry must reassess their approach to children’s content and data protection.

Buried Clauses in Terms of Service and EULAs: What You Need to Know