The Take It Down Act has been federal law since May 19, 2025. For the first year of its existence, enforcement was largely theoretical. This month, the Federal Trade Commission changed that.
The FTC sent compliance letters this week to Meta, Amazon, Apple, and more than a dozen other tech companies, warning them of an approaching deadline to demonstrate they have systems in place to swiftly remove non-consensual intimate imagery (NCII) — including AI-generated deepfakes — from their platforms. FTC Chairman Andrew Ferguson called the issue a “top priority” and signaled the agency is prepared to begin enforcement quickly.
What the Law Requires
The Take It Down Act — formally the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act — does two things:
It criminalizes publishing NCII. Knowingly publishing non-consensual intimate imagery, including deepfakes created with AI tools, is a federal offense under the Act. This applies to both real images and synthetic content created to depict a real person.
It requires platforms to remove it within 48 hours. Covered platforms — social media companies, image hosting services, search engines, and similar web services — must establish procedures to receive removal requests from victims and take down the content within 48 hours of a valid request. Platforms that fail to comply face FTC enforcement action.
The 48-hour window is the part that matters most in practice. Victims of NCII have long described the experience of reporting content as a Kafkaesque process of appeals, form submissions, and ignored requests while the content continues to circulate. The Take It Down Act is designed to make non-removal a legal liability, not just a policy question.
Why the FTC Had to Send Letters
If the law has been in effect since May 2025, why is the FTC sending compliance letters now?
Because having a law and having compliance systems are two different things. Large platforms typically respond to new legal requirements over months, establishing internal processes, training trust-and-safety teams, building reporting interfaces, and testing workflows. Whether they moved quickly enough — and whether their current systems actually meet the 48-hour requirement — is what the FTC letters are designed to assess.
The letters ask companies to document their NCII removal procedures, provide data on how many removal requests they’ve received, and demonstrate that they can meet the statutory timeline. Platforms that cannot produce this documentation face enforcement action.
The FTC declined to identify all recipients of the letters, but confirmed Meta (Facebook, Instagram), Amazon (which hosts significant cloud-based content distribution), and Apple (App Store and related services) are among them.
The Deepfake Problem the Law Is Trying to Solve
AI image generation has made the creation of non-consensual deepfake intimate imagery dramatically easier and cheaper. Tools that previously required technical expertise and significant compute time can now produce realistic synthetic nude imagery from a single clothed photograph in minutes, using consumer hardware.
The victims are disproportionately women and girls. Research from organizations tracking NCII trends has consistently found that the targets of deepfake intimate imagery are overwhelmingly female, often public figures, but also including private individuals — ex-partners, classmates, coworkers — who never sought public attention.
The harm is concrete: professional damage, psychological trauma, relationship destruction, and in documented cases, suicide. The perpetrators are often difficult to identify and, until the Take It Down Act, faced limited federal legal exposure.
The Act attempts to address both the creation and the distribution side. Criminal penalties target perpetrators. Platform liability for failure to remove targets the infrastructure that makes distribution at scale possible.
The Limits of the Law
The Take It Down Act is meaningful, but it has real limitations that advocates have pointed out since before it passed:
It’s reactive, not proactive. The 48-hour removal requirement kicks in after a victim reports content. It doesn’t require platforms to proactively detect and remove NCII — only to respond to reports. Victims who don’t know their images exist, don’t know how to report, or are too traumatized to engage with a bureaucratic process may receive no protection.
Verification is a challenge. The law requires platforms to act on “valid” requests, but doesn’t fully specify how platforms verify that a request is legitimate without potentially exposing victims to additional identification requirements. There’s a real tension between preventing fraudulent takedown requests and making the process accessible to actual victims.
Enforcement capacity is limited. The FTC has finite resources and a long list of enforcement priorities. Meaningful monitoring of whether platforms are meeting the 48-hour requirement across billions of pieces of content will be difficult to sustain.
International scope is unclear. Content hosted abroad by non-U.S. platforms exists in a murky jurisdiction. The Take It Down Act’s reach over foreign-hosted content is not fully settled.
What the FTC’s Enforcement Push Means
The compliance letter campaign is significant because it signals that the FTC under Chairman Ferguson intends to treat NCII enforcement as an active priority rather than a background statute. Companies that don’t have documented, functional removal systems face real regulatory exposure.
For victims, the practical change is that major platforms now have legal and regulatory incentive to take removal requests seriously and to process them within the 48-hour window. Whether the systems platforms have built are adequate will become clearer as the FTC’s review proceeds.
For the broader AI industry, the Take It Down Act is a preview of how regulators will approach AI-enabled harms going forward: establish a legal baseline, give platforms time to comply, then enforce. The deepfake NCII problem won’t be solved by the Act alone — but the combination of criminal liability for perpetrators and platform responsibility for distribution changes the calculus for everyone involved.
If You’re Affected
If you’re a victim of non-consensual intimate imagery — real or AI-generated — you have rights under the Take It Down Act:
- Report directly to platforms. Use the platform’s reporting tools to flag content. Under the Act, platforms must respond within 48 hours.
- Contact the Cyber Civil Rights Initiative (CCRI) at cybercivilrights.org — they provide crisis counseling, legal referrals, and guidance for navigating removal processes.
- Preserve evidence. Before content is removed, document it with screenshots, URLs, and timestamps. This evidence may be necessary for any legal action.
- Contact law enforcement. The criminal provision of the Take It Down Act means perpetrators can now face federal charges. Your local FBI field office can take reports.
The law exists. The FTC is now checking whether the platforms do too.



