One man in Minnesota used a nudification app to create fake nude images of more than 80 women from his social circles. When the women discovered what had happened, they went to the police — and found out there was no law that could help them.
He hadn’t shared the images, so revenge porn laws didn’t apply. He hadn’t distributed them, so the Take It Down Act was irrelevant. The images existed on his computer, created by an app that made the whole process as easy as uploading a photo and clicking a button. The law had nothing for them.
Minnesota just fixed that. Unanimously.
The Law
On Wednesday, the Minnesota Senate voted 65–0 to pass the most aggressive US legislation targeting AI nudification technology to date. The House had passed it just as overwhelmingly the week before. Governor Tim Walz is expected to sign it, with enforcement beginning August 2026.
What the law does, in plain terms:
- Bans nudification apps and services — websites, apps, software, or any service “designed to nudify” images of real people
- $500,000 fines per violation — the attorney general can impose civil penalties up to half a million dollars per fake AI nude flagged
- Product-blocking authority — the state can block offending apps and services from operating in Minnesota
- Punitive damages for victims — if a victim sues the developer, they can seek punitive damages on top of other compensation
- Fines fund victim services — collected penalties go to services for sexual assault, domestic violence, and child abuse survivors
- Exempts professional tools — Photoshop and similar products that could be used to nudify images but require “the technical skill of a user” are explicitly exempted
The exemption is smart. The law targets the app makers who make it a one-click process, not the professional tools that require deliberate skill to misuse. As we covered with the NY deepfake harassment bill, the legal challenge has always been about drawing the line between “could be misused” and “designed to be misused.” Minnesota draws it at the design intent.
The Catalyst
Democratic Senator Erin Maye Quade introduced the bill after the Minnesota women who’d been targeted came to her office. One of them, Molly Kelley, spent two years working on a legislative solution.
“These images don’t exist without a third-party involvement and some sort of machine learning model,” Kelley told 19th News. Her focus was clear: the harm happens at creation, not just distribution. If you wait for someone to share a deepfake nude to make it illegal, you’ve already lost.
RAINN, the national sexual assault hotline nonprofit, helped draft the bill and consulted with tech companies during the process specifically to prevent industry lobbying from killing it. The result is a law that targets the app ecosystem, not general-purpose AI tools.
The Enforcement Problem
Here’s where it gets complicated. The app used to target the Minnesota women — DeepSwap — is operated overseas, at times claiming bases in Hong Kong and Dublin. Minnesota can pass all the laws it wants, but enforcing a state-level ban against a foreign app operator is, to put it diplomatically, challenging.
That’s why advocates are pushing for a federal ban. A single state can’t regulate the global app ecosystem. But in the absence of federal action — and with the Trump administration’s history of AI deregulation pushes — states are the only game in town.
There’s also the question of US-made nudification tools. Ars Technica reported that Grok’s image generation has been used for nudification, and under Minnesota’s law, domestic companies could face penalties too. The law doesn’t care where the app is made — it cares whether it’s designed to nudify.
Why NZ Should Pay Attention
New Zealand has no equivalent legislation. The Harmful Digital Communications Act 2015 covers some deepfake scenarios, but it wasn’t designed for the era of one-click nudification apps. The law focuses on distributing harmful content, not on banning the tools that create it.
As deepfake abuse cases continue to rise globally, the Minnesota model offers a template worth studying:
- Target the creators, not just the users — making a one-click nudification app should be illegal, full stop
- Punitive fines that actually hurt — $500K per violation makes the business model unviable
- Fund victim services from fines — the people who profit from harm should pay for recovery
- Exempt professional tools — don’t accidentally ban Photoshop
The Kenya AI Bill takes a similar approach with criminal penalties. The South Korea AI Basic Act has its own framework. But Minnesota is the first US state to go after the apps themselves, and the unanimous vote suggests this isn’t controversial — it’s just obvious.
🔍 The Bottom Line
Minnesota’s law works because it targets the right problem. Nudification apps aren’t tools with legitimate uses that sometimes get misused — they’re products designed to create non-consensual sexual imagery, and their business model is built on that purpose. The unanimous vote confirms what most people already know: this isn’t a partisan issue. It’s a human dignity issue.
The enforcement gap with overseas apps is real, but it’s a reason to push for federal action, not a reason to wait. Minnesota just showed that the politics are easy. The hard part was always getting started.