MG was a normal twentysomething in Scottsdale, Arizona. She worked as a personal assistant, waited tables on weekends, and posted the occasional photo on Instagram — matcha runs, Pilates, pool hangs with friends. About 9,000 followers. Nothing wild.
Then a follower DM’d her a link. Multiple Instagram Reels were circulating of a woman who looked exactly like MG. Same face. Same tattoos. Different body. Scantily clad.
“If you didn’t know me well, you could very well think they were images of me,” MG told Ars Technica. “It was kind of like this reality check that I don’t have any control over my own image.”
The images weren’t just being shared. They were being used to advertise a business called AI ModelForge — a platform that teaches men how to generate their own AI influencers using photos of unsuspecting women.
The business model is the horror
This isn’t some anonymous creep in a basement. It’s a business with a business model.
Three Phoenix men — Jackson Webb, Lucas Webb, and Beau Schultz — are named in a lawsuit filed in January in Arizona, along with 50 John Does. According to the complaint, the men:
- Scoured the internet for photos of young women who wouldn’t be able to fight back
- Used AI to generate pornographic images and videos of fictional models who look exactly like the real women
- Sold the content on the subscription platform Fanvue
- Sold courses for $24.95/month on Whop, teaching other men how to do the same thing
The courses included “Blueprints” — step-by-step instructions on how to scrape images from women’s social media accounts, feed them into a generative AI model on CreatorCore, and use a separate app to remove the women’s clothes and generate explicit content.
They even advised targeting women with fewer than 50,000 followers to avoid “legal issues.” In 2025, the CreatorCore platform had more than 8,000 subscribers who generated more than 500,000 images and videos.
As attorney Nick Brand puts it: “These boys aren’t just using generative AI to disrobe women — they’re selling the ability to do so to other men and boys, who are then going to use other women’s images to do the same thing.”
MG and the other plaintiffs aren’t just victims of image theft. They’re the faces of a product that’s being used to harm other women. Brand compares it to “making somebody the face of ICE who has had their parents deported.”
The law isn’t ready
Technically, there is a federal law. The Take It Down Act, signed by Trump in May 2025, makes publishing nonconsensual sexualized AI-generated content illegal and requires platforms to remove it within 48 hours when flagged.
But the Take It Down Act doesn’t go into effect until May 2026. That’s right — a law signed a year ago doesn’t take effect for another year. State laws vary, and Arizona State Representative Nick Kupper describes them as “reactive rather than proactive.” He’s introduced a bill requiring websites to use automated detection tools — age verification, consent forms — to prevent nonconsensual AI content from being uploaded in the first place.
Meanwhile, MG has been lobbying Instagram to take the images down. Many are still up because they don’t technically violate Instagram’s guidelines on AI-generated content. The AI-generated images are just different enough from MG’s real photos that she can’t claim impersonation. “It’s my face, my tattoos, on a different outfit on a slightly different body,” she says. “These are real women being transformed, not just a random AI-generated person.”
When Ars Technica reached out, an Instagram spokesperson said the accounts were “under review.” TikTok, for what it’s worth, had already taken down the associated accounts for violating community guidelines.
The defendants? They’ve rebranded. AI ModelForge’s Linktree now points to “TaviraLabs,” a Telegram group with more than 18,000 members advertising itself as “the #1 AI Influencer coaching community.” The Instagram accounts promoting the business? Most are still active, posting photos of nubile women, fast cars, and expensive watches with captions like “She’s not my girlfriend, she’s my best paid employee.”
The scale is staggering
This is one lawsuit. One set of defendants. One platform with 8,000 subscribers. The AI influencer gold rush on X and Telegram is full of self-styled entrepreneurs boasting about earning hundreds of thousands of dollars from AI models.
The economics are simple: scrape photos for free, generate content with cheap AI tools, sell on subscription platforms. The marginal cost of generating another fake image is effectively zero. The harm to the real woman whose face was used? Incalculable.
As MG says: “It’s not about being cautious with your image online because everyone posts on social media now. Everyone is on LinkedIn. Everyone is on Instagram. And I want people to realize that this could also happen to them.”
Why this matters
This lawsuit is the first major civil action of its kind, testing whether existing legal frameworks can address AI-generated harm without waiting for legislation to catch up. The criminal laws are coming — the Take It Down Act, state deepfake bans — but they’re months or years from effective enforcement. This lawsuit is happening now.
For New Zealand, the implications are direct. We don’t have a Take It Down Act. Our Harmful Digital Communications Act predates generative AI entirely. The question of whether AI-generated intimate imagery of real people falls under existing law is essentially untested. As AI tools become more accessible and the incentive to monetise them grows, cases like MG’s are going to show up here too.
The deeper issue is consent architecture. Social media platforms weren’t designed with the assumption that your photos could be fed into an AI model to generate pornography. The implicit contract — “I’m sharing this with my followers” — didn’t include “and also with 8,000 men who want to create a fake porn version of me.” The platforms need consent mechanisms that account for AI extraction, not just human viewing. They don’t have them yet.
MG is fighting back because she wants other women to know they can. “We were put in this place where our backs were against the wall,” she says, “and I want other women to know you can’t stop living your life.”
But no one should have to fight this battle alone, and no one should have to wait for the law to catch up with technology that’s already here.
SOURCES
- Ars Technica — “Women sue the men who used their Instagram feeds to create AI porn influencers”
- Wired — Original reporting