Grok Undressing Goes Viral: Why Everyone Is Asking Grok to Add Bikinis

If you blinked for a second on X this week, you probably missed it. If you didn’t blink, you saw it everywhere...
"Hey Grok, replace her outfit with a bikini."
"Grok, put me in a bikini in this angle."
Christ, I can't even perform my typical Monday morning doomscroll without bumping into at least a dozen of these bikini requests.
We don't have exact numbers, but it's said that at the height of the trend, Grok produced over 2 million bikini images in the space of 42 hours.
The pattern is always the same. Someone posts a perfectly normal photo. Street clothes, mirror selfie, beach-adjacent vibes but fully dressed. Then comes the follow-up. Same image. Same pose. Same lighting. Except now the outfit has been swapped for a suspiciously well-fitted bikini that absolutely was not there five seconds ago.
To the surprise of literally nobody, this trend was started by adult influencers on X.
It's not clear who tried it first, but we can easily find a gajillion copycats like this:

13 million views...?!
As harlf of X has discovered in the last few days, it doesn't take Grok long to get his perv on.
Moments later, the undressing cometh:

Now, to be clear, there’s nothing particularly sinister about adult creators using faux-porn bait to farm engagement. That’s been part of the X economy for years, right? Swap “AI bikini” for “strategic crop” or “accidental nip slip” and the mechanics are exactly the same.
What has people spiralling is the realisation that Grok isn’t just doing this for consenting creators chasing clicks. Users quickly worked out that the same logic can be applied to pretty much any image on the platform. Random photos. Old tweets.
And people who never asked to be part of an AI experiment.
While conventional undressing and face swap tools are operated privately, on the sly, away from the wide-open eyes of the intended target... Grok is posting these images in the public domain, often with the recipient tagged in the output.
It's a recipe for disaster.
Even the Trumpster himself has been Bikinified and shared to the masses.
(I'm not going to share those examples, just in case you prefer your cornflakes in your mouth instead of sprayed up the wall...)
Understandably, many X users expressed strong disapproval, viewing the trend as a form of digital harassment and objectification:
"Despite the Centre taking the matter seriously, the “Grok Bikini” trend on X has not stopped.Compared to earlier, users are now seeking even more obscenity and deriving perverse pleasure from it. They are crossing limits by instructing @grok to increase nudity and to spread legs." - @Gaddambhaskarr1
This is getting abusive and pornographic… I’m seeing too many ppl commenting on women pics asking @grok to change clothes to a bikini, or remove this remove that … how sick r ppl ! Strong enforcement & governance thru@X and govts is much needed #bikinigate." - @greenviper308
British journalist Samantha Smith (targeted directly) told the BBC she felt "dehumanised and reduced into a sexual stereotype": "While it wasn’t me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me."
Of course, The Atlantic has jumped in, calling it part of Elon Musk's Pornography Machine (lol).
How Has Elon Musk Responded?
We all know that Elon Musk gives zero fucks about offending the left-wing press.
So far, he has largely downplayed or mocked the controversy, posting laugh-cry emojis in response to AI-generated bikini images (including of famous figures and himself).
But there are signs of the flat-bat treatment disintegrating...

Sure enough, I'm seeing signs of Grok refusing to cater to some of X users' more outlandish demands. And that's probably a good thing...
This is the part of the cycle people forget. Every time a mainstream platform stumbles into accidental horniness, it gets yanked back by policy teams, PR staff, and whoever draws the short straw to explain things to journalists.
Features get nerfed. Guardrails get bolted on... it's hard to avoid.
What Grok accidentally did was shine a very bright light on something that has been happening for years, just off-platform. Clothing swaps. Undressing. Face swaps. Roleplay. AI-generated thirst traps. These aren’t exactly new behaviours.
They’ve simply been happening inside dedicated NSFW tools (which readers here will know all too well!)... and the only real difference is visibility.
So yes, it’s probably good that Grok is learning to say no more often.
Not because AI undressing is uniquely evil, but because half-baked NSFW features bolted onto mainstream platforms are always going to cause chaos.
You either build for adult use properly, or you don’t build it at all.
How Grok Flies Close To The Sun...
What makes this whole episode interesting isn’t the bikini edits themselves. It’s the position Grok has put itself in by operating with much more lenient policy guidelines than OpenAI, Google, or any of its competitors.
From day one, Grok has been marketed as the “less uptight” AI... no doubt driven by Elon's religious endorsement of free speech (and all that it entails).
In many ways, the bikini trend is a perfect stress test of that positioning. Not because it’s extreme, but because it sits right on the boundary between “allowed but awkward” and “woah woah woah, absolutely not”.
Harmless enough to go viral. Sexual enough to make people nervous. Public enough to cause reputational damage if it spirals.
That’s the danger of sailing close to the sun on AI and adult content.
What we’re seeing now feels like the first real test of whether Grok can sustain its identity. Can it remain meaningfully more permissive without becoming a harassment engine? Can it allow adult expression without turning non-consenting users into collateral damage?
Can it keep the “fun” without triggering platform-wide backlash?
Many parties will be interested in how it responds. Further afield, Sam Altman has just teased the arrival of ChatGPT erotica "for verified adults".
I'm sure he, and his investors, will be watching on closely...
Comments
More from the blog



