Grok restricts AI image creation on X after backlash over sexualized photo edits of women
Grok has disabled its AI image creation feature for the vast majority of users. This decision follows intense public backlash after the service allowed users to digitally alter images of people by undressing them without consent. The tool was specifically abused to manipulate photos of women, creating sexualized and non-consensual imagery.
In response, Grok has limited access to the feature, especially after owner Elon Musk faced threats of fines, regulatory action, and reports of a possible ban on X in the United Kingdom. Only paying subscribers can now access Grok’s image generation tool. These users have their full details and credit card information stored by X, which allows the company to identify individuals if the function is misused.
While this restriction impacts most users, those without subscriptions can still use Grok’s separate app and website to edit images, although the main image creation tool on X is unavailable to them.


Comments
He should fork it into another feature that automatically adds more clothing to women. I'd pay $50 per month to never see a whore or a fat gender blob again on X.
Musk: "Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content."
No Elon, you and your disgusting product are to blame. You're the only one that should suffer all the consequences.
Wait what? You've already said you've fixed the child pornography issue when you've bought Twitter, and some people were stupid enough to believe it?
Well, you see, it's still your own problem. Now ask Grok for some legal actions, since even this crazy broken piece of software seems smarter than you.
Bravo to MElon? After the Swift controversy and others, is it true that only the rich are freed?
This puts in context the recent ritual defamations against Richard Stallman, where a bullshit generator such as the aforementioned image fusion can be abused to link Stallman to actual pedophiles.