From now on, Grok will undress people in photos only for money.
Henceforth, Grok Will Undress People in Photos Only for Money
The AI tool Grok, developed with Elon Musk's involvement, has disabled the image creation feature for the vast majority of users. The action was prompted by widespread complaints about the service being used to generate explicit and violent images.
The decision was made after Musk was threatened with fines, regulatory measures, and a possible ban of the platform in the UK. It was discovered that the tool was used to manipulate women's images, removing clothing and placing figures in sexualized poses.
Now, the image generation feature is available only to paid subscribers. On Musk's social media platform, the Grok account posted: "Image generation and editing are currently available only to paid subscribers." The majority of users can no longer create images through Grok.
However, full data (including credit card information) of paid subscribers is retained in the system—allowing user identification in cases of misuse of the feature.
Although the public @Grok account has been seriously restricted in generating images, there is a separate Grok application where users without a paid subscription report still being able to create explicit images of women and children. In this application, content is not published publicly.
Over the past two weeks, thousands of sexualized images of women were created without their consent—following an update to Grok's image generation feature in late December. Despite numerous public calls to disable or limit the feature, the platform took no action for a long time.