The Web Watch Basis (IWF) says its analysts have found “legal imagery” of ladies aged between 11 and 13 which “seems to have been created” utilizing Grok.
The AI instrument is owned by Elon Musk’s agency xAI. It may be accessed both by way of its web site and app, or by way of the social media platform X.
The IWF mentioned it discovered “sexualised and topless imagery of ladies” on a “darkish net discussion board” wherein customers claimed they used Grok to create the imagery.
The BBC has approached X and xAI for remark.
The IWF’s Ngaire Alexander informed the BBC instruments like Grok now risked “bringing sexual AI imagery of youngsters into the mainstream”.
He mentioned the fabric can be categorised as Class C below UK legislation – the bottom severity of legal materials.
However he mentioned the person who uploaded it had then used a unique AI instrument, not made by xAI, to create a Class A picture – probably the most critical class.
“We’re extraordinarily involved in regards to the ease and pace with which individuals can apparently generate photo-realistic youngster sexual abuse materials (CSAM),” he mentioned.
The charity, which aims to remove child sexual abuse material from the web, operates a hotline the place suspected CSAM may be reported, and employs analysts who assess the legality and severity of that materials.
Its analysts discovered the fabric by on the darkish net – the pictures weren’t discovered on the social media platform X.
X and xAI had been beforehand contacted by Ofcom, following studies Grok can be utilized to make “sexualised photos of youngsters” and undress ladies.
The BBC has seen a number of examples on the social media platform X of individuals asking the chatbot to change actual photos to make ladies seem in bikinis with out their consent, in addition to placing them in sexual conditions.
The IWF mentioned it had acquired studies of such photos on X, nonetheless these had not to date been assessed to have met the authorized definition of CSAM.
In a earlier assertion, X mentioned: “We take motion in opposition to unlawful content material on X, together with CSAM, by eradicating it, completely suspending accounts, and dealing with native governments and legislation enforcement as needed.
“Anybody utilizing or prompting Grok to make unlawful content material will endure the identical penalties as in the event that they add unlawful content material.”

