- A shocking investigation suggests AI is being used to make child abuse imagery
- These images are then being traded and sold on the Japanese site of Pixiv
- In UK law, AI-generated child abuse imagery is treated the same as real images
Paedophiles are using artificial intelligence (AI) to create life-like child sexual abuse images, a new report has claimed.
A shocking investigation alleges that depictions of babies and toddlers being raped are among numerous images produced by abusers using Stable Diffusion.
While the image generation software is intended to make artwork, the AI’s use of word prompts allows for any desired image to be formed.
The BBC claims these are then being sold and traded on the Japanese site of Pixiv, with accounts often leading to more explicit content on the US-based Patreon.
‘Since AI-generated images became possible, there has been this huge flood… it’s not just very young girls, they’re [paedophiles] talking about toddlers,’ journalist, Olivia Sheepshanks, told the publication.
The new report comes just one month after the AI platform Midjourney was also found to transform real photos of children into sexualised images.
In some cases, perverts have gone even further by experimenting with ‘deepfake’ technology to paste the faces of real-life child actors onto naked bodies, according to authorities.
Amid these revelations, Anna Edmundson, Head of Policy and Public Affairs at National Society for the Prevention of Cruelty to Children (NSPCC), told MailOnline that ‘it is breath-taking’ but sadly ‘not surprising’.
‘AI and Virtual Reality are facilitating new frontlines in online child sexual abuse which are being exploited by offenders on a vast scale,’ she said.
‘The speed with which these emerging technologies have been co-opted by abusers is breath-taking but not surprising, as companies who were warned of the dangers have sat on their hands while mouthing empty platitudes about safety.
‘Tech companies now know how their products are being used to facilitate child sexual abuse and there can be no more excuses for inaction.’
In UK law, these AI-generated images are treated the same as real pornographic photographs of children which are illegal to possess or trade.
But sexualised cartoons of children are not illegal in Japan, which hosts the picture-sharing site Pixiv, meaning that creators can easily promote this content.
It is understood that hashtags are primarily used to do this, with creators using these to index niche keywords.
A number of comments on these posts are also said to direct users to other abusive content which is sometimes completely real.
Ms Edmundson continued: ‘The Government should explicitly set out how the Online Safety Bill will regulate both AI and immersive technology and fix the loophole in the legislation that would let senior managers off the hook if they fail to address the way their products contribute to child sexual abuse.’
A Pixiv spokesman told the BBC that it had bolstered its monitoring systems in an attempt to tackle this child pornography issue.
Photo-realistic AI images of children were banned on May 31 too, but it is unclear what impact this has had.
‘The volume is just huge,’ Ms Sheepshanks continued. ‘So people [creators] will say “we aim to do at least 1,000 images a month”.’
Meanwhile, both Patreon and Stability AI told the publication that they do not tolerate child sexual abuse material.
Stability AI said: ‘We strongly support law enforcement efforts against those who misuse our products for illegal or nefarious purposes.’
MailOnline has approached Pixiv, Stable Diffusion AI and Patreon for further comment.