Paedophiles using AI to produce ‘astoundingly realistic’ child sex images
Paedophiles are using artificial intelligence to create ‘astoundingly realistic’ images of children being sexually abused, a charity has warned.
A revolution in AI poses a risk of normalising exploitation and tracking the images to identify if they are genuine or fake could distract from helping real victims, the Internet Watch Foundation (IWF) said.
Analysts said while the number of confirmed AI images is still small, ‘the potential exists for criminals to produce unprecedented quantities of life-like child sexual abuse imagery’.
Of the 29 web addresses reported to the IWF between May 24 and June 30, seven were confirmed to contain AI-generated imagery.
They were Category A and B material – some of the most severe kinds – with children as young as three years old depicted, the organisation warned.
This is the first data on AI-generated child sexual abuse imagery that the IWF has published.
Susie Hargreaves, chief executive of the IWF, said fit-for-purpose legislation needs to be brought in ‘to get ahead’ of this threat.
She said: ‘AI is getting more sophisticated all the time. We are sounding the alarm and saying the prime minister needs to treat the serious threat it poses as the top priority when he hosts the first global AI summit later this year.
‘We are not currently seeing these images in huge numbers, but it is clear to us the potential exists for criminals to produce unprecedented quantities of life-like child sexual abuse imagery.
‘This would be potentially devastating for internet safety and of the safety of children online.’
Ms Hargreaves stressed that for members of the public some of this material would be ‘indistinguishable’ from a real image of a child being sexually abused.
An online ‘manual’ written by offenders with the aim of helping others train the AI and refine their prompts to return more realistic results has also been discovered.
Despite not featuring real children, the charity said this is not a victimless crime, warning that it can normalise exploitation and make it harder to spot when real children might be in danger.
Ms Hargreaves said: ‘Depictions of child sexual abuse, even artificial ones, normalise sexual violence against children.
‘We know there is a link between viewing child sexual abuse imagery and going on to commit contact offences against children.’
Dan Sexton, chief technical officer at the IWF, added: ‘Our worry is that, if AI imagery of child sexual abuse becomes indistinguishable from real imagery, there is a danger that IWF analysts could waste precious time attempting to identify and help law enforcement protect children that do not exist.
‘This would mean real victims could fall between the cracks, and opportunities to prevent real life abuse could be missed.’
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.