Artificial intelligence could further fuel an epidemic of child sexual abuse, Britain’s top law enforcement agency has warned, as it said that one in every 50 men pose a risk to children.
The National Crime Agency (NCA) estimates that up to 830,000 adults – 1.6% of the adult population – represent some degree of sexual danger to children, a figure labelled “extraordinary” by its director general, Graeme Biggar. He added that online abuse images were having a “radicalising” effect that “normalised” such behaviour.
The rapid onset of artificial intelligence (AI) means the threat to young people will only increase as fake images flood the internet, Biggar said – while other experts warned that instruction manuals on exploiting the new technology are already circulating online.
The head of the NCA, the agency that spearheads the fight against serious and organised crime, said: “We assess that the viewing of these images – whether real or AI-generated – materially increases the risk of offenders moving on to sexually abusing children themselves.”
Understanding of the existing threat was still growing, and Biggar said that most child sexual abuse (CSA) involved viewing images. Eight out of 10 arrested in connection with such abuse are male, and Biggar agreed that this meant around 2% of men posed a risk.
Unveiling the NCA’s annual threat assessment, Biggar said: “We estimate that there are between 680,000 [and] 830,000 adults in the UK that pose some degree of sexual risk to children. These are extraordinary figures: roughly 10 times the prison population.
“They partly reflect a better understanding of a threat that has historically been underestimated, and partly a real increase caused by the radicalising effect of the internet, where the widespread availability of videos and images of children being abused and raped, and groups sharing and discussing the images, has normalised such behaviour.”
The NCA’s National Assessments Centre produced the figures, and claimed that its methods were sound.
It examined the results of online investigations into CSA. Of the offenders identified online, only one in 10 were known child sexual offenders, while nine out of 10 had not previously been identified.
Researchers therefore multiplied by about 10 the known number of registered sex offenders.
The NCA said: “Our confidence in the validity of this figure is further informed by available intelligence and subject-matter expertise.”
Biggar said those active in online abuse forums were already excitedly discussing what AI could do and warned that this was just “the start of it”.
He added: “The use of AI for child sexual abuse will make it harder for us to identify real children who need protecting, and further normalise abuse.”
Evidence has emerged of guides circulating online for those interested in abuse images. The Internet Watch Foundation said the technology was being used to produce “astoundingly realistic” images of children as young as three to six, using sophisticated image-generation tools.
The IWF said it had found an online guide aimed at helping offenders train an AI tool and refine their prompts in order to return the most realistic results.
The IWF’s chief executive, Susie Hargreaves, urged the prime minister, Rishi Sunak, to treat AI-generated CSA material as a priority when he hosts a global AI safety summit in the autumn: “The prime minister needs to treat the serious threat it poses as the top priority when he hosts the first global AI summit later this year.”
She added: “Offenders are now using AI image-generators to produce sometimes astoundingly realistic images of children suffering sexual abuse.”
The IWF said instances of AI-generated material were low, as use of the technology is only just beginning to spread. Between 24 May and 30 June, it investigated 29 reports of webpages containing suspected AI-made material. It confirmed that seven of them contained AI-generated CSA material, with some containing a mix of real and AI images.
AI-generated images of child sexual abuse are illegal in the UK under the 2009 Coroners and Justice Act, which contains provisions on making and possessing indecent “pseudo-photographs” of children, although the IWF would like to see legislation amended to cover AI images directly.
The Ada Lovelace Institute, a data and AI research body, said on Tuesday that the UK must strengthen its regulation of the technology. Under current government proposals for overseeing AI, regulation is devolved to existing regulators, an approach that the institute said did not give enough cover to the use of AI in areas like recruitment and policing.
The institute said in a report analysing the UK’s proposals for AI regulation that it welcomed Sunak’s commitment to global AI safety, but added that the government should also address its domestic regime.
“Efforts towards international coordination are very welcome, but they are not sufficient,” said Michael Birtwistle, an associate director at the institute. “The government must strengthen its domestic proposals for regulation if it wants to be taken seriously on AI and achieve its global ambitions.”
The institute recommended that the government consider introducing an “AI ombudsman” who would support people affected by AI. It also said ministers should introduce new legislation to give better protections “where necessary”.
A government spokesperson said the forthcoming online safety bill, due to become law this year, contained provisions for removal of CSA material from online platforms.
The threat assessment report also contains details of the changing pattern of drug use in Britain. Last year, it said, record amounts were available, which caused prices to fall. Examination of wastewater in some urban areas showed a 25% increase in cocaine use during 2022.
The NCA said that 40 tonnes of heroin were consumed in 2022, as well as 120 tonnes of cocaine.
Source: The Guardian
Categories: News