“AI is now putting online child abuse on steroids.”
The UK government has announced four new laws to combat child sex abuse images generated by artificial intelligence (AI).
Officials say these measures will make the UK the first country to criminalise the possession, creation, or distribution of AI tools designed to produce child sexual abuse material (CSAM). Offenders face up to five years in prison.
Possessing AI-generated paedophile manuals, which teach individuals how to exploit AI for sexual abuse, will also become illegal. Those caught will face a maximum sentence of three years in prison.
Home Secretary Yvette Cooper said: “AI is now putting online child abuse on steroids.”
She warned that the technology was “industrialising the scale” of child sexual exploitation and added that further measures “may have to go further”.
Other laws will criminalise running websites where paedophiles share CSAM or provide advice on grooming children. This will carry a maximum sentence of 10 years.
Border Force will gain powers to order individuals suspected of posing a sexual risk to children to unlock their digital devices for inspection upon entering the UK. Offenders could face up to three years in prison, depending on the severity of the images.
AI-generated CSAM can look highly realistic, often combining real-life children’s faces with computer-generated elements.
In some cases, the voices of actual children are used, re-victimising innocent survivors.
These fake images are also used for blackmail, coercing victims into further abuse.
According to the National Crime Agency (NCA), there are 800 arrests each month for online child abuse threats.
It estimates that 840,000 adults in the UK pose a danger to children both online and offline, making up 1.6% of the adult population.
Cooper continued: “Perpetrators are using AI to groom or blackmail children, distorting images, and drawing them into further abuse.
“The most horrific things are taking place, and it is becoming more sadistic.
“Technology doesn’t stand still, and our response cannot stand still if we want to keep children safe.”
Some experts believe the government could go further. Professor Clare McGlynn, an expert in online abuse laws, said the changes were “welcome” but left “significant gaps”.
She urged the government to ban “nudify” apps and tackle the “normalisation of sexual activity with young-looking girls” in mainstream pornography.
She warned that such content, while involving adult actors, mimics child sex abuse and remains legal in the UK despite bans in other countries.
The Internet Watch Foundation (IWF) reported a surge in AI-generated CSAM.
In 2024, confirmed reports rose by 380%, with 245 cases compared to 51 in 2023. Each report can contain thousands of images.
A 2023 IWF study found that, over one month, 3,512 AI child sex abuse images were discovered on a single dark website.
The number of Category A images, the most severe, had risen by 10% compared to the previous year.
IWF interim chief executive Derek Ray-Hill said: “The availability of this AI content further fuels sexual violence against children.
“It emboldens abusers and makes real children less safe.”
He welcomed the government’s announcement, calling it a “vital starting point”.
Lynn Perry, chief executive of Barnardo’s, supported the measures:
“Legislation must keep up with technology to prevent these horrific crimes.”
She urged tech companies to introduce stronger safeguards and called on Ofcom to ensure the Online Safety Act is enforced effectively.
The new laws will be introduced in the upcoming Crime and Policing Bill and is set to reach Parliament in the coming weeks.