British Tech Firms and Child Safety Officials to Examine AI's Capability to Create Exploitation Images

Tech firms and child protection organizations will be granted authority to evaluate whether AI systems can generate child exploitation images under new UK laws.

Significant Rise in AI-Generated Harmful Material

The announcement came as findings from a safety monitoring body showing that reports of AI-generated CSAM have increased dramatically in the past year, growing from 199 in 2024 to 426 in 2025.

New Legal Structure

Under the amendments, the government will permit designated AI companies and child safety groups to inspect AI models – the underlying systems for conversational AI and image generators – and verify they have adequate protective measures to prevent them from creating images of child sexual abuse.

"Fundamentally about stopping exploitation before it happens," declared Kanishka Narayan, noting: "Specialists, under rigorous conditions, can now detect the risk in AI models early."

Addressing Legal Challenges

The amendments have been implemented because it is against the law to create and possess CSAM, meaning that AI creators and other parties cannot generate such images as part of a evaluation regime. Previously, authorities had to delay action until AI-generated CSAM was uploaded online before addressing it.

This law is aimed at averting that problem by helping to stop the creation of those materials at their origin.

Legislative Structure

The amendments are being introduced by the government as modifications to the criminal justice legislation, which is also establishing a prohibition on possessing, producing or distributing AI systems designed to generate child sexual abuse material.

Real-World Impact

This week, the minister visited the London headquarters of Childline and listened to a simulated conversation to advisors featuring a report of AI-based exploitation. The call depicted a adolescent requesting help after facing extortion using a explicit deepfake of themselves, created using AI.

"When I learn about children experiencing extortion online, it is a cause of intense frustration in me and justified concern amongst families," he said.

Concerning Data

A leading online safety organization stated that instances of AI-generated exploitation material – such as online pages that may include multiple images – had significantly increased so far this year.

Instances of category A material – the most serious form of exploitation – increased from 2,621 visual files to 3,086.

  • Female children were overwhelmingly targeted, accounting for 94% of prohibited AI images in 2025
  • Portrayals of infants to toddlers increased from five in 2024 to 92 in 2025

Industry Reaction

The law change could "represent a vital step to ensure AI products are secure before they are launched," commented the chief executive of the internet monitoring organization.

"AI tools have enabled so victims can be targeted repeatedly with just a few clicks, giving offenders the ability to make potentially limitless quantities of sophisticated, photorealistic child sexual abuse material," she added. "Material which further commodifies survivors' suffering, and makes children, particularly girls, less safe on and off line."

Counseling Session Information

Childline also published details of counselling interactions where AI has been referenced. AI-related risks mentioned in the sessions comprise:

  • Employing AI to evaluate body size, body and looks
  • Chatbots discouraging children from talking to safe adults about abuse
  • Being bullied online with AI-generated content
  • Digital blackmail using AI-manipulated images

During April and September this year, Childline conducted 367 support interactions where AI, chatbots and associated topics were discussed, significantly more as many as in the equivalent timeframe last year.

Fifty percent of the mentions of AI in the 2025 interactions were related to mental health and wellbeing, including using AI assistants for support and AI therapeutic apps.

Jason Moore
Jason Moore

A passionate gamer and strategist sharing insights to help players master competitive gaming and achieve clutch victories.