British Tech Firms and Child Protection Officials to Test AI's Capability to Create Exploitation Content

Technology companies and child safety organizations will receive authority to evaluate whether AI systems can generate child exploitation material under new British legislation.

Significant Increase in AI-Generated Harmful Material

The announcement coincided with findings from a protection monitoring body showing that cases of AI-generated CSAM have more than doubled in the last twelve months, growing from 199 in 2024 to 426 in 2025.

Updated Legal Structure

Under the changes, the government will allow approved AI companies and child safety groups to inspect AI models – the underlying systems for conversational AI and image generators – and ensure they have sufficient safeguards to stop them from producing images of child exploitation.

"Ultimately about preventing exploitation before it occurs," declared Kanishka Narayan, noting: "Specialists, under rigorous conditions, can now identify the danger in AI models promptly."

Addressing Legal Challenges

The amendments have been introduced because it is illegal to create and possess CSAM, meaning that AI creators and others cannot generate such content as part of a testing regime. Previously, authorities had to delay action until AI-generated CSAM was uploaded online before dealing with it.

This legislation is designed to averting that issue by enabling to halt the creation of those materials at their origin.

Legal Structure

The changes are being added by the government as revisions to the criminal justice legislation, which is also establishing a prohibition on possessing, creating or sharing AI systems developed to create child sexual abuse material.

Real-World Consequences

This week, the official visited the London headquarters of a children's helpline and listened to a mock-up call to advisors featuring a account of AI-based exploitation. The call depicted a teenager seeking help after facing extortion using a sexualised deepfake of himself, created using AI.

"When I hear about children facing blackmail online, it is a cause of intense anger in me and rightful concern amongst families," he stated.

Concerning Data

A leading online safety organization reported that instances of AI-generated abuse material – such as webpages that may include numerous files – had significantly increased so far this year.

Instances of category A material – the gravest form of abuse – increased from 2,621 visual files to 3,086.

  • Girls were overwhelmingly targeted, making up 94% of illegal AI images in 2025
  • Portrayals of newborns to toddlers increased from five in 2024 to 92 in 2025

Industry Response

The law change could "constitute a vital step to ensure AI products are secure before they are launched," stated the head of the online safety organization.

"AI tools have enabled so victims can be victimised repeatedly with just a simple actions, providing offenders the ability to make possibly limitless quantities of advanced, photorealistic child sexual abuse material," she continued. "Material which further exploits survivors' suffering, and makes children, particularly female children, less safe on and off line."

Counseling Session Information

The children's helpline also released information of support interactions where AI has been mentioned. AI-related harms discussed in the sessions include:

  • Using AI to rate body size, body and appearance
  • AI assistants dissuading children from talking to trusted adults about abuse
  • Being bullied online with AI-generated content
  • Online blackmail using AI-manipulated pictures

During April and September this year, the helpline conducted 367 counselling sessions where AI, conversational AI and related terms were mentioned, significantly more as many as in the equivalent timeframe last year.

Half of the references of AI in the 2025 sessions were related to psychological wellbeing and wellness, including using chatbots for support and AI therapy applications.

Kimberly Arellano
Kimberly Arellano

Lena is a travel writer and urban enthusiast with a passion for uncovering hidden gems in cities across the globe.