Pedophiles use artificial intelligence (AI) technology to create and sell child sexual abuse material that looks real, reveals it BBC.

Some access the images by paying subscriptions to accounts on centralized content sharing sites; like Patreonbut claims it has a “zero tolerance” policy regarding such images on its site.

The Council of the Chief of the National Police stated that it’s “outrageous” that some platforms make “huge profits” but don’t take “ethical responsibility” for content which is trafficked.

And GCHQ, the government’s intelligence, security and cyber agency, says in the report that: “Child sex offenders embrace all technologies and some believe that the future of child sexual abuse material is in AI-generated content.”

The creators of the abuse images use the artificial intelligence software called Stable Diffusionwhose object is the creating images for use in art or graphic design.

Artificial intelligence allows computers to perform tasks that normally require human intelligence. Stable Diffusion software allows users to they describe in words any image they want and then the program creates it.

However, the BBC discovered that used to create images of child sexual abuse, including the rape of babies and toddlers. UK police online child abuse investigation teams say they have already come across such content.

What is AI and is it dangerous?

A computer-generated “fake image” depicting child sexual abuse it is treated the same as an actual image and is illegal to possess, publish or transfer in the UK.

National Police Chiefs’ Council (NPCC) head of child protection, Ian Critchley, said it would be wrong to suggest that because no real children were depicted in such “synthetic” images – that no one was harmed.

He warned that a pedophile could “to go from thought, to synthetic to end up abusing a living child”.