Paedophiles steal real photos of children and use AI to make them sexualised

An investigation by the Times found disturbing images on Midjourney

Paedophiles are using artificial intelligence (AI) to transform real photos of children from the internet into sexualised images, an investigation has discovered.

Some users of Midjourney are creating a large amount of sexualised images of women, children and celebrities, according to the Times.

The AI tool is extremely popular14.5 million registered members and known for creating realistic-looking images with text prompts.

Parents have been warned to be aware of this before posting photos publically of their children online.

The images created with the platform are so realistic that many have believed them to be real. An image of Pope Francis wearing a huge white puffer jacket with a cross hanging from his neck set the internet alight earlier this year.

Among the celebs the programme has been used for to create explicit deep fakes are Kim Kardashian and Jennifer Lawrence. There were 57,000 images of Kate Upton produced since October by one account holder alone.

The San Francisco-based company states that content should be “PG-13 and family friendly”, however, it admits the new technology “does not always work as expected”.

Virtual child sexual abuse images are illegal in England and Wales but not in the States.

The NSPCC described the sexualised depictions of children as “unacceptable”.

“By hosting child abuse content they are putting children at a very real risk of harm,” said Richard Collard, the associate head of child safety online policy at the charity.

He added: “It is incredibly distressing for parents and children to have their images stolen and adapted by offenders.” He warned parents to only post pictures to trusted contacts and stressed that ultimately tech companies must take responsibility for the offensive material.

Midjourney told the Times it would ban users who breached its rules.

It comes after the NSPCC warned that the growing use of virtual reality headsets to explore the Metaverse will expose children to new risks online.

Paedophiles are using headsets to view and store child abuse images, police data showed for the first time, according to the charity.


We are currently offering 25% off all purchases until the end of half term! Head over to our website for details.

Scroll to Top