Children Make Indecent Images of Other Children Using AI

The UK Safer Internet Centre (UKSIC) found that some children are using AI to create inappropriate images of other kids using these tools.

The UK Safer Internet Centre (UKSIC) found that some children are using AI to create inappropriate images of other kids using these tools. Although it’s only a few reports from schools, the charity is urging for immediate action to prevent this issue from escalating.

Many children might not fully grasp that what they’re doing is actually considered child abuse material. That’s why UKSIC is pushing for a joint effort from both teachers and parents. They want to make it clear to young people that curiosity isn’t a defense for breaking the law. In the UK, it’s illegal to make, have, or share these kinds of images, whether they’re real or AI-generated.

There’s also a risk that these images could be used in harmful ways, like for blackmail, or they might just circulate online without the kids realizing the serious consequences of their actions.

Now, a study by RM Technology, involving 1,000 students, revealed that about a third of the children are using AI to view inappropriate content. Tasha Gibson from RM Technology pointed out that many students are more clued-up on AI than their teachers, which is creating a knowledge gap. This makes it harder to keep kids safe online and prevent misuse. With AI’s growing popularity, bridging this knowledge gap is becoming crucial.

Interestingly, there’s some debate among teachers about whether educating kids on the dangers of such material is the responsibility of parents, schools, or the government. But UKSIC believes in a collaborative approach, where schools and parents work together.

David Wright, the director of UKSIC, emphasized the need for immediate steps to address this problem. He pointed out that young people might not fully understand the seriousness of their actions, but these harmful behaviors should be anticipated with new technologies like AI generators becoming more available.

Also, Victoria Green from the Marie Collins Foundation, a charity helping children affected by sexual abuse, highlighted the potential lifelong damage from such material. Even if the images aren’t created to cause harm, once shared, they could end up in the wrong hands and on abuse sites, further exploited by sex offenders.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Ethical Tech section to stay informed and up-to-date with our daily articles.