Discord Trust and Safety Bans AI-Generated CSAM

Discord Trust and Safety, CSAM, Discord, Grooming, Predatory Behavior,

Discord Trust and Safety is implementing comprehensive changes to its child safety policies, including prohibiting AI-generated child sexual abuse material (CSAM) and teen dating.

  • Discord explicitly bans AI-generated CSAM and any text or media content that sexualizes children.
  • Teen dating servers are strictly prohibited to protect underage users from potential exploitation and grooming.
  • Parental control tools through the Family Center feature enable parents to monitor their children’s activities on the platform.

Discord Trust and Safety head announced that the popular chatting platform is changing and clarifying its child safety policies including those around teen dating and AI-generated child sexual abuse material (CSAM), following an NBC News investigation last June into child safety on the platform

Discord is an American Voiceover Internet Protocol (VoIP) and instant messaging social platform with over 300 million monthly users ages 13 and above. It is especially popular with teenagers, especially considering its features making it a great communication tool, including text, voice, and video messages, as well as file sharing (images, videos, and documents).

In the NBC News investigation, numerous Discord servers promoting child abuse material were uncovered, along with servers advertising as teen or child-dating platforms soliciting explicit images from minors. The investigation identified cases where adults were prosecuted for offenses involving communications on Discord, including kidnapping, grooming, sexual assault, and sextortion.

“We built Discord to be different and work relentlessly to make it a great and safe space for teens,” said Discord Trust and Safety in their blog post announcement. “Child safety is a societal challenge and responsibility, and we are committed to continuing our work with industry partners as well as our close partnership with law enforcement to combat harmful content online and strengthen industry moderation practices and safety initiatives.”

The Washington Post recently reported an alarming rise in AI-generated child sex images across the internet, including on Discord. So, one crucial aspect of the policy update involves tackling the proliferation of generative AI that produces fake content and sexualizes children. Discord will now explicitly ban AI depictions of child sexual abuse and the sexualization of children in text chats.

The platform has become a gathering place for communities focused on generating AI images, some of which have sexually explicit themes. As a result, Discord Trust and Safety ‘s updated policy now encompasses “any text or media content that sexualizes children, including drawn, photorealistic, and AI-generated photorealistic child sexual abuse material.” The goal is to prevent the normalization of child sexualization in any context.

Discord has also explicitly banned teen dating on its platform. Experts have warned that online dating exposes teenagers to potential exploitation and grooming by adults. Their Safety Center emphasized the risk associated with such relationships, stating that the company recognizes teen dating as a significant harm vector for predators targeting teens. Under the new policy, teen dating servers are strictly prohibited, and users engaging in this behavior will face consequences. “When we become aware of these types of incidents, we take action as appropriate, including by banning the accounts of offending adult users and reporting them to the National Center for Missing & Exploited Children (NCMEC), who subsequently work with local law enforcement.”

We went into a Discord chatroom and asked a number of avid Discord users their thoughts on the new changes. The general consensus was that this is a great start. But they fear that this would force the predators underground, making them harder to catch. One user, Elie00001, wrote, “My only concern is that we do not get another case of the scams and gambling websites where they adapted to the new system and became harder to detect.” Discord’s previous crackdown on malicious actors and bots only forced them into secrecy. However, as for teen dating, user Abobo does not believe it would be effective as they can easily take the conversation elsewhere. “The effort is commendable but ultimately, only shifts the problem elsewhere.”

Discord Trust and Safety has gone further by adding parental control tools to complement these policy updates. The newly launched Family Center tool allows parents and children to opt in and receive updates on the children’s activities on the platform. While parents cannot view the content of their kids’ messages, they can monitor their friends and interactions.

Discord Trust and Safety continues to proactively scan images uploaded to its platform using PhotoDNA, a proprietary image-identification and content-filtering technology widely used by online service providers, as part of its ongoing efforts to protect children and teens from harm.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.