Tech Companies Directed to Shield Children from 'Toxic' Content

Ofcom, the UK's media regulator, has issued a stark warning to social media platforms: adhere to new online safety regulations.

Ofcom, the UK’s media regulator, has issued a stark warning to social media platforms: adhere to new online safety regulations or face severe consequences, including potential bans for users under 18.

The warnings are accompanied by draft codes of practice that demand companies bolster age verification systems and reconfigure algorithms to shield children from “toxic” content.

The urgency for tighter regulation comes in the wake of tragic incidents where children suffered fatal consequences after encountering harmful online material. Despite the introduction of these rules, parents who have lost children to such incidents have voiced dissatisfaction with the pace of change, describing it as “at a snail’s pace.”

Meta and Snapchat, in response to the new guidelines, have cited existing measures that purportedly offer enhanced protections for younger users and allow parental control over accessible content.

Pivot Pivot Pivot

Ofcom’s draft codes, significant to the enforcement of the UK’s Online Safety Act, outline over 40 specific measures aimed at safeguarding young users. Key among these is the modification of algorithms to prevent harmful content from populating children’s feeds and implementing more stringent age checks for viewing potentially damaging material.

Dame Melanie Dawes, Ofcom’s chief, emphasized the gravity of the situation in a BBC interview, declaring that failure to comply with the new regulations would result in public “naming and shaming” of non-compliant companies and could escalate to outright bans for underage users.

The new regulations are set to take effect in the latter half of 2025, with a public consultation period ending on July 17. Following this, companies will have three months to conduct risk assessments and adjust their platforms in line with Ofcom’s guidelines.

The initiative by the media regulator has received backing from various quarters, including Technology Secretary Michelle Donelan, who called on tech companies to proactively engage with the new regulations and avoid potential penalties. Similarly, voices from the tech industry acknowledge the need for improved age verification technologies to effectively enforce these measures.

However, the response from bereaved families and advocates, like those who have publicly appealed to Prime Minister Rishi Sunak and opposition leader Sir Keir Starmer for stronger online safety laws, suggests that the proposed changes may not yet be sufficient. These families are pushing for more ambitious action, including integrating mental health and suicide prevention into the school curriculum, to better protect children in the digital age.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.