
On Tuesday, the National Society for the Prevention of Cruelty to Children (NSPCC) Chief Executive Chris Sherwood voiced concern over the alerting numbers of recorded crimes, urging the government to strengthen child internet safety by improving safeguards in the Online Safety Act.
Governments should know that the children of today are the leaders of tomorrow.
The NSPCC child protection Chief Executive Sherwood called the 39,000 recorded child sexual abuse image crimes recorded last week as “deeply alarming,” condemning the horrific events as an “unacceptable loophole” in messaging services that’s leaving children at risk.
Tech Endangering Online Child Safety
According to NSPCC, Snapchat was the most cited application that’s endangering child safety online, with the app ranking the highest in terms of child exploitation cases. Child exploitation and online protection command internet safety organizations argue that Snapchat’s system facilitates services that “harm children and go undetected.”
Home Office data shows more than 38,685 child exploitation crimes in England and Wales in 2023 and 2024, averaging 100 crimes a day – clear evidence of failing to secure online safety in child protection measures.
In 7,300 cases where police cited that 50% were from Snapchat, 11% from Instagram, 7% from Facebook, and 6% from WhatsApp.
NSPCC, Barnardo’s, and other charities have appealed to the home and tech secretaries to strengthen the Online Safety Act. As the law is enforced by Ofcom, charities claim that its code of practice includes a loophole whereby platforms need to only act if “technically feasible.”
The NSPCC is calling on technology companies to make their websites not ‘safe havens’ for abusers, or they will fall into the blind spot of end-to-end encryption. The case of a 13-year-old victim illustrates the degree of the issue.
Child Internet Safety Exposing Big Tech Socials
Child safety online is in danger. Social media applications that offer the view once feature are mostly US-based companies, such as Meta Platforms’ Instagram, Facebook, and WhatsApp, as well as Snapchat. The concept of the view-once feature began in 2011 with the release of the app “Picaboo,” which was later rebranded as Snapchat that same year.
Snapchat was designed to encourage real and authentic conversations in the moment not to put any child internet safety in danger. However, with people from various backgrounds using the platform, social media companies struggle to control users and their intentions while engaging online.
According to Socialfly NY’s report, ‘21 Snapchat Statistics Marketers Need to Know in 2024’:
- Ages 18–24: Approximately 38.5% of users
- Ages 13–17: Around 20%
- Ages 25–34: Approximately 22%
- Ages 35 and older: Significantly lower percentages
Snapchat’s primary user base consists of teenagers and young adults, which can make it a potential target for online predators and other cruel actors. This highlights the need for strong safety measures by the government and parental guidance or else to be able to report child safety concerns online would be a hassle for the mental health of the child and parents as well.
The fact that there is little or no control over users online shows how much of a jungle social media platforms can be. It’s also a warning to parents that their children’s innocence can be taken in a matter of seconds with the click of a button.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Ethical Tech section to stay informed and up-to-date with our daily articles.