Apple Under Fire for Insufficient CSAM Reporting 

Apple has been accused of allegedly for insufficient reporting about the widespread of CSAM on its platforms with an Apple report abuse.  

Apple has been accused of allegedly for insufficient reporting about the widespread of child sexual abuse material (CSAM) on its platforms with an Apple report abuse.  

The accusation came from the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) because the tech giant reported only 267 cases of CSAM to the National Center for Missing & Exploited Children (NCMEC) last year.  

The number is considered to be very low compared to Google, which reported 1.47 million cases and Meta reported 30.6 million. 

 In 2023, other major platforms have also disclosed significant numbers of CSAM cases. TikTok reported 590,376 potential cases, X (FKA Twitter) disclosed 597,087 of the same cases, and Snapchat had 713,055. This goes to show that US-based tech companies are legally mandated to provide statements related to CSAM to the NCMEC, which in turn files these cases to appropriate law enforcement agencies worldwide. 

Inconsistencies and Encryption 

The National Society for the Prevention of Cruelty to Children (NSPCC), emphasized that Apple was involved in 337 CSAM cases in England and Wales between April 2022 and March 2023, a number that surpasses the company’s global reporting for the whole year. 

According to The Guardian, which was the first to report on the NSPCC’s claim, mentioned that Apple services, such as iMessage, FaceTime, and iCloud use end-to-end encryption. This makes it hard for the iPhone maker to have access to the content shared by users. Meanwhile, WhatsApp, which also has encryption, reported nearly 1.4 million suspected CSAM cases to NCMEC in 2023. 

Head of child safety online policy at the NSPCC, Richard Collard, raised concerns about the inconsistency between the number of child abuse images crimes reported in the UK and Apple’s global reporting.  

Collard stated, “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the Online Safety Act in the UK.”  

But First, Privacy 

Back in 2021, Apple announced that it plans to install a system that is capable of scanning images before being uploaded to iCloud to be able to compare them with the database of CSAM images provided by NCMEC, as well as other organizations, but then decided not to for user privacy concerns. 

Apple for its part refused to comment on the issue, mentioning that it had previously stated that “children can be protected without companies combing through personal data.” 


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.