Apple removes CSAM tool from blog, says plans did not change

Apple has discreetly removed all content related to its child sexual abuse material (CSAM) tool from its blog, following harsh criticism from various organizations and individuals. 

First presented in August, the CSAM tool is set to distinguish and expose unethical images on iPhones and iPads. The communications safety measure incorporates different approaches to identify this content, such a scanning users’ iCloud Photos libraries for CSAM, communication safety to alert kids and their parents when receiving or sending explicit sexual images, and enhanced Siri’s CSAM management. 

Once the announcement was publicized, the feature took heavy backlash from experts, particularly university and security researchers, privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook’s former security chief Alex Stamos, policy groups, politicians. 

While numerous companies, individuals, and organizations vocally voiced their disgruntlement with the safety detection, the most prominent opposers were Apple employees, given it represents a direct breach of users’ privacy rights, even if they were children.

The controversy surrounding the CSAM tool rose from the fact that the company would be extracting hashes of users’ iCloud Photos and weighing the comparison with a database of famous hashes of sexual abuse imagery. 

According to MacRumors, the blog alteration happened between December 10 and 13. While some vital information has been extracted from the iOS developer’s blog, two of three detection features remained.

The first one is developed to inform children of receiving nudity photos via SMS or iMessage. The second delivers supplementary information whenever the user searches for child exploitation content via Siri, Spotlight, or Safari Search. 

Released earlier this week with the latest iOS 15.2 update, “Expanded Protections for Children,” demonstrated a much more controversial approach to detecting CSAM.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the tech giant stated back in September. 

Now, Apple Spokesperson Shane Bauer explained that the company’s stance on this matter is still the same since the initial announcement of postponing the release of the CSAM tool. 

It is worth highlighting that the iPhone parent’s statement did not announce a complete cancelation of the CSAM detection, given its functionalities are still available on the site.

To Apple, this method provides it with the needed means to report users to authoritarian entities by searching if these individuals are known for posting such photos without being forced to jeopardize its customers’ privacy. 

In parallel, the tech titan also said user data encryption cannot be impacted while the analysis runs on their devices.

One thing is accurate though, once the CSAM tool launch date is announced, Apple will have provided all three child-protection features with different purposes. 


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Ethical Tech section to stay informed and up-to-date with our daily articles.