Facebook and Instagram Under Scrutiny for Child Addiction Risks

EU has launched a formal investigation into Meta's handling of children safety on its social platforms, Facebook and Instagram.  

The European Union (EU) has launched a formal investigation into Meta’s handling of children safety on its social platforms, Facebook and Instagram.  

The Union’s probe, potentially resulting in significant fines for the networking giant, accentuates growing regulatory scrutiny on Meta and its impact on young users, especially regarding addictive behavior. 

Compliance with DSA 

The European Commission will evaluate Meta’s compliance with the Digital Services Act (DSA) – mandating online platforms to implement robust child protection measures. These include preventing minors from accessing inappropriate content and ensuring their privacy and safety.  

Non-compliance could lead to fines of up to 6% of global revenue or enforced software changes. 

The Commission’s concerns focus on whether Facebook and Instagram exploit the vulnerabilities of minors, fostering addictive behavior. It also questions the effectiveness of Meta’s age verification methods.  

“We want young people to have safe, age-appropriate experiences online,” a Meta spokesperson told CNN, highlighting the company’s decade-long effort in developing protective tools and policies. 

Despite these assurances, regulators remain unconvinced.  

“We are not convinced that Meta has done enough to comply with the DSA obligations to mitigate the risks to the physical and mental health of young Europeans,” stated Commissioner Thierry Breton. This investigation follows multiple lawsuits in the US where Meta faces allegations from school districts and state attorneys general over youth mental health and child safety issues. 

Questioning Regulatory Intentions 

Beyond the headlines and regulatory actions, a critical question arises: Are the EU and American Congress genuinely prioritizing children safety, or are they more concerned with maintaining their public image? 

As the EU and US Congress continue their high-profile battles with Meta, amongst other tech giants, the tech industry and the public must scrutinize whether these efforts genuinely enhance child safety or simply serve as political posturing. Effective protection for children safety online requires more than just regulatory rhetoric, it demands enforceable actions that prioritize the welfare of young users above all else. 

At the End of the Day… 

Recent events highlight these shortcomings.  

Earlier this month, an investigation by the New Mexico attorney general into the dangers of Meta’s platforms led to the arrests of three men for attempted child sexual abuse. Similarly, the EU has repeatedly clashed with Meta over issues like disinformation, illegal content, and election interference, with limited success in driving meaningful change. 

Meta’s platforms remain under intense scrutiny as both the EU and US grapple with the broader implications of digital safety. While regulatory bodies emphasize their commitment to child protection, critics argue that these measures may be more performative than effective. The tech industry and the public must remain vigilant, ensuring that efforts to safeguard young users are genuine and impactful. 


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.