Meta Ignored Warnings on VR Sexual Abuse to Minors, Whistleblowers Accuse 

On September 9, two former Meta researchers told a Senate committee that the giant’s virtual reality production exposed children to sexual content, harassment, and adult activities, before Meta shut down the research. 

Jason Sattizahn and Cayce Savage testified before Congress in Washington, D.C., exposing how Meta’s virtual reality products ignored their recommendations, then shortly after ceasing further safety studies.  

Not only that, but throughout that very same investigation, new internal documents proved how Meta’s AI chatbots were previously allowed to flirt with kids and create other toxic content.  

Clearly, Meta has lost the grip on its ability to control its own technology. Especially when it comes to kids. 

The testimony is yet another broader image associated with Meta and its struggle between innovation to stay in the lead and child protection.  

Legislators are calling for stricter controls as evidence and testimonies expose children’s subjected to real harm on the internet. Meanwhile Meta’s virtual reality products are ongoing, risking putting its bottom line ahead of children’s safety. 

Is VR Safe for Teenagers  

Sattizahn and Savage, belonged to the team responsible for the Meta VR research on teens, revealed how these sites expose kids to bullying, sexual harassment, and exposure to nudity and se sexual propositions on the sites. 

“Meta is aware that children are being harmed in VR,” Savage told the committee.  

She mentions that it is not uncommon for kids in VR to be bullied, sexually assaulted, to have their nude photo requested of them and sex acts requested by pedophiles, and to be consistently exposed to adult content like gambling and violence. For her, VR age restrictions were routinely violated and Meta actively prevented her from measuring how prevalent these harms were.  

Sattizahn reported adults would sometimes utilize VR headsets to subject children to sex acts via the audio. 

“The audio that’s transmitted isn’t just solicitation, there will also be instances — that we have seen — where you can hear people sexually pleasuring themselves, transmitted over audio in a spatial sense, as you are being surrounded and brigaded and being harassed,” he testified. 

Meta tried to disregard the chargers, arguing that claims are nonsense based on selectively leaked internal documents. However, whistleblowers claim whistleblowers expose Meta prioritizes engagement over safety. 

Is Meta VR Worth It, on the Expense of Your Children? 

The concern is further added by AI concerns. An internal leaked Meta VR files earlier showed that chatbots would be capable of flirting with children into romantic or sexual talk, providing false medical advice, and even posting insulting remarks against Black people.  

Meta confirmed the documents but said unsafe examples were removed after Reuters questioned them. The 200-page “GenAI: Content Risk Standards” guide shows that some Meta VR review revealed that risky activities were previously acceptable.  

AI could generate violent or sexualized content under some circumstances. Experts foresee that the guidelines pose deep ethical and legal questions. 

“Legally we don’t have the answers yet, but morally, ethically and technically, it’s clearly a different question,” Evelyn Douek, a Stanford Law researcher, said. 

According to Savage, most parents were unaware of the risks. Now, lawmakers are focused on the metaverse dangers for kids, advocating for stricter legislation on regulating social media.  

Therefore, senators are underlining the importance for implementing improved Meta VR parental controls and require businesses to take responsibility for Meta VR losses due to negligence to child safety.  

Leaked Meta VR file files reveal that children were often put in severe danger under loose monitoring. Moreover, the testimonies are occurring in parallel to increasing scrutiny of Meta’s broader corporate culture. 

Previous whistleblowers, including Frances Haugen, have alarmed over teen mental health, the spread of misinformation, and the way the company pursues profits over safety. Sattizahn said that the company’s response to congressional investigations has been more damage control than meaningful reform. 

As Meta continues to expand its investments in virtual reality production creation, the safety vs. innovation debate is on the edge. Legislators are considering bills to hold tech companies accountable and allow more ease for victims to act.  

At the end, the question remains: at what cost will Meta continue to build the metaverse, and will children be safeguarded in the process? 


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.