If you have ever opened a videogame chat lobby, you know what toxic behavior sounds like across the microphone. You will inevitably hear some flattering remarks targeting your family members or their respective private parts. While these issues have always been there, and most gamers are used to the occasional inappropriate remark or hurtful comment, the Metaverse is a different story. The attention paid to such misdeeds scales up in tandem with the tech giants that are racing to grab at the financial opportunities of this new technology, and the dark side of the Metaverse becomes a real issue worth addressing if companies want to pave the way for further business.
Any vice or sin you can come up with will be found in the virtual world, as it is a reflection of the real world, except in the virtual world, people’speople’s inhibitions are left by the wayside. Think of any game you’ve played that involved violence. You probably felt no guilt slaughtering thousands of Nazi Zombies.
In the Metaverse, however, you are not playing against mindless zombies or lines of code with life-like animations; you are interacting with people. The problem arises when the same inhibitions you lack while interacting in virtual worlds spill onto other real people who share the virtual space with you, and your data, your money, and your identity are all in there with you as well.
What happens when we let human vices roam free in a space that is currently underregulated, misunderstood, and wild with human activity? We may be seeing the dark side of the Metaverse emerging as we speak.
The New York Times acquired an internal memo from Meta encouraging personnel to volunteer to test the Metaverse. According to a business spokeswoman, a stranger recently harassed one tester’s virtual self in the virtual reality game Horizon Worlds. The Verge broke the story first, which Meta claims to have learned from. Since incidents in virtual reality occur in real-time and are rarely recorded, it might be difficult to spot wrongdoing there.
Internet abuse is nothing new. It has been demonstrated to be fatal in its traditional form, with several people committing suicide after experiencing online harassment.
The Metaverse and the technologies that underpin it, however, makes it worse. With VR headgear and, in certain cases, full-body clothing that may transfer sensations, the abuse becomes more accurate.
This discomfort is particularly destructive to women and young people. Many women have spoken out about the violence they experience on virtual reality platforms, even starting support groups to help them cope.
Sexually explicit content and behavior are the two main aspects of the pornographic problem in the Metaverse. Regarding the former, according to the BBC, two major apps—VRChat and Roblox—allow anybody over 13 to attend a Metaverse strip club.
There are many explicit goods, such as dance poles and contraceptives in these online clubs in the more ill repute and dark side of the metaverse. Sexually explicit action via avatars, however, is the real issue here. The BBC analysis found evidence of what looked to be adult men’s avatars approaching, touching, and trying to groom an avatar that belonged to a researcher acting like a 13-year-old girl, in addition to the fact that avatars may remove their clothing and mimic intercourse.
Of course, such things exist on the internet and are as strictly regulated as can be, but there remains a gap in the regulation of the Metaverse space thus far that warrants concern, especially with regard to the safety of minors and children.
Gambling is one part of the Metaverse that is winning big, with online Casino owners making millions per month in some cases. If mature adults choose to take risks and play with their money, that is their choice, but children have as much access to crypto casinos as adults do.
The problem is that since players are gambling with token issues by the casino itself, they are not considered real money. Courts, in turn, dismiss many lawsuits targeting this type of gambling.
Your data is being bought and sold as we speak; that is the reality of the Web2 internet we grew familiar with. Social media is a huge network of user-generated content. The types of data collected are your things, such as your social and financial engagements and your biometrics.
In the Metaverse, advertisers and developers could be able to monitor how users spend their time and attention and how and with what they interact. This begs the question of how much of our time and attention are we willing to sell?
Mental Health Effects
Despite its numerous benefits, technology can also lead to addiction and mental health issues, as evidenced by the effects of social media on young minds. Recently, social media papers connected to the uncovered social media networks underlined this. It will be addictive, according to a Facebook (Meta Platforms) whistleblower working on the project.
The Internet, especially social media, can cause or exacerbate mental health conditions like anxiety or depression, attention deficit disorder, eating disorders, and body dysmorphic disorder, according to research as well as whistleblower reports and leaked documents.
We are all currently more or less hooked to a dopamine release machine called social media. Every time we get a notification or a like, beat a level, or even trade crypto, we get a small hit of dopamine. It is worth exploring how such effects on the human brain unfold in the dark side of the Metaverse, where such pleasure triggers are cranked up to 11.