Skip to main content

Scheller Explores the Metaverse: Addressing Fundamental Human Rights in a Virtual World

The metaverse is expected to change the ways we play, work, and interact, but are we ready to address the human rights abuses that may lie ahead in a VR world?

Life-altering technology or the latest digital fad? The definition of the metaverse is evolving as fast as the systems that underpin it are being developed. Will the metaverse become a new, fully immersive way of life, or will it simply be just another word for online interaction? Scheller Explores the Metaverse is a new series in which Georgia Tech Scheller College of Business faculty, students, and alumni share their perspectives on the metaverse and everything it entails.

In today’s feature, Katsiaryna Siamionava, Ph.D. candidate in Information Technology Management, discusses the issue of user safety and human rights in a virtual world.

Today, the metaverse may sound as unbelievable and uncertain as the World Wide Web sounded to the early users of the Internet. Yet, the idea of a digital world resembling reality and capable of integrating multiple sensory aspects of human perception in a single limitless online experience is fascinating.

But the real question is, how realistic do we want it to be? More specifically, are we ready to handle this new world of unprecedented complexity in terms of technology and human interaction? 

The metaverse is likely to contribute to areas of e-commerce, education, entertainment, and much more. My review does not seek to diminish such possible benefits. At the same time, I would like to focus on communicational aspect of the metaverse, specifically, fundamental human rights. In this specific area, we can already spot some challenges awaiting us as the metaverse unfolds.

The metaverse reminds me of the Ideal City of Plato where the wise should live in peace and harmony. In fact, everything may be perfect about the dream of the Ideal Meta-world but the people themselves will never be.

One concern is tracking and collecting evidence about violations committed by people via their avatars. Just recently Meta, the parent company of Facebook, launched an investigation after a user was groped by a stranger avatar in Meta Horizon Worlds. Some of these issues may be resolved by better educating users about various privacy settings. For instance, Meta already offers its users tools to manage their avatars’ physical boundaries. But what can be done with verbal abuse that leaves little trace and may come unexpected?

The study by the Center of Countering Digital Crime has already found evidence of minors being exposed to hate speech while using Meta’s VR Chat app. In addition, Roblox, a popular metaverse game for children with 202 million monthly active users, admits its ongoing fight against “condos,” virtual sex games that keep re-appearing on the platform. While such games are “unlisted” and are not common, the existence of such parties where adults and minors may engage in sex-related discussions or even have virtual sex is troubling. While Roblox is currently attempting to address these challenges by introducing parental control systems, it is not yet clear how effective and consistent such controls can be in different metaverse apps.

Metaverse apps are going to introduce platforms for close to real-life communication via chat, voice, and physical interaction via avatars. While the technology is exciting, it is important to invest necessary time in establishing reliable systems to track user compliance with terms and conditions to protect against human rights violations.

This website uses cookies. For more information review our Cookie Policy