The Metaverse Has A Sexual Harassment Problem And It’s Gonna Get Worse

Nina Jane Patel designed her avatar, a cartoon-like version of herself with blonde hair and freckles, and entered Meta’s Horizon Venues, a 3D digital world, using a virtual reality (VR) headset. She had been there for less than a minute before a group of male avatars touched and touched her avatar without her consent, taking photos as they harassed her. Patel detailed the “horrible experience” in a December 2021 Average position. It “happened so quickly and before I could even think of putting up the security barrier,” she wrote. “I froze.”

Patel is not the only woman to report digital sexual harassment on Meta’s virtual reality platforms or other digital worlds in the Metaverses. When Facebook rebranded itself as Meta last fall, it embedded the concept of living a digital life in 3D metaverses, or virtual worlds where people can meet and play. But the early presence of digital sexual harassment, which can include non-consensual touching, verbal harassment and simulating sexual assault on avatars, raises questions about whether this new immersive technology can solve the problems of the old technology.

It’s not that the metaverse is creating new opportunities for digital sexual harassment – social media has always been plagued by gender harassment – but virtual reality technology is dissolving the gap between the physical self and the digital self, creating experiences immersive experiences that heighten both realism and emotional connection. Users watch as digitally rendered hands grope a representation of their own bodies, and it all feels increasingly real, just as the designers of the Metaverse intended. “All of these innovations and technologies that can make a digital life feel real with real feelings, which have exacerbated the impact of sexual misconduct in a metaverse,” Iowa State University professor Michael Bugeja who teaches media ethics and technology, said.

For those like Patel who experience digital sexual harassment, it can be degrading and emotionally devastating. It’s “surreal”, it’s “a nightmare”, wrote Patel. Despite this, and despite Big Tech’s history of ignoring the concerns of groups particularly vulnerable to online harassment, sexual harassment and its consequences are often glossed over or ignored by developers. It’s a problem that needs a solution, especially as haptic technology – technology that mimics the effects of touch – evolves.

Researchers at Carnegie Mellon University recently developed a VR headset attachment that sends ultrasonic waves to the mouth, allowing people to experience sensations on the lips and teeth (fingertips are the other hot spot haptics, where it is easier for developers to send sensation signals). With the mouthpiece, players can feel spiders, raindrops, and even a jet of water on their lips. They can simulate tooth brushing. But the headlines fled with the advances stating that users will soon be able to kiss each other, and subsequently feel the kiss on their physical bodies, in the metaverse. For now, that’s not true. Another lip configuration would be too large a haptic signal for this type of technology, said Vivian Shen, one of the researchers behind the accessory. His research is an example of how developers are looking for ways to mimic touch that feels intimate and real. Another company, Teslasuit, has introduced a full-body haptic suit that resembles a diving suit. It can capture the feeling of balls, for example, or a hug. Then there’s Meta: Last year, it developed a much more sophisticated haptic glove prototype than its current hand controls that would allow people to feel an object’s weight and texture when lifting it in a digital world.

Better and more realistic haptic technology is coming to the metaverse even as the harassment is persistent. Women on almost every social media and gaming platform have told stories of preposterous harassment, stories so common that their story arc is now familiar. According to a recent Pew Survey, 33% of women under 35 say they have been sexually harassed online, compared to 11% of men. Lesbian, gay and bisexual people are also more likely to be harassed. It is therefore not surprising that digital sexual harassment was commonplace in the first virtual worlds. In 1993, journalist Julian Dibbell wrote in the Voice of the village about a “rape in cyberspace” in LambdaMOO, one of the first text-based online worlds. A player known as Mr. Bungle used a ‘voodoo doll’ game feature that allowed him to assign actions to other people in the community rather than his own character, even though he was behind the keyboard. It created a narrative in which some players performed non-consensual sexual acts on others. Dibbell’s report asks the same questions about the boundaries between the digital self and a physical being elevated in the metaverse. This debate has been simmering for nearly 30 years, and the tech community is no closer to a consensus on what digital sexual assault means or how to prevent it.

Bugeja has studied what he calls “avatar rape” for more than a decade, and witnessed it in 2007 on a virtual beach in Second Life, where he watched a male avatar”violently violateanother who was sitting in a bar on the boardwalk, drinking a martini. This has made him a fierce critic of the platform’s use for education. Some saw Second Life as the future of gathering to work and learn (a familiar refrain used to market the metaverse). He worried that digital harassment would trigger a traumatic reaction in someone who had been sexually assaulted in real life. “I say don’t get rid of the metaverses,” Bugeja said. “I say, let’s put more controls in the metaverses. Let’s focus on the profit margin.

Get smarter in just 5 minutes

Get the daily email that makes reading the news truly enjoyable. Stay informed and entertained, for free.

Tech companies tend to be reactive rather than proactive when it comes to preventing harassment, and even then they can be slow to act. In 2016 Jordan Belamire, a VR video game player QuiVrwrote a Average position about her experience of being groped in the game. Players appeared in it as a floating helmet and a set of disembodied hands. Belamire wrote that one approached her and rubbed her chest, an image made more real by the VR headset she was wearing and the immersive nature of the game. QuiVr published an in-game fix that allowed users to push back other avatars. In 2018, someone reported to The Sims team that a player with special credentials had harnessed his power and shared sexual fantasies with teenagers. EA Games, which works The Sims, waited three months to delete the drive.

Many platforms have anti-harassment policies, but enforcement is lacking, and some hold those who are harassed accountable for taking action instead of blocking aggressors or being proactive to prevent abuse. Women who stream on Twitch have had their manipulated and sexualized images, despite a company policy that prohibits “unsolicited objectifying statements regarding the sexual body parts of another person’s practices”, among other forms of harassment. It’s true that anyone who feels uncomfortable can instantly log out of a game or remove their headset to exit the metaverse, but this forces the victim to take responsibility for others’ misbehavior. It’s also an excuse – especially since Big Tech knows that digital sexual harassment is a persistent problem – that allows metaverse builders to ignore the problem rather than proactively make the metaverse a place where women don’t have to come in with an exit plan. According to early reports that a beta tester from Meta’s VR world was groped last fall, the company reviewed the report and determined that the player should have implemented a tool called “Safe Zone”, a protective bubble that prevents people from touching or talking to players. This would insulate him from harassment. It also interrupts its interactive experience in the virtual world, while stalkers can continue playing.

It would make more sense for games to require players’ consent before they can touch each other, rather than trying to give them an exit only after harassment, said Jesse Fox, associate professor of communications at Ohio State. University: “If we can’t solve the problems, then ask yourself, is this ready to be a public space? Something companies don’t want to do because they only care about profits and money and not people,” Fox said. “If we haven’t thought about this or if we have people who are having negative experiences, should we let this continue?”

After Patel wrote that he was digitally bullied in late 2021, Meta added a feature called “Personal Boundary” in February. It creates an invisible four-foot bubble around avatars in Horizon Worlds. People can use it to stop everyone, or just strangers, from getting too close. They can also turn it off. “We believe Personal Boundary is a powerful example of how virtual reality has the potential to help people interact comfortably. This is an important step, and there is still a lot of work to be done,” said the company in the ad. With that, Patel agrees.

“I think it’s a step in the right direction,” Patel said of the new Meta protection bubbles in an email to Morning Brew. “Everyone in this industry must accept the ethical responsibility that rests with us as stewards of the future metaverse.” But whether or not the tool has put a stop to digital sexual harassment is unclear. A spokesperson for Meta declined to provide statistics on the number of people who reported being harassed in Horizon Worlds.

Metaverse platforms will encourage people to spend more time and money on their avatar and digital life. They are investments in its importance and validations of these digital selves. But until these companies fully invest in security, it can be difficult for workplaces, schools and brands to ethically lure people there with the promise of mind-blowing, futuristic experiences. Developers pushing metaverses will need to come to a consensus on what digital harassment means, and the creators of those metaverses should listen to those who are affected. If they don’t, the new digital worlds cannot hope to escape the mistakes of the old ones.

Comments are closed.