The metaverse is still in the concept stage, but the latest attempts to create virtual worlds are already facing an age-old problem: bullying.
technology columnist Bloomberg Parmy Olson told the show Tech Tent from the BBC about their own “creepy” experiences.
And one woman likened her own traumatic experience in virtual reality to sexual abuse.
Goalthe companyMark Zuckerberg’s daynow announced a new feature, Boundary Staff (in Spanish, “Personal limit”), which began to be implemented on February 4 and prevents avatars from approaching a certain distance, which creates more personal space for people and makes it easier to avoid these unwanted interactions.
This feature prevents others from “invading your avatar’s personal space,” Meta said.
“If someone tries to enter your ‘personal limit,’ the system will stop their forward movement when they reach the limit.”
It is available in the software Horizon Worlds Y Horizon Venues of Meta.
The firm said it was a “powerful example of how virtual reality has the potential to help people interact comfortably” but acknowledged there is more work to be done.
For some, the news will be welcome.
“I had some moments where it was awkward for me as a woman,” Olson said of their VR interactions.
I was visiting Meta’s Horizon Worlds, their virtual reality platform where anyone over the age of 18 can create an avatar and hang out.
To do so, users need one of the virtual reality devices of goaland the space offers the possibility to play and chat with other avatars, none of which have legs.
“I could see right away that I was the only woman, the only female avatar. Y these men surrounded me and observed silentOlson told Tech Tent.
“Then they started taking pictures of me and giving them to me, and I had a moment when a guy came up to me and said something“, he continued.
“In VR, if someone is close to you, then the voice sounds like someone is literally speaking in your ear. And he took me by surprise,” she added.
Olson experienced a similar discomfort in the Microsoft social virtual reality platform.
“I was talking to another woman, and within minutes of chatting, a guy came over and started talking to us and kept saying inappropriate things to us, and we had to block him,” she said.
“I have since heard of other women who have had similar experiences,” she said.
The tech columnist indicated that while she wouldn’t describe it as harassment, it was “creepy and awkward”.
Nina Jane Patel went much further days ago when she told the Daily Mail that was abused at Horizon VenuesComparing it to a sexual assault.
Patel described how a group of male avatars “touched” her and subjected her to a seguidilla of sexual advances. They photographed her and sent her a message saying, “Don’t pretend you didn’t love her.”
Meta responded to the newspaper saying she was sorry: “We want everyone to have a positive experience and easily find security tools that can help in a situation like this, and help us investigate and take action.”
Opportunities and Threats of the Metaverse
Moderating content in the nascent metaverse will be a challenge, and Meta CTO Andrew Bosworth admitted that he would offer “biggest opportunities and biggest threats“.
“It might feel much more real to me if you were abusing me, because it feels much more like a physical space,” she said in an interview with the BBC late last year.
But he assured that people in virtual roles would have “much more power” over their environments.
“If I were to silence you, you would cease to exist for me and your ability to harm me would be nullified immediately,” he said.
Bosworth questioned whether people would want the kind of moderation that exists on platforms like Facebook when they have chats in virtual reality.
“Do you really want the system or a person to be listening to you? Probably not,” he wondered and answered at the same time.
“So I think we have a privacy trade-off: If you want to have a high degree of content, security, or what we would call integrity, well, that’s traded off with privacy.”
And in Meta’s vision of the metaverse, where different rooms are run by different companies, compensation becomes even more complex as people move from the Meta-controlled virtual world, to others.
“I cannot guarantee the privacy or integrity of that conversation”He said.
Olson agreed that it was going to be “a very difficult thing for Facebook, Microsoft and others to work out.”
“When you’re scanning text for hate speech, it’s hard but it can be done; you can use machine learning algorithms,” she said.
“(Instead,) processing visual information about an avatar or how close one is to another, that’s going to be very computationally expensive, it’s going to consume a lot of computing power; I don’t know what technology can do that,” she reasoned.
The ethics of virtual worlds
Facebook is investing $10 billion in their metaverse plans and part of that will need to continue to build new ways to moderate content.
“We’ve learned a lot in the last 15 years of Internet discourse (…) so we’re going to bring all that knowledge to do the best we can to build these things from scratch, to give people a lot of control over their own experience.” Bosworth told the BBC.
Beth Singler, an anthropologist at the University of Cambridge who has studied the ethics of virtual worlds, said: “Facebook has already failed to learn about what goes on in internet spaces. Yes, they have changed some of their policies, but there is still stuff out there that shouldn’t be.”
There is more to learn from games, think, where Second Life Y world of warcraft have offered virtual worlds for years, limiting who avatars can talk to and the names they can choose for them.
- Why Social Media Algorithms Are Increasingly Dangerous
Meta’s decision to use avatars without legs may also be deliberate, he thinks, most likely a technical decision about the lack of leg sensors, but it could also be a way to limit problems from the waist down that could arise in case of having a complete physical presence.
However, having strict rules about how avatars look can bring its own problems for those “trying to express a certain identity,” he added.
Remember that you can receive notifications from BBC World. Download the new version of our app and activate it so you don’t miss out on our best content.