Earlier this month, reports emerged that police in the UK were investigating an alleged rape in the metaverse for the first time. A girl under 16-years-old said she was attacked while playing a virtual reality (VR) game. Although not physically injured, she is said to have been left “distraught” after her avatar — or digital character — was assaulted by several adult men in a virtual “room”, according to the Daily Mail.
Given VR’s immersiveness, police said she suffered the same psychological and emotional trauma as someone raped in the real world.
The case has once again thrown a spotlight on the safety of the metaverse — a loosely defined term given to immersive virtual reality technologies — for women and girls. Two years ago, psychotherapist Nina Jane Patel said her avatar was mobbed by a group of males on Meta’s Horizon Venues. When the attackers proceeded to “virtually gang-rape” her character, Patel froze.
As current legislation defines rape and sexual assault as physical offenses, where does that leave victims in the virtual space? And what needs to change? We spoke with Elena Martellozzo, an associate professor in criminology at Middlesex University, who is co-leading new research into children’s safety in the metaverse. Her words have been edited for clarity and length.
This isn’t the first time concerns about sexually motivated attacks in the metaverse have been raised. Do we have a sense of the scale of the problem?
Children and women, and people in general, don’t report. Primarily they might not report because they may not know how to, what to do or where to go. This case that you’ve brought up is, to me, a wake-up call to the potential harms of the metaverse.
Debate rages in online forums over whether sexual assault in the virtual world should be taken as seriously as in real-life. Where do you fall on that?
First of all, we shouldn’t really make a distinction. But often what happens in cyberspace, like in the metaverse, is not always considered as impactful as “real” crime. And I think that’s where we need to change that argument because it is real abuse, and the impact feels real.
I [also] think it’s a kind of defense mechanism for lots of platforms to protect what they’ve created. And what they’ve created is not thought through. For example, Meta’s response to harassment was to introduce personal bubbles for avatars [a virtual social distance function]. But it was really only developed after reports of abuse, a kind of reactive step.
How should virtual 3D spaces be governed? What can we put in place to protect women and girls?
There should be some efforts to make tech companies more responsible and embed that safety by design. Almost like when you develop a car, you have to make sure you have brakes, lights and seatbelts. All those very important elements that make you a safer driver, a safer passenger and a safer pedestrian. Unfortunately, we’re not quite there in the online space.
In the UK, the Online Safety Act regulator, which is OFCOM [the telecoms watchdog], is not due to produce guidance on violence against women and girls until 2025. Steps can be taken before then…these timetables suggest that we’re not quite there in terms of prioritizing women and girls.
It doesn’t have to be something magical. It could be quite simple…a kind of support mechanism if something goes wrong. Like you would call the police.
How do we define what is a crime in the metaverse? Are we even able to police it right now?
This is a very interesting question. I don’t think we have the answer yet. We have this wonderful Online Safety Act, which I think is a massive step forward. [But] crime is always ahead of us unfortunately. That’s why we need to work collaboratively with those tech companies that create wonderful platforms before they’re public. Let’s sit down and reflect, is this safe? Have women and young girls been consulted before this can be released for everybody to use?
It’s wonderful you can go inside the Tate gallery through immersive technology if you don’t have the means financially to travel to London. We don’t want to stop that. What we want to stop is those criminals, I’d like to call them, who could take advantage of these platforms.