“The Metaverse is wide open and undefended, like airports in the days before bombs and metal detectors, like elementary schools in the days before maniacs with assault rifles. Anyone can go in and do anything that they want to. There are no cops. You can't defend yourself, you can't chase the bad people. It's going to take a lot of work to change that—a fundamental rebuilding of the whole Metaverse, carried out on a planetwide, corporate level.”—Snow Crash, 1992
In the context of the novel, the hero muses these thoughts after a deadly computer virus begins spreading through the Metaverse. But so far the most virulent infection in the Metaverse is basically human nature.
And in a virtual world, human nature at its worst can burst out in unexpected ways.
Philip Rosedale often tells the story of how, in Second Life's early days, he and his development team gave the beta users creation tools that were powerful enough to remake the virtual world however they chose. They expected to see it become a strange alien reality—especially since all avatars can fly at will.
They were surprised to discover how it actually evolved. As Hunter Walk told me for The Making of Second Life:
While a substantial minority of early users did create castles, steampunk cities, and other fantastic realms, Second Life on the whole quickly began to seem like a fever-dream version of California at its most stereotypical. Giant suburbs sprang up everywhere; nightclubs, casinos, and shopping malls were digitized into existence; and all along the virtual world's waterfronts came homes with picture windows and sundecks.
“It quickly became Malibu,” as academic and game industry analyst Nick Yee observes. “A very consumer, materialistic-oriented world.”
None of this was consciously planned or encouraged by Linden Lab. (Recall again that Second Life itself was inspired by Burning Man, a proudly noncommercial communal arts festival.)
How did it happen, anyway?
The core reason is an early decision the company made that is so standard, most virtual world developers don't even think to do otherwise:
Second Life's only default avatar options were realistic humans. They were highly customizable, to the point where someone could, say, turn their avatar into an orc-like monstrosity, but the default setting for each adjustable feature started with a conventionally attractive male or female human.
With that in place, Malibu quickly followed.
“Once users are presented with a believably human template,” as Yee explains, “you want chairs and furniture and cars where your bodies can sit and drive around, and you need large virtual closets to put all your virtual clothing in, and people are building these beautiful cantilevered houses by the beach because that's what people do in the real world.”
While Linden Lab may have hoped Second Life users would define reality according to their wildest imaginations, the realistic human avatars shaped how much of the world would evolve, and with them came all the social problems typically associated with wealthy beach enclaves in the real world.
“[That's] where the racism and the sexism comes from,” as Yee puts it. “Because when the avatars are sufficiently human to make human assessments upon, our inherent human biases come clawing into the digital world…. It's almost unavoidable, because once you have bodies that are anywhere near realistic, people feel the need to dress up their bodies, and to look cooler than the next person. And suddenly you have this whole economy based around selling bodies and hair and body parts.” (The problem worsened, as I explained in the previous chapter, when Second Life avatars became ultrarealistic.)
To be clear, Second Life is overall an endlessly creative community that for the most part is highly positive and supportive. At the same time, Linden Lab making attractive humans the default was (and is) a conduit for much player-to-player abuse.
I began noticing that as an embedded journalist very early on. Female avatars that were poorly customized often got roundly ridiculed by many in the user community; people who adjusted their avatars to look fat or disproportionate (because in a virtual world, why not?) were jeered at and harassed.
Worse still, avatars that weren't white regularly received racist comments and trolling. In The Making of Second Life, I write about a woman, white in real life, who customized her avatar to look like an attractive Black woman—and was instantly hit with racist jibes, even by people she thought were her friends.
In the same way that Nick Yee's early Stanford studies suggest that users unconsciously bring the unwritten rules of eye contact and social distance with them in these virtual worlds, “a lot of racial norms, gender norms, and sexual harassment follows us in,” he tells me. “We shouldn't be surprised by it now.”
(Many of these insights, by the way, are featured in Yee’s seminal 2014 book, The Proteus Paradox: How Online Games and Virtual Worlds Change Us—And How They Don't.)
A metaverse platform is inherently a virtual world, but as Yee's book title suggests, any assumption that a virtual world will offer a complete escape from our daily prejudices and cruelties is sorely shortsighted.
In retrospect, Linden Lab could have prevented much of this problem by making nonhuman avatars also (and equally) available to new users by default, helping to nudge them to consider other avenues and cultures of expressiveness beyond conventional humans—and making it clear that human avatars were only one option among many, no one type more of a “right” choice than another.
Instead, the problem persisted and continues to this day. Kavya Pearlman, an award-winning cybersecurity expert, herself experienced this while still at Linden Lab. Because she was a convert to Islam, someone kindly created a traditional headscarf for her Sansar avatar, so she could express her personal faith in the virtual world.
One day as Kavya was attending an event there, someone said out loud, “Oh, is she bringing Sharia law to Sansar?”
“So I then stopped using that avatar,” Kavya tells me now. “Because people literally perceived me as this threat that was coming to their culture.”
“I think a lot of people have this assumption that when you create a VR world, you're creating a clean slate utopia,” as Yee puts it. “My research, and the point of the book, was always no, you carry all this baggage, and you need to be very conscious of that baggage because people just revert back to the behaviors that our biological brains are hardwired to learn.”
The other hardwired human behavior, of course, is horny.
At the peak of Second Life's media exuberance, as former Linden Lab PR head Catherine Smith recalls, “I got a call from a German reporter who had—this is quite a story—who had found a young woman who was using her mother's Second Life account to augment her monthly allowance by having cybersex.”
Catherine Smith soon faced a barrage of bad PR when stories around Second Life's extreme sexual content emerged. But it was a foreseeable outgrowth:
“If you're gonna give people freedom to do things, how are you going to moderate it? There's so many intricacies with opening up an environment like this, that you've got to just be prepared—whatever you're dealing with in real life, you're probably going to be dealing with in a virtual space as well. I don't think people understand how complicated and how multifaceted it is.”
Rampant virtual sex was another inevitability set in motion by Second Life's human avatar default. An online game where the player characters are (for example) squat, spherical robots will also have its share of juvenile players humping and grinding into each other, but because it doesn't appear explicitly sexual, they will soon get bored with that activity.
Failing to deal with this content and behavior immediately can deeply tarnish a metaverse platform's brand, and worse—as Meta learned the hard way in 2022, when female users of its Horizon Worlds were immediately beset by virtual sexual assault.
Second Life learned that around 2006. “It did not help Linden to get branded as a freak show, particularly through the decade wandering the desert,” as cofounder Cory Ondrejka puts it now.
I tell these stories to illustrate how a simple design decision early on in a metaverse platform can lead to enormous unforeseen social/content moderation consequences, many of them bad. Go with human avatars by default at start, reap an enormous community management and PR whirlwind.
“The thing that people forget when designing virtual worlds … [is] every decision you make about bodies and social interaction is a really important decision,” as Nick Yee puts it. “All these variables that you code create spontaneous unpredicted events that have interesting social psychological consequences.”
Perhaps more than for any other online platform, community management of a metaverse platform is a vexing challenge, as difficult, nuanced, and constantly demanding as governing a real country. (And recall again that the leading metaverse platforms have populations that are numerically larger than those of most nations!)
While no perfect solution exists to deal with all community management issues in the Metaverse, there's much wisdom from the last 30 years to follow, to keep the abuse tamped down to a dull roar. But first let's dive deeper into three key challenges.
It's an unresolved paradox: Metaverse platforms promote themselves as the successors to social media while also prohibiting activity that's essential to human socialization. Which is to say virtual sex—in every possible variety that frisky avatars can pull off together.
So far, most strategies to preempt this problem are comically Victorian: Roblox bans avatars from even holding hands, and Meta’s Horizon Worlds restricts its users to avatars that resemble floating hand puppets with no legs, reportedly for technical reasons—but which, by a nice coincidence, also means they come without even the possibility of having genitals.
But freely allowing unrestricted virtual sex, as Second Life learned, is fraught with its own hazards. The company's openly laissez faire approach implicitly invited depictions of violent or just plain bizarre sexual fetishes; confronted by video and screenshots of this content in the media, button-up organizations once interested in Second Life as a platform were belatedly sent scurrying.
So the paradox remains: No evangelist can honestly promote the Metaverse as the next great leap in social technology without acknowledging the inevitability of virtual sex, both between consenting adults and—just as key—the inevitability of attempted abuse, where virtual sex is not consensual.
This problem has existed for so long, game developers many years ago gave it a cheeky name: TTP.
That's the amount of time it takes players to figure out how to use the content creation tools to create a virtual penis. (Hence, Time To Penis.) It usually happens during the first day of launch.
How to deal with virtual sex even plagues metaverse platforms dedicated to enabling free user creativity. Most people may not be interested in having pixel porn, or even having to see it, but at the same time, some fiercely are.
Linden Lab wrestled with pleasing both sides and focused their solution on defining rules for public and private spaces in the virtual world.
“There's a reason that you have public and private citizens in real-world societies and the rules are fundamentally different for each of them,” as Cory Ondrejka explains now. “There are certainly worse starting points than that, I think. Deciding to have a very strict set of rules everywhere in the universe is probably limiting.”
This approach also had limited success. Most reporters who depicted Second Life as a wasteland of sex perverts did not base that perspective on actual visits to Second Life; social media and discussion sites like Reddit are effectively the Internet's public space, and photos and videos of Second Life at its raunchiest were everywhere there, even becoming snarky memes. The practice continues to this day, with assorted smartass YouTubers airdropping into the virtual world in search of cheap gibes.
It's easy to chortle mockingly at avatars engaged in pixel sex, but an important point should not be lost: Many people turn to virtual sex for deeper reasons than simple interactive porn: young LGBT people, for instance, who want to safely and anonymously explore their sexual orientation in a virtual world, or individuals severely isolated and alone in real life for any number of reasons, still yearning for intense human connection.
The essential power of a multiuser virtual world is also its greatest liability: the immersive sense that you are really there with other people, which is usually thrilling and engaging.
Except when some of those people turn out to be cruel provocateurs or worse. Hateful or trolling words on social media can already be scarring to the target; when abusive words are said by someone in voice chat or even expressed with 3D building tools, it's almost as if the troll is invading the victim's personal space, making the stress and anxiety even more heightened.
Oftentimes this even includes physical attacks on avatars, and thus they become even more invasive. In the infamous sexual assault that happened in Meta's Horizon Worlds, for example, a woman's avatar was surrounded by male avatars who virtually groped and thrusted at her.
“It's like you're giving a megaphone to a baby boy,” says acclaimed game designer Jenova Chen. “They will test the boundaries. Or they will say something kind of disturbing and anger-inducing to see how much response they get. So that's why to me the Internet by default is not neutral, the Internet by default is kind of toxic.”
Female avatars and avatars who appear as racial minorities tend to bear the brunt of the worst harassment in virtual worlds. Gamergate proved this was not simply casual churlishness, but also a concerted, highly political effort to drive women, LGBT+ people, and other vulnerable groups out of online games and other virtual worlds.
“How I personally think about online bullying today is different than how I thought about bullying 20 years ago,” as Cory Ondrejka puts it to me. “I have a lower tolerance today. I have more empathy for what it feels like to be on the receiving end of it. And I would build systems from the beginning that enable less random bullying. The public norms I would set at a more restrictive place than where Linden Lab started. Because we've got way more evidence, frankly, of how people are able to channel their hatred into online spaces and be terrible to each other.”
Player-to-player abuse can even metastasize into the real world to the point where it becomes a national security issue.
This is not hyperbole. We saw a variation of this in the early days of Second Life, which were the years after 9/11. Back then, Rohan Gunaratna, an expert in Islamist terrorism, informed me that Al Qaeda–affiliated jihadists were experimenting with Second Life as a near-anonymous Internet platform from which to communicate with each other as avatars, potentially using its 3D creation tools to plan future attacks. Criminal or state-sponsored money laundering within metaverse platforms, via its virtual currency, is also an ongoing concern.
So I was not surprised to learn, via the Snowden leaks of 2013, that NSA agents had also gone under virtual cover as Second Life avatars.
All this happened even though Second Life was (and still is) a niche platform of some 600,000 regular users. It's unlikely that bad actors affiliated with various authoritarians have missed the strategic opportunities inherent in far larger platforms, like Roblox and Fortnite—especially when millions of their own citizens are already active users themselves. At minimum we should expect subtle disinformation and discord-sowing campaigns, the kind wrought by Russian intelligence during the 2016 U.S. election, to play out in the virtual world.
As I touch on in Chapter 4, metaverse platforms are still dominated by users who are minors.
When it comes to crafting metaverse platform policies for kids, however, solid reference data is difficult to come by, digital education expert Anya Kamenetz tells me. The companies themselves refuse to share their internal data with researchers, while social academics do not move at the velocity of the Internet—and tend to be biased against kids playing online games/immersive experiences.
“There's a bias in the world of researchers that is hung up on moral panic,” as Kamenetz puts it, referring to studies on the media's impact on ADHD and anxiety, for instance, along with the perennial games-and-violence question. “They're always trying to tie into worst-case scenarios.”
As I discuss in the Introduction, there is research to suggest that playing in metaverse platforms is overall a positive experience for children. But then again, in some cases it can contribute to antisocial and self-destructive behavior. The lack of definitive data on this topic, even when at least one in two children are active in a metaverse platform, is a smoldering crisis.
“How do we actually safeguard children in these ecosystems where they are going to exist? We need to enable trust,” Kavya Pearlman tells me. “There needs to be custom engineering of that experience, so that the impact on the child's brain is not net negative.”
We have barely begun to address the behavior algorithms on our social media platforms for adults. But metaverse platforms loom above them with similar unaddressed problems, except that here, teens and children still comprise the majority of the total user base.
“The National Institutes of Health should already be doing this kind of research, to study the impact on children's brains so we can enable these technologies,” Kavya Pearlman tells me. “But we don't have those answers. Because we didn't allocate the money to study them. We just let Roblox run wild.”
If bad behavior is inevitable to human nature, so is a desire to create and enforce rulesets that minimize and curb it.
As a design reference for metaverse platform developers, Anya Kamenetz suggests visualizing the ideal playground that parents would feel safe letting their own kids roam free in:
There's a fun variety of places to explore and installations that encourage fun and creativity; there are many other kids to play with. At the same time, the parents also want to be able to see what their kid is doing at a glance; they also want to see the entry and exit points of the playground and be assured that their kid will not leave the area without their knowledge. The parent also wants to know there are responsible adults actively monitoring the playground, and that they have a fair and consistent escalation system for resolving disputes when arguments break out among the kids. Just as key, the playground monitors should have an identification process for other adults entering the playground, to ensure they are not bad actors.
While the concept of a soccer mom has long ago become cliché, Kamenetz advocates for a new role: Minecraft Parent. Someone who engages and encourages their kid's metaverse platform activity, even getting involved directly. (It's fairly common for parents to play Roblox and Minecraft with their kids.)
Platform developers could do more to actively incentivize parents to take an active role in their kids’ virtual activity—for instance, by offering discounts on parent/child subscriptions. Or what if a parent is given rewards for regularly logging in, such as power-ups and special content they can then pass on to their kid?
In other words, give parents special protective and fostering powers in the virtual world that echo a child's elevated vision of them. Kamenetz broadly agrees with the idea, citing the beloved online education resource Khan Academy, which has a “coach mode” that parents can take on, creating shared pedagogical quests for their kids to fulfill.
XR security expert Kavya Pearlman doesn't believe the core problem can be solved without some government oversight. “That's where regulation starts to come in. We will say, ‘Hey, you will not code for engagement because we already know children can get addicted, so we cannot do that for immersive technologies. Otherwise, they'll just stay there.’
“So we need to put our foot down as a regulatory measure, and say that you will not advertise and code for engagement—you will code for well-being.
“You will code for well-being, and prove to the court, prove to the regulators, that you're coding for well-being. And it's possible.” Somewhat ironically, Chinese regulators are taking a more hands-on approach to protecting their children's safety, forbidding algorithms in apps that encourage excessive and negative usage.
As head of the XR Safety Initiative (XRSI), Kavya is also drafting self-policing standards for metaverse platforms to adopt. She believes companies will ultimately want to adopt them out of self-interest—and self-preservation.
“We're trying to build that measuring stick and so if they don't do it by themselves, we're going to tell regulators. And then we're gonna ding them on all the things that they're not doing right. And then they will be forced to retrofit all this regulatory stuff that comes after. So this is their opportunity or window to be proactive, to change their business tactics.”
VRChat, as I discuss in Chapter 7, has an extremely open and powerful toolkit. Users can (and surely often do) use it to create virtual porn of all varieties. But you will rarely see any of it posted as videos or screenshots on social media.
VRChat's founders Graham Gaylor and Jesse Joudrey decline to say much on that topic except to suggest any porn that does exist is overshadowed by so much other content. “My guess would just be that most people coming to VRChat are the sort of folks that are interested in making and building things, and so that's the type of content that tends to have the most reach,” as they put it.
But the other reason is emblazoned in VRChat's Community Guidelines:
Live-streaming, advertising or publicly sharing content that is sexually explicit in nature or simulates sex acts is not permitted. Doing so may result in moderation action being taken against your account up to (but not limited to) banning of the offending user account depending on the severity of the act in question.
The wording expresses a precise subtlety: Simulated sex in and of itself is not prohibited—it's just publishing screenshots and video of it on social media and elsewhere that's forbidden. Do whatever you like in your VRChat world with other consenting adults, in other words, just don't blemish VRChat's brand by posting it in public.
It's an elegant solution to the Metaverse's sex paradox. Fighting against the inevitability of consensual sexual activity in the virtual world is a lost cause; preventing it from defining the platform is not.
Veteran developer Jim Purbrick, who vainly warned Meta to prepare for virtual sexual assault, likens the online moderation challenge to keeping the peace in the offline realm:
“If everyone in a real environment decided to riot tomorrow,” as he puts it, “no police force would be able to deal with that situation. It relies on most of the people most of the time acting well.”
On the highest level, that does require laying down specific, easy to understand rules.
“You have to be super clear about that stuff. Especially because we have an environment where people have been growing up playing video games since they were tiny, where video games they're playing, maybe they're the only actual human being [in it] and everything else is an NPC.”
Enforcement of rules also has to be done in real time. This runs counter to how social media platforms are run, where a user can be flagged for toxic content but the issue is adjudicated days or even weeks later. “If you got strangers meeting each other in real time,” as Purbrick puts it, “you have to have real-time ways of responding to bad behavior.
Finally: “You want those laws to permeate the environment and become norms.” This, and not virtual world policing, is what ultimately keeps a virtual world's (relative) harmony.
Anil Dash, a veteran Internet thinker and CEO of Glitch, a web app development platform, has advice applicable to online communities of all kinds: Actively promote social norms around good positive behavior, through simple adages and memes as well as stated policies. Once inculcated, the community itself will then enforce those mores among each other and new users.
“They tell each other ‘we don't do that [trollish behavior] here’,” says Anil. “That has more power than banning or any policy things we can do … we don't want to be in an arms race of bad actors, and [community norms are] super powerful.”
But those norms take time to establish and can't be imposed on a code level or through automation. They're grown during the early beta phase of a metaverse platform, as the company's community management cultivates and rewards norms, and those norms are passed on by the user community to new members.
Another way of putting it: Hire humanities graduates to run community management. They tend to be best attuned to the nuances of social behavior and have the skills to communicate and encourage good behavior in a clear and fun way.
In his book The Metaverse, Matthew Ball briefly touches on a concept of interoperability that's different from how it's generally understood: not to enable 3D asset transfers but to address metaverse platforms’ broadly shared desire to tap down on racism and other antisocial behavior by trolls—the small but incredibly radioactive subset of users whose main goal is (with an impressively dedicated, sociopathic zeal) to abuse these platforms and the players, until they are finally banned from a particular world … after which, typically, they migrate to another platform and once again fire up their special feedback loop of assholery.
In cases like this, Ball suggests, interoperability on the user account level would be a powerful way to identify and blacklist this.
“So what you're talking about is interoperation of data and identity,” he tells me. “This is much easier technically. And I think it's a lot more powerful.
“The classic example is credit score systems. Banks used to believe that their credit information on customers was the single most important thing that they had, because it allowed them to make the best judgments on who to lend to. The problem is no one benefits from default. And so there were customers who would have poor credit with Bank A and go to Bank B to get a loan. So they opened up their credit systems to the benefit of all.
Major metaverse-facing companies are beginning to hammer out a similar shared system, but directed at trolls.
“We are seeing with Epic, with Microsoft, with Sony, and myriad different startups, an effort to say, ‘Let's interoperate not just our communication suites, but cross-reference, corroborate, and integrate our player information.’ So that someone who behaves poorly on Game A or platform A can't just shift to game B or platform B. Because no one, not players, not publishers, not platforms, benefits from toxic behavior.” Airbnb and VRBO are already doing this with their own data, “because bad hosts and bad renters hurt everyone, including the commissions that need to be paid by good users.”
As I hint in Chapter 8, this specific version of interoperability definitely has value (if carefully managed to avoid false positive trolls identification). It's also a corollary to my principle “Only community must be interoperable.”
Call it “Only tools that support community must be interoperable.”
This chapter started with the story of how a standard design decision—conventionally attractive human avatars—biased a virtual world's community in a very consumerist direction, while also indirectly leading to a small but steady amount of toxic behavior.
But if bad design choices can lead to negativity, good ones can encourage positivity.
Or as Nick Yee puts it, referring to toxic behavior: “The way to get around that is to break reality in productive ways.”
Jenova Chen's upcoming metaverse project may be the most ambitious in that regard, drawing from the design of Sky: Children of the Light, his highly successful virtual world (which will be incorporated into his company's upcoming metaverse project).
“So when we approached our metaverse design for the society of Sky, we wanted to simulate what reality is. In reality, we are social animals by nature, and there's social consequences.”
Chen sees those consequences emerging from our evolution as a species, when ostracization from hunter-gatherer tribes did not mean being banished to 4Chan but being abandoned in the wild to defend oneself. “And so, to me, it's: how can we simulate that evolutionary biology?”
“[Designing] how people socialize is so delicate that I feel like if people who design the Metaverse do not pay attention, they could easily create a very tricky situation.” After 20 years of writing about the chaos and drama eruptions from poor social design, I'd call this an understatement.
Chen's innovation is to change the game structure, so it's not based on defeating opponents (a la Fortnite) or earning achievement badges (a la Roblox) or leveling up as in traditional MMORPGs.
“We don't have role-playing-game leveling—but we have leveling for relationships.” Only at mutual level two, for instance, can you share your name with another player or even share a hug. While this might seem artificial, Chen suggests it models how strangers become friendly only after several random street encounters; it takes a while for a level of trust to develop.
These mechanics, he adds, do not come with a reward beyond the activities you share with a friend—there are no “karma points or other benefits,” as is common in many virtual worlds and social platforms. (“That's like the worst way to start a friendship,” says Chen. “We learned that human relations cannot be meddled with any gamification.”)
FIGURE 9.1 Avatars in Sky: Children of Light
Courtesy of Thatgamecompany
Following Yee's insights, the avatars in Sky are humanoid but genderless, evoking Studio Ghibli characters, and are intentionally without race—their skins are a dark-grayish hue.
“We don't really care about gender either because a lot of people, their gender in their mind and the gender of their body is different. And it doesn't really matter if they are a man or a woman in real life. As long as they are who they are in the virtual world.”
Chen's team also minimizes class bias through its monetization system as a free-to-play virtual world. Where most metaverse platforms will have a sliding scale of avatar enhancements based on quality and price—so, as in real life, it's fairly easy to spot a wealthy person with a single glance at the finery of their clothing—Sky's system is based around gifts you can buy for other players.
“Half of our design is about how you buy things for other people,” as Chen puts it, “rather than buy things for yourself. So it's altruistic spending. Yeah, I think right now, about half of the players who use our season pass, they're getting a season pass from a friend.”
Often this is done by players across countries as a bridge between economic disparity, with (for instance) relatively well-off people in Japan and the United States gifting friends in Sky based in the Philippines.
Over 22 percent of Sky's revenue now comes from player-to-player gifts—a truly impressive number and proof that Chen's approach is working.
As is Sky's success as a virtual world: In 2022, it boasted some 20 million monthly active users, with 600,000–1.5 million concurrent players. A global online community with a population the size of Taiwan, architected around friendship.
A better metaverse can be built.