Mapping Out the Metaverse with Machine Learning
After announcing plans to make Facebook a ‘metaverse company,’ the internet is abuzz trying to figure out what that even means. Plenty of folks have made half-decent definitions, but many don’t appreciate the ‘digital map’ that is already looming over us. Taking a quick jaunt through philosophy, this article explores just how “meta” our high-tech world already is, and why machine learning will be key as the metaverse expands.
What’s in this article:
- Facebook’s ‘metaverse’ talk sparks conversation
- Metaverse vs hyperreality—two approaches to simulation
- Where does ML/AI fit into the picture?
- How we already “embody” the internet
- Where the metaverse ends
- ML/AI is already in the metaverse
Facebook’s ‘metaverse’ talk sparks conversation
New words are used regularly before they take on an exact meaning. That’s how most language works, especially in constantly-changing industries like machine learning. The more we talk about it, the closer we get to agreeing on what terms like ‘metaverse’ mean.
The metaverse—what emerged nearly three decades ago in science fiction is currently one of Mark Zuckerberg’s new favorite words. With Facebook’s looming presence as a machine learning business, it’s time ML/AI innovators stop twiddling our Oculus controllers and ask: what is the metaverse? What makes it lucrative? How can we refine our definition of the metaverse to more clearly see where it starts and where it ends?
But what is the ‘metaverse’?
A quick recap: the ‘metaverse’ is an umbrella term for the digital realities that are meant to augment the real world. It’s like using VR to go grocery shopping. The metaverse is a bunch of copies of reality that interact with the world we live in. Matthew Ball’s quasi-manifesto lays out some very helpful criteria.
Folks familiar with ML/AI can immediately see how they fit into this overlaid digital realm. For example: a lot of work has to be done before you can virtually grocery shop; someone has to teach the model where the produce aisle is, how to pick up a cucumber, and whether or not the refrigerator doors have handles. Even when the human body fails, machine learning depends on real people to make sense of the world. The construction of the metaverse is not exempt from the irony of the information age: in order for humans to rely on machines, machines need humans to first teach them. A mosaic of inputs makes models work, models that are already underpinning the metaverse fantasy.
Metaverse vs hyperreality—two approaches to simulation
Fortunately, Zuckerberg and other futurists aren’t the first people to try and imagine the kind of stacked, illusory reality that defines the metaverse. In his work “Simulacra & Simulations,” the postmodernist philosopher Baudrillard put forward the concept of ‘hyperreality’—a thing that, in an unending effort to look like itself, looks so different from the original that it must look like itself. Like the uncanny valley, or Disney World. As Baudrillard puts it:
“The real is produced from miniaturized units, from matrices, memory banks and command models – and with these it can be reproduced an indefinite number of times…It is a hyperreal: the product of an irradiating synthesis of combinatory models in a hyperspace without atmosphere.”
A lot of words to say: when you can copy and paste an image, you’re able to do it as much as you want. The same way going ‘hyper’ is making an object from itself (like a massive lego block made of lego blocks,) building the ‘metaverse’ is taking several digital versions of reality and making one, singular universe—ideally, including the one we actually live in.
Baudrillard’s idea of “hyper-“ helps us think of simulation as more than a cryogenic sleeping pod where we upload our consciousness. Instead we get to think, “if we put everything together, will it look exactly like where we started?”
From Borges’ perfect map to Google’s Street View
A fun explanatory metaphor: Borges’ map of the kingdom. TLDR: A king orders his mapmakers to create the most detailed map possible. Eventually, the map becomes so detailed that it is the same size as the kingdom itself, perfectly covering the entire land in parchment. The king and his citizens love the perfect map so much that they live on top of the map instead of on the earth it represents. To Baudrillard, this is hyperreality.
Now think about google’s street view feature. Users can zoom out or in to see the whole world or stitched-together photos of the storefront they’re standing at. In the metaverse, the map doesn’t have to cover the kingdom anymore. Instead, composite digital realities are ready to be accessed by our technology at any time.
The Metaverse will require countless new technologies, protocols, companies, innovations, and discoveries to work. And it won’t directly come into existence; there will be no clean “Before Metaverse” and “After Metaverse”. Instead, it will slowly emerge over time as different products, services, and capabilities integrate and meld together.
– Matthew Ball, The Metaverse: What It Is, Where to Find it, Who Will Build It, and Fortnite
Why do we want the metaverse?
Now, Facebook doesn’t necessarily want us to live in “The Matrix.” After all, our pockets and watches are where the current metaverse begins. It’s lucrative due to its accessibility, not its immersion. Companies that push for metaverse functionality want to ensure that users have constant access to anything relevant to where they are, even if it’s a little different from what they might see, hear, or smell.
This small variation is key; it’s what the metaverse’s success depends on. Social media isn’t a 1:1 with in-person communication; it’s supposed to make communication better. In the same way, the metaverse is profitable only if a better version of reality is ‘laid on top of’ our own. The metaverse needs to be enriched, enhanced, mediated.
Where does ML/AI fit into the picture?
So—machine learning. What will it’s role in the metaverse be? Or, more importantly, what’s it already doing?
The easy answer is: everything. Depending on what time of day you open your Maps app, it might suggest you go to a bar or a coffee shop. After enough trips, it knows your home from your office—something a map usually struggles to represent. Our history of interactions with the world, collected and synthesized by the technology we use, fuels inferences made about us. We’re writing our own maps that intersect with every other map, a kind of digital autocartography. There is a virtual version of your neighborhood that can guess what each house will order for dinner. And that What-To-Eat map, powered by machine learning, makes sense of the data. Artificial intelligence enriches the map of the metaverse.
How we already “embody” the internet
Zuckerberg aspires toward an embodied Internet; a place where “instead of just viewing content—you are in it,” as he told The Verge. Which does, admittedly sound very The Matrix. But I think it’s a little more nuanced than just Roblox meets Facebook.
Embodiment is when an idea or concept becomes real, tangible. So an embodied internet is when the digital manifests. This might be traffic patterns, credit scores, or dinner suggestions. To think about it simply—where do you go without your phone? For most folks, it’s nowhere. Today, being connected to the world through your phone isn’t just a habit, it’s a peace of mind. We’re fluent in reaching through our technology and connecting with others, the same way that spoken language used to only connect two people standing next to each other. Our social reality depends on our wifi-enabled extremities.
This embodied internet becomes a sort of digital theory of the flesh—our personal, social, and digital identities are not fragmented things, but cohere together into one self. To aspire towards an embodied internet isn’t “entering the simulation.” Rather, it’s deeply enmeshing the body with technology that connects us to the metaverse. Almost like a cyborg doing yoga.
Where the metaverse ends
The same way we can speculate on the metaverse’s beginnings based on currently available technology, we can also imagine its end based on the limitations we currently face. Whether it be structural, psychological, or extra-human, our current iteration of the metaverse has its problems.
Data Shift—imagine the machine learning models that uphold the metaverse are tectonic plates. As they slowly slide across the world, they move land masses and create volatile events at their fault lines. This same thing can (and does) happen to high-performing models. As the situation that shaped your AI system changes (historical vs. new data, strengthened or weakened correlations, new project needs, etc.) your results will slowly “drift” further and further away from your previous success. How will we make sure that the metaverse changes at the same rate as the reality it’s supposed to augment?
The Moon says “Slow Down!”—A few weeks ago, a viral tweet showed Tesla’s autopilot feature mistaking the full moon for a yellow traffic light, causing the car to slow down. Situations like this aren’t rare; it’s how we improve our models. However, as we construct a more robust metaversal network, where else will “moons” tell us to “slow down”? How might unexpected or self-regulated parts of our world break, confuse, or play with the systems that uphold our metaverse?
Metaverse Anxieties—Most folks have wondered if we’re already in a simulation. So much so, in fact, that scientists had to disprove it. Now that we’re constructing the metaverse as a lucrative tool, this anxiety could regain traction. But the worry goes both ways; just as humans are afraid of being stuck in the digital, there is also a fear that we’ll end up making the virtual world just as unfair and unjust as our own. Will we harness the metaverse or just get trapped inside? What’s worse—man-made prisons or digital ones?
ML/AI is already in the metaverse
While it’s not clear when or how the metaverse emerged, it’s pretty evident that we’ve already taken our first steps. Art, defense, realty, and myriad other industries are already bracing for the interoperable VR jungle. In the same way that machine learning is revolutionizing every part of the economy, designing for the metaverse seems to be the logical next step. While Zuckerberg’s public commitment to the metaverse isn’t out of the blue, it still changes the conversation. As more and more leaders turn towards the metaverse, industries like machine learning will prove essential to our new hyper-world.
We are a group of scientists, engineers, and entrepreneurs with a vision for better AI. With backgrounds primarily in Machine Learning and Computer Vision, the Innotescus team understands the importance of having full control over and insight into data used to train Machine Learning models.
For media inquiries, please contact: firstname.lastname@example.org