It’s so easy to get mad at Meta and Mark Zuckerberg. The tough bit comes later, when it’s time to articulate the things I’m actually mad about. After all, the Metaverse is coming – isn’t this what I always wanted?
My answer, for the time being, is a loud and decisive “no”. I believe that if you care about how technology helps you learn, teach, or explore, you should also consider saying “no” to the Metaverse as it is now being imagined and designed. This article will hopefully help you see why I think that way – it will hopefully also help me see my own points more clearly.
Disclaimers, preambles, etc.
I am writing this as a teacher, learner, education professional, and a tech enthusiast. I am trying to squeeze many things into this text, and it’s possible that I will gloss over things which would be better explained in detail. If something is unclear, let’s have a chat – see below the article for how to contact me.
I am also writing this from a privileged position of a well-off white man in a developed country. This means I will almost certainly be partially, or completely, blind to how the Metaverse discourse plays out in several other contexts (although this is a point I’ll try to address below). I’m not offering an apology – again, inviting you to talk to me about this instead.
Finally: writing this as an introverted open-source fanatic on an on-going mission to de-Google and de-Face as much around him as possible. So, yeah, biased. 🙂
Got that? Let’s go.
Metaverse Problem 1: from the start, it’s missing out on 80%+ of possible experience
Hiro feels even at this moment that something has been torn open in the world and that he is dangling above the gap, staring into a place where he does not want to be.
Neal Stephenson, “Snow Crash”
The first actual, physical thing you associate with VR and AR setups these days is the thing that sits over your eyes. From VR goggles to Google Glass, what you see is what these projects aim to change first – sometimes the only thing they aim to change. Sure, you may move around, jump, make gestures – but the feedback mainly reaches you through the eyes.
As of 2021, three modes of AR (Augmented Reality) are possible with the use of widely available technology. The visual is as described above; the auditory relies mainly on sounds; the haptic communicates with the user through the sense of touch (a more elaborate version of when your smartphone vibrates in your pocket).
But if you were to sit through the Metaverse slide decks or promo videos, you would be forgiven for thinking that only one of these modes really counts. What you see is the big ticket. This is what everyone seems to be ready to build. The Metaverse is coming for your eyeballs, first and foremost.
This is bad enough if you happen to not rely on your eyeballs much – good luck building and scaling anything in Zuckerberg’s world for the blind or partially sighted users! – but the accessibility implications aren’t necessarily all that should be concerning about this obsession with the gaze.
From the very start, the Metaverse project gives up on 4 of the 5 senses available to us. Taste and smell aren’t used, because technologies aren’t readily available. Fair enough – but the flipside to this is a vicious cycle: AR and VR technologies for taste and smell aren’t readily available because (guess what?) nobody uses them.
Paradigm after paradigm goes by, and we slide from one technological mode to another, relying on what is seen. The trick missed by Vannevar Bush is the one missed by Tim Berners-Lee, and the one which the Metaverse designers are apt to miss as well: if the medium is indeed the message, then a gaze-biased Metaverse misses at least 80% of it.
This argument could be made much stronger. Any VR/AR that falls back on vision will also fall back on the legacy which the concept of the gaze brings with it. Who sees? Who is seen? Whose gaze dictates the perspective – who is rendered invisible – who is objectified by the gaze? These problems won’t go away just because we’ve got a brand new name for our shiny new platforms.
The math isn’t in favor of the sight-based Metaverse, even if we only decide to count the senses. If we go beyond them – attempting to dream up a Metaverse which relies on critical thinking or emotional intelligence to build and re-build itself, instead of replicating the sensory experience alone – then Zuck’s vision becomes even more limited. The limitations don’t end there, however.
Metaverse Problem 2: lack of diversity is a feature, not a bug
All these beefy Caucasians with guns. Get enough of them together, looking for the America they always believed they’d grow up in, and they glom together like overcooked rice, form integral, starchy little units. With their power tools, portable generators, weapons, four-wheel-drive vehicles, and personal computers, they are like beavers hyped up on crystal meth, manic engineers without a blueprint, chewing through the wilderness, building things and abandoning them, altering the flow of mighty rivers and then moving on because the place ain’t what it used to be. The byproduct of the lifestyle is polluted rivers, greenhouse effect, spouse abuse, televangelists, and serial killers. But as long as you have that four-wheel-drive vehicle and can keep driving north, you can sustain it, keep moving just quickly enough to stay one step ahead of your own waste stream.
Neal Stephenson, “Snow Crash”
Google’s machine learning models, which underpin most of its AI research, have been criticised for not being diverse enough.
Facebook’s “echo chamber” effect is reflected not only in how it leads its users to misinformation, but also in how it hires and retains its employees.
Twitter’s own research recently acknowledged that, if left unchecked, its AI algorithm will become biased towards boosting and amplifying voices from one side of the political debate over another.
I could go on. Read Shoshanna Zuboff’s “Surveillance Capitalism” for more of this. For the purposes of this piece, here’s my point:
The mostly-white, mostly-male, mostly-American perspective will be baked into every kilobyte of the current Metaverse. The people who’ll plan, design, code, implement, test, market, and sell you Metaverse will be the ones deemed “appropriate” by Meta, Google, Pearson, or any other company big enough to jump on the bandwagon. The research they’ll use will be aimed at where the money is, and where the corporate locus of power resides. The AI models used to build and govern the Metaverse will be trained on white, male, and American faces, ideas, language, accents, and norms.
It would have been prudent to at least slow down the march towards the Metaverse a little; to listen to people like Timnit Gebru, or Zuboff; and to change, at least a little bit, the way in which Big Tech thinks, codes, learns, and operates. Some hopeful indications already show that such changes could be possible.
We’re not waiting; nobody’s waiting. Facebook needs a new brand image, and all Big Tech firms need new frontiers, now. Ready or not, Metaverse is coming, and it’s as tech-bro-heavy as Zuck’s first big idea.
In addition to the limitations of the current Metaverse, then, we’re about to be sold a project whose diversity leaves a lot to be desired, on every level. The thing which is making it much worse is the selling of it.
Metaverse Problem 3: buying into a top-down privacy nightmare
When you are wrestling for possession of a sword, the man with the handle always wins.
Neal Stephenson, “Snow Crash”
Let’s say that Mark Zuckerberg’s Metaverse won’t be the only game in town. Let’s assume that a few other companies – some media conglomerates, some big publishers, a few internet providers – also build their alternatives. Does this change anything?
No: the problems will remain. As long as there’s a corporate entity at the other end of the End User Licence Agreement (of course there will be an EULA…what did you imagine?), the following can be guaranteed to happen.
First, users will automatically assume the roles of passive consumers. Interactions and modifications will be numerous but limited, and rolled out in a fashion designed to retain and maximise “engagement” (oh look, my avatar now does a “thumbs-up” in his new branded t-shirt!). What we get is what the company deems appropriate to release – and not a megabyte more.
Second – related to the above: the environment will never be 100% open. The source code will never be made publicly available. Users won’t be trusted to build their own versions, for whatever purpose. Those who try will be breaking the law.
Third – and likely one of the main reasons for the second: any corporate-issue Metaverse will be using its users, in the same way in which Facebook, Google, Twitter and other media companies have been treating their user base as raw resource. What we do in a Facebook Metaverse will help Facebook learn more about us, and serve us better ads at more appropriate moments. That’s why the source code will be kept deliberately obscure – cut out the bits that spy on us, and no Metaverse is worth its maintenance fee any longer.
Finally – each corporate flavor of the Metaverse will be carefully positioned to maximise its (and its tech partners’) gains in the key demographic (whatever that is, in each instance). This means corporate oversight over appropriate content. It means corporate governance over what is and isn’t possible in each Metaverse instance. It means sloppy coding, and constantly evolving system requirements – if you need to buy new gear to keep up with the Metaverse, the Metaverse is happy to suggest recommended purchases there and then. And, crucially, it means an addictive, sugary-sweet, pain-free experience. This is fine if that’s what you’re after – it’s not fine if you can’t afford to pay, or want a more unconventional narrative, or if you’re after a low-tech implementation.
This is where we’ve got to, then: every Zuckerberg-like incarnation of Metaverse is likely to a) severely limit our experiences, sensory or otherwise, favoring the gaze; b) offer a vision based on unfair, flawed, biased and out-dated understanding of the world, and c) repeat the corporate, privacy-based, technological, and legal oppression which has led to disappointing outcomes of other technological advances so far.
Lovely.
What is to be done?
See, the world is full of things more powerful than us. But if you know how to catch a ride, you can go places.
Neal Stephenson, “Snow Crash”
The Amazons and Googles of our current timeline weren’t inevitable in the early days of the Web. Their current monopoly is a result of several overlapping factors (again, read Zuboff!).
For Zuckerberg’s Metaverse, and all those to follow, the situation is at once different and similar.
It’s different, because Big Tech’s headstart in just about any game involving bits and money is hard to overcome.
It’s similar, though, because their success isn’t inevitable. Metaverse may be the next best thing; it may be a flash in the pan; or it may catch on, but none of the big corporate players will be the winners.
This means that there is still time for everyone to get onto the playing field. The big boys will be there soon, and they’ll make it all about them. But the playing field is big enough for many Metaverses. Sure, Zuck’ll have his own.
But I want a metaverse for learning. Several, in fact – a well-governed metaverse for young learners to explore safely, and a well-amped-up metaverse for adult learners to get their hard knocks and resilience from. I want a non-corporate metaverse for sharing skills and practices which will help communities live, work, feed and defend themselves in the rapidly heating world.
I want a metaverse for makers, DIYers, those who cherish old technologies. I want a punk metaverse. I want a literary metaverse. And, hell yeah – I want a myriad of adult content metaverses, which won’t need to deal with the stigma of “undesirables” and the burden of being deplatformed by the big social media.
Open-source VR and AR solutions still exist. VR goggles can be built on the cheap (yes, they suck – go and build better ones!). The technology still runs on code which anyone can teach, learn, modify, and master – hopefully, to look at the remaining four senses, too.
If we’re to keep learning, teaching, and exploring, we need more metaverses which aren’t dreamed up by the usual suspects.
Step back, Zuck: these are the folks I want designing my Metaverse
Besides, interesting things happen along borders—transitions—not in the middle where everything is the same.
Neal Stephenson, “Snow Crash”
Teachers and learners in the Philippines, who are struggling to stop the world’s biggest post-pandemic learning loss.
The legal and advocacy teams who, for the past 20 years, have been fighting to keep e-books accessible to people with print disabilities – the same sysiphean struggle every three years.
The Fairphone people + the Framework people + the Raspberry Pi people + the Nintendo GameBoy people + the 8-bit ghost of Sir Clive Sinclair.
Edward Wong Hau Pepelu Tivrusky IV.
My mum + your mum.
You.
(Photo by Daniam Chou on Unsplash)
I am an editor, author, translator and teacher based in the UK.
I am always looking to get involved in new projects. My areas of expertise:
ELT publishing –Â print and digital
Language learning
Translation –Â POL-ENG-POL, non-fiction
Editorial project management
Does it look like we could work together? Download my CV or get in touch via e-mail.