Serendipity and the Metaverse
A new post in my series on workflows for thinking, and a new essay looking at Mark Zuckerberg's vision for the next revolution in computer interfaces.
A few updates here at Adjacent Possible before the weekend…
Earlier today I published the latest installment in my series on designing a workflow for thinking: “Seven Types of Serendipity.” It opens with a classic quote from one of the 20th-century’s most eclectic thinkers:
“One thing a person cannot do, no matter how rigorous his analysis or heroic his imagination,” the Nobel laureate Thomas Schelling once observed, “is to draw up a list of things that would never occur to him.” Schelling’s original point was part of a larger argument defending the intellectual exercise of wargaming. Well-designed war games—like the ones created by the RAND Corporation, where Schelling worked in the late 50s—were a way of helping military strategists stumble across new opportunities or vulnerabilities that would have never otherwise occurred to them. But there’s a general applicability to Schelling’s quip as well: so much of trying to think imaginatively is figuring out ways to trick your brain into coming up with an idea that would have not occurred to it otherwise. Now, it’s true that one way to do this is obvious: go read—or have conversations with—other people, and borrow whatever interesting new ideas you uncover from those interactions. But I think there’s a more general problem here, one that requires more devious methods to get around. And it boils down to the question: how do you surprise yourself? What part of your workflow supports unplanned discoveries?
The post goes on to explore different kinds of serendipitous discovery, with examples that range from Lin-Manuel Miranda accidentally stumbling onto a Hamilton melody by selecting the wrong measure in Logic Pro, to the 18th-century chemist Joseph Priestley inventing soda water because he happened to move next door to a brewery. As you may know, that series is only available to paying subscribers, so if you’re interested, you might want to subscribe so you can follow along. I’ve got some exciting new additions to the series planned for the next month or two.
In other news, The Wall Street Journal asked me to contribute an essay to their Tech Year In Review about one of 2021’s defining buzzwords: the metaverse. I had a lot of fun writing this piece — in a way it brought me back to my first book, which is now almost a quarter of a century old: Interface Culture. Some of you might have seen the elaborate concept video that Facebook/Meta produced to showcase Zuckerberg’s vision for the metaverse. There are many cringe-inducing segments, and of course there’s the whole issue of whether Facebook is the right company to build the next generation of software interfaces. But the video does make one thing clear: Zuckerberg really does seem to believe that the metaverse—which he seems to be defining in very VR-centric ways—is the next big paradigm shift in our software interfaces. So I tried to wrestle with that question directly.
Modern computing history has seen three “inevitable” paradigm shifts in how we interact with our digital devices, advances that began as fringe experiences but swiftly became ubiquitous: the graphic interface, popularized by Apple Computer with its introduction of the Macintosh in 1984; the hypertext links of the World Wide Web, which went mainstream in the ’90s; and the multitouch interface introduced with the iPhone in 2007, now almost without exception the standard interface for all mobile interactions. Will the metaverse eventually find its way into this pantheon?
I am generally pretty skeptical about this transition, in part because Zuckerberg keeps talking about it as a radical improvement in our sense of “presence” online. I think both the mass adoption of—and the frustrations we all feel about—Zoom has shown that there is still a massive amount of headroom left in terms of improving the quality of straightforward video/audio-based virtual meetings, improvements that would be vastly more human-centric than anything involving VR goggles:
For most of us, I suspect, presence doesn’t mean dressing up as an oversize robot floating in a space station. It means experiencing our friends and family through the full bandwidth of human connection: facial expressions, subtle vocal cues, all experienced in an environment that we can feel and touch with our unmediated senses. We aren’t far from a world where we can have remote conversations with people where their images are captured in 8K video and full-fidelity sound, and displayed life size on a wall-mounted screen. The experience wouldn’t involve abandoning the real world for the metaverse; instead, other rooms populated by actual people could simply open up adjacent to ours, creating a powerful illusion of presence without goggles or glasses.
I also wonder—if VR does become more mainstream—if it will actually start re-shaping our physical spaces.
In the Meta promotional video, we see numerous sequences where users are actively exploring a fully virtual environment—particularly when playing games or exercising. Needless to say, this requires a lot of empty space to make it work in practice. Having a fencing match with a virtual opponent in the middle of a tropical rainforest may sound appealing—until you accidentally trip over your real-world coffee table.
It is true that VR technology can scan rooms to create boundaries to protect you from unwanted collisions, but if we’re going to be doing extensive physical activity in the metaverse, we’re going to need a lot of empty space, preferably lined with Nerf-like surfaces to soften the inevitable blows.
There is some precedent for this, of course. The rise of television triggered changes in home design that ultimately became second nature to us. Perhaps we will eventually have empty “imagination rooms” in our homes where we can explore the metaverse without physical impediments.
You can read the whole piece here. May your weekend be filled with actual presence!
sbj