Regional
What wearing Apple’s Vision Pro headset does to our brains
Apple’s mixed reality headset heralds a new era of “spatial computing.” We are not ready.
The tech world got very fired up in 2013 when Google released a video of a wild concept. It showed a first-person view of a guy walking around Manhattan, texting friends, following map instructions, making a video call.
It was all pretty normal â except instead of pulling a phone out of his pocket, the texts, maps, and videos appeared to float in front of his eyes.
The gadget that would make all this possible was called Google Glass, a $1,500 device you wore like glasses, with a tiny display placed in front of one eye and a camera that could see (and record) everything you could see. You could theoretically go about your daily routine, giving voice commands to the internet-connected headset and making bright distracting screens a thing of the past.
Itâs a dream the tech industry is still chasing. There are a few names for this concept. Mixed reality and augmented reality are the generic ones. Metaverse, popularized by Mark Zuckerberg when he renamed his company Meta, is the one youâve probably heard of the most. Spatial computing is the name Apple recently coined with its $3,500 Vision Pro headset, which goes on sale February 2.
Whether it succeeds or flops, as Google Glass ultimately did, the Apple Vision Pro will bring lots of fresh attention to the mixed reality concept. How that attention shapes the development of the technology could dictate how we use computers in the future. Apple reportedly sold as many as 200,000 units in the Vision Pro presale, a sign that at least a few people are interested in spending a lot of money to see what Apple thinks we eventually should do. Will we all walk around town wearing headsets and taking video calls while we watch the sunset? Or will a small group of us end up playing video games in a completely â and probably lonely â virtual world?
Itâs not clear what Apple thinks you should do with its mixed reality headset right now. The companyâs marketing materials seem to suggest you should use it in your living room like a fancy TV that can also make futuristic video calls. The device has not one but 12 cameras on the front of it â not to mention a LiDAR sensor, six microphones, and a TrueDepth camera that can scan faces. Like the Meta Quest, the new Apple headset can do fully immersive virtual reality, but itâs also capable of combining virtual elements with images of the real world, kind of like the Google Glass concept did.
But whereas Google Glass let viewers look out at the real world with minimal interference, the Apple Vision Pro headset essentially turns the world into a giant screen. Inside the headset there are two 4K micro-OLED screens with a staggering 23 million pixels to recreate the real world outside the wearer as a virtual one. All those cameras, sensors, and screens combine to create the headsetâs marquee feature, one that truly differentiates it from past smart glasses: real-time passthrough video.
Passthrough, which also exists on the Meta Quest 3, allows the wearer to see the real world while wearing the goggles. Put simply, your eyes seem to pass right through the opaque front of the headset so you can see the room youâre sitting in and even interact with other humans. Itâs also a way to get a taste of that Google Glass fantasy, where you can walk around with your face computer, looking at a world with a useful digital layer superimposed upon it. The big problem, of course, is that nobody knows how looking at the world through a screen and only a screen could screw up your brain and your relationship with society.
Appleâs vision and its inevitable distortions
While itâs tempting to wax philosophical and start talking about how this echoes Platoâs Allegory of the Cave, there are more practical questions about what the Apple Vision Proâs arrival will mean for the future of computing. Many of us are already using computers â namely by craning our necks to look down at the miraculous slabs of glass we call smartphones â almost all the time. The smartphone freed computing from the computer and turned it into something we can carry with us everywhere. Theyâve already changed how we see and interact with the real world, as anyone whoâs watched someone disappear into their iPhone knows. So if a radically new device, like a headset or set of glasses, becomes the dominant computing technology, it will change how we see the world. The question is how.
A group of researchers at Stanfordâs Virtual Human Interaction Lab have been trying to figure out the answer. The researchers tested out the Apple Vision Pro alongside a range of other headsets with passthrough video capabilities, including the Meta Quest 3. The researchers explored the psychological impacts of living life in passthrough video by wearing these headsets for hours at a time, even venturing out in public to try things like riding a bike or eating a meal while seeing through a computer. They shared their findings in a new paper that reads like a cautionary tale for anyone considering wearing the Vision Pro anywhere but the privacy of their own home.
One big problem with the passthrough video technology is that cameras â even ones as high-tech as those in the Vision Pro â donât see the way human eyes see. The cameras introduce distortion and lack the remarkable high resolution in which our brains are capable of seeing the world. What that means is that everything looks mostly real, but not quite.
The Stanford researchers explained that objects close to your face end up looking unusually large in passthrough, so eating with a fork is a real nightmare. Headset wearers also tended to underestimate distances, which makes pressing elevator buttons a fun guessing game. And because nothing looks quite right, headset wearers âtended to move tentatively and slowly while walking.â You can almost picture them looking lost and overwhelmed at all times.
The time spent observing the world through the headset isnât the most alarming part of the experiment. Itâs once they took the headsets off that the researchersâ minds really started playing tricks on them. The human brain can generally adapt to changes in vision and effectively correct for any distortion, and itâs possible that Apple could push software updates that solve the distortion problem. But when the headsets came off, it took time for the researchersâ brains to return to normal, so theyâd misjudge distances again. Many also reported symptoms of simulator sickness â nausea, dizziness, headaches â that will sound familiar to anyone whoâs spent much time using a VR headset.
Then there are the social effects. While wearing Google Glass in public invited ridicule a decade ago, headsets like the Meta Quest and, soon enough, the Apple Vision Pro will look more familiar to present-day observers. You donât see people wearing them in public a lot, but itâs not out of the question. The researchers trying to live life with passthrough-capable headsets said they didnât experience many negative reactions to what they were doing. Instead, they felt incredibly self-conscious to be navigating the world through goggles, and some experienced something called âsocial absence.â The researchers described this sensation in the paper: âPeople in the real world simply felt less real. ... Being in public could sometimes feel more like watching TV than interacting face-to-face. It was often embarrassing to interact with strangers while wearing a headset.â
Jeremy Bailenson, who led the study and is the founding director of Stanfordâs VHIL, has spent the last 25 years studying the psychological effects of virtual and mixed reality. He doesnât think this technology should be used all day long or even every day, in part because of how socially isolating it can be. His lab has developed a framework it calls DICE. Thatâs an acronym for âdangerous, impossible, counterproductive, or expensiveâ â the use cases that make the most sense for employing VR and mixed reality technology.
âTraining firefighters, rehabilitating stroke victims, learning art history via sculpture museums, time-traveling to understand climate change are all examples that fit squarely in DICE,â Bailenson told Vox. âChecking your email, watching movies, and general office work do not. Letâs use the medium when it earns its keep.â
The inevitable growing pains of technological process
Setting aside the Platoâs cave of it all, itâs obvious that, on a practical level, this new future of face computers is, at the very least, going to be awkward for everyone.
The Apple Vision Pro may be a marvel of modern industrial design befitting its four-figure price tag, but it still looks like a gaudy set of ski goggles. Apple even tried to make them feel less awkward for those not wearing them with a feature called EyeSight that shows the eyes of the wearer on a display built into the front side of the headset, but people still think itâs creepy. Meanwhile, in an effort to make the headset less bulky, Apple gave it a tethered external battery that people have to carry in their pockets, ensuring a bit of a cyborg vibe for the wearerâs outfit. Itâs all far from natural.
David Lindlbauer, a professor who leads the Augmented Perception Lab at Carnegie Mellon University, doubts that weâll see people talking to their friends while wearing Vision Pro headsets at coffee shops in the near future. Itâs simply strange to talk to someone whose face you canât fully see.
âSocially, weâre not used to it,â Lindlbauer said. âAnd I donât think we even know if we ever want to get used to this asymmetry in a communication where I can see that Iâm me, aware of the device, can see your face, can see all your mimics, all your gestures, and you only see a fraction of it.â
But the situation might be temporary. Letâs not forget that it wasnât that long ago when wireless earbuds, including Apple AirPods, meant that people were standing on street corners seemingly talking to no one when in fact they were on a phone call. If the hardware continues to get better and smaller, we might see people standing on street corners wearing regular-looking eyeglasses, staring off into space, tapping their fingers in the air because their glasses are projecting a digital layer onto the physical world. They could be looking for a friend down the street while simultaneously watching a TikTok video and talking to a voice assistant. That kind of thing will definitely look unusual the first time you see it.
This is still the big dream for computing. Google Glass did not end up being the mixed reality device that changed the world, but you have to assume that Apple is investing billions in mixed reality. It will likely take years for the Vision Pro and its successors to take off like the iPhone or iPad, but one day, Apple wants to see how people use a headset that turns your entire world into a computer. Thatâs actually what Apple Vision Pro promises to do. âThe era of spatial computing has arrived,â said Apple CEO Tim Cook in early January. The Vision Pro, he said, âwill redefine how we connect, create, and explore.â But for now, the catch is it will probably just do those things while youâre inside your home.
Mixed reality â or spatial computing or whatever you want to call it â is still an experiment. Weâre not ready for it on a number of levels. The hardware isnât there yet or we wouldnât have to wear heavy ski goggle-shaped headsets and carry battery packs in our pockets. Perhaps if the display technology could replicate the resolution of human eyesight with no distortion or limitations, we would avoid all the negative consequences of passthrough video. The world will also need to ease into the idea of face computers, just like it had to get used to the idea of personal computers, and just like weâre still getting used to the ubiquity of smartphones. And our brains will need to do some adjusting to understand where the virtual world ends and the real one begins.
So donât expect to run into too many people wearing Vision Pro headsets in public just yet. Even Apple seems to be gently discouraging the idea. Most of the promotional material for the Vision Pro sets it up as a productivity device (a way to make your spreadsheets look like theyâre in your living room) or an entertainment system (a way to watch movies in 3D on screens the size of your entire field of vision, although Netflix, YouTube, and others will not have apps ready when the headset launches). So itâs chiefly an indoor toy. Tech analyst Benedict Evans noticed something funny in the videos Apple released to developers last summer to showcase what the Vision Pro can do: âApple doesnât show this being used outdoors at all, despite that apparently perfect pass-through. One Apple video clip ends with someone putting it down to go outside.â
It will be a while before we get that magical headset, the lightweight glasses that put a computer screen on your world. Lindlbauer, the Carnegie Mellon professor, doesnât think weâll still be using smartphones in 15 years. What we will use will be more exciting, and how that technology will work depends on what people do with gadgets like the Apple Vision Pro. If people who buy the headset donât like seeing the world through a screen, watching the moments of their lives fly by like shadows on the wall of a cave, someone will just have to build something better. And hopefully cheaper.