Vision Pro is touted by Apple as a new era in spatial computing – what does that even mean?

Last summer, Apple announced Vision Pro as a ‘revolutionary spatial computer’. Vision Pro launched earlier this month, but many people are wondering “What the hell is spatial computing?” That’s a good question, and I think I have the answer. Combining the best elements from a computer, a headset and a smartphone they have created a new kind of computer.
Plain Language Vs. Word Salad
In my previous post, I made the case that language is important. In that article, I used VR as a case in point. Decades after Virtual Reality was first coined, it still hasn’t caught on with the masses. In a way, it has only gotten more confusing. In recent years, we’ve seen Augmented Reality, Mixed Reality, and even Extended Reality. That’s AR, MR, and XR in addition to the original VR. All those new terms essentially describe VR with some form of awareness of the world around you. There has to be a better way.
Early on, Apple recognized that VR can be very isolating. You put on a VR headset, and are cut off from the world around you until you physically take the headset off. As early as 2016, Apple CEO Tim Cook was praising augmented reality (which adds content onto the world around you) because it did not isolate the user. Apple built support for AR into iOS and iPadOS years ago, and has a number of tools for developers. Spatial computer feels like plain language compared to all the word salad that came before.
VR Is Dead, Long Live VR!
That may sound like a bold proclamation, but let me explain. A pure VR headset is one that it virtual reality only. Newer headsets have cameras and possibly also sensors to bring in the reality. Since all the AR/MR/XR headsets have some way to turn off the outside and be virtual reality only, most (if not all) the headsets sold today are AR/MR/XR. So VR headsets are dead, but immersive virtual reality apps and experience are alive and well.
Apple Vision Pro is essentially a super-powered AR/MR/XR headset on steroids. Since it doesn’t fit the definition of any of the things that came before it, they are calling it spatial computing and raising the bar in some important ways.
Going Big On Sensors
To answer “What Is Spatial Computing?” we need to better understand what sets Apple’s product apart. In a nutshell, sensors and input. Those fancy ski goggles are loaded with them. It has a total of 12 cameras that capture everything from the world around you to precise hand movements to what your eyes are focusing on. A LIDAR scanner to sense walls, floors, doors, and other objects. Four of something they call “Inertial Measurement Units” but I call fancy motion sensors. An ambient light sensor (so it can automatically adjust the display). And even a flicker sensor that detects the flicker from lights and screens.
All those sensors and cameras mean that Vision Pro does more than bring in the reality. It also detects objects, motion, hand gestures and more. You don’t need to think about it, and you don’t need to need to use special controllers. It is spatially aware of your surroundings.
Putting The Computing In Spatial Computing
A spatial computer is not just a headset powered by an off-the-shelf smartphone chip. They design their own chips, where other companies mostly just buy off-the-shelf chips and slap them together. In the case of Vision Pro, there is a whole computer in there. And then some! On the technical side, it’s Apple’s M2 chip with an 8-core CPU and 10-core GPU. It also has a 16-core Neural Engine. This video does a good job of explaining what makes a neural engine so special. It also has a chip they call an R1, which is 100% dedicated to reducing latency. In a headset, this means how long between when something happens in the world around you, and when you see it in the headset. According to published comparisons, Vision Pro is nearly 4 times better than the competition on latency.
Market Strategy
When Apple introduces a new product, there is usually a loud chorus from one corner of the internet or another crying about how somebody else had it first. They often are not wrong, but they are missing the point. Apple’s market strategy has never been to be first to market. The strategy is about being best to market.
There were MP3 players before iPod. There were smartphones before iPhone. Tablets before iPad, smart watches before Apple Watch. And of course, there were headsets before Vision Pro. Nothing that came before really fit the bill of a spatial computer, though.
Apple has been working on their headset for a long time. According to this report, the first patent related to Vision Pro was granted way back in 2007. That still only paints an incomplete picture. Some of the tech in Vision Pro is stuff they carried over from iPhones, and a lot of that DNA is intermingled with Macintosh.
Spatial Computing Ticks All The Right Boxes
The processing power of a computer. The stunning displays and sound. All the sensors and cameras. Apple has ticked all the right boxes and created a computer that is spatially aware of your surroundings.
By bringing all its experience to bear as a company that designs chips and makes computers and software and smartphones to bear, they have an advantage over the competition. The result is the first spatial computer with all those features and functionality. It does the heavy lifting in a lot of areas so that the user doesn’t have to. It just works.


Pingback: Apple Vision Pro - A New Era - trilo.org
Pingback: What Is Foveated Rendering? - trilo.org
Pingback: Eye Tracking: A Peek Into Where You Look - trilo.org
Pingback: How Data Mining Threatens Your Privacy - trilo.org