What is ‘spatial computing’ and why is Apple pushing the new buzzword?

Now that Apple’s highly anticipated Vision Pro headset has hit store shelves, you’ll likely see more people wearing the futuristic glasses that are supposed to usher in the era of “spatial computing.”

It’s an esoteric form of technology that Apple executives and their marketing gurus are trying to spread into the mainstream.

This while avoiding other, more commonly used terms such as augmented reality (AR) and virtual reality (VR) to describe the transformative powers of a product touted as potentially monumental as the iPhone released in 2007.

“We can’t wait for people to experience the magic,” Apple CEO Tim Cook said Thursday while discussing the Vision Pro with analysts.

The Vision Pro will also be among Apple’s most expensive products at $3,500 (€3,255) – a price that has most analysts predicting the company will sell only 1 million or fewer devices in its first year.

But Apple sold only about 4 million iPhones in its first year on the market and now sells more than 200 million annually. So there is a history of what initially appears to be a niche product becoming something that becomes enmeshed in the way people live and work.

Spatial computing: a future everyday word?

If that happens with the Vision Pro, references to “spatial computing” could become as ingrained in modern language as mobile and personal computing – two previous technological revolutions in which Apple played an integral role.

So what is spatial computing?

It will change the interaction between humans and computers, and eventually every interface – be it a car or a watch – will become spatial computing devices.

It is a way of describing the intersection between the physical world around us and a virtual world manufactured by technology, while humans and machines can harmoniously manipulate objects and spaces.

Performing these tasks often involves elements of AR and artificial intelligence (AI) — two subsets of technology that help enable spatial computing, says Cathy Hackl, a longtime industry consultant who now runs a startup working on apps for the VisionPro.

“This is a crucial moment,” Hackl said.

“Spatial computing will allow devices to understand the world in ways they have never been able to do before. It will change human-computer interaction, and eventually every interface – be it a car or a watch – will become spatial computing. devices”.

As a sign of the excitement surrounding the Vision Pro, more than 600 newly designed apps will immediately be available for use on the headset. according to Apple.

The range of apps includes a wide selection of television networks, video streaming services (although Netflix and Google’s YouTube are notably missing from the list), video games, and various educational options.

On the work side, video conferencing service Zoom and other companies offering online meeting tools have also built apps for the Vision Pro.

Sinister side of spatial computing

But the Vision Pro could reveal another troubling side of the technology if its use of spatial computing is so compelling that people will see the world differently when they’re not wearing the headset and come to believe that life is much more interesting if you put it through looking at the glasses. .

That scenario could worsen the screen addictions that have become endemic since the iPhone’s debut and deepen the isolation that digital dependency often cultivates.

Apple is far from the only leading technology company dabbling in spatial computing products.

In recent years, Google has been working on a three-dimensional videoconferencing service called “Project Starline” which is based on ‘photorealistic’ images and a ‘magic window’, so that two people sitting in different cities feel like they are in the same room together.

But Starline still hasn’t been widely released.

Facebook’s parent company, Meta Platforms, has also been selling the Quest headset for years, which can be thought of as a spatial computing platform, although that company hasn’t positioned the device that way until now.

Vision Pro, on the other hand, is backed by a company with the marketing skills and customer loyalty that tend to create trends.

While it could be heralded as a breakthrough if Apple realizes its vision with Vision Pro, the concept of spatial computing has been around for at least two decades.

In a 132-page research paper on this topic, published in 2003 by the Massachusetts Institute of Technology, Simon Greenwold argued that automatic toilet flushing would be a primitive form of spatial computing.

Greenwold supported his reasoning by pointing out that the toilet “senses the user’s movement to activate a flush” and “the space in which the system turns on is a real human space”.

‘Minority report’

The Vision Pro is of course much more advanced than a toilet. One of the most attractive features of the Vision Pro is its high-resolution screens that can play back three-dimensional video recordings of events and people, making the encounters seem like they are happening all over again.

Apple has already laid the groundwork for sales of the Vision Pro by including the ability to record what it calls “spatial video” on the premium iPhone 15 models released in September.

Apple’s headset also responds to a user’s hand gestures and eye movements in an attempt to make the device resemble another piece of human physiology.

While wearing the headset, users can also use just their hands to pull up and arrange a series of virtual computer screens, similar to a scene with Tom Cruise in the 2002 film ‘Minority Report’.

Spatial computing “is a technology that begins to adapt to the user rather than the user having to adapt to the technology,” Hackl said.

“It should all be very natural.”

It remains to be seen how natural it will seem when you sit down to dinner with someone else wearing the glasses, instead of occasionally staring at their smartphone.

Leave a Comment