Skip navigation
AI Image

What you need to know now about the Apple Vision Pro Headset

By:

DATE: June 23, 2023

Apple introduced the world to the Vision Pro headset at the Worldwide Developers Conference (WWDC) this year. The headset challenges the way we think about immersive technology and is a full computing platform powered by the same chip in the MacBook Air. This headset brings apps into your physical space, and Apple calls this “spatial computing.” While MR (Mixed Reality), AR (Augmented Reality), and VR (Virtual Reality) are not new concepts; Apple has introduced a new way to think about how humans should use computers.

Seeing the digital in the real world really does feel like the next big thing.

Joan Stern The Wall Street Journal

(https://youtu.be/bwUZUG8x2MI)

Despite the hype, immersive technologies have had little impact on the work of brand marketers and design leaders. With Apple's entrance, we can expect a seismic shift. 

The Apple Vision Pro headset merges the physical and virtual worlds by combining digital elements with the physical environment. Instead of disconnecting users from reality, Apple's headset integrates physical and virtual reality. Keep reading to learn more about what Apple did differently with the Apple Vision Pro and how brand leaders should respond. 

“Look through not at”

Tim Cook introduced the headset by saying, “Look through, not at.” This concept of looking through the headset to see our physical world isn't a new technology, but the phrase frames a way of thinking about the headset wearer. 

Before Apple's announcement, users visiting virtual worlds were forced to be represented by simplistic or cartoonish avatars. This avatar often didn't match the user's own sense of identity, especially when looking down in VR to see cartoon hands further accentuating the rift between self-identity and virtual identity. XRHealth, a company that creates VR software for mental health patients, also encountered this problem. Chief Engineering Officer, Xavi H. Oromí, explained many patients reported feeling as though their body did not belong to them.

To feel comfortable and act like yourself, you have to inhabit a body you feel ownership of. What body could be better than the one you're in? Apple's concept, “Look through, not at” placed the user at the center of the experience. Even when you are fully immersed in a virtual world, you will always see your own hands. 

“What body could be better than the one you're in?”

When you need to connect with others in a virtual meeting or FaceTime call, you use your “digital persona.” That's Apple's name for an avatar, but conceptually, it is different from an avatar. Sensors on the headset scan the user's face to create their photorealistic digital persona.

Thanks to sensors inside the headset, your own facial movement will control your digital persona. Apple's headset aligns your internal self-image and your external identity, so you can always be your authentic self. Other headset users can call in and bring their own life-size digital personas (although they will be disembodied as well. Apparently, legs in the metaverse are not just a challenge for Meta.)

Not only does “Look through, not at” capture the wearer's experience, but it also applies to the observer in the room with the wearer. When you're wearing the Apple headset, others can see your digital persona's eyes and whether you're engaged in an experience or using an app. Apple calls this EyeSight.

Here's why I believe this feature is a game changer. Recently, I was setting up a mixed-reality headset demo in New York City while my coworker and I were having a conversation. Even though I could see them through my headset, they couldn't see me, and that made for an awkward interaction. I had to lift the headset up to make eye contact with them. For headset users, you know what I'm talking about. Apple made wearing a headset a little more human.

An interface that doesn't disappoint

The rest of the headset is exactly what you'd expect from an Apple product. It has a familiar home screen and an innovative user interface floating in the air that responds to ambient lighting. The Vision Pro will launch with a full App Store thanks to out-of-the-box compatibility with most iPhone and iPad apps. The device caters to both personal and business use cases, with support for core Microsoft 365 apps at launch.

The device won't come with any controllers. Instead, it is controlled through eye tracking, hand gestures, and voice commands. To type, you can use the floating keyboard, dictate with your voice, or pair an Apple Keyboard. 

This is huge. Before joining Aquent Studios as the Metaverse Practice Leader, I worked in the Accenture Metaverse Continuum Business Group, where we onboarded 60k employees to the Meta Quest 2 headset. The controllers were not intuitive at all, and users over the age of 40 struggled to get the hang of it, even after using the headset many times.

The best controllers we have are our hands and our voice. You've mastered the use of them from birth, so there is no need to learn a new skill just to navigate the spatial computing space.

Apple's user interface feels organic thanks to the fact that your eyes are the cursor. Select with eye tracking and click by making hand movements. YouTuber Marques Brownlee tested the Apple Vision Pro and reported that it only took a few minutes before it felt like second nature.

In Apple's Developer Platforms State of the Union, Edwin Iskandar, Senior Engineering Manager of visionOS, said, “On Apple Vision Pro, users with physical and motor disabilities can interact with their device entirely with their eyes, their voice, or a combination of both.” Up to now, using headsets has been a highly active and physical process. In the past, I speculated that headsets would never be accessible for people facing mobility challenges, yet Apple's headset can be used by people who are paralyzed from the neck down. This alone should convince any skeptics of the technology's long-term staying power.

Who is the Vision Pro for?

All of these great features come at a price. The $3,499 price tag on Apple's new headset induces sticker shock which may leave our reader asking, “Who is this headset even for?” According to a source who attended a dinner with the Apple Vision Pro headset team ahead of the announcement at the AWE XR conference, the headset will target high-spending Apple consumers. Tom Ffiske from Immersive Wire speculates that the majority of experiences will operate on a subscription model and target affluent users, suggesting that brands seeking to access an affluent technophile should build experiences for the Vision Pro.

With Disney+ entering the space and promising to deliver richer experiences for watching movies or playing games than in a movie theater or at home, it could be the ideal place for any brand looking to connect with the affluent family consumer. Designers, travel, retail, healthcare, fitness, entertainment, and any DTC brand can find ways to deliver unique experiences that generate real business value.

What strategies should brands implement to build experiences for Apple Vision Pro?

It's time to define your spatial strategy. How do you want to show up? What kinds of experiences do you want to share with your audience? How will you connect your community together in a shared spatial environment?

This powerful spatial computing device is now a part of the Apple ecosystem. When deciding if your iPhone app should also work on iPadOS, you need to consider if you want to support it on the Vision Pro's Vision OS. Then you need to consider how your app will be different as a spatial experience. 

Design leaders should familiarise themselves and their teams with Apple's Principles of Spatial Design and learn the new developer tools. Vision Pro app development is different from other types of Apple app development because developers can embed 3D objects in app window frames. This makes the content truly spatial, as walking around the window frame will allow you to see every side of the 3D object. 

Our world is becoming more spatial. Define your spatial strategy and decide how you want to show up before your competition makes that decision for you. It's easier to start at the front than trying to catch up later.

Creating and implementing a spatial strategy can be challenging. Delivering spatial content requires many new skills and capabilities, and you don't have to do it alone. Aquent Studios has a background in developing spatial experiences for brands like Apple, Meta, Microsoft, NASA, and many others. Connect with us to discuss your needs around building spatial experiences and opportunities for your brand.