Early Thoughts on My Time Inside



Over the last three weeks I’ve had the chance to spend significant time in Apple Vision Pro. While my full review is still marinating, I want to share some early thoughts I’ve had while using the headset and what it means overall for the XR industry.

It has to be said that anyone who doesn’t recognize how consequential the launch of Vision Pro is fundamentally doesn’t understand the XR industry. This is the biggest thing to happen since Facebook bought Oculus in 2014.

It’s not just Vision Pro itself, it’s Apple. Everything about the device suggests the company does not consider the headset a tech demo. It’s in this for the long haul, likely with a forward-looking roadmap of at least 10 years. And even then, it’s not that Apple is magical and going to make the best stuff in the industry all the time; it’s that the things it does well are going to set the bar for others to compete against, resulting in accelerated progress for everyone. That’s something Meta has needed for a long time.

The moment feels remarkably similar to when the iPhone launched 17 years ago. At the time a handful of tech companies were making smartphones that mostly catered to enterprise users, a relatively small subset of users compared to all consumers. Then the iPhone came along, designed as a smartphone that anyone could use, and most importantly, a smartphone that people would want to use.

And while the iPhone did less than a BlackBerry at launch, it steadily caught up to the same technical capabilities. Meanwhile, BlackBerry never caught up to the same ease-of-use. Less than 10 years after the launch of the first iPhone, BlackBerry had been all but put out of the smartphone business.

Compared to Quest, Vision Pro’s ease-of-use is so very much like the original iPhone vs. the Blackberry that it’s not even funny. Quest’s interface has always felt like a sewn-together patchwork of different ideas, offering little in the way clarity, intuitivity, and cohesion.

What Apple has built on Vision Pro from a software standpoint is phenomenally mature out of the box, and it works exactly like you’d expect, right down to making a text selection, sharing a photo, or watching a video. Your decade (or more) of smartphone, tablet, and PC muscle memory works in the headset, and that’s significant. Apple is not fooling around, they knew Vision Pro would be the first headset of many, so they have built a first-class software foundation for the years to come.

Luckily for Meta, it has a much more diversified business than BlackBerry did. So it’s unlikely that they’ll get pushed out of the XR space—Zuckerberg isn’t going to let that happen after all this time. But Meta’s piles of cash no longer guarantees dominance of the space. Apple’s presence will force the one most important thing that Meta has failed to do in its XR efforts: focus.

Even if there wasn’t a single native Vision Pro app on the headset at launch, its impressive how well most iPad and iPhone apps work on the headset right out of the box. Technically speaking, the apps don’t even though they don’t realize they’re running on a headset.

As far as the iPad and iPhone apps know, you’re using a finger to control them. But in reality you’re using the headset’s quite seamless look+pinch system. Scrolling is fluid and responsive. Drag and drop works exactly like you’d expect. Pinch zoom? Easy. In a strange way it’s surprising just how normal it feels to use an iPad app on Vision Pro.

There’s more than 1 million iPad and iPhone apps which can run on Vision Pro out of the box. That means the vast majority of the apps you use every day can be used in the headset, even if the developer hasn’t created a native VisionOS app. As a testament to Apple really thinking through the technical underpinning and leveraging its existing ecosystem, apps which expect a selfie cam are instead shown a view of your Persona (your digital avatar). So apps with video calls or face-filters ‘just work’ without realizing they aren’t even looking at a real video feed.

And it’s really impressive how you can seamlessly run emulated iPad apps, flat Vision Pro apps (called Windows), and 3D Vision Pro apps (called Volumes), all in the same space, right next to each other. In fact… it’s so easy to multitask with apps on the headset that one of the first bottlenecks I’m noticing is a lack of advanced window management. It’s a good problem for the headset to have; there’s so many apps that people actually want to use—and they can run so easily side-by-side—that the software isn’t yet up to the task of organizing it all in a straightforward way.

For now apps pretty much just stay where you put them. But sometimes they get in the way of each other, or open in front of one another. I expect Apple will tackle this issue quite soon with some kind of window manager that’s reminiscent of window management on MacOS or iPadOS.

Being able to run the apps you already know and love isn’t the only benefit that Apple is extracting from its ecosystem. There’s small but meaningful benefits all over the place. For instance, being able to install the same password manager that I use on my phone and computer is a gamechanger. All of my credentials are secured with OpticID, and can be auto-filled on command in any app. That makes it a breeze to sign into the tools and services I use every day.

And then there’s things like Apple Pay which already knows my credit card info and shipping address. On supported apps and websites, buying something is as quick as double-clicking the digital crown to confirm the purchase. Compare that to typing your info into each individual app through a slow virtual keyboard.

And then there’s AirDrop, FaceTime, etc. It really adds up and starts to make it feel like you can do everything you want to do inside the headset without needing to take it off and go to your phone or computer.

It’s clear that Apple has spent a long time obsessing over the details of the Vision Pro user experience. Just one example: after I set up the headset it had already integrated the custom HRTF for personalized spatial audio that I had scanned for my AirPods on my iPhone a year or two ago. So without any additional step I’m getting more convincing spatial audio any time I’m using the headset.

So there’s a lot to like about the headset out of the box. It’s capable of so much of what you already do every day—and then it mixes in interesting new capabilities. But as much as you might want the headset to be your everything-device, there’s no doubt that its size and weight are bottlenecks to that urge. Vision Pro’s comfort is in the same ballpark as similar headsets in its class (though it might be a little more difficult to find the most comfortable way to wear it).

One of the most interesting things to me about Apple Vision Pro is that it shows that price isn’t what’s holding headsets back from being smaller and more comfortable (at least not up to $3,500). Apple didn’t have any kind of novel and more expensive tech to make AVP smaller than the $500 Quest 3, for instance.

There’s a path forward, but it’s going to take time. This ‘holocake’ lens prototype from Meta, which uses holographic optics, is probably the next step on the form-factor journey for MR headsets. But R&D and manufacturing breakthroughs are still needed to make it happen.

In the end, headsets aiming for all-day productivity will need to pass the “coffee test”— meaning you can drink a full mug of coffee without bumping it into the headset. I’m not even joking about this—even if it’s otherwise perfect, wearing something that’s going to prevent you from doing basic human things like drinking is a tough trade-off that most won’t make.

– – — – –

So there’s a smattering of thoughts I’ve had about the headset—and what it’s existence means more broadly—so far. You can expect our full Vision Pro review soon, which will include a lot more technical detail and analysis. If you’ve got questions for that full review fire away in the comments below!

We will be happy to hear your thoughts

Leave a reply

Funtechnow
Logo
Compare items
  • Total (0)
Compare
0
Shopping cart