apple vision pro hands on review

Apple Vision Pro Hands-On: Very Good In Some Aspects, A Bit Bad In Others

Finally, I have been able to put my hands on an Apple Vision Pro. Thanks to the very kind people of the Helsinki XR Center who let me spend more than one hour with it, and my buddy Gianni who let me have another half an hour, I managed to have enough time to play around with the device to have an informed opinion about it. And I would like to share it with you, even if I’m sure that you have already read a thousand opinions about it… but mine is the only one where you will find the word “potato”, so it’s worth reading it.

Keep in mind that this is a “hands-on” post and not a full detailed review because the time I have spent on it is not enough to have a final opinion on the headset (I would have needed at least a week). So some impressions I’m writing here may actually change after intensive and prolonged use.

Design

Since I’m Italian, I always like to start with the design of the device. I would say the Vision Pro is well-manufactured and you can see it has been made with premium materials. Apple is one of the tech companies that cares the most about design and beauty and the Vision Pro reflects this.

But still, I can’t say I fully love it. The main reason behind this statement is that the lateral white plastic parts are for me totally unrelated to the rest of the body of the device, which is made with black and gray materials, usually fabric. They look out of context and in my opinion, ruin a bit the aesthetics of everything.

That said, let’s have a look at the headset. On the front, you can see the covering glass of the device, with the EyeSight display behind it.

apple vision pro
Front view of the headset

If you look with more attention, you notice that on the lower part there are the various tracking cameras and sensors.

apple vision pro
Tilted Top View

On the left side, you can see the left white plastic frame, that holds one of the speakers and the connector for the battery.

apple vision pro
Left view

On the right side, there is the other speaker, and also the crown to regulate the headset around your head.

apple vision pro
Right View

On the back, there is a soft strap that goes onto your nape when you wear the device.

apple vision pro fitting
Back view

From the top, you can have a full overview of the wearing system of the headset, plus you see the digital crown button and the top button that you can use to give commands to the headset. The top button is the one you use to turn on the device.

apple vision pro
Top View

From the bottom, you have a better view of the speakers’ holes. You also see the nose pad and you notice more sensors used to track you and the environment.

apple vision pro
Bottom View

If you look inside it, you mostly see the facemask and the lenses, which have a pretty unique shape. What surprised me about the lenses is that while I was learning them with a cloth I noticed that they were concave and not convex.

apple vision pro fitting
Inner view

Visuals

Visuals are the part that impressed me the most with the Vision Pro. But it is also the clearest example of why I defined the Vision Pro “Good Good Bad Bad” soon after I did my first tests.

The “Good Good” part is the resolution. It is simply incredible. After having handed me the headset, my friend Jussi suggested me to try the “Environments” demo, which basically enclosed me into a few enhanced 6DOF photos of places. I tried a few of them, one about the moon, and one being on some rocks close to a cliff. And I can guarantee you that it has been one of the fucking coolest demos of VR I had in my 10 years in the field: the sense of immersion was literally over the top… I’ve never felt immersed like that. The digital environment has such a high resolution, that is displayed so well to the eyes, that you can only go in awe when you try it. The subtle ambient audio completes the magic and if you try it, you feel really there. When I was on the moon, it felt so good, so relaxing. When I moved towards the cliff, I was afraid of going down. It was crazy good, I would like to return there.

Nice video, but trust me… seeing it from inside is a totally different experience

Every experience that tries to show high-quality graphics is amplified on the Vision Pro because the resolution is very good, the processor is strong enough to show good-looking 3D assets, and the colors of the display are pretty bright. I also didn’t see any screen door effect, even if I didn’t explicitly try to focus on the pixels. Everything was so magical.

The FOV is also good enough: I’ve heard that it is slightly smaller than the Quest 3 one, but in the short time I had with it, I never had any sensation of being inside binoculars or things like that.

But before you get too excited by my words and start doing this

via GIPHY

let me go to the “Bad Bad” parts. First of all, the display has some motion blur issues and things do not look that good while you rotate your head: this is especially noticeable in the passthrough (I will talk about it in a while). Then the lenses suffer from aberrations in the periphery: if you try to look at the very left or the very right, you will see that things look distorted there. It is just a tiny slice of your field of view, but a part of me kept noticing it and I found it pretty disturbing. It is not so bad as to ruin your experience, but it is like having a little stone in your shoe: you can still walk, but once in a while, it gives you discomfort.

apple vision pro lenses
The lenses of the device

It seems they optimized everything for the scenario where the user sits down and focuses on a big screen in front of him to watch a movie. For this, you just need a very high resolution in the center of your vision. We will see soon that visuals are not the only thing for which they seem to have optimized for the seated use case…

Comfort

The first thing I noticed when I grabbed the headset was that it was heavier than it seemed. While all the VR hardware companies are working hard to make headsets that are lighter and more balanced, Apple had the genius idea of putting glass and aluminum on its front side, together with all the computational units.

I guess this is the only explanation for having made the headset so front-heavy

To wear the headset, you just put it on, and then rotate a knob to tighten it around your head. The first time I put it on, I could see how its weight was heavingly pressing on my face. The standard strap was also not good enough and I had the constant sensation as if the device was not fully stable on my head, as it was about to fall at any time (this was just a sensation… it actually never slipped down). It’s also a pity that the lateral frames do not rotate up and down, so I could not make the device perfectly fit my face. Just to show you how the weight distribution is bad, I can tell you that when my friend Jussi came from a long session with AVP, he had a big red mark on his forehead. I’ve heard many of these problems are improved with the over-the-head strap, but I did not have the occasion to try it.

Apple Vision Pro and its detachable facemask
Apple Vision Pro and its detachable facemask

It’s a pity they made these poor weight distribution choices because the materials are very good and feel good on the body. The rear strap that goes around the nape feels incredibly soft and comfortable, I loved it. Regarding the facemask, it’s great that it is magnetic, so it can be easily detached and reattached. there are different facemasks for different face shapes, and this is also good to facilitate comfort.

View post on imgur.com

The battery is attached via a wire to the device but I have to tell you that rarely the wire was an issue to me. Putting the battery in the pocket and launching the Vision Pro experiences, which usually are not very dynamic, I was fine most of the time. I consider the wire a minor problem for the Vision Pro comfort.

Tracking

The positional tracking of the headset is good: you move around and the augmentations appear as fixed in the place you left them, with no noticeable wobbling even when you rotate your head. This elevates the Apple Vision Pro above many cheap AR headsets that have rather imprecise tracking when the user is moving. I tried also to make some sudden movements and the headset kept tracking me accurately. But in some occasional moments, it has some small hiccups, some small jumps, where you see the virtual elements slightly snapping. I never saw something like this on Quest 3, so I think Meta still has an edge over Apple for what concerns tracking.

Talking about the things that Meta does better, we can mention hand tracking. According to various tests and reports, Apple’s tracking on Vision Pro is slower and less accurate than Meta’s one. The good thing is that you almost never notice this while using the headset, because Apple shows your real hands and not the virtual ones. This way, you can not notice all its tracking problems. I’ve anyway verified that the tracking of the hands works quite well, and it tracks the hands correctly if your hand is in the field of view of the device. Outside the FOV, the tracking of the hand degrades, and at a certain point, the tracking is even lost. I wouldn’t trust making an archery game with the Vision Pro.

Video posted by a developer that shows the lag of the hand tracking of the Vision Pro at release time

Regarding eye tracking, I have to say that it is rather accurate, but not perfect. As with all the eye-tracking systems I’ve tried in my life (7Invensun, Tobii, Meta, Apple, Microsoft, …), its performance degrades when you start looking at the periphery of your vision. So while looking at menu items more or less in front of you leads to accurate results, looking at buttons more in the periphery is going to have some misdetection. Of course, eye tracking quality improves a lot with eye calibration, so if someone hands you a Vision Pro, start with the eye calibration process, or your experience in using the eyes as a controller for interaction is going to be very frustrating.

Audio

apple vision pro audio
One of the speakers of the Vision Pro

The audio of the Vision Pro is provided through two little speakers embedded into the two lateral frames of the visor. This is a similar solution to the one provided by almost all other headset vendors. I’m not an audio expert, so I can not speak in detail about the audio quality. But I can say I tried many applications (including one about DJing), and I’ve found the audio quality to be good.

Performances

The M2 chipset of the Apple Vision Pro gives it a lot of horsepower. But still, it has to handle a lot of data and especially a big screen. I agree with Ben Lang when he says that the passthrough vision never misses a frame, it is always fluid, in whatever condition you operate. But I can not say the same about the applications: I had the impression a few of them I tried were not entirely smooth, even if there were not so many graphical elements in the scene. So I guess developers have to optimize a lot to make a title run smoothly on the Vision Pro.

Battery

The battery looks like a silver brick that you put in your pocket. I thought it was slimmer, while actually it is thick as two smartphones put one on top of the other.

It is connected to the headset via a special circular connector. Luckily the battery itself has a USB-C connector, so charging it is pretty easy.

apple vision pro battery
Top view of the battery
apple vision pro battery
Bottom view
apple vision pro battery
Side View

Passthrough

Before trying the Vision Pro, I’ve read some people in the VR community saying that the Quest 3 has basically the same passthrough at one-seventh of the price, so I was expecting a similar experience. Actually, I do not know what these people have smoked (I guess a lot of fanboyism), but the two passthroughs are completely different.

View post on imgur.com

The resolution of the passthrough of the Vision Pro is crazy good. It is not identical to reality as Ian Hamilton said during his first run on the device, but it is very crisp. You see that you are watching the world through a screen and you perceive a bit of noise, but the quality is incredible. I was able to take notes very easily on my phone while still wearing the Vision Pro. It also comes with almost no distortion, so you don’t have that “underwater effect” that you have on Quest 3 when you move your hand in front of the device (I know, v66 improved this factor…).

Everything was so magical with the passthrough until I took a step forward while rotating my head and I was like “What the hell??!”. Motion blur kicked in and I saw my reality becoming super-fizzy. Every time you rotate your head, you perceive this strong blur effect: it is incredibly disturbing and disappointing and ruins the experience completely. The motion blur in the passthrough is so visible that it is difficult to forget about it.

apple vision pro passthrough
The fun thing about the passthrough of Vision Pro is that you can put a Quest 3 on top of it and see the passthrough of the passthrough (passthrough-ception!)

At the end of the day, if you are seated and you are just watching something in front of you, the passthrough of the Vision Pro is amazing. But if you have to make an experience with people moving in the room, I would still go for the Quest 3.

EyeSight Display

Let’s talk about the display that is in front of the device and that should make you see the eyes of the user. The marketing material showed us how you can really see the user’s eyes, and it is like the user has no headset on his face. The sad reality is that it is really an image with potato quality (yes, I’ve said the word). Even worse, it’s not clear when it displays the eyes and when it displays a color animation, so it is very confusing both for the person in the headset and the one watching him.

apple vision pro eyesight
Here you can see a mix of the blue animated visuals and my eyes provided by EyeSight

But notwithstanding all of these issues, I still think it’s better than not having it. Even if the eyes are not accurate, seeing them still makes the experience of speaking with someone having a VR headset more human. I personally believe that Apple shouldn’t abandon this technology and instead should work on improving it.

User Experience

Hand-Gaze Interactions

Everyone knows that the Vision Pro can be controlled by using your eyes and your hands: you look at objects, and then you perform an “air tap” to activate them. Everyone was excited by this feature, but a few months ago I criticized this system saying that while it is very innovative, it has the drawback of trying to use the eyes as a mouse, which is a bad idea because our eyes are not meant for that. Some people told me that I shouldn’t have written such an article without having tried the headset before.

View post on imgur.com

Now I have tried the device and in fact thanks to this test I have changed my mind completely: now I think that trying to use the eyes as a mouse is a bad idea, because our eyes are not meant for that. In fact, after 15 minutes of using the eye-hand interface, I would have wanted to throw the headset out of the window (then I remembered its cost and changed my mind). Whenever I could, I got close to objects and I poked them with my finger (but the fun thing is that the menu disappears if you get too close or distant from it, so I had to poke it from a distance). A few friends told me “You have to get used to it”, but I find it pretty absurd that I should get used to a “natural interface”… if it is natural, if it is well-thought, I do not have to get used to it, I just behave naturally. The human brain is very plastic and can get used to everything, even controlling the menu of an MR headset with a dildo between the foot fingers, but this does not mean that this should be the interface of all the next devices (even if it would be pretty fun).

(Video by Apple)

Why am I so critical about it? Well, there are a few reasons. The minor one is that eye tracking is not 100% accurate, and it is frustrating that sometimes you are looking at something and it is actually highlighting the button next to it. An input system should be very reliable, and currently eye tracking is not. There is also an additional problem: what do you do in these cases? Do you look at the item more? Do you open more your eyes? With a hand-based interface, you can move your hand around misdetected objects, but with the eyes, it’s hard to understand what to do in case of failure of the system.

But the major reason is that… using the eyes as a mouse is a bad idea because our eyes are not meant for that (I guess you’ve already heard this sentence). Eyes are made to EXPLORE. With eyes we look around, we help our brain in creating a map of the surroundings, we redirect our attention when there is some audio or visual stimulus. We perform a lot of involuntary movements with the eyes (saccades). We do not do any action with eyes, they are just meant as sensors to give the brain data. We evolved as humans for thousands of years to use our eyes in these ways, and I don’t think that Mr. Tim Apple can change this. So what are the collateral effects of this decision of using the eyes as a mouse? Let me tell you a few:

  • Mida’s Touch: whatever you look at, starts reacting, so you look at the icons of the menu and they all start doing animations. This is annoying because you are just looking at the menu options, you do not want the system to react in any way until you have taken a decision. The good news is that you can get used to it;
  • Selection of the wrong item: since eyes are meant to explore, they keep doing their job, but this may mean that you select the wrong item because of this. Let me tell you an example of something that happened during my last test: I wanted to select an application, the File explorer, so I looked at its icon, while I was doing my Air Tap to launch it, my eyes decided to have a look at the “Apple Music” icon. Can you guess what application started? Yeah, that one. And as ThrillSeeker says in his last video: what about a window about a Save File option when you exit an application? Maybe while you are clicking on “Save”, your eyes go on “Exit without saving” and bam, you’ve lost hours of work;
  • Eye strain: the eyes are not meant to do this job, so using them this way leads to mental and eye strain. After 20 minutes of using the headset, I had to remove it because my eyes wanted a break.
Thrillseeker’s video about VR UX got lots of attention lately

So as much as I like the idea of the eye+hand interface and I’m glad that Apple is experimenting with it, I don’t think that’s the final solution for the UX on XR headsets. I would never use on my computer a mouse that works just 90% of the time and at other times, clicks on random parts of the screen. Remember also that the Vision Pro is the first step by Apple on its road towards everyday AR glasses. And I guess that if I’m crossing the street while wearing AR glasses, I can not focus on the interface of the device, because I have to keep looking in front of me and check if cars arrive on the side. And at the same time, when I am just wandering around, my eye movements can not be interpreted by the system as inputs. This means that using eyes to interact with glasses may have issues when you are outdoors. I see more eyes as “context providers” for AR interactions: for instance, if the system identifies that I keep staring at a bakery, maybe it suggests me if it is a good idea to enter or not. But “eyes as a mouse” every time everywhere for me is a dead end.

Regarding the hands’ usage, it kinda works, but I had some misdetection between swipe and click sometimes. Because if the gestures are “micro” it is very easy to confuse one movement with another one. I’ve also noticed that if you are standing, not always your hand gestures are detected when the arm stays relaxed, but you have to move your hand slightly forward, which is more tiresome. The experience of interacting is much better when you are seated because the arm relaxes on the armchair or on your lap and it is “forward enough” that the headset detects it. As I’ve already said one million times, this is a headset optimized for seated use.

apple vision pro damon hernandez
Damon Hernandez using the Vision Pro

Still, after so many critics I have also to say something good to counterbalance: there are some moments when the eyes and hands work well together that really feel magical. I remember that at a certain point I managed to use the keyboard with my eyes and the airtap very well: I just looked at letters, pinched, and I was writing in a very relaxed way, something that never happens in VR. This is to remark how it is not that “hand and eye” are bad per se, it is just that I would use them just in specific contexts and not as the main interaction means of the whole headset and its applications.

Menus and UI

apple vision pro menu
The home menu of the device

The Apple Vision Pro has an initial home menu where you can select the application to launch. You have seen it already one million times, so I don’t have to describe it. After you launch an application, it either launches in exclusive mode or together with the rest of the other apps around you. If an app is not immersive, it is associated with a floating window. The window has below it a circular button to close it and a little horizontal bar to move it in the space. I’ve not had enough time to play around with the movement of windows around me as if I had a 3D desktop, but from the little I did, it is pretty cool.

You can use the Reality Crown button to open the menu of the application you are running, or alternatively, you can look up, until you see a little icon that confirms that your movement has been detected, and then shows the menu. Using eye movement is a bit uncomfortable if you have to do it often, but once in a while is ok.

All the system UIs have the Apple style every one of us is very familiar with and it looks very classy and polished.

apple vision pro sdk menu
Official image of the UI of the Store of the device (Image by Apple)

Virtual keyboard

To input the text in the various input forms you may choose to use voice input or to write using a floating virtual keyboard. You can either use the gaze+pinch mechanic or use your fingers to type on it by poking on the keys. None of these input methods are ideal, and the virtual keyboard is barely usable like all the other virtual keyboards of the other headsets. It’s interesting that, as I was saying before, when the gaze+pinch works, the keyboard is very relaxing to use for short inputs, but the moment that the eye direction is not detected well and you press a letter instead of another, it starts becoming frustrating.

If you have to use the Apple Vision Pro for productivity, buy a physical keyboard and connect it to the headset.

Body Passthrough

One of the coolest things about the Vision Pro, together with the resolution, is that it lets you see your upper body. You do not need to have an avatar, because you just see yourself, especially you real arms and your real hands. It is not a Persona, or a digital reconstruction, it is the image of your body cut out from the camera frames. The quality of this visualization is very good: it is slightly noisy and the colors are a bit toned down, but still, it’s like truly seeing yourself. And the segmentation is very well made as well: the hands were cut down in a precise way. The only way I had to break it was by occluding my arm with my hands, but I had to do it on purpose to break the system. This feature is much more presence-inducing than seeing your avatar because you truly see your body and it is always superimposed on the application you are using, so it feels like your body is interacting with the immersive experience.

This is how you see your hand and arm (Image from Road To VR)

The only drawback of this choice is that it masks any tracking fault: while on Quest, if I see that my virtual hand is going haywire I know that I can not use it for interacting, on Vision Pro, I would not notice that and still try using my passthrough hand for clicking a button even if the fingers are all mis-detected.

Other people’s passthrough

When you are in an immersive environment and someone comes close to you, you see him or her appearing in your virtual world. This is very useful for safety, because you see the people around you and so you don’t risk punching them while you are in VR (or you can aim better at their face, if that is instead your purpose). Technically speaking this is well implemented, because it’s great that the headset is able to just show you the person you have in front of you, cutting it out from the background. But on the experience side, it looks a bit cringe to see these “ghosts” in your VR experience, I think they ruin the immersion.

The reality crown

(Video by Apple)

One of the flagship features of the Apple Vision Pro is that you can rotate its crown to select how much reality you want: you can choose if being fully immersed, or having your virtual reality occupying only a piece of your environment. This is very cool, but in my limited time, I never felt the need of using it, unless I wanted to momentarily speak with someone while I was immersed.

Virtual Reality

Tim Apple is never happy to read “Virtual Reality”, because we should say “Spatial Immersive Applications”. Well, he can call it however he wants (let’s remember we kept calling “metaverse” whatever digital application, so names are not that important), but he should guarantee that these things work correctly because full immersivity is instead very important. VR works very well on the Vision Pro: in the beginning of this article, I’ve already told you how immersive environments were crazy good. But the problem is that your “room scale” setup is ridiculously small. You make one step forward, and the headset already makes you see the passthrough of the surroundings to warn you of potential risks. This is immersion-breaking, so you can only do very static applications…. maybe seated. Even in the Environments, it is so annoying that you are on the moon, but you can not walk on it.

Content

I’ve tried a good bunch of content on the Vision Pro: an app to draw in the air, the concert app Amaze VR, the DJing application DJay, the photo visualizer, an app about the Mars Rover, another one about Dinosaurs, a visualization of the heart, the NBA app, Marvel’s What If…, the demonstration app Jigspace, and many more. There is already enough content to spend a few hours strolling around… but I had the impression there is not enough to keep using the headset regularly. Most of the content didn’t even look that deep to me, some apps looked like demos. I know that Apple doesn’t like to say that this is a devkit, but this is exactly the content situation that you have in a devkit. And if it looks like a duck and quacks like a duck…

I guess I love turbines now

The demos that shined more were the ones that exploited the high resolution of the headset. For instance, Megan Thee Stallion in the Amaze VR concert looked gorgeous, and the turbine inside JigSpace was the most beautiful turbine I’ve ever seen in XR. But some other demos could have been a bit bolder: for instance, the Dinosaurs demo is cool because it makes you see a dinosaur in high resolution, but it always stays inside its AR portal, it never jumps inside the space with you. I also tried the NBA app, and I found it funny that I had this very crisp resolution and a giant screen in front of me to see a choppy 2D video in potato quality about a basketball match…

This is one of the best experiences you can have on the device now

I also had a try of the first part of “What If…”. It’s a very polished application, with an interesting story, a strong IP, and well-executed graphics. Maybe interactions are a bit dull, though… it seemed to me one of those storytelling experiences that lets you do some interactions to keep you engaged with the story, but that never becomes a true game. But there I also had some problems with hand tracking: the experience is all about gestures, and sometimes some of my gestures were not detected (maybe a room lighting issue)… but since there was no way I could see a virtual representation of my hand, I had no idea about what was going wrong and so how I could improve it.

Ah and regarding all the iPad apps available, I don’t care. If I want to use iPad apps, I buy an Ipad… if I have a Vision Pro, I want to go immersive.

Final considerations

apple vision pro battery
Apple Vision Pro and its battery

After this first run with the Vision Pro, I have mixed opinions about it. I would define it as a very good devkit that introduces some innovations in our space but that still lags behind regarding some other features. To be the first headset of a company, it is a very good one, so kudos to Apple for it. But it is clearly not a mature product, and in comparison with the Quest 3, the Quest 3 looks very strong because it is instead a product coming from 10 years of experience that Meta has in the industry, with a lot of feedback received from real users.

If I had to select one feature that truly impressed me I would pick the clarity and resolution: the moment I saw myself on the moon environment, with those crisp visuals and believable audio, my brain clicked. I’m very rarely surprised by anything, but the Vision Pro delivered one thing that truly amazed me because it showed me what is true immersion in VR. This is enough for me to confirm that this device is bringing something new to the ecosystem, and its release has been a watershed moment for XR. This device has many drawbacks, but that thing, that simple thing, made me forget anything else.

apple vision pro selfie
That feature alone was enough to amaze me

If I had to choose something that disappointed me, I would say the user experience. I’m not a fan of the choice of optimizing the whole experience about the seated use case. XR, may it be AR or VR, is about exploration. Mixed Reality should be about the merging of realities, not just about showing a damn virtual screen in front of me. Ah and for the last time: using your eyes as a mouse, is a bad idea.

The store is still new and the immersive content available is very limited, as in every devkit.

For all these reasons, I think that the only people who should buy the Vision Pro are the people who should buy an Apple devkit, that is: developers, XR/tech enthusiasts, Apple fanboys. All the others can be very happy with the Quest 3 for now. But still, if you can get a demo of the Vision Pro, do that, because that immersion quality will inspire you about what will be the future of VR.


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

vps immersal visual positioning system

Visual Positioning Systems: what they are, best use cases, and how they technically work

Today I’m writing a deep dive into Visual Positioning Systems (VPS), which are one of the foundational technologies of the future metaverse. You will discover what a VPS service is, its characteristics, and its use cases, not only in the future but already in the present. As an example of a VPS solution, I will […]

vive focus vision hands on review

Vive Focus Vision and Viverse hands-on: two solutions for businesses

The most interesting hands-on demo I had at MatchXR in Helsinki was with the HTC Vive team, who let me try two of their most important solutions: the new Vive Focus Vision headset and the Viverse social VR space. I think these two products may be relevant for some enterprise use cases. Let me explain […]