GravityXR’s reverse passthrough put my eyes on a headset

Two days ago, when I visited GravityXR’s offices, I was introduced to one of their chips dedicated to reverse passthrough. Now, on the showfloor of VR AR Expo China, I was able to actually try it and see my eyes installed on an MR headset! It was cool and disturbing at the same time… let me tell you why in this quick article!

[Disclaimer: I have been paid for my accommodation in Shanghai by the organizers of the event, since I’ve been both a speaker and a media partner]

Reverse passthrough

The term “reverse passthrough” was coined by the engineers at Meta when they presented a research project at SIGGRAPH that showed the user’s eyes on the outside of the headset via an external display. Thanks to this, MR should feel less isolating, because people around you wouldn’t see your face covered by a brick, but would see something that kinda reminds them of your full face. This can be especially useful if you are in passthrough mode, because you can see the other people and, thanks to this technology, they can see what you are actually looking at (e.g., at them or at some virtual element).

The promotional images of Apple reverse passthrough technology (Image by Apple)

Apple has been the only big company trying to release something like that in an official product: the Apple Vision Pro, thanks to the EyeSight technology, shows on the outside something that reminds the eyes of the user, when the user is in mixed reality. While promising and innovative, the technology got a lot of negative feedback. The visualization of the eyes has a very low resolution, so it doesn’t look as realistic as in the promotional images. Plus, to offer this feature, Apple is making the headset heavier, bulkier, and much more expensive… and it is definitely not worth it.

I still think that this feature may have value, so I was happy I could try it at GravityXR.

GravityXR’s reverse passthrough

GravityXR is currently offering three chips, and one, the EB-100, has been conceived to offer reverse passthrough on headsets (or to show human faces on robots). You need a dedicated chip because doing reverse passthrough is not easy, and you don’t want to steal computational resources from the main chips of the headset to do that.

The G-X100-M0 reference design headset has a pretty traditional form factor

The reverse passthrough is currently a module that works together with the G-X100-M0 reference design, which is a powerful mixed reality headset connected to a PC. The main reason for that is that the M0 prototype supports eye tracking, and you need eye tracking to be able to properly show the eyes of the user on the outside screen.

To do the reverse passthrough operation, you need to scan the face of the user. I had to go to the GravityXR booth, where they took a photo of my face, and then used an app on the phone to get a video of the 3D depth of my face for like 2-3 seconds. This information has then been provided to a dedicated system, which, after various minutes, created a model of my face that could then be uploaded to the reverse passthrough module.

At that point, I could wear the headset and have my face on it!

Hands-on the reverse passthrough

This has been one of those rare cases where I could not notice the performance of a system while I had my headset on, but I had to wait and see the video that was recorded of me from the outside. On the inside, I was just seeing a mixed reality experience with a giraffe in front of me.

You can watch the video and see how GravityXR reverse passthrough fitted well with my face

When I watched the video of the reverse passthrough on my face, I thought that it was both super cool and a bit weird at the same time. It was super cool because the representation of my eyes was quite crisp. They were clearly visible and were not just a blurry representation like on the Vision Pro. They truly looked like my eyes, and thanks to some optical magic of the display used for it, the eyes always appeared at an apparent depth that was not the one of the display, but that was the one of my face. So there were really my eyes on my face. As far as this, the mission of the passthrough was absolutely accomplished.

And it was cool that, thanks to eye tracking, the representation of my eyes was really showing where I was looking. I could move my eyes up, down, left, right, and their virtual counterpart moved accordingly.

When the technology worked well, the results were pretty good

It was all great, until it wasn’t. The effect of the eyes can be amazing when everything is perfectly aligned, but it can become disturbing when there are problems. For instance, sometimes the pupils move too slowly, or one pupil moves in one direction, and the other in a slightly different one. Or the eyes are not exactly aligned with the face. Or the eyes look slightly too big for the face. Or one eyelid appears as half closed, while the other is fully opened. In these cases, the uncanny valley effect becomes strong, and what your brain perceives is having in front of you what looks like a Frankenstein human, some sort of mutated being. It feels very unnatural sometimes.

I don’t look very well in this picture

I guess this is the reason why Apple went for blurring things up. If the screen is too defined, you notice every little problem with the eyes’ representation, and things can become pretty disturbing.

The Steve Buscemi effect

Help, I’m trapped inside a headset!

Talking about disturbing things… GravityXR uploaded my profile to the headset and kept it for all the people using it at their booth! This means that when I arrived at their booth, I found a guy using the mixed reality HMD, and I could see my eyes and nose installed on his face! It was mildly disturbing; it was like someone trapped my soul inside the device and was putting it on all the faces…

This guy doesn’t look so well with my eyes on…

Chinese people looked very weird with my eyes. It reminded me of all those photomontages that you find online where people put Steve Buscemi’s eyes on the faces of random people!

I know, when you opened this article you didn’t expect to see an image with the Spice Girls having Steve Buscemi’s eyes, but here we are…(Image by chickswithstevebuscemeyes)

Final impressions

It was very cool trying a fully working reverse passthrough solution at GravityXR, which confirmed itself as one of the most interesting companies at this year’s VR AR Expo. Having such a test also made me understand both the good things and the difficulties of providing such a feature. When it works, it is definitely wonderful, and makes you see more realistically the faces of your friends who are in VR. But when there are some problems related to it, things become uncanny or even disturbing very fast.

I still think this technology has a lot of potential, because we want to see people in the eyes, and with it, MR can be less isolating. But at the same time, I think that probably now it is still a bit experimental. Probably it shouldn’t be installed by default on headsets, but should be an accessory that the more tech-savvy people may decide to install on their devices.

Skarredghost: AR/VR developer, startupper, zombie killer. Sometimes I pretend I can blog, but actually I've no idea what I'm doing. I tried to change the world with my startup Immotionar, offering super-awesome full body virtual reality, but now the dream is over. But I'm not giving up: I've started an AR/VR agency called New Technology Walkers with which help you in realizing your XR dreams with our consultancies (Contact us if you need a project done!)
Related Post
Disqus Comments Loading...