Qualcomm releases XR2 5G reference design, paving the way for the headsets of the future
After having announced the impressive XR2 chipset in December, today Qualcomm has released the related XR2 reference-design, upon which will be based most of the upcoming standalone XR headsets. This is a piece of very important news for the future of AR and VR, so let me explain to you everything about it.
Qualcomm XR2 reference design
Some months ago, Qualcomm had announced its XR2 chipset, which should be the beating heart of all the upcoming AR and VR headsets based on Qualcomm technology. This news became viral on all the XR magazines because the new chipset was so much powerful if compared to its predecessors and it also allowed for features able to make AR and VR more realistic (more on this soon). Anyway, the announcement regarded only the chipset and not also a reference design, so it was a good teaser, but the technology was not applicable to real headsets yet. Today, instead, the XR2 reference design has been shown, and with it also some examples of possible AR and VR glasses based on it. This will pave the way for manufacturers to create standalone headsets for the upcoming years.
Chipsets and reference designs
If you’re new to this world, you may be confused by my use of “chipset” and “reference design” words. Don’t worry, I’ll explain them to you in a very easy way.
The chipset is the central unit of the headset, it is like its brain, the hardware computing the images that you see, that gives hw acceleration for video encoding, computer vision, AI, etc… The more powerful the chipset, the bigger the brain of your headset, that so can have more features and offer more realism.
The reference design is a set of guidelines that Qualcomm gives to hardware manufacturers on how to create a new headset that uses a chipset by its. So, for instance, Qualcomm says that you can pair that chipset with that model of display, that model of cameras, etc… and you’re guaranteed that the resulting headset will work well. Qualcomm also says for example how to design the device so that to handle the heating of the chipset in a proper way. And so on.
As you can imagine, this speeds up a lot the design time for a new headset, since the companies have half of the work already done and have just to focus on choosing the features they want to implement in the device, selecting the hardware suppliers, and give the headset the look they prefer. Then the hard part is implementing the firmware and OS magic.
Qualcomm chipsets and reference designs are overly important in the XR industry. Think the fact that Qualcomm powers almost all the most famous AR VR headsets on the market, like Oculus Quest, Vive Focus Plus, and HoloLens 2. One big exception is Magic Leap, that has gone with its own route.
Reference design features
The new reference design announced today by Qualcomm is something that is far superior to the headsets that are currently available on the market. Citing the American company, “the [XR2] reference design has 2x the CPU and GPU performance, 4x more video bandwidth, 6x higher resolution and 11x AI improvement compared to our current widely adopted XR platform“. And with “widely adopted XR platform”, they are talking about the Oculus Quest. 2x the computational power, 6x higher resolution of the Quest. This is enough to make us go crazy.
The XR2 reference design allows for 2K x 2K panels for each eye. I have tried the HP Reverb, that has 2Kx2K per eye, and I can assure you that that resolution is crazy, and the screen door effect becomes only a sensation, a disturbance in the images you see. Having it on a standalone would be awesome, and I think that adding a diffuser like in the Valve Index could make the SDE disappear completely. Think about it.
And if this wasn’t already enough, Qualcomm XR2 enables the possibility of having many cameras onboard, with 7 used simultaneously. “The reference design supports up to seven cameras. It features two internal cameras, one for each eye to support eye tracking. It also includes four external cameras, two RGB cameras for MR experiences and two for head tracking which can also be used to generate accurate depth maps. This reference design allows partners to assemble different configurations with an additional camera for facial and lip tracking or a second monochrome camera for controller tracking.“
So, the classical inside-out tracking cameras now can be used to generate depth maps, and this can have important applications for environment reconstruction, smarter safety boundaries detection, AR with occlusion, and also for the first examples of AR-cloud experiences. Since the XR2 provides many accelerations for AI algorithms, this could also give devs the power of understanding the environment around the user via object recognition, enabling lots of new experiences.
And apart from those cameras, we have:
- Eye-tracking cameras, that can be used for foveated rendering, thanks to the support for this technology on the XR2 chipset, due to the embedded “Adreno foveation”. This will boost even more the 2x computational power, and will be fundamental to allow the 2K per eye of the reference design. Qualcomm has also partnered with Tobii, and its Tobii Spotlight Technology may be able to offer foveated rendering inside these new headsets;
- Facial tracking cameras: I’m not sure many companies will embed these ones, since they’re not fundamental, but they could be great to improve realism in social XR experiences, like recent experiments on Neos have shown;
- Passthrough cameras. The Qualcomm reference design allows for high-resolution RGB passthrough cameras, making every VR headset an AR one as well. Passthrough AR has a lot of potential (e.g. in prototyping and simulation), and many enterprise vendors are experimenting with it (see the Varjo XR-1 or Cosmos XR), especially because it can provide brighter colors and bigger FOV (110 vs 52) than see-through AR. As the developer of a fitness game employing it, I can’t but being super happy because of that.
Talking about controllers, there’s not much to say: the reference design allows for more or less the same technologies we have today, letting manufacturers choose what is the best solution for them, may it be IR tracking, electro-magnetical tracking or other similar things.
“In addition to components, the reference design includes an IR emitter for hand tracking and head tracking with simultaneous localization and mapping (SLAM) to coexist. […] The reference design has embedded technology partners including the Atraxa electromagnetic tracking technology from Northern Digital Inc. (NDI) that enables accurate, low latency 6DoF controller and peripheral device tracking without line-of-sight restrictions.”
It seems that Qualcomm is trying to encourage vendors to not use cameras for controllers’ tracking but to use technologies that allows for reliable tracking in all conditions, even when controllers are occluded by the body of the user. This could free the processor from having to analyze the input from 4-6 cameras in search of the controllers but may introduce interferences problems. I’m curious to see if Oculus will really want to abandon its amazing Insight in favor of this kind of tech.
Thanks to its AI superpowers, the reference design has also been built to allow for voice commands. One of the possibilities that the chipset offers is that the mic gets left always on, so that the user can interact every time with the HMD by just using vocal commands, even in noisy environments. The chipset also offers the possibility of contextual awareness, that is described this way: “Contextual awareness provides an added layer of confidence and safety by letting users focus on their XR experience while the device monitors their real environment by distinguishing noises from the user’s background that may require attention, such as a crying baby or ringing doorbell, and will prompt the user accordingly”.
I’m not a huge fan of voice commands, honestly: I think they’re less usable than just selecting a menu item, and also make you look like an idiot if you use them in rooms where you are not alone. But voice understanding’s power may be in implementing virtual assistants (see the success that tools like Alexa are having), or to trigger complex tasks (that would require you to select many items), or to increase the accessibility for people having limited movements of hands. So it is a piece of welcome news.
Contextual awareness, instead, is interesting because, together with passthrough vision, it can limit the isolation of the user while he/she is inside VR. The fact that the headset can understand when it is better to remove the headset because something needs our attention will be like our “audio chaperone”, that will help us in not missing important things because of VR. In case of such an input, I guess the user could turn on the passthrough and verify in an instant if something really needs his/her attention.
New headsets based on Qualcomm will also have fast hardware video decoding, so users will be able to enjoy 8K 360° video at 60FPS! VR movies will have a realism never seen before. Also porn of course, I know you were thinking about it 😀
The icing on all this cake is 5G. Finally, a standalone headset may have 5G connection and not only Wi-fi connection. The reference design includes the X55 5G Modem-RF System, that provides native support for both 5G mmWave and sub-6 GHz connections.
To demonstrate that the XR2 reference design may actually provide “Boundless XR over 5G”, Qualcomm has partnered with NVIDIA and Zerolight to create the first “Boundless XR over 5G retail experience”. It works as follows: there is a car configurator made by ZeroLight, that lets you create the car of your dreams before buying it, running on a new Qualcomm XR2 headset. This device is connected with a 5G network to NVIDIA CloudXR solution. NVIDIA CloudXR provides edge servers always close to the users to that it can render XR content fast thanks to NVIDIA graphics card and stream it with very little latency to the user. NVIDIA CloudXR renders all the configurator experience (with supreme graphical quality offered by NVIDIA cards), streams it via 5G network to the headset, that shows it to the user. Notice that before showing the frame to the user, it actually must be re-projected to his current head position. This is because the rendering+streaming creates a little latency, so when the frame arrives to the headset, the position of the head has changed wrt when the frame rendering was requested.
This way, a standalone headset is used to show a high-quality graphical content that today only PC VR can offer. And the user doesn’t even notice that there has been all of this happening, he just thinks that the experience is running on the device. Of course, this is only a showcase, since 5G is not widespread yet, but it is a very welcome one. We all hope in a future where VR headsets are very light and can get content streamed from the cloud. CNET estimates that we’ll have 1B people enabled to use 5G in 2023, so hopefully, in some years this can become a reality.
The reference design just released by Qualcomm looks like a horrible gray shoebox, that’s why the American company has collaborated with the Chinese manufacturer Goertek to create some viable prototype designs of how AR and VR headsets implementing this technology may look like. You can see them here below. The AR headset looks a bit like a Chinese HoloLens, while the VR headset is cool because it resembles a pair of big sky goggles.
Honestly, I expected the VR one to be a bit smaller, because after I tried the Huawei glasses, I understood we need headsets that are light and trendy… but I understand that Huawei can be this sexy because it is not an all-in-one headset. I also think that the VR reference created by Geoertek is wrong on so many levels from an ergonomic point of view (I expect a comment on this by the expert Rob Cole), because it replicates the same big error of the Oculus Quest, putting all the weight on the front side, making it completely unbalanced. But these are just concepts, and I’m sure that real manufacturers will do a better job (We’ve already seen Pico putting the battery on the back on the Neo 2, for instance).
Keep in mind that, as I stated above, a reference design offers various possibilities, but then it’s the job of who implements it understanding what actually integrating into the device. For instance, the 5G modem is not mandatory, and some models may ship with just Wi-fi connections. Manufacturers must find a compromise between features and price, as always. XR2 headsets won’t be all the same.
Price is an important topic to talk about: a headset implementing all the features described above (7 cameras, eye tracking, magnetical controllers tracking, 2K per eye panels, high-res RGB cameras, etc…) can hardly cost $400 like a Quest. In fact, the only headset we know implementing the XR2 chipset, the Lynx R-1, costs $1500! It has to be understood if this reference design can power consumer headsets, or if in the beginning it will be devoted only to the prosumers and enterprise market. I bet on the second one, with consumer headsets of 2020 being powered by the 845 Qualcomm chipset (remember that the XR2 is a variant of the Snapdragon 865). Anyway, this is just a speculation of mine… we’ll see.
And that’s it with this big announcement! As you have read, there are lots of upcoming disruptions for standalone headsets, from foveated rendering to voice commands, not to mention 5G. I can’t wait to try a next-gen headset based on this reference design! And you? What do you think of this XR2 reference design? What are you most excited about? Let me know in the comments or answering me on my social media channels!
PS If you like this content, consider becoming a donor on Patreon or at least subscribing to my newsletter!
(Header image by Qualcomm)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.