Qualcomm Snapdragon AR2 Gen 1 Platform and AR Glasses

Qualcomm announces AR2 chip to pave the road to the augmented reality glasses of the future

At the Snapdragon Summit, Qualcomm has just made an announcement that I think will be very important for the future of augmented reality. The company has unveiled the Snapdragon AR2 Gen1 chip, which will power the next generation of slim augmented reality glasses.

Unluckily I haven’t been able to go to Hawaii to enjoy the live announcement (and enjoy some drinks on the seaside), but I attended a preview of the presentation dedicated to the press, so I’m able to tell you everything I’ve learned about this new chip, and why it is so important for me. Follow me, I promise you’ll get some juicy technical details, too!

What is the Snapdragon AR2 Gen 1

Launch video of Snapdragon AR2 Gen1

AR2 Gen1 is the new chip that Qualcomm has designed specifically for augmented reality glasses. Hugo Swart underlined that building AR glasses has very unique challenges that are different from building a VR headset or a hybrid MR/VR headset (like the Quest Pro). The first of all is the power: you need to consume very low power, and also dissipate very little of this power as heat. Then there is the form factor, which has to be very small if we want augmented reality glasses to look like standard sunglasses. Of course, to that, you have to add high-speed connectivity, quality of visuals, etc…

For this reason, using the XR2 processor, which is so successful among VR and MR headsets, is not possible. You need a dedicated unit, and that’s why Qualcomm has worked on Snapdragon AR2, which is a solution completely dedicated to AR.

There was an interesting question during the press conference about why can’t we just use the AR2 for VR headsets, too, then. The answer has been that the AR2 can drive only screens with lower resolutions than XR2. So now you know why there is a need for two distinct lines of products: one for VR/MR headsets and the other one for AR glasses.

Why AR TWO Gen ONE?

qualcomm snapdragon ar2 vs xr2
Here you can see the difference in purpose between the XR line and the AR line. In the future, an AR1 chipset may be released (Image by Qualcomm)

The additional question can be “why this complex name?”. Well, Qualcomm has recently changed its naming convention. Let’s start from the easiest part: it is a “Gen1” because it hopes (and we hope, too) that there will be future models.

Regarding why “AR TWO” if it is the first version, it is because Qualcomm is trying to follow the same convention it is having with MR chipsets: the XR2 (and its variants) is the chip that powers important devices like Quest, Quest 2, and Quest Pro, while the XR1 is powering less demanding headsets like for instance the Vive Flow. The AR2 is so meant to power premium-quality augmented reality glasses. Qualcomm may decide in the future to create an AR1 chip to power some kind of AR smartglasses: imagine something like Rayban Stories but with some lightweight AR features. This is not sure yet, but it may happen, that’s why it kept on the AR line the same numbering it is having on the XR line.

The evolution of AR

I think that the fact that an important company like Qualcomm has created a fully dedicated line just for AR glasses is a great sign of the commitment it has in trying to follow the path that will lead us to have AR glasses we want to wear all day. Hugo Swart showed during the presentation a very important slide in this sense:

evolution ar glasses qualcomm
The timeline of AR devices imagined by Qualcomm. It’s good to see that starting from next year we will see more lightweight AR glasses (Image by Qualcomm)

I think it summarizes very well where we were, where we are now, and where we could be. First of all, Qualcomm has very clear that now is impossible to make AR glasses that are fully standalone, but there is the need for external computational power. This may be local (a smartphone) or remote (via the cloud). Of course, now the most popular one is via a smartphone connection. We have all in mind the glasses connected to the phone like Nreal light, which have been the first to show a fashionable design for augmented reality. This has been the first step for mobile augmented reality.

Then we have seen a new reference design by Qualcomm that let you have the same connection, but wireless. This was great because it finally removed the wire attached to your glasses, but honestly speaking, I have seen no successful fully-AR glasses implementing this solution. We had some smartglasses doing that, though: the Rayban Stories for instance work wirelessly with a phone and they run over a Qualcomm chip.

According to Qualcomm, the next 2-3 years will see a further evolution of this distributing processing paradigm. The glasses will start implementing more onboard processing power, while still being slick. This will let them be much more efficient and performant, even if they will still need a smartphone to work, especially to provide the rendering of the experiences. The AR2 chip fits exactly in this description: it gives new power to the AR glasses, allowing performances that were impossible before.

After 2025, there will be new paradigms: probably we’ll start entering in the “5 to 10 years” timezone where everything becomes possible.

Qualcomm Snapdragon AR2 Gen 1 3-chip design

Let’s get into the real deal of this chip: it’s not a chip, it is 3 chips. I could make some jokes about it being “one and three”, but I will avoid it for today. The three chips are called: AR Processor, AR Co-processor, and Connectivity.

Qualcomm Snapdragon AR2
The 3 parts that compose the Qualcomm Snapdragon AR2 Gen 1 system (Image by Qualcomm)

I was kinda confused by why there was the need to separate this chip into three sub-chips, and the confusion incremented even more when I discovered that the three chips are scattered all around the glasses: two chips are in the two temples, and the third is at the center of the glasses, on the part that goes over your nose.

Anyway as soon as I was explained the reasons behind this design, everything made total sense to me. The reasons behind having three chips are:

  • A big chip would make the frames too big and bulky, while if you split it into multiple parts it is easier to fit them inside a thinner frame
  • A big chip would generate too much heat, and this would be not good for the circuits and not good for the wearer, who would have a BBQ of one side of his/her face
Yes, I’m stupid and I make stupid memes
  • Having multiple chips also reduce the wirings necessary between the various parts of the glasses, and this makes them more efficient and thinner

So having distributed power is the key. Now let’s see all these three chips in detail.

AR Processor

The AR Processor is the main chip of the glasses, that is their main brain. It is built over a 4nm technology and its main purpose is to understand the environment around the user and show the virtual elements in front of his/her eyes.

qualcomm snapdragon ar2
The internal architecture of the Qualcomm Snapdragon AR2 Gen 1 chip (Image by Qualcomm)

The AR Processor’s main feature is tracking. It is the one that analyzes the data of the cameras, find the features, and reconstructs what is the pose (rotation + position) of the glasses in the world. We’ve been shown a video that compares the feature detection capabilities of this chip compared to the previous one, and the improvement was impressive: the AR2 was able to detect much more features in the environment. “Features” are special image points that inside-out algorithms find in the image of the cameras and that they use for tracking: the more features, the more the algorithm is precise and stable, so with this chip, tracking is much more improved. Besides being more performant, it also consumed less memory and less power.

Together with tracking, it also is good at doing everything related to environment understanding, like for instance performing object detection or hand tracking. It also supports recording of photos and videos from the device.

For all of these computations, the AR processor supports up to 9 cameras.

You may wonder why I’m not talking about rendering here. Well, because as I’ve explained before, Qualcomm believes in distributing processing power: if we want the glasses to be lightweight, some functionalities should be offloaded to the smartphone. Rendering is the main feature that is offloaded. The glasses so perform no rendering of the virtual elements whatsoever.

But they still have to show the virtual elements on the screen. And this operation is slightly more complex than you may think. Considering that the glasses talk wirelessly with the phone, there is latency between the rendering request, when the frame is actually rendered, when it arrives back to the glasses, and when it is shown on screen. All these different latencies should be considered when showing the image in front of the eyes of the user, because in the meantime his/her head, of course, has moved. For this reason, the AR2 Processor features a high-performant and power-efficient XR reprojection engine, which is able to take the rendering frame and transform it so that it is shown the right way to the user.

AR2 Connectivity

The Connectivity chip (powered by Qualcomm® FastConnectâ„¢ 7800) is poised to make the glasses connect to the network, in particular to the Wi-Fi network that connects them to the glasses. This chip works with Wi-Fi 7 (which I didn’t even know existed) to reduce the latency of connection up to 2ms. This is impressive. Low latency is fundamental to having distributed rendering and this chip helps a lot in achieving it.

ar qualcomm snapdragon visual
From this render we can imagine how AR2-based glasses can fit on the head of a person. They don’t look like standard glasses yet, but we are close (Image by Qualcomm)

AR2 Co-processor

Ok, we have the brain, and we have the connectivity unit, why the heck do we need a third processor? Well, it turns out it is fundamental again to make the glasses more lightweight and thermal-efficient.

The AR2 Co-processor fits in the middle of the glasses, so it can stay quite close to the eye tracking sensors and also some cameras. All these sensors could be connected directly to it, which so can pre-process some information and then send just the results directly to the AR processors. For instance, the AR2 Co-processor has the responsibility of performing eye-tracking, foveated rendering, and iris authentication, so it can connect directly to the eye-tracking cameras, which are close to it, calculate the eyes’ rotations, and then send just this datum to the Processor.

Let me show the wiring system with and without these co-processing unit. This would be the wiring of the glasses without the Co-processor:

qualcomm augmented reality single chip
Notice how much wiring is needed in the glasses to connect the sensors to the chip if there is only one chip (Image by Qualcomm)

As you can see, there are a lot of wires, and some of them have also to transmit lots of data (like the ones connected to the cameras). This means other problems with thermal and especially the size of the frames. One particular problem that Hugo told us is that one of the most difficult parts to design when making AR glasses is the hinge: if you want to make a foldable hinge, like the one of standard glasses, you must have very few wires that go through it. The co-processor helps with that: many sensors connect directly to it, and then from it, there are just a few wires that go to the main Processor. This means having fewer wires going through the hinges. You can see that in this other picture:

Notice how the wiring has been impressively reduced thanks to the coprocessor (Image by Qualcomm)

The Co-processor also helps the main processor in being smaller, so it can fit better inside the frames.

Results

Qualcomm snapdragon ar 2 gen 1 features
The main features of the new Qualcomm Snapdragon AR2 Gen 1 (Image by Qualcomm)

Qualcomm claims that this new AR2 chip can offer a consumption inferior to 1W, which is an impressive result: it is 50% less power than what is used by the XR2. It is also quite powerful: since AI is very important for AR (because it relies a lot on environment understanding and sensor tracking), this chip has been supercharged for AI and is 2.5x more powerful for AI if compared to the XR2.

Regarding dimensions, this new design is 40% smaller than the previous AR viewer reference design based on XR2.

qualcomm augmented reality single chip
A single chip would make the glasses too bulky (Image by Qualcomm)
qualcomm ar2
Thanks to the use of a distributed architecture, the glasses frames can be thinner(Image by Qualcomm)

Ecosystem

Qualcomm is working with selected OEMs to build AR glasses based on this chip. The names are the usual suspects: Lenovo, LG, Niantic, Nreal, Oppo, Pico, Qonoq, Rokid, Sharp, TCL, Vuzix, and Xiaomi. The first glasses are in the works, but we must most probably wait until the second half of 2023 to see one of them released to the market. The name “Pico” caught my attention: this means that Bytedance/Pico is working on augmented reality glasses, too, and it plans to compete with Meta also on that front.

Qualcomm Snapdragon AR2 is also compatible with Snapdragon Spaces, so it fits completely the hardware+software ecosystem that Qualcomm has been building these years.

Hugo Swart also mentioned that there are new interesting partnerships with Adobe, Microsoft, and Niantic that are taking place, but he has not unveiled what they are about. Niantic could be easier, because we already know it is working on some sort of XR glasses reference design. The other two are more confusing: Adobe is not a hardware company, so maybe it wants to release its software suite for AR glasses. And Microsoft may work with Qualcomm for its future AR glasses now that HoloLens is dead. But this is just my speculation.

Microsoft has released an official statement about this collaboration, but it is one of those sentences that mean nothing:

Microsoft worked closely with Qualcomm on the platform requirements for Snapdragon AR2 to help define the purpose-built, foundational technologies to unlock new possibilities in AR experiences […] Snapdragon AR2 platform innovations will revolutionize headworn AR devices that will transform immersive productivity and collaboration and we look forward to seeing the innovation that Qualcomm and its partners will bring to market.

Rubén Caballero, Corporate Vice President of Mixed Reality, Devices & Technology, Microsoft

Final considerations

Qualcomm Snapdragon AR2 Gen 1 chip
A picture of the chip (Image by Qualcomm)

I am pretty excited about this announcement because it makes me see that there is a path that will slowly lead us to have lightweight AR glasses that we can wear all day. We are not there yet, but we are going there, and it seems that there is a roadmap.

Also, the three-chips design is quite original and it’s good that there are creative solutions in place to try to reach the goal we all have. I know, this is not the first glass that uses multiple chips, for instance, HoloLens also had a separate unit to perform positional tracking, but it is the first time I see this on a Qualcomm reference design upon which many other glasses will be based.

I can’t wait to see glasses based on this reference design, and especially to try them. I always repeat it on my blog: headsets have to be evaluated on the head, and not on paper, and my recent experience with the Quest Pro confirmed this even more (yes, a review of it is coming in a few weeks). So I would really love to try an AR2-based device to see how good it is, and evaluate the improvements over the current designs (like Nreal Light).

The premises are anyway very good: the removal of the tether and the improved tracking performances are my favorite ones. I am also curious about the FOV and resolution of these glasses, which being vendor-dependent, have not been revealed by Qualcomm. These, together with the price, will also be important to understand the future success of these AR glasses. I guess we’ll all discover more about this in 2023.

For now, I just enjoy this important announcement, hoping it will be a stepping stone toward having widespread AR glasses!

(Header image by Qualcomm)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

vps immersal visual positioning system

Visual Positioning Systems: what they are, best use cases, and how they technically work

Today I’m writing a deep dive into Visual Positioning Systems (VPS), which are one of the foundational technologies of the future metaverse. You will discover what a VPS service is, its characteristics, and its use cases, not only in the future but already in the present. As an example of a VPS solution, I will […]

valve deckard roy controllers

The XR Week Peek (2024.12.02): Valve Roy Controllers 3D models’ leak, Black Friday VR deals, and more!

Happy Thanksgiving weekend to all my American friends! We don’t have Thanksgiving in Italy, but I know it’s a very important celebration in the US, Canada, and a few other countries, so I hope all of you who celebrated it had a great time with your family.  To all the others who did not participate in […]