skarredghost snap spectacles review

Snap Spectacles 5 hands-on: a nice devkit towards a brighter future

At AWE Europe I have been able to try the latest iteration of Snap Spectacles, the fifth one. I came out of this test with mostly positive but also negative opinions about it. Let me explain to you very well how these glasses are in my usual very-detailed hands-on description (which will include a few jokes, as usual).

(Notice: this article is a “hands-on impressions” one because I had just something like 15 minutes to try the device, which is not enough to write an exhaustive review like the one I wrote about the Quest 3S. Take whatever I’m writing with a grain of salt, because some sensations about a device may be wrong after just a first use)

Snap Spectacles 5

snap spectacles together
Two people wearing Snap Spectacles (Image by Snap)

If there is a bit of renewed hype about augmented reality, it is because, in a few days, both Snap and Meta showed on stage their new AR glasses: while Meta showed the Orion prototype, Snap showed the fifth edition of its Spectacles glasses. Spectacles glasses are a product already in the market, and they evolved from a device able just to shoot photos and videos to augmented reality glasses that creators can use to create AR lenses for Snapchat.

snap spectacles 2
The second edition of Snap Spectacles, which were mostly made for taking photos and small videos and publishing them on Snapchat. The device has evolved a lot since then (Image by Snap)

These glasses have been launched as pure devkits that interested creators can rent for $99/month (€110 if you are in the EU), with a minimum commitment of 12 months. After the creators stop to pay, the glasses have to be returned to Snap. The very recent news, that Snap announced at AWE EU, is that Spectacles are also coming to Europe, and to be exact in Germany, France, Spain, Italy (yay!), Austria, and The Netherlands.

According to a report by The Verge, these glasses are expensive and difficult to manufacture, so Snap plans to build only around 10,000 units. It’s another devkit projected into the future of augmented reality, but that today is not suitable for consumer AR.

Specifications

snapdragon snap glasses
Can I make a stupid joke about Snap glasses using a Snap-dragon chipset? (Image by Snap)

Some relevant specs taken directly from the Snap Spectacles website:

  • Display
    • Type: see-through
    • Technology: waveguides, with Liquid crystal on silicon (LCoS) miniature projectors
    • FOV: 46° diagonal
    • PPD: 37
  • Audio
    • Integrated stereo speakers
    • 6 microphone array for audio input
  • Input: voice, hand, mobile companion app
  • Chipset: 2x Snapdragon processors with distributed computing (model name undisclosed)
  • Battery duration: 45 min
  • Sensors:
    • 2x full color, high-resolution cameras
    • 2x infrared computer vision cameras
    • 6-axis IMUs for inertial sensing
  • Connectivity: WiFi 6, Bluetooth, GPS/GNSS
  • Weight: 226g

You can find the full specifications list on this page: https://www.spectacles.com/spectacles-24

The purchase dilemma

snap spectacles multiplayer
Concept image of some creators building for the Snap Spectacles glasses (Image by Snap)

While I was waiting for some other media representative to finish his hands-on with the glasses, I had the opportunity to speak with a few nice people from Snap. They introduced me to the glasses, saying that they are a new device and that they can give creators new ways of expressing themselves.

They were all smiling until they asked me if I had questions. I immediately asked “If I get the glasses, and I make some amazing lenses using Snap Studio, can I sell them in the store?” and they replied, “No, it is currently not possible”. At that point, I naturally asked “I am an XR developer, why should I spend $100/month for at least 12 months, for a total of more than $1200, if I have no way to earn even a dollar from it?“. Silence. It was the first time Snap called me for a demo and after this question, I realized it was probably also the last.

After a couple of seconds of embarrassment, one of the guys told me “There are other indirect ways to earn money with these glasses“. When he said that, my brain started spinning and thinking what it could mean. The only similar thing that came to my mind that resembled a bit this sentence was what happens in those “Fake Taxi” adult videos where the girl jumping on the taxi has no cash to pay for the ride, and the driver says “There are other ways to pay this taxi”. I started thinking that probably I had to have sex with the Snap people to earn money with the glasses, but then the guy went on with his reasonings and explained to me that for instance, Snap has a fund for creators. It is possible for people who have an amazing idea for a lens to get money to develop it. Furthermore, the glasses can be used by creators to do cool stuff, share it on social media, and get enough visibility so that brands pay them to develop Snap lenses. Or anyway getting this devkit lets creators gain experience with this development environment, so that in the future, when there will be a commercial version, they have an unfair advantage over the other creators that are instead getting them for the first time.

Intrigued by the last point, I asked when the first commercial Spectacles edition could be released. I was answered that no one knows. Snap is iterating its product until it has in its hands glasses that are good and affordable enough to be released on the market for everyone to enjoy.

Have I been convinced by this explanation of the business value of getting Spectacles?

via GIPHY

I agree that there may be some value for some people that can exploit this “indirect way of making money”, but if you are outside of this circle, it is only a cost. If they were sure that the 6th edition of the device would be the consumer one, then spending $1200-1800 to get an advantage over the competition may make sense. But with a consumer release a few years away, the value is less clear. I would suggest getting them if you have a budget to invest in pure R&D, if you have a close relationship with Snap through which you can get some funds, or if you are a popular Snap creator who can use these to enrich your personal brand. But if you are an XR creative team in a shoestring and you don’t come from the Snap ecosystem (that is, me), these glasses are not economically a fit for you.

After these preliminary considerations, I’ve been given the device and the demo session could start. Let me tell you my first impressions of these innovative glasses!

Hands-on Snap Spectacles 5

Comfort & Design

I pretty much like the design of the Spectacles. To be honest, I’ve mostly liked the design of all the editions of the Snap Spectacles up to now. Usually, Spectacles are pretty stylish, and this fifth edition makes no exception. Even if they are pretty bulky, you see from the pictures above that they look pretty well on people. I’ve also shot some pictures of the glasses from all sides, so you can better see how they are:

snap spectacles
Front view
snap spectacles
Left view
snap spectacles
Right view
Top view
snap spectacles
Bottom view

Their problem is that they are indeed quite bulky. These are fully standalone AR glasses, and it is pretty remarkable that Snap managed to cram all that technology in stylish glasses. But still, they are big and heavy. 226g is not a lot per se, but when I’ve been given the Spectacles, I felt them much heavier than I thought. Regular sunglasses are around 30-40g, so these glasses are 6-7 times heavier than standard glasses. The frames are also pretty tall and thick, and they made the top of my ear bend while I was wearing them. After 10 minutes of usage, also the top of my nose was hurting a bit.

As I will repeat many times during this review, I think that Snap did as much as it could, but this is the current status of technology, and it is not possible to make the lightweight AR glasses of our dreams, yet.

Visuals

Looking through the fifth iteration of Snap Spectacles is not much different than doing that with other AR headsets like XREAL Air 2 Ultra, HoloLens 2, or Magic Leap 2: as all the other see-through devices, it is able to show you virtual elements that are slightly transparent, but with a quality that is good enough to make them usable for AR purposes. I would say the glasses have probably more things in common with HoloLens 2 than Magic Leap 2 because of the constrained FOV and the visual artifacts.

The FOV looks like having a square-ish proportion and it feels very constrained while you are using the device. 46° is not much and Snap is not even using the trick to end the FOV with the frame of the glasses, so your view window really looks cut out. You start noticing the small FOV especially when you run lenses like the one that lets you draw around you: it’s very hard to free-draw stuff that is just straight in front of you.

snap spectacles fov
The FOV of the glasses, as recorded by Evan Spiegel while on the stage (Image by Snap)

As for the quality of the visuals, I would say that it is nice, but there are many visual artifacts due to the waveguides: I don’t know how to properly call them, so I will just define them as “rainbows” until Karl Guttag will answer to this post and will tell me what is their technical name. When I put on the glasses, I started seeing rainbows appearing in the lower-left part of my vision. Together with this big rainbow, it happened sometimes that when I rotated my head, the colors shown in the virtual elements separated into rainbows until I stopped the rotation. I jokingly commented about it saying that I hadn’t seen so many rainbows in front of me since the last time I was at a gay pride.

Apart from these inclusive artifacts, the screen also seemed to suffer from some persistence issues. When I rotated my head quickly, the visuals started to blur until I stopped again. I don’t know if this is a problem with the waveguides or with the reprojection, but I would bet on the last one.

One cool feature of Spectacles is that you can use the internal menu to decide the dimming of the lenses: lenses can be slightly darker for outdoor usage or more transparent for indoor usage. It’s interesting that Snap is thinking about outdoor use, while most of the competitors are only working on indoors. I tried the feature and seeing the lenses changing the opacity with a menu choice of mine was pretty rad.

As you can see, visuals are affected by the same problems that all the other AR glasses are having.

The lenses of the glasses

Audio

I’m not an audio expert, so I can’t tell you all the details about the audio playback. But as a standard user, I can tell that the integrated speakers of the device did their job all the time.

Tracking & Input

Spectacles are 6DOF glasses. I can’t judge very well the quality of the positional tracking because, as you will see in the next paragraph, the glasses often stuttered because of the low computational power available. For this reason, I can’t judge if the tracking stuttering was due to tracking issues or computational issues, but I would bet on the last one. The only thing I can say is that I had no noticeable drift when moving or rotating my head, so the tracking seems quite good.

The headset has no controllers and mostly works through hand tracking. Hand tracking doesn’t look like the best around for what concerns speed, accuracy, and tracking FOV, but it is good enough that you can use your hands to interact with the UI and to have free-form drawing.

In some applications, the glasses can use the Snapchat companion app on your phone to transform your phone into a gamepad that you use with little games running on the glasses. This is pretty cool: I tried it and it works fairly well.

snap spectacles phone
Using your phone as your controller (Image by Snap)

Computational power

These glasses feature an undisclosed chipset by Qualcomm, which from the description looks like the Snapdragon AR2, or an evolution of it. AR2 is a chip made specifically for AR glasses: it is small, lightweight, and engineered for thermal dissipation. Of course, since it is a chip that has to work in very constrained settings, it has its limitations. It can’t have the same power of a Snapdragon XR2 Gen2 of a Quest 3, which is a much bigger device.

This was clearly visible to me during my execution of the experiences: the performances of the lenses I tried were rarely completely smooth. When we tried the one about drawing together, after all of us three people in the room drew a few lines, the experience started looking a bit choppy to me. If you create for these glasses, you have to create simple experiences and over-optimize them.

Again, this is the current status of the technology and I was not expecting anything different.

Battery

The battery of the demo devices has been put under heavy strain during the event. Since I had only 15 minutes with an (over-used) demo unit, I can’t tell you much about the estimated 45-minute battery duration. The only thing I can tell you is that while I was playing, at a certain point the glasses told me they were running out of juice and then they died a few seconds later. This showed me that in general, they don’t have a lot of charge.

User Experience

snap spectacles menu
The menu of Snap Spectacles, as seen from the live stream of the stage during the launch (Image by Snap)

Snap created a whole runtime specific for its glasses: it is called SnapOS and the applications running on it are the “lenses”.

As soon as I put on the glasses, I went through a tutorial that guided me through the main hand interactions of the device. Here there was nothing surprising: with the hands, you can pinch, pinch and drag, and also use two hands to zoom. I liked that the tutorial was simple, but cute (it used some visually-cute jellyfish) and effective.

The tutorial also taught me to use the menu: on the back of your left hand you can see a digital clock, and by tapping on it with the fingertip of the right hand, you can access the settings menu through which you can configure some parameters like the dimming of the lenses, the brightness of the visuals, etc… On the palm of the hand, instead, there are some buttons to close the current lens (application), favorite it, or open the main menu. Tapping with one fingertip on the other hand’s palm is a nice way to have some haptic feedback with the virtual menu buttons, but for some reasons I can’t exactly tell (maybe the fact that hand tracking was not perfect), I had some little sensory mismatch while doing that and I didn’t feel this haptic magic.

The main menu shows a list of lenses (i.e. applications) that you can run. There are various tabs that let you choose which experiences you want to see: for instance, you can select your favorite lenses, single-player lenses, multiplayer lenses, and so on.

All interactions with the menus happen via pointing and pinching with your hands. Far interactions are pretty good this way but I got a bit confused by the fact that there were no near interactions. If I got close to a menu button, I couldn’t just tap on it, but I still had to pinch on it. I think this should be fixed in a future software update.

You can have a look at the interface in the video of the demo that Evan Spiegel himself gave during the launch of the glasses:

Content

There are already quite a few lenses that you can try on these glasses. I tried something like 4-5 experiences. One was about playing with a Peridot and giving it cuddles and food. It was pretty cute. Another one was about boxing, a bit a la Fit XR, but it was quite hard to play because hand tracking was not very reactive on these glasses yet, so when I punched stuff, the tracking of the hand got a bit lost.

Then I tried a game about using a little helicopter to grab some virtual objects and throw them in a virtual bin. For this, I needed to use the phone as a gamepad, and I can tell you that it worked fairly well.

I tried the AI-powered “Imagine Together” that Spiegel showed in the video above. It was fun saying whatever I wanted and seeing that 3D object generated by AI appearing in front of me. The only problem is that at a certain point, I said “A potato wearing Spectacles glasses” and the AI created a 3D potato wearing something similar to the Ray-Ban Meta. I suspect that someone in the Snap AI department is actually playing on Meta’s side…

The last relevant lens that I tried was a multiplayer one. A Snap guy in the room launched a session of drawing together, then I found in the main menu the session I could join and the system asked me to move around and scan the room. As soon as the OS found enough common reference points between me and the other guy, we found ourselves in the same shared drawing experience and we all started drawing together. It worked very well: hand tracking was accurate enough for freeform drawing and I could see where the other people were and what they were drawing. And it was fantastic that the shared setup happened very easily by just us scanning the same room for a few seconds. We were in the end 3 people drawing together… and when the drawings became too much, my headset started to stutter, so I closed the application.

snap spectacles lego
People building with Lego together on their Snap Spectacles glasses (Image by Snap)

Being a devkit, there is not enough content to have fun for months, but there is already enough to keep you busy for a few hours. For the rest, you can develop lenses by yourself using Lens Studio. More complex lenses can be made using Javascript or Typescript.

The content dilemma

(Video by Snap)

Talking about content, I asked Snap people why forcing people to use Lens Studio for creating the applications for SnapOS. I mean, Lens Studio is nice and is used by many creatives, but constraining people to use it makes me think that these glasses are just an add-on to Snapchat. I think instead that Snap should ambition to create a great general AR ecosystem like other companies like Microsoft and Magic Leap already tried to do.

I would never use Lens Studio and especially Javascript to make a complex experience. I’m a Unity developer and my team uses Unity every day to create amazing applications. Why should we use Javascript? I think that if Snap wants to go on with the Spectacles project, at a certain point it should start detaching it from Snapchat, or it will just be an add-on to a social media. A cute add-on where you can try some nice little experiences and nothing more. Furthermore, I think that especially if I pay $99/month, I should be given a way to use my favorite production tools and not… Javascript. I’m not saying they should remove Lens Studio, I’m saying they should add also other options.

I told all of this to the people at Snap there. I guess they loved me: after my hard questions on monetization, my feedback on the ergonomics, and my criticism of their production tools, they told me “Thanks for your honest feedback. We like direct questions”, which is a PR jargon to kindly say

via GIPHY

(By the way, I liked to joke about it, but I really thank the Snap people for having given me the opportunity to try these glasses and for having kindly listened to all my honest feedback)

Final considerations

skarredghost spectacles snap
Me trying the spectacle glasses

Going hands-on with Snap Spectacles has been one of my highlights of AWE EU. I think Snap managed to create an interesting gadget, which can only be sold as a devkit because it is flawed by the current problems that all AR glasses are having. It is also too expensive to just be sold to consumers.

This is a device that is projected to the future, it is a step towards the consumer edition of Spectacles which will probably come in a few years and that fix all the problems that I’ve described above. During this time, Snap will have anyway to think about what to do with this platform: does it just want it to be a cute Snapchat add-on that launches “lenses” or does it want to join the AR arena with a more general-purpose device? And if takes the second choice, it should make sure that developers have a clear monetization path, so that to foster a rich ecosystem. As a developer, I’m here waiting: as usual, I’m open to all the opportunities.


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

vps immersal visual positioning system

Visual Positioning Systems: what they are, best use cases, and how they technically work

Today I’m writing a deep dive into Visual Positioning Systems (VPS), which are one of the foundational technologies of the future metaverse. You will discover what a VPS service is, its characteristics, and its use cases, not only in the future but already in the present. As an example of a VPS solution, I will […]

vive focus vision hands on review

Vive Focus Vision and Viverse hands-on: two solutions for businesses

The most interesting hands-on demo I had at MatchXR in Helsinki was with the HTC Vive team, who let me try two of their most important solutions: the new Vive Focus Vision headset and the Viverse social VR space. I think these two products may be relevant for some enterprise use cases. Let me explain […]