Holo-light xrnow hands-on

Holo-Light showed me that AR cloud rendering may already work

There is a lot of talking about cloud rendering, and at AWE I have been able to try the solution proposed by German startup Holo-Light, and it was interesting to see that it worked quite well.

Holo-Light

Holo-Light is a company working in providing cloud rendering solutions for XR. I know the company since a while, and I have finally been able to meet some of its representatives at AWE this year. They told me that they started providing XR solutions some years ago, and they noticed how it is hard to optimize the content for AR and VR headsets, especially the standalone ones. Customers always wanted to see high-quality models, and working on optimization and polygon reduction often lead to results that were not satisfying enough, either in terms of performance or visual quality. So the company tried to find an alternate approach and started working on cloud rendering so that to offer high-end graphics on whatever headset: since all the rendering happens on the cloud servers and then gets streamed to the headset that just acts as a display, it is possible to have high-end graphics even on standalone devices.

holo light isar sdk
How remote streaming with ISAR SDK works (Image by Holo-Light)

Holo-Light has released some months ago its SDK, called ISAR SDK, to let people develop applications that can work through cloud rendering. Holo-Light approach is different from the one of other solutions, like NVIDIA Cloud XR, where you just develop a standard SteamVR application and then you have a system to stream this SteamVR application from the cloud to the headset. Here it is the developer that adds a dedicated SDK for cloud rendering and develops the application since the beginning with cloud rendering in mind. The two approaches are different, and have both their pros and cons.

XRnow

Applications developed with ISAR SDK required the developer to set up a server (on a commercial cloud like AWS or on-premise) to perform the actual streaming. But at AWE, Holo-Light has announced a new service, called XRnow, that basically offers already-configured cloud rendering machines as a service for your Holo-Light experience developed with the ISAR SDK. This increases a lot the usability of the service because if I developed an experience that needs cloud rendering, I would really have not the willing to spend my time installing, configuring, and tuning a companion cloud rendering server. The fact that is possible to pay for already set-up machines increases a lot the appeal of this solution. XRnow works through servers on the AWS cloud, but if companies are interested in a solution installed on a proprietary cloud for security purposes, they can use it as well.

I’ve been shown that XRnow has also a store that hosts cloud rendered solutions. This store can be accessed by all devices (AR headset, VR headsets, mobile phones, etc…) and when an experience is launched, if the device is compatible with that experience, it gets executed, otherwise, you are shown a QR code that you can scan from a compatible device to launch the experience on it. At the moment, XR Now is compatible with HoloLens 2, Oculus Quest 2, and Android 10 tablets.

I like about this solution, which at the moment is devoted to the enterprise market, that now it offers a quite complete package for companies that need an application that exploits cloud rendering. You have the SDK, you have the store where to host the applications you build with it, you have the servers that run the cloud rendering, all hassle-free. You just pay the price of the service, and you have everything already set up for your company, without having to worry about whatever technical detail.

Holo-Light has also just announced to have become the first AR/VR streaming provider to be named Unity Verified Solutions Partner.

If you want to read more about this service, you can head to the announcement post on Holo-Light website.

Hands-on XRnow

I have been given a HoloLens 2 headset, and I’ve used it to launch the XRnow store and execute one of the experiences included there. In this experience, I could see three interactive models: a car, a big engine that could be exploded, and a big model of the Earth. The models were quite big and complex, and there was no way they could be visualized with a decent framerate on a limited device like HoloLens 2.

When the experience started and I saw the first part of it, with the big Earth and some satellites around it, I asked if that was already cloud rendered and the answer was affirmative. It was quite cool because I couldn’t tell that it was streamed from the cloud. I moved around a bit, I got some close ups, and everything looked more or less like a standard HoloLens 2 application. I had just a bit more stuttering than usual when moving around the model, but applications on HoloLens rarely look totally fluid in any case. It was interesting to see such a complicated model running on the standalone HoloLens: it would have been impossible without cloud rendering. The same with the other two parts of the experience: everything seemed more or less like an experience running locally on a HoloLens 2, and if I wasn’t in Holo-Light private demo room, I would have thought it was just a normal non-optimized HoloLens app with some little problems of framerate. Even the interactions reacted pretty well: when I did the gesture to explode the car engine, it happened immediately.

This video shows what kind of models you can have with cloud rendering on HoloLens 2. The stuttering of the video is much bigger than the one I experienced

I had already tried NVIDIA CloudXR with an on-premise server, and I already knew how remote rendering technologies are already advanced (and most probably if you have tried Virtual Desktop you know that as well). But this time it was special for me because it was the first time experiencing cloud rendering from a commercial cloud: all the system was working through AWS servers, so not on dedicated machines installed at 5 m from me, but on servers that were in an unknown location distant from me. This made everything more interesting because it showed me that cloud rendering is already commercially viable. I mean, I’ve tried a cloud rendered app from a commercial cloud and I didn’t notice it was cloud rendered… this is huge!

There are anyway some equally huge caveats you have to consider before getting too much enthusiastic:

  • I tested the system only for a few minutes, and this is not enough to have a complete opinion on this solution;
  • My tests were only on AR experiences, where steaming lags are less likely to cause noticeable side effects like motion sickness;
  • The applications I tested had little and simple interactions: playing Beat Saber in Expert+ mode you notice much more the problems of the lag in input detection;
  • I had anyway a bit more stuttering in the experience, and a very slight lag in some moments;
  • HoloLens applications rarely run very fluid, so even if the streaming introduces a slight stuttering, I consider it quite acceptable. A test on Oculus Quest 2, where most commercial applications run well at 90Hz would have been more significant;
  • We were in the Bay Area, one of the most advanced technological regions of the world. Amazon has for sure some servers in the region, so the streaming was happening from a close location. I wonder if I launched the same application from my city here in Italy, if I would have the same results. Even worse if I was not in a city but in the country side. The problem of cloud rendering is not the one of making the system to work, is the fact that you must have a server close to you to have an acceptable latency, and this is not always possible at the moment.

Final impressions

holo-light
Concept image of people interacting on a cloud-rendered engine (Image by Holo-Light)

I came out from my visit to Holo-Light fairly satisfied. The XRnow solution is a smart idea to offer companies a complete package to let them develop and use cloud rendered experiences, which can be for instance training applications showcasing complex models of some machines. I was quite impressed by my tests, because I’ve been able to verify that cloud rendering from a commercial cloud works already very well for an AR experience if you have a server that is close enough: you may notice little difference from an application that is running locally. Of course, there’s still a lot of infrastructures to be built before we can all use cloud rendering from wherever we are with good performances and at an affordable price on all devices, but it’s interesting to see that in some contexts, this kind of solution is already usable nowadays. Fingers crossed for the future!

(Header image by Holo-Light)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

vps immersal visual positioning system

Visual Positioning Systems: what they are, best use cases, and how they technically work

Today I’m writing a deep dive into Visual Positioning Systems (VPS), which are one of the foundational technologies of the future metaverse. You will discover what a VPS service is, its characteristics, and its use cases, not only in the future but already in the present. As an example of a VPS solution, I will […]

vive focus vision hands on review

Vive Focus Vision and Viverse hands-on: two solutions for businesses

The most interesting hands-on demo I had at MatchXR in Helsinki was with the HTC Vive team, who let me try two of their most important solutions: the new Vive Focus Vision headset and the Viverse social VR space. I think these two products may be relevant for some enterprise use cases. Let me explain […]