meta hyperscape

Meta Hyperscape hands-on: a wonderful glimpse of the future

When I wrote the article about my 5 main highlights of Meta Connect, I selected among them Meta Hyperscape, the solution to scan environments and enter inside them. I guess this surprised a few of you, who were expecting a mention of Meta Avatars or some of the new games, but after I’ve tried the beta of Hyperscape available on the Horizon Store, I’m not only convinced that I did right at considering it one of the main pieces of news of the show, but I also think that it’s much bigger than I imagined. Let me tell you why.

Meta Hyperscape

Gameplay video of my first 10 minutes in Meta Horizon Hyperscape

A little refresh in case you missed this news at Connect: Mark Zuckerberg announced at Meta Connect a solution called Hyperscape. This software is comprised of a mobile app that lets you scan an environment (e.g. your room) with your phone, and it uploads the data on the cloud, which reconstructs the space using a series of Gaussian Splats. Then there is a VR app for Quest where you can enter the environment you’ve just scanned, either alone or with other people. Meta is not the first company proposing something similar: a few months ago I tried a similar service by Varjo, for instance.

The service is currently available in beta only in the US and the VR Client only features 6 predefined environments for now.

How to install Hyperscape

Hyperscape is available on the Meta Horizon Store. It requires the runtime v69 of Quest, so if you are still on a previous version, make sure to update your headset by going to the System Settings and forcing an update.

The app is available only in the US… but I mean, I’m Italian, so I’m not someone willing to respect the rules… so I launched the VPN on my PC to set myself up as if I was in the US, went to the store page of Hyperscape on the web version of the Horizon Store while I was logged in with my Meta account, and asked to Get and Install the app. I put the headset on my head, and voilĂ , Hyperscape was on my device. The download has been lightning fast because the app weighs only 32MB!

Launching the app

Hyperscape is so lightweight that I guess is a native app… which is also coherent with the fact that it should be very reactive. The application surprisingly launches as a 2D window, that then becomes immersive only after it has done some checks. I’ve never seen any functionality like that if not in WebXR, so I wonder how they did it (maybe thanks to the new Spatial SDK?).

meta hyperscape connection
The intro 2D window that checks the network connection

When it is a 2D window, Hyperscape does some network checks and runs the full application only if your network is good enough. If it finds some network problems, it shows you a dialog that flags that your network is not fast enough and says that you may have some issues. You can choose to close the app or to continue notwithstanding the issues. My suggestion is that, unless your network speed is really potato or you are very prone to motion sickness, you should go on anyway, so that you can experience this interesting application.

The network check is important because this application relies on cloud streaming. According to people who analyzed the data of the app, Hyperscape uses the rumored Meta Avalanche service for cloud rendering and streaming.

Navigating the environments

As I’ve said, Hyperscape currently features only six environments: one is a car garage, four are the spaces where some small indie artists paint or craft objects, and the last one is the previous office of Mark Zuckerberg. Hyperscape showed me this choice in a little floating menu window in mixed reality. I selected the cars to start but then navigated all of them. I have to say that the four artists’ spaces are quite similar, I would have personally preferred a bit more variety, but still, they are all worth a look.

Meta Horizon Hyperscape environments
The selection menu with the possible environments to visit

When you choose one space, you are teleported into it in an immersive way and a popup explains to you what that environment is about. Then you are presented with the controls: with the left thumbstick, you can teleport, while with the right thumbstick, you can snap turn. The A button is to show again the info window of the space you are in. The Menu button of the left controller is to return to the environment select window, instead.

Meta Horizon Hyperscape input
Input is pretty straightforward

Inside the spaces there are some places featuring little circles: if you point at them with your right controller and you press the Trigger button, a little pop-up will explain to you something about the object the little dot is on. Most of the time, to be totally honest with you, this additional information was absolutely not interesting to read.

Meta Horizon Hyperscape
Uhm, ok, I guess?

If you have ever tried a VR virtual tour experience for real estate, you know what I’m talking about: you just go around a place and click on some interesting points to get some relevant info.

Gaussian Splats

Meta Horizon Hyperscape
These lamps have not been reconstructed very well, so they look like made of watercolors

Gaussian Splats are an amazing new rendering technique, but since it’s something new, people are still experimenting with how to use them at their best. When I tried Varjo Teleport, I mentioned how the various environments I was in looked like painted in watercolors. Here the reconstructions are much better, and I did not have the same sensation, but still, things looked slightly blurred, as if every object was lightly softer or slightly more distorted than it should have been. A few objects, probably the ones that were not scanned appropriately, looked like watercolored. Reflective surfaces showed artifacts, and when I moved my head, they changed their appearance in a noticeable way. Sometimes you notice some “paint strokes” in the air, which probably represent some bug in the reconstruction. And if you go close to an object, you notice that all its visuals are like painted.

Meta Horizon Hyperscape
This is a logo on a jacket. From a distance, it looks fine, but going very close to it, you start seeing it as if it is made of blurry strokes

Don’t misunderstand me: as you can see from the images and the videos in this article, the reconstruction is definitely good. But from the pictures, you easily miss all the imperfections that you instead easily notice in VR, and that prevent you from having a full sense of immersion.

Cloud streaming

The application is cloud streamed and you can easily notice that. Sometimes there are some visible effects of that: I had lag, small freezes, and also some weird things happening in the periphery of my vision. But also when the network behaves well, there is always that subtle sense of slight blurring and slight lag that is always present with streamed experiences. This is for sure one of the reasons why this app is in Beta: the streaming should be improved. I also wonder if, considering that the app should be available only in the US, the streaming services are only located in North America, so we Europeans have a worse experience.

The huge sense of presence

Meta Horizon Hyperscape
These paintings look amazing. If I told you that this an image I took in a physical place, you would believe me

Most of the time, your experience will be conditioned by some of the problems that I’ve mentioned above and your brain will easily notice that you are in a reconstructed environment and not in a real one. Some objects are blurred, some others have reconstruction imperfections, and so on. In fact, when I launched the car garage environment, I was not much impressed by it. I found it nice, but that was it.

But there will be a few spots in the experience where the space in front of you will look real to you, and in these moments you will go WOW. It happened to me the first time in one of the workbenches of one of the creators: I looked at the desk in front of me and for a split second it looked like real and my mind clicked. For that moment, I truly believed to be there in the space of that creator, in front of his desk. Then moving around, I lost this sensation because of the imperfections, but then later in another space, I had it again.

The quality of the reconstruction is overall very good, but in some places, it is excellent and you can feel immersed there. And what is impressive is that these environment reconstructions are truly 6DOF. I could move with teleporting everywhere I wanted, and I could snap turn however I wanted. Virtual tours are usually 3DOF and you can only navigate to specific viewpoints. Here I could be everywhere, it was like being in another space. If they can improve the splats’ quality this can truly feel like a teleport to another location.

I thought it could be just me to be amazed by this app, so I also made another non-VR person try the experience and she was literally amazed all the time about it. So it’s not only my impression, it is definitely good.

The endless possibilities

via GIPHY

I got pretty excited by Meta Hyperscape, so as an entrepreneur and developer I immediately started thinking about all the possibilities enabled by this technology. The first thing I thought is that this will disrupt virtual tours: if I were Matterport or another company making money with virtual tours, I would be concerned, because with Hyperscape, everyone with a phone (and some technical knowledge) could scan his own space and create a virtual tour about it. And this virtual tour would be fully 6dof: people can freely move inside it, feel the objects with the proper depth, and so on. It would not be just a set of 360 photo bubbles, but a full navigation in another space. I know that Hyperscape is in beta, but when it is finalized, this service will be perfect for visiting homes remotely.

The social aspect is also another thing that always fascinated me about reconstructed 3d spaces. I could scan my new home and invite my friends to visit it in VR. Or I could scan a space in a specific moment I want to remember (e.g. a portion of the restaurant of my wedding party) and I could revisit it alone or with my family whenever I want. I’m pretty sure that Meta is primarily interested in this and I see in it great potential: as today you shoot a picture to remember a moment, in the medium-long term future maybe you will be able to save a full environment.

The office of Mark Zuckerberg is a glimpse of that: in my life, probably I will never have the opportunity of being invited by Marky Z to his office, but with Hyperscape, I was there. It could also be a service offered by celebrities to let you visit their environments, their favorite spaces, or the location of their events…

Meta Horizon Hyperscape
I am in the same place as Mark Zuckerberg. With just some hundred billions less…

Another idea could be creating VR games that are held in real spaces. Indie developers with limited budgets for 3D graphics may for instance scan their rooms (after they remove the pizzas and the Red Bulls :P) and set up an escape room or a point-and-click adventure there.

As the Varjo PR told me at AWE about their Teleport service, it can also be useful in the B2B sector: for instance, if a company has to set up a stage, or a booth, or something like that, they could do that, then scan the environment, and send it to the CEO to step in and verify if he likes it

I think this is an enabling technology that as soon as it is stable, will offer many new opportunities in the XR space. And don’t get me started on what it could be possible to do with interactions: if some of the scanned objects could be grabbed or interacted with (e.g. you can turn on the light), that would be amazing.

A note on cloud rendering and feasibility

That moving black band you see in the periphery of my vision is an artifact of the streaming

This application does cloud streaming, probably streaming the splats that you have to show on the display. Cloud streaming is amazing for many reasons, but it also is the cause of some of the problems described above: lag, latency, vision artifacts, etc… It also introduces an additional problem: price. Streaming from the cloud is pretty expensive and I wonder what is the cost per minute per user that Meta is paying for Hyperscape. I wonder also if streaming has been chosen because Meta wanted to test its Avalanche service or if there is a necessity at the base, like that the environments are so big that a full download would require too much time, or the computation needed by the splats is so big that must be done in the cloud. I bet that also the choice of using teleport and not allowing smooth locomotion is conditioned by these performance considerations.

I ask all these questions because I’m wondering about the feasibility of this as a commercial application. Is it currently too expensive for a startup to create a similar infrastructure? And if it is Meta offering it, what will be the price? And will Meta try to make money out of it? All these questions will help in understanding if all the above ideas can become true products in the short term or not. Because if the costs are too high, no company would ever create a product out of it in the next months. But if it is affordable, we can have new startups exploiting this new technical marvel.

Final considerations

I’ve been impressed with Meta Hyperscope. It’s a glimpse of a future when we will be able to visit other real spaces without moving from home and have the true impression of being there. This is an enabling technology that in my opinion will have many ripple effects in our space in the medium-long term. But for now, it is still a bit rough and probably expensive. If you truly want to feel like being in the office of Mark Zuckerberg, you still had better wait for him to offer you lunch (where you eat some BBQ sauce, of course)…

(Header image by Meta)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Releated

windows 10 virtual reality headsets controllers

The XR Week Peek (2024.10.08): Windows 11 update bricks WMR headsets, PSVR 2 becomes the 9th most used HMD on Steam, and more!

After last week’s many interesting pieces of XR news, this week has been far more relaxed. Or maybe “relaxed” is not the right word to use, considering the little drama on Microsoft’s headsets. Top news of the week Microsoft officially drops support for its Mixed Reality headsets Microsoft has just released the infamous Windows 11, version 24H2 […]

meta orion glasses

The XR Week Peek (2024.10.01): Meta Connect 2024, Apple Vision Pro refresh, and much more!

The past week was the week of the Meta Connect, with all its amazing announcements that made all of us excited. It was also the week when I returned to Italy, and eating my mother’s pasta again made me excited, too. Next year, I should eat my mother’s pasta while watching the Meta Connect to […]