syncreality vr

AWE 2022: SyncReality turns your room into a cross-realities playground

One of the most interesting products I have tried at AWE has been SyncReality, a solution to turn every physical space into a mixed reality experience. This is something I thought about also doing myself, so I have been very happy to see that someone was carrying on that vision for real. Let me explain everything…

Cirillo

It’s a while I’m connected on Linkedin with Cyril Tuschi, a prize-winner filmmaker turned Virtual Reality entrepreneur. I had the pleasure of meeting him in person at the last AWE US, where his company SyncReality had a small booth. I had a few friendly chats with him since he’s a nice guy… and quite soon I started Italianizing his name into “Cirillo”: this is how I will refer to him for the remainder of this article. Either Cirillo, or Chief Cirillo Officer, both sound good.

By the way, for all the days at AWE US, Cirillo invited me to visit his booth, and I always said “I’ll come later, don’t worry” while I was actually chasing some hard tech stuff like Mojo Vision contact lenses or Leia 3D displays. After AWE finished, I realized he should have probably worried about me never coming later…. because I actually forgot to visit his booth. Ouch.

This AWE EU started in the same way with me meeting Cirillo everywhere (I guess he has some twins, because he was literally everywhere) and always promising him to “come later”. In the afternoon of the last day, when I found the booth of SyncReality in front of me, I said to myself “maybe late… no, com’on, you can’t do this to Cirillo again” and finally entered the booth. After having screamed “Cirillo” a few times for no reason and having hugged my friend, I have been introduced to what the company SyncReality does, which is actually quite cool. In that moment, I regret not having visited it before.

SyncReality’s Vision

One of the videos from Oculus Connect 5 that stuck in my mind till tody is when Michael Abrash showed a real space transformed into a virtual one that has exactly the same shape, but a different mood. So imagine being in your bedroom, and a headset is able to recreate there a map of a game that has exactly the same shape and “obstacles” of your real space, but that actually is for instance a spaceship environment, or some alien home. This is simply amazing because it would offer an experience that is virtual reality, but a virtual reality completely rooted in your real reality. You can re-live your real space in many different ways. I’ll call it “mixed reality” or “cross reality” because it blends information from both realities, even if technically it is just virtual reality.

This video is from 2018, but I still dream about it today

SyncReality’s vision is exactly the one of making this possible for everyone. To be more specific, it wants to create a complete set of tools that let you do this operation in a very easy and adaptable way.

SyncReality tools

SyncReality trailer

SyncReality wants to offer tools for developers to create “mixed reality” experiences starting from real spaces. The experience should be able to adapt completely to the physical environment it is running in, modifying the size of the virtual environment to be the same as the real one, and taking care of substituting desks, tables, chairs, and sofas, with virtual reality counterparts.

While Cirillo was telling me this, I stopped him immediately and asked him why they were doing this, if Meta now already offers a tool to define what is the layout of your room so that you can use that information in your Unity project... it is a bit clunky, but it works. He told me that SyncReality works also with that, but it actually extends that by doing much more.

First of all, SyncReality is now launching for Quest 2 and Quest Pro but plans to be cross-platform. Then, the system is able to take the data from the Meta Scene Setup, if it is present, otherwise, it offers other ways to define what is the room layout. For instance, it lets you define the room space and the objects in it in an easy way by just touching the corners of the room and the various items within using your bare hands. Or it can also take an existing room scan made with tools like Matterport and detect the shape of the room and its content automatically.

But the culprit is not just having a room layout: the interesting feature that SyncReality wants to offer is also the ability to transform it into a meaningful experience. For this, the SDK lets you define some rules about how to substitute the various real objects with virtual counterparts: so for instance if you are developing a horror game in Unity, you can say that every time in the room there is a real sofa, it should be substituted with a virtual coffin with a vampire inside, while every time there is a real chair, there should be a zombie seated on a virtual chair. The virtual environment is so built dynamically starting from the characteristics of the real environment, with the virtual elements put exactly in the same place of the real ones.

syncreality reality mapping
A real space can be turned into a virtual one with the exact same layout (Image by syncreality)

This is different from what is usually done now. Currently, people creating this kind of experiences perform the scanning of the space where they want to play and then re-create by hand the virtual space so that its objects coincide perfectly with the layout of the real space. This, of course, is good to make some demos or some LBVR experience, but is not suitable to distribute content of this kind in the store, for instance, because it could be used only by people with the same exact room layout. SyncReality, instead, makes sure that once you design your mixed reality content, it will get adapted to the room layout of all the houses of the users. Because the runtime can detect what is the room layout and where are the elements inside, and then use the rules created by the developers to know where to put the virtual content depending on the location of the real one. This means that for the first time, this kind of physical mixed reality content may become distributed in the Stores. And I also imagine that with the help of the upcoming Shared Anchors, it will also be possible to create local multiplayer games running in physical spaces. You could invite your friends with a Quest 2 at your place and then play with them a shooter game in your bedroom that has been transformed into a sci-fi arena.

Another interesting feature of this development kit is that the system tries to adapt the virtual elements to fit the space occupied by the real ones. Continuing the example described above, let’s suppose you said that in your application, every sofa should become a coffin… the problem is that there are many sofas sold commercially, and they have all different shapes and dimensions. So a single 3D asset of a coffin can’t fit them all. For this reason, the SDK tries to elastically enlarge and shrink the virtual assets to fit their physical counterparts.

syncreality vr
Real environments can become virtual (Image by SyncReality)

It is important that there is a fit because you want your virtual space exactly identical to the real one. You want that if there is a virtual chair in the same place as a real chair, the user can sit on the virtual chair to sit on the real one. You want that if the user touches the virtual plane that has been put on top of a real desk, when the virtual fingers touch the virtual plane, the real fingers touch the real desk. And this is why you want perfect mapping: so the user, first of all, can play safely, because he knows that all the virtual obstacles correspond to real ones, so he can walk freely in the virtual space knowing exactly what to avoid. And also he can have enhanced presence, because when he touches the virtual objects, he also touches the real ones they represent, for a natural haptic sensation that feels magical.

https://gfycat.com/nicecluelessauklet
Your space can be turned into a virtual world

Cirillo told me that the company just announced the alpha release of its responsive spatial design tools. The main target for them is the game developers, but also enterprise developers are more than welcome. If you are interested in checking it out, you can do that at SyncReality’s official website: https://syncreality.com/

Hands-on SyncReality

Of course, at AWE I wasn’t able to have a proper hands-on as a developer because it would have been pretty weird to give me a laptop and ask me to code there in front of them. So I can’t tell you how is the experience in using the SDK. But I can tell you how was the experience as a user of a demo built with the current alpha of the SDK.

I had the luck that as soon as I put on the headset on to try the experience, everything crashed. I’m serious to say it was luck because this forced the SyncReality team to redo the room configuration in front of me. I could see a colleague of Cirillo’s putting the headset on, and walking around the room with passthrough activated, touching with his bare hands all the corners of the objects in the room: sofas, tables, chairs, etc… Seen from the outside, it looked a very easy operation, and in fact, in a few minutes, the room was properly configured, and I could run the demo again. This was for me the proof that the room setup offered by the system is pretty easy to be used.

After that, I put on the headset, and I found myself in a virtual reality environment where all the objects in the room had been transformed into their virtual reality counterparts. I could move freely with my headset on, knowing that the shape of everything was perfectly replicated in virtual reality, so I never risked stumbling against anything. I tried to check the quality of the mapping, and it was definitely good: every time I was touching a virtual object with my hands, I was touching a real one as well. I made no quantitative evaluation of the error, but let’s say that qualitatively, it seemed the matching was very well made. I also tried to sit on the real sofa that was transformed into a big group of virtual rocks, and while I was a bit scared of sitting on a virtual element, I soon realized that there was a real solid sofa under my bottom and sit down comfortably. It was cool to be able to physically interact with all the virtual objects, knowing that they represented real elements. It was interesting that the room had become something else in virtual reality… while still being the original room.

Me chilling on a sofa. I was able to find it and sit on it while always being in virtual reality. This is the power of environmental mapping

There was also a feature that let me cast some balls that activated passthrough AR in a circular region, in case I wanted to verify the mapping between real and virtual objects, and casting a few of them around, I could confirm that the mapping seemed good in every part of the room.

So, I can confirm that the system works and works well. I have anyway a few remarks to add to this positive review:

  • The demo showcased just a simple single environment. I had no way of verifying the adaptation system in action in multiple rooms, so I don’t know how that part of the system (which is, in my opinion, the coolest one) works well
  • The demo was very simple and static. I would like to try a demo with a real game adapted to an environment, also because there are some interesting game design challenges at stake. It’s ok that the game can adapt its environment to the real environment, but what changes do these differences in environment introduce into the game calibration? Is it possible that the game in my room is more difficult than the one in your room? Does the SyncReality SDK also help in balancing this? These questions have still no answers for me
  • The sense of touch was actually a bit of a disappointment. This was a huge surprise because one of the coolest things about SyncReality is that when you touch a virtual element, you really feel you are touching something because in that position there is a real element. On paper, this should increase immersion. In reality, to me, it diminished immersion. For instance, when I sat down on the sofa, and I touched the virtual stones it was made of in the virtual world, my brain had a sensory mismatch. I was seeing rigid and cold stones, but my bottom could feel a comfortable cushion, and my hands could touch soft fabric. At that moment, I had an interruption of the magic, because I could understand the reality around me was fake. I wonder if there is a way to improve this.

Final remarks

vr dance syncreality
I don’t know what I was doing here, I guess some arthritis dance

I think that SyncReality is pursuing a very intriguing vision, which is the one of giving a new life to the spaces you live every day in through mixed reality. I think that this could become a powerful thing in our mixed reality future that someone references with the M-word.

Of course, its product is still in the early stages, so it can’t do miracles, and I think also that the monetization of this company is still something that should be evaluated with care.

But at the end of the day, SyncReality has potential. As a developer that wanted to do something similar, I would like to experiment with it, and if you may be interested too, I would suggest you have a look at their website and join the alpha. Cirillo would surely be happy about that. And don’t say that you are going to do it later, because trust me, it won’t happen…

(Header image by SyncReality)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

virtual reality hog awe europe

My experience at AWE Europe and a comparison with the US chapter

I’ve just come home after my trip to Vienna to attend AWE Europe. Many people asked me about my experience with the event and especially what are its differences with the US version, so I thought it could be worth writing a little post about it. AWE is one of the best XR events out […]

xpanceo smart contact lenses hands on review

XPANCEO smart contact lenses hands-on: AR prototypes from the future

At AWE, XPANCEO, a Dubai-based company working on smart contact lenses, showcased a few interesting prototypes of its futuristic technology. I was able to even put my eyes close to one of them and I want to tell you everything about this experience! XPANCEO If you’re not new to this space, you will surely remember […]