Last year, in November, I was finally able to go hands-on with 2Sync, an innovative solution that creates a compelling blend of the real and the virtual. I don’t know why I waited so long to write this article, but better late than never, so follow me and discover why I loved discovering this startup.
2Sync
You all know that I’m a big fan of true mixed reality, the one that truly mixes the real world and the virtual one. 2Sync aims at exactly that, by letting you enter a virtual world that replicates the shape of your real one. Let me explain this easily with an example.
Imagine that you want to play a virtual game powered by 2Sync, like one of those zombie wave shooters that were so popular in 2016. Imagine also that you are in your kitchen and you have in front of you the table where you usually eat, and then one meter in front of you, there is a sofa (probably the interior designer of such a room would deserve the death penalty… but this is just an example, not the description of a real room!). When you launch the game, the 2Sync system would detect what you have in the room, and would, for instance, transform the table into a big virtual crate and the sofa into a big bench with some broken bones on it. It would basically detect the layout of your room and integrate the real elements into it, for added realism. Now, if you were to play the same game in your bedroom, you would find in the virtual world other elements that would replicate the layout of this new room. The game would adapt completely to your physical environment… a bit like a responsive website adapts to your reading device.
Why is this cool?
I think this is great for a few reasons. The first is that it is technically cool, and as a nerd, I’m a big fan of every innovative technology.
The second is that it increases safety: systems like the Guardian on Quest warn you about exiting the play area, but do not warn you about obstacles you have in your space. And having done my fair share of demos in public exhibitions, I can safely tell you that almost no average consumer understands the meaning of the safety grid when it appears, and lots of times I had to go and physically prevent people from going where they should not. A visible obstacle in the game world is instead a clear sign, and something we instinctively want to avoid: if I see a big crate, I just move away from it because I do not want to bang my head on it (unless I have just read a “VR is dead” article… in that case, I would bang my head on it on purpose). You don’t need to explain anything… it just works.
The third good reason is that you can exploit what 2Sync calls “passive haptics”. If you see a virtual big crate and you touch it with your hands and you actually touch an object there (e.g., the physical table), your brain clicks and believes more that the crate is real. If you have a real sofa and in the virtual world it becomes a rock, you can actually sit on the rock as you would do in real life (because you actually sit on the sofa).
2Sync can create a bridge between the real and the virtual world, and magically transform your room into something else, letting you play wonderful adventures inside it.
What are its potential problems?
I told you about the cool stuff, but now it’s time to also talk about the problems (it wouldn’t be a Skarredghost’s article, otherwise).
The first problem is that this solution makes sense only for experiences where you actually walk inside the space. It can’t apply to games where you move with your controllers, because the game must be anchored to your physical space. Most at-home games are not made like that, though, so it is suitable mostly for outside-home installations, in my opinion. Home games are still doable, and can still be a lot of fun, but VR people usually want to do bold things like be a furry cat-bird and fly in the sky, and this plugin is not made for that.
A consequence of the above point is that this system requires a relatively large walking area. If you have a small room full of clutter, you can not play a game where you walk inside. This is one of the various points that make me think that this solution is ideal for playing in dedicated spaces, for instance, inside arcades.
Then there is the sensory mismatch: passive haptics are cool, but they do not always work well. For instance, if in the virtual world you see a rock and in the real world you have a comfy sofa, when you sit on the virtual rock, you actually perceive a soft material under your bottom, and so you have a sensory mismatch which can break the magic.
Safety also works only if the plugin does its job well: if the virtual crate that replaces your virtual table is smaller than the table or slightly offset from it, you still risk hitting the real object while playing. And don’t get me started on moving elements: what happens if you have a chair and you move it in the space?
The last problem I can think of is less technical and more business-oriented. The solution is cool… but how many devs may be interested in building this kind of experience? It’s clear that there is a market for that, but is the market big enough to make this company generate enough revenue? I guess only time can tell.
How does 2Sync work?
2Sync provides an SDK for developers who want to build apps with it. If you build a Unity or Unreal app with it, the app at startup will scan the environment, detect the walls and the objects in it, and then proceed with the generation of the virtual world that fits the real environment. I wondered if the scan happened using a custom technology or the room data provided by the Meta Room Setup, so I launched a demo on my Quest, and I verified that it uses the data from the scene setup.
The generation happens following some rules that the developer defines at development time. The rules explain how the system generates the walls, how to substitute the objects, and so on. So, for instance, the system can be instructed to substitute tables with crates, and the system will scale and/or add crates until the shape of the table is perfectly replicated.
This is the special part of this system: with 2Sync, the app can adapt to any real environment. You can easily hardcode yourself an app that works in a very specific environment, because the Quest already detects where the tables, sofas, chairs, and so on are. But it’s hard to build an app that is “responsive” and so knows how to fill every possible environment with virtual elements so that they feel coherent. With its developer-time rules, 2Sync aims at building exactly that.
Multiplayer
One important thing to know about the 2Sync SDK is that it supports colocated multiplayer: this means that multiple players can play the same game in the same physical space. All the players would see the same virtual environment, and for all of them, the virtual space would replicate the shape of the real one.
I think this feature is very important, because, as I’ve said before, this software seems to me ideal for location-based entertainment more than at-home use. And in LBVR (Location-Based VR), playing a game together with other people and having social interactions with them is one of the most important things. So with 2Sync, you don’t only see virtual objects replicating real ones, you also see virtual avatars where the real people are. This enables a lot of fun, as we’ll see later on.
Availability
The 2Sync SDK is compatible with Unity and Unreal Engine. It supports building for the most popular VR headsets, like Quest and Pico, and it allows you to create both Virtual Reality and Mixed Reality experiences. The SDK is currently in closed beta, and you can apply on the 2Sync website to be allowed to use it.
If you are not a dev, there is currently a demo application on the Quest Store that you can download and try at home. I have to warn you, though, that I tried it and it is actually a very simple one.
Hands-on 2Sync
At the NextReality event in Hamburg, I was able to go hands-on with a couple of demos of 2Sync, which were kindly offered to me by Moritz Loos, one of the cofounders of the company.
He had a small dedicated space with a sofa and a table, and he launched the experience for another guy who was there and me. The game was a sort of simplified version of the Siege of Heaven mode of In Death: Unchained, so basically a wave-based defense game. We were like on a tower, and around us there came enemies that wanted to attack us from all directions, and we had to kill them using a bow and arrow. Sometimes, some bonuses appeared in our room, and we had to get them.
The game was simple, but fun. The first thing I saw was the avatar of the other guy, and we started interacting and having fun together. We were both fun guys, and we started doing stupid things, so the game became hilarious. As I’ve said before, multiplayer interactions can make any game more fun. And we also played it super-well, because we did the top score of those 2 days of demos! And thanks to the 2Sync SDK, the table in front of us had become a crate and was fully integrated in the virtual environment. The sofa was visible in VR, too. The advertised replication of the real environment worked fairly well.
The big problem of this demo, which is the same as the demo that you can try at home, is that the integration of the virtual elements is only aesthetic: you see these virtual crates in your room… and basically that’s it. They don’t have a purpose; you don’t use them, you are not forced to feel them with your body. The game that I tried was basically a wave shooter game: whether the table was in the virtual world or not, nothing changed for me. This showed me that when you build for this technology, you should also design your game around it, otherwise it only feels like a gimmick. You should give some utility to these generated objects, a sense of being there (apart from the safety, which is still very important)
After this demo, he let us briefly try other demos, and it was super cool to try different virtual environments replicating our real room. It was like one of those movies where you visit many parallel universes (any other fans of Everything Everywhere All at Once here?). In one, the table was a crate, in another, a piece of wood, and in another crazy game, it was even made of gummy sweets!
It was a fun trip. When Moritz set a game about skeletons to shoot, I decided to “try” the sofa: so I went where the game put a virtual element to replicate the sofa, and just sat on it and started shooting. Just by being in VR, I knew exactly where the sofa was, and I safely moved towards it, and I sat on it. That was super cool.
Final impressions
I think that 2Sync is trying to do something very interesting: bring your real world into virtual reality, for an increased realism and safety of your immersive experience. The system works fairly well from my hands-on experience, and can also give you fun in multiplayer.
Its real problem is that to fully give its best, the experiences should be designed around it, so that to give a sense of why you are bringing your real elements into your virtual world. It becomes a problem of game design, and it is not an easy problem to solve, either: if these virtual elements have a use in the game, how can you ensure the game is balanced in all the possible room configurations, including environments that do not have furniture inside?
I think that this system still gives its best for location-based experiences, where you have full control of the space, both in terms of the dimensions and the elements inside it. If you are in that business and you think a system like that can be useful for you, head to the 2Sync website and ask to be part of the beta. I personally can’t wait to try a full-fledged game that can optimally merge the virtual and the real…