On June, 21th, in the occasion of the French Music Fest, there has been a virtual reality concert held by electronic music legend Jean-Michel Jarre. This event, dubbed “Alone Together” has been a true success: hundreds of thousands people have followed its live stream on Youtube and Facebook, while thousands of people have attended it in VR on VRChat.
I’ve been honored of being one of the people taking part in this project organized by VRrOOm, and it has been the first time for me to organize something watched by so many people together. I have learned a lot in the process, and I want to share with you what has happened behind the curtains, satisfying some of the curiosities you may have, teaching you some lessons, and making you laugh a bit.
A little backstory…
Let me tell you how the project has born. At the end of May, Louis Cacciuttolo of VRrOOm contacted me and Lapo Germasi of MID Studio to tell us that he was discussing with the French Minister of Culture an event in VR for the French Fête de la Musique, an initiative held every year to celebrate music in France. Since all the concerts are currently banned because of the coronavirus, the Minister was looking for new ways of doing the event and Louis proposed him to do an initiative completely in VR. VRrOOm is investing since months in organizing artistic events and live performance on top of VRChat together with me and MID, and so Louis proposed the minister to organize a virtual concert with a French artist all in VRChat, with also something done in the real world to bridge the gap between virtual and real.
This all sounded very cool, but there was a little problem: the Music Fest was on June, 21st, giving us less than one month to organize such a big initiative. Considering the bureaucracy times needed by the Minister to approve the project, this time was reduced even more, giving us more or less just 3 weeks to organize a full-fledged music fest in VR. It sounds an amazing but completely crazy mission, who would be so stupid to participate in? Count me in, of course!
Alone Together
Louis started gathering people for the project that should have been done at an insane speed. First of all, he had to find the star of the show: a music festival without any famous artist would be pretty boring. After getting some “no”, Louis managed to sign a deal with the electronic music legend Jean-Michel Jarre, famous for having been a pioneer of electronic music in the 70s and 80s. Louis is a very passionate person, and when he told us that JMJ would have been part of our show, he had a bright light of excitement in his eyes. The same light was also present in the eyes of the other people during the Zoom calls we had later on with the artist, while I was more like
After having found the star, it was time to create a team that could deliver the project, so Louis (apart from me and MID Studio) contacted digital artist Pyaré (Pierre Friquet) to be the art director of the project, Vincent Masson to create all the visuals for the show, Georgiy Molodtsov to help with the organization of the event, especially for the streaming part. Many other talented people and companies came later on: Jean-Baptiste Friquet, SoWhen?, Atelier Daruma with the director Mathias Chelebourg, Seekat, MindOut, Urszula Gleisner, and of course the support of the host VRChat.
This group of crazy people worked really hard for three weeks to develop this project, which was called Alone Together to underline the fact that it was an event where all people were together enjoying music while being alone at home in lockdown because of the coronavirus. The experience was all in VRChat, but at the same time, there was a physical installation at the Palais Royal in Paris where some people could enjoy the show and then connect virtually to Jean-Michel Jarre asking him questions so that to create a bridge between the real and virtual world.
For the VR part, the idea was to create something like a big virtual concert with animations and visual effects that are not possible in real life and that everyone could join from home. Super cool, isn’t it? Yes, it was!
Working alone together
I can say that I have been very lucky to work with a group of talented people from many countries (Italy, Russia, France, etc…) on this project. But at the same, I have learned that when you are around 15-20 people working on the same project remotely, you can’t keep the same freestyle organization of when you are 2-3 people working in the same office. So let me give you some suggestions for your future VR events that you may organize.
(This section of the article is more about project management… if you find it boring, you can skip it and go directly at the tech details section)
Mind the timing
Every event requires some time to be organized, but remember this simple rule: whatever the time you think it is necessary, it will be more, much more. I have organized a workshop in 1 week and this concert in 3 weeks, but really “don’t try this at home”, as WWE says.
Events require time for planning, execution, testing. If you don’t have time, everything will become messy. All the errors I will discuss below are not due to the fact that we were not capable, but were caused by the insane time of 3 weeks to organize properly such an ambitious project. So, plan in advance: for such a concert in VR to be done, probably at least 2 months are required to prepare it well and without rushing. For a simple event (like a little conference with 50 attendees and 2 speakers), 2 weeks are necessary. For a big event like the Educators in VR Summit, I would suggest at least 4 months of preparation. Estimate how much time you do need, and don’t try to be faster than that.
Take your time, otherwise first of all you will have no time to organize the event properly and things will have bugs or problems. And then people working on it will become very stressed, and probably will have quarrels (that luckily we had not in our experience, but I had in other rushed projects) or going into burnout. Working all the nights is not healthy, trust me, so avoid doing it. Been there, done that.
Ah, of course the last week will always be crazy and you’ll work night and day to iron out the latest details, but make sure that it will be just one week and not more.
Hire a project manager
If your project is big enough, either you have a project manager, or you are going to have big troubles. We were in a hurry and tight in budget and we started doing stuff without minding about hiring a project manager, but we regretted this choice later on.
This is one of the lessons that I want to tell you: no matter how in a hurry you are, how tight in budget you may be, if your project involves more than 5 people, and they are not working in the same place, hire a project manager. If this doesn’t happen, you happen to:
- Not having clear deadlines (apart from the delivery date);
- Not having coordination between the teams;
- No one has a clear overview of what is happening and can make all people work as cogs of a big machinery.
The project manager is like the conductor in an orchestra. It may seem completely useless from the outside, but actually he/she is who guarantees an optimal performance of the team. Not having him in our team, this is what happened:
- Every one of us had not an exact picture of what all the other members were doing;
- Every one of us had not always a clear list of his tasks. Sometimes I had to understand what I had to do by reading various chats, were all that was to be done was scattered a bit here and a bit there in conversation with other people;
- We had no clear idea of the timing since day 1. What were the milestones of the project? For what day? And what features should be dropped if we were late for a milestone? This was not much clear;
- We didn’t always know who should we refer to if we had problems with something;
- There were not clear times on when having calls to perform a status update, stand-up meetings, etc..
Luckily, we were all very professional people, and we managed to deliver even without a PM, but looking back, I think it has not been a good choice of ours. We would have had fewer headaches with someone with a high-level picture of what was happening and coordinating all of us. Take this as a piece of fundamental advice.
Write clear specifications, then develop, then test
Writing specifications is boring, but it is necessary. Someone should prepare one or more documents that will be the guidelines for all the members of the team. Without this, everything is a mess. The documents should be clear, and straight to the point. And once written, they should change the least possible.
Theoretically, you should write the specs, then get them approved by the decision-makers after which everyone can start developing, then finally there is the testing and polishing, that may require some re-work on some parts of the project. If you don’t follow this method, the organization becomes more confused. Due to the little time, we had the various parts of this perfect schema to overlap, and so some things have been changed after having already been developed, leading to a waste of time, and the testing phase has been very little, giving us problems, as we’ll see later on. Try to follow this simple structure when doing something.
Be careful of Slack
As another smaller piece of advice, ask your team to use Slack chats wisely. Slack and similar means are fundamental for communication between team members, but they can also be an issue if not used correctly. Some pieces of advice from my experience in this and other projects:
- Don’t use slack to communicate new specifications for a project. Every team member should have access to a specification document (or multiple documents) that he/she should access to understand what he/she should do. If all the tasks and the decision are scattered inside Slack chats, everyone must read all the chats to reconstruct the exact tasks, and this has the risks that some information gets lost;
- Ask the team members to respect the topic of the chat: if a chat is about “3D”, I as a developer should never read it. If I have to read all the chats, it is a big waste of time, and also generates confusion;
- Don’t write too much. If your team members login in the morning and find 1000 unraed messages from the night before, either they waste a lot of time reading them or they just ignore all of them, missing important info.
Lots of times I skipped all the Slack notifications because I had to choose between investing my time in reading all of them, or in actually developing the project. I have often found myself asking my friend Lapo Germasi “Hey man, can you make me a summary of the last 1000 messages on Slack?” and he was kind enough to help me in keeping me up-to-date with all the info. Use Slack wisely, otherwise it just generates confusion.
For another big project I am now using Microsoft Teams, and I appreciate that Teams subdivides every channel in threads that start from a single post, so it is easier to skip directly all the posts that seem uninteresting from the first post of the thread. But this doesn’t solve all the problems and can be even worse if people don’t manage the thread-mechanism correctly.
Ask the help of the host
If you are not using your platform, look for having a closer relationship with the company behind the platform that you are using. In this, we have behaved very well and we have contacted VRChat that has supported us by helping us with some technical problems and by giving big visibility of the event, featuring it on all its portals. Thanks VRPill for all that you have done for us!
Of course to do that you must have an event that is significant for the platform itself. If you are organizing a meeting for the fans of the “Video paused. Are you still watching?” popup on Youtube, that are probably 3 people on the whole Earth, no platform will care about you. But if you are organizing an event that has a very cool topic (e.g. a very innovative artistic performance) or one or more VIP guests (singers, actors, top tech leaders, etc…), for sure the platforms will want to hear from you. Remember that we are in the early days of VR and these platforms are actually looking for interesting events and use cases to survive themselves, so they will like every cool initiative you can offer them. Don’t be shy and shoot an e-mail!
The technical secrets behind the concert
Some people asked me to reveal what were the technical secrets behind this great concert, and having been the lead developer of the concert, the guy uploading the final VRChat world for the others to enjoy, I think I’m the right person to give you some juicy technical details on how we tried to solve some problems, where we succeeded, where we failed.
Why VRChat?
The first question you may ask us is why we have chosen VRChat some months ago when we started the adventure of VRrOOm Events. There are many other social VR spaces (Altspace, ENGAGE, Sansar, etc…) and I like many of them, so why VRChat?
The reasons are mostly these ones:
- VRChat is very popular, much more than all the others together, so a public world there has a potential userbase bigger than one on the other platforms. Sansar, for instance, is a very ambitious project, but it has few online users every day, so whatever you do there, will have little visibility every day (I mean, you can do a successful event there, like Lost Horizon, but it is difficult that you make a successful permanent installation);
- You can customize everything in VRChat by developing in Unity. You can have your 3D models for the avatars, you can create your 3D environments, you can code custom interactions. And all of this using Unity, the tool that I already use every day. This is something extremely powerful, that very few other systems offer.
- VRChat knows de wei.
This doesn’t mean that VRChat is necessarily the best solution for your events: every event has its peculiar requirements. For instance, for a workshop, I would choose ENGAGE, because it has better audience moderation and supports PPTX. For a recurring stand-up comedy event, I would choose Altspace, because it has better community tools. And so on. You have to know the environments available and select the best for you. Contact us if you have to organize an event and you don’t know what to choose 🙂
Hosting thousands of people in VRChat
The biggest problem that haunted us all the time was how to make the concert available to thousands of people. Jean-Michel Jarre is very popular and for sure many people wanted to be part of his great concert. But all social VR worlds, despite what they may advise, can’t host many people in the same room. ENGAGE can have only 50, VRChat only 40, Mozilla Hubs 25. Only Somnium Space and Virbela can give you hundreds of people together. So, how could we make thousands of people to enjoy the VR concert of JMJ?
VRChat always allows 40 people for every instance of a world, and every person can be only in one single instance. If you are not familiar with VRChat terminology: you can publish a world in VRChat, but the “world” is like a stamp to create various clones of itself: actually when you select a world, like “The Void”, you don’t enter into the world, but you can enter into an instantiation, a clone, of this world. All people joining a world actually are all inside different instances of it, the whole “world” is just a reference for all of them. The instances are completely unrelated and unlinked, so what happens in one can’t influence the other ones. You don’t even know how many instances will be created or how many people will be in: someone could create a new instance and stay alone there because he hates other people. Some instances will be public, others more private. So, basically, everything will be completely random.
We put our minds at work, looking for the best compromise possible to make all people in all instances enjoy the concert. Remember that due to the limitations of social VR platforms, you must always find compromises between what you want and what you can have, so be prepared to find workarounds for everything.
We came up with this masterplan:
- 40 people would have been in the main instance, the one with all of us organizers and Jean-Michel Jarre. This would have been the “VIP” instance, with important people of the VR and music landscapes. This would have been the “main” concert hall, where we would have controlled all the execution for a perfect concert experience;
- All the other people in VRChat would have accessed automated worlds with live music. We got many critics of this because someone thought that we created only one room with the real concert and many others with a fake low-quality concert. The reason for this is that it is technically impossible to control all the instances of the same world together. JMJ could be only inside one of the instances, and not in every one of them. And we couldn’t control all the instances that would have been created to trigger the animations and the videos there, because they were too many and completely unpredictable in number and timing (new instances were spawned and destroyed every minute). We didn’t want to leave the concert in VRChat available only for 40 people, and so we tried to find the best way possible to offer the biggest number of people possible a good VR experience (even if not optimal as the one of the main VIP room). It was a way to democratize the concert. Furthermore, we didn’t want to put everything as a pre-recorded concert because we wanted to offer a live experience, to make everyone “alone together” listening to the same live music. JMJ would have been live streamed via Youtube as a 2D holographic avatar with pre-recorded animations in every instance so that to offer a good experience to everyone;
- All the people in mobility could have watched a Youtube/Facebook livestreaming and enjoy the concert in 2D.
Tracking the artist
JMJ was performing as a full body avatar that could jump, clap his hands and move all his body like in a real concert. How was it done? Easy, peasy with Vive Trackers.
VRChat lets you have your full body only through Vive Trackers or via softwares like Driver4VR that emulate the trackers via SteamVR drivers. The problem when you don’t own the platform is that you have to follow its rules, and so we were forced to use Vive Trackers or equivalent equipment.
The initial proposal was to use multiple Kinects to emulate the trackers, but having used Kinects in VR for 3 years with my first startup, I warned people that time was not enough to employ them, and in the end, I was right. Even Optitrack gave problems: Optitrack is perfect for body tracking but in your custom application. With VRChat, you can’t directly provide the body joints, but you can only emulate the position of virtual Vive Trackers. So the required steps were performing Optitrack tracking and then emulate Vive trackers given the data offered by Optitrack so that the skeleton used by VRChat could show a pleasant body pose. I think it’s clear that this is pretty difficult, and since we were limited in time, we were not able to put these cool solutions at work with the desired quality.
So, after various tests, the tracking techies decided to use Vive Trackers. Wise decision: if you are in a hurry, go for the safest road.
Streaming the audio
Jean-Michel had his avatar, and he had to play his music. But how to inject music into VRChat?
The most used option from the community is usually emitting the music through the line-in (the mic) of the computer, as the voice of the avatar. But while we were testing it, Louis told us that the quality of the music was not good enough for a high-profile person like JMJ. Another option was streaming the voice via Youtube or similar, but Youtube streaming introduces something like 30 seconds of delay, and so the movements of the avatar of JMJ and the music he was playing wouldn’t have been in sync. This road was not viable as well.
Thanks to the great work of Georgiy Molodotsov we found a tool called TopazChat developed by a Japanese guy, Hirotoshi, that provided super-fast high-quality audio streaming for VRChat. It worked like a charm and made our concert possible. If you need a similar solution, I sincerely advise you to check out TopazChat.
Streaming the background videos
Streaming of the background 360 videos was much easier, since VRChat offers many tools already out of the box, plus you can also use other ones from Unity. Here it is all a matter of deciding if you want synchronization or not and what kind of resolution you do want. It’s very easy to be done, and there is no need for me to go into details on this.
Triggering the animations
Let’s come to the pain point of the experience that many people had in the public world. We had JMJ that in the VIP world was a 3D avatar with audio coming from TopazChat and in the public world was a 2D avatar streamed over Youtube. We had the 3D environment, designed by Vincent Masson and optimized by MID’s Victor Pukhov. We had the trippy background videos rendered by Vincent Masson. We even had the 3D animations designed by Vincent and optimized by Victor. Now it was time to put all of this together.
The idea of the concert was to activate a different background video and a different 3D animation of flying objects for every song of Jean-Michel Jarre so that to create a special discotheque with trippy visual effects that were in sync with the song. But how to do that?
VIP World
VRChat doesn’t recognize songs and couldn’t activate things automatically for every song. Putting a timer was not an option as well, because JMJ could have had some delays, could want to speak a bit more between each song, so we couldn’t hard-code the animations. The only way was to trigger them by hand.
We created a special VJ room that was on the top of the dancing hall, and had a VJ panel that was a texture with only the front-face rendered: this meant that the VJ (Georgiy) could see it, but the people down dancing on the floor couldn’t see it. The VJ panel had all the VRChat triggers necessary to start the different videos, to trigger the different animations, and control everything. Georgiy and the other people designed a precise timeline of when to activate what and he was there every minute pressing buttons to move parts of the stage, activating 3D animations, 360 videos, and such. After some rehearsal, he was very practical with this tool, and he was a perfect DJ of the whole event.
The experience in the VIP room was so cool because it had been designed with exact timings, and everyone there could enjoy the concert as it had been conceived, with every song giving the right audiovisual sensations.
Public worlds
Unluckily in the public worlds, we haven’t offered the same amazing experience. The reason is that how VRChat is designed has not helped us.
We had no way to trigger manually animations in all the public instances because we had no idea of how many instances would be created and when, so we were forced to use some automation. The plan was so that while the 2D-streamed holographic avatar of JMJ was performing the music, some automated animations were playing trying to emulate the scheduled time of the concert. If JMJ wouldn’t have respected exactly the timing, the automation would have been slightly out of sync, but that shouldn’t having been a big deal.
But here comes the problem that we couldn’t even know when the new instances would have been generated. I mean: the concert was scheduled for 9.15, and if someone created an instance at 9.15, everything would have been in sync. But what about people doing that at 10.15? We couldn’t give them the same timing, otherwise they would have seen everything with one hour of delay. And what about people joining at 8.15 instead? They couldn’t see everything before the others, also because they could have spoiled the experience to the other fans of Jarre. We could have added a one-hour delay for them before the animations could start, but if we did this, that delay would have been applied for every world created later on, even the ones created at 9.15, worlds for which the animations should have started immediately. Summarizing: whatever we could do, we were doing it wrong.
In a normal development case, you can use some C# coding and solve everything with a DateTime.UTCNow, but VRChat prohibits C# coding for security reasons... and we couldn’t even use the new Udon visual scripting system introduced in VRCSDK 3, because the new SDK doesn’t support video streaming yet and we needed videos. We were so forced to use the less powerful VRCSDK 2, which has a very limited set of interaction blocks. Without having a scripting system, it is impossible to query the time, and so it is impossible to start the animations at absolute times, leaving us with just very dirty workarounds available.
The best solution that we found was to block access to the world until 9.15 pm: if people joined before, they could only see the first part of the world, and a big message inviting them to connect again at 9.15 pm, so that to solve the hard way the problem of early comers. For all the people arriving from 9.15 on, we scripted some simplified animations triggering at the expected timestamps of the songs so that to show cool audiovisual effects, that would have been more or less in sync for the people joining at 9.15 pm, and not that in sync for the ones creating instances later, but anyway cool enough to give them a nice VR experience. The animations were scripted with buffered timers so that after the first person created the world, the others would have followed the animation he was seeing by using a timestamp: e.g. the first animation started after 30 seconds, the first video started after 5 minute… and so on.
What went wrong?
So why did we have problems? Because the automated videos and animations didn’t trigger for many people. Triggers for “late joiners” are a bit complicated in VRChat: it is easy to do a button that when it is pressed makes a mirror appear for you, and it is easy to do a button that makes a mirror appear for you and the other people in the room. But what should happen when a new person enters the room when you have already made the mirror to appear? Should he/she see it or not? Welcome to the world of buffered and unbuffered events, a very common topic of multiplayer games, something that makes us developers have many headaches.
I don’t want to go into many details for this technical subject, but let me just tell you that it is very complicated per se when you have full control of the source code, and it becomes even more difficult when you use VRChat. Sometimes you instruct the system to behave in a certain way for people that join a world later on, and then it doesn’t work maybe because of a bug, or maybe because VRChat expects you to do things in a particular way that is not the one that you used.
This meant that most probably the first person creating an instance of the concert world could see all the videos and the animations triggered at the right times, but the other ones joining him later on more often than not didn’t have the animations triggered. So they could only see the hologram of JMJ, one base animation and that’s it. They got angry because basically they were watching a 2D video in VR and they thought that the experience was that one, while it was much better. The people that could see all the animations liked the experience in the public worlds, but the many others that didn’t see anything were pretty pissed off.
Add to the fact that we started a bit later, so people reloading the world at 9.20 could still see the message inviting them to reload the world at 9.15 and you can understand why we had so many people angry with us.
I’m very sorry for what has happened to anyone that hasn’t had a good experience. I take full responsibility for the dev problems myself. We are improving and we are learning every day, and for sure the next time we will provide you all a much better experience.
How could you avoid the same error?
The root of all evil remains the short time at our disposal: we had almost no time to test the public worlds, and in the first tests they seemed to work, so we were confident they were ok. More thorough tests would have revealed the problem, so plan ahead so that to have at least one week of testing for your big events.
As for the triggers, now we are discussing with VRChat how this can be solved in the future, and we got the news that anyway with the new SDK the event system will be improved and should have less flaws.
In case you do something similar, also consider a plan B. We could have put a big button in the room that could trigger random animations local for every single viewer (so not common between the same room inhabitants), or a personal VJ console so that everyone could trigger the animations for themselves as they wished. This wasn’t cool as a synchronized experience, but it would have been better than seeing nothing: everyone could have felt as a VJ for Jean-Michel Jarre as a plan B if the official animations weren’t working.
And regarding the delay… communicate your delays immediately in your social media, and also don’t write exact timings in your world. “Reload at 9.15pm” is a bad choice, because people will expect that at 9.15pm everything will be fine, and even if you have 5 minutes of delays, they will be frustrated.
Outside streaming
The streaming to Youtube/Facebook went very well. Here we followed a quite standard route: in the VIP world there were some people of the staff that were in some strategic positions of the world from PC, and the screen of their PC was streamed to the web. A streaming director assembled all the streams and decided what to put live on Youtube at every instant.
If you watch the Youtube video, you notice that it is a great job. Most streaming videos of events are like from a single camera, and this is so boring to be seen because it is like watching a TV show recorded from only one viewpoint. With this concert, we had various points of view, that were shown alternatively, and were also mixed with the suggestive images of the videos of Vincent Masson. The result is something dynamic, fresh and nice, very pleasant to be seen. If you plan to do a Youtube streaming, keep in mind to create something that is entertaining for the ones that are watching it, and it is not only a mediocre plan B for who has not a VR headset.
Graphics optimizations
For this project, big kudos to Victor Pukhov for having optimized all the graphics of the experience. When you are in VR, you should be able to guarantee 90 frames per second, so you must make a compromise between what you want to see and what you can afford showing. The problem is that if you want to offer a great experience with a great artist like JMJ, you can’t just show two flying cubes, but you have to show many visual effects.
What Victor did was optimizing all the meshes, reducing the number of polygons at the minimum possible. Then he also cared about the shaders, selecting some that were not too heavy on the GPU. All the lights, except one, were baked and so were the reflections. If you don’t start optimizing graphics since day 1, in the end, you will find yourself with a very low framerate and too little time to re-design everything.
And besides optimizing the environment, remember to optimize the avatars. Every new user in the room is a new avatar, and so more polygons in the scene. Invite people not to use too heavy avatars.
The experience was not available for Quest because, despite all the optimizations, it was not possible to provide a high-quality experience with the limited capabilities of Quest. This is a decision with did not take lightly, but it was the only possible way. We couldn’t ruin the experience for all other users because of Quest users and we couldn’t design a Quest-only parallel world because of the lack of time. So we had to kill the Quest.
Mind the usability, and not only the graphics
One big error that we did was to not make immediately clear that some objects were interactive and what they were able to provide. For instance, in the public world, there was a sphere that could be used to re-sync the video of the avatar of JMJ, but it was absolutely not clear: if you see a blue sphere, you don’t think “oh, that must be a sphere to resync a video!”, you just think “ah, a blue sphere”. So many people that couldn’t see properly the streaming didn’t know they could resync it easily and kept reloading worlds forever until they found a working one.
VRChat shows you a tooltip when you frame interactive objects, and we used it, but you should make objects that are self-explanatory: maybe a resync video object should show something about time, or a video icon. Take not only care in making objects that fit visually into the environment, but also make them self-explanatory on their purpose.
If you have time, invite people that are not practical with VRChat in your world in preview and see how they behave and what they do. Remember the old rule: “UX is like a joke, if you have to explain it, then it is not good”.
Trolls will be trolls
In the end, in some of the public instances, we had trolls ruining the experience. We couldn’t do anything about that, since the public instances were outside of anyone’s control. And you have to keep in mind that trolls know how to be very creative. In one of the public instance I visited, a guy had an enormous avatar of himself streaming his own music at high volume while showcasing his own animations. Sometimes he also triggered some animation that was so heavy that the world crashed for everyone in. I was pretty shocked: how could him show all these animations on the stage himself? How could he have such a big and dangerous avatar without being blocked by the system?
VRChat is already doing a lot to prevent trolling on its platform, but I’ve learned that whatever you will do, there will always be a troll more creative than you. Remember that.
Marketing
We did our best to spread the voice about the concert. Urszula Gleisner wrote a press release and spread it everywhere, I contacted all the journalists and influencers I know from the VR space (and got the news published on Road To VR and Upload VR, yeah!), Jean Michel Jarre triggered his fans, VRrOOm wrote many posts on the topic, VRChat featured the event… and in the end we made a big buzz about it. Remember that marketing is fundamental: if you make a great event, but no one knows about it, all your hard work will be useless. Invest in marketing a lot!
Some little curiosities
Working with JMJ
Some fans of Jean-Michel Jarre asked me how it is working with him. Honestly I don’t know: the people that have worked with him the most have been Louis of VRrOOm, Pierre, and the guys handling its full-body tracking. I have just spoken with him in two Zoom calls and worked with him indirectly. Anyway, he gave me a good impression in our time together.
The secret room
In the environment, there was a secret room that gave access to all the backstage. In the secret room, there was a security system protected by a password (I was very proud of having developed it inside VRChat) to activate all the secret passages to the backstage. Anyway, the staff was pissed off of having to type a password every time they had to make tests, so the password was disabled during all the development stage. In the hurry of the moments before publishing the world, we forgot to restore the password protection, but luckily no one found the secret passages otherwise he/she could have trolled Jean-Michel Jarre 🙂
Sponsorships
At a certain point there was also the idea of adding some kind of ads inside the experience like it happens in real concerts. The idea was dropped later on because it could have hurt the overall perceived quality of the event. Anyway, a poster about Ubisoft was added because of the support they gave us in spreading the voice about the event.
The virtual drugs
Some days after starting developing the experience, someone (maybe Pierre) had the idea of adding some special vision effects, something that you could take and have a different vision of the environment. Technically speaking, we are talking about post-processing effects, and we designed some glasses that could provide this experience: I developed the “code” (not code, but a combination of triggers and animations) to support 4 special effects, and to make some tests, I moved randomly the bars of the Post Process Effect script in Unity to create 4 demo effects just to make the tests. Then I told the artists to design 4 special effects for the purpose.
I and Victor designed the interaction with four glasses that could be worn to have these special visions, but then most of the other people said that they preferred to have virtual drugs, that were more in line with the electronic music rave party mood. We made the change, even if I found it a bit bizarre.
The fun thing is that fixing one bug after the other, we forgot completely about the special effects, that remained the 4 test ones that I made in 3 minutes. When VRPill from VRChat came, he tried our virtual drugs and said “Oh, so cool, I know other worlds in VRChat that do this, and also them spend SO MANY TIME DESIGNING THESE SPECIAL EFFECTS”. So many time, yes, well… maybe… cough cough… in the end 3 minutes is a lot of time if compared with 1…
Anyway, the virtual drugs were loved by everyone that tried them. Sometimes we have so high expectations that we don’t realize that even simple effects can be great for people that are not used to them.
The plan B
A rehearsal of the show was all pre-recorded on Saturday night in VRChat. We took this registration as a plan B in case there was some disaster happening, like the VRChat servers crashing on Sunday night. We didn’t use it in the end, but as a suggestion, always consider a plan B. It has already happened to me and VRrOOm that an event couldn’t be done because VRChat made a server update at that exact time (really, it was very frustrating)
The final upload
This project gave me lots of anxiety because I was the guy doing the uploads. I assembled all the (great) material given me by the other team members and I had to assemble them in Unity and publish them on VRChat. A great responsibility, also considering that I always received all the materials a bit late, and then I had to respect the timing. I still remember when Pierre asked me how much time I needed to complete the job, and I said “30 minutes” and he told me “WHAT? YOU HAVE 10”.
Having a full team of people in different countries, plus a legend of electronic music, just waiting for me to do finalize their hard work was one of the most stressful moments of my whole working life. From great power comes great responsibility, and also great stress. But I went through it, I had to do that because all those people worked so hard for the purpose, and they trusted me so much to give me that role. But I have probably lost 20 years of my life in the process.
The results
All in all, the concert was a success: all the people in the VIP room had a blast, and we got lots of compliments from them. People in the public rooms were a bit angry for the problems I described above, but anyway we got 2500 visits inside VRChat: it has been one of the most successful events on the platform. And on Facebook/Youtube, in the end we got more than 600,000 views. After all the pain, the stress and the lack of sleep, seeing these results made us very happy. Also Jean-Michel Jarre was very satisfied from the outcome.
Yes, there have been some problems, yes, some people came out unsatisfied, and we have a lot to learn and to improve. I am very angry with myself for not having been able to give everyone the experience they deserved. But as someone has told me, we were trying to do something new and innovative, and errors, glitches, bugs are still part of the game. This is virtual reality now: a territory of crazy experiments, where we all are trying to push the technology forward.
And having been part of a project that not only created something new for VR concerts but also gave visibility to virtual reality to hundreds thousand people, make me happy and make all of us of the team very proud of what we have done. I personally want to compliment all the other members of the staff of this concert for the great job they have done, notwithstanding the critical situation we were in. And kudos to Louis Cacciuttolo of VRrOOm for having made all of this possible.
And to support the people that had a bad experience in the public worlds, we are thinking about creating a permanent installation inside VRChat with a recording of how the full experience should have been, so that to let everyone enjoy the concert as it should have been.
We’re sure that next time we’ll give you all a better experience, and that I’ll be able to have fun with you, all together… but alone in our VR headsets 🙂
UPDATE 2020/07/09: Georgiy Molodotsov has written a very insightful comment to this article, highlighting other lessons he himself has learned during the organization of this event. I invite you to scroll down and read it 🙂