holding quest controllers

Quick Fix: how to use ONLY the trigger to interact with the UI of the Oculus Go/Quest in Unity

Today I want to talk about a quick fix that may interest all the Oculus Go and Oculus Quest Unity developers that employ the plain Oculus Unity Plugin from the asset store to develop their VR experiences. If you are in this category of people, you may find yourself in the unpleasant situation where you have undesired other buttons that interact with UI elements besides the trigger button and this could make you go crazy. Don’t worry, the Ghost comes to your rescue and explains to you how to solve this issue.

This is the typical scenario. You’re making an Oculus Go/Quest experience in Unity, and in this experience there are of course various interactive items. Most probably, to handle the input, you have used the prefab UIHelpers in Assets\Oculus\SampleFramework\Core\DebugUI\Prefabs, modified to fits your needs. You want the scene items to be interactable using the index trigger of the controllers, of course, so you modified the OVRInputModule behaviour attached to the EventSystem child of UIHelpers so that JoyPadClickButton is PrimaryIndexTrigger and/or SecondaryIndexTrigger. You test the experience in the editor, maybe with a Rift attached, and everything is fine. You’re happy, so you finally build your experience. And there you notice a problem: on the device, actually also the touchpad (in the case of Oculus Go) or the thumbstick (in the case of Oculus Quest) can be used to interact with the virtual elements, even if you specified that you want them to be operated only through the main trigger. And no matter what you change in the settings of the OVRInputModule class, this is always going to happen. It seems that you’re controllers are possessed by an evil force that makes them use other buttons as the trigger! How to solve that? How to use only your trigger to interact with the virtual elements? How to prevent the thumbstick and the touchpad to interact with the UI elements of your experience?

oculus quest review
Oculus Quest and touch controllers

Well, you have to make your hands dirty and modify the code provided by Oculus, because no params in Unity will take you there (at least as far as my knowledge goes). You have to open the file OVRInputModule.cs, that is in Assets\Oculus\VR\Scripts\Util and make a little modification to the code. Look for the following lines (currently at line 862 of the script, but it may vary a bit depending on the version)

#if UNITY_ANDROID && !UNITY_EDITOR
// On Gear VR the mouse button events correspond to touch pad events. We only use these as gaze pointer clicks
// on Gear VR because on PC the mouse clicks are used for actual mouse pointer interactions.
pressed |= Input.GetMouseButtonDown(0);
released |= Input.GetMouseButtonUp(0);
#endif

Andcomment them all by adding a “//” before them. Basically, they have to become like this

//#if UNITY_ANDROID && !UNITY_EDITOR
// // On Gear VR the mouse button events correspond to touch pad events. We only use these as gaze pointer clicks
// // on Gear VR because on PC the mouse clicks are used for actual mouse pointer interactions.
// pressed |= Input.GetMouseButtonDown(0);
// released |= Input.GetMouseButtonUp(0);
//#endif

As you can read from the comments, these lines were useful to adjust the input for Gear VR, but they create troubles on Go and Quest builds.

After you apply this fix, save the file and try to build again. At this point, your experience should use only the trigger to interact with elements!

I got crazy to discover this little issues some months ago, and I discovered that other devs are having problems with this, so I decided to write this little tutorial to help the other XR devs of the world!

If this post has been useful to you, consider subscribing to my newsletter to make me happy as well! Happy coding 😉


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

dynaimic generative applications unity

Dynaimic Apps: How to create a mixed reality experience whose logic is generated at runtime by AI

Today I want to take you on a journey on one of the coolest prototypes I’ve ever worked on, which is about creating Unity applications that can dynamically modify their logic at runtime depending on the user’s needs thanks to generative artificial intelligence. And it will have a happy open-source ending, so make sure to […]

custom meta user report

How to implement the new required User Reporting Tool for your Meta Quest app

A new requirement for all multiplayer applications that are already published or will be published on the Meta Quest Store is to implement the “User Reporting” system. Let’s see what it is about and let me tell you how to implement it. User Reporting Tool In the effort to make its Virtual Reality (or should […]