doublepoint smartwatch mouse xr wowmouse hands-on review

AWE: Doublepoint transforms your smartwatch into a mouse for XR

At AWE I have been able to try the demo of Doublepoint WowMouse, an application for your smartwatch that transforms it into a 3D mouse for XR. It has been one of my highlights for the event, so you should absolutely read about it.

Doublepoint

I have known Doublepoint since when it was called Port6 and a video about its use of a smartwatch to activate home appliances became a bit viral on Linkedin. The first call I had with Ohto Pentikainen, the CEO, and Jamin Hu, the CTO, happened in 2022… it’s a while since this company has been working on creating innovative interfaces thanks to the use of smartwatches.

doublepoint wrist smartwatch
My hand wearing a smartwatch running an application developed by Doublepoint

The main idea behind the effort that Doublepoint is making is that modern smartwatches have incredible sensors onboard, and these sensors allow you to do much more than a smartwatch usually needs. The intuition the founders had was to use these sensors to detect the pose and the gestures of the hand. In particular, the first application has been detecting when the index and the thumb of the hand touch each other.

Jamin explained to me that when the thumb touches the various fingers of the hand, this movement creates some vibrations around the wrist and the vibrations are different for each touched finger. So using the sensor data read by the smartwatch that is on your wrist and some machine-learning magic, it is possible to detect when your thumb is touching your index fingertip. So basically you can detect the air-tap gesture (which is used on Vision Pro, Quest, and HoloLens) just with a smartwatch on your wrist, and without any need for full-worn gloves or external tracking cameras.

WowMouse

Considering that smartwatches have IMUs, you can detect their orientation in space. So you can pair the tap detection with the orientation of the device and use your hand wearing the smartwatch as a 3DOF mouse. Doublepoint is doing exactly this and it is calling this solution “WowMouse”.

Trailer video for WowMouse

WowMouse is an application for Android smartwatches. After you have installed the application and paired your smartwatch with an XR headset, you can use your hand as a mouse for immersive realities, pointing at UI elements and clicking. WowMouse is currently compatible with Magic Leap 2 and with Quest 2/3 headsets and it is suggested to be executed on Samsung Galaxy Watch 4 / 5 / 6 or Google Pixel Watch 2 smartwatches.

If you have a look at the documentation, you will notice that one interesting thing is that the integration of WowMouse doesn’t require the installation of any packages inside your Unity application. The system just works by integrating with Unity Input System, so you have just to define a bunch of Actions and you’re good to go. As a developer, I think that’s great.

But… why?

I guess many of you will be thinking now: why do we need this when we already have hand tracking on all the major headsets now? Why bother with this smartwatch approach if we can already use our hands as a mouse?

Well, people from Big Bang Theory would answer you “Because we can”.

One of the most iconic moments for us engineers

But a more serious answer is that we are still in the early stages of VR and it is interesting to explore various approaches to solve a single problem. While hand tracking has the advantage that it just works without any external hardware, the smartwatch approach has its own advantages, too:

  • It can provide haptic feedback to the hands via vibrations
  • It has little power consumption: it just takes input from the hand, analyzes it locally, and sends the result to the headset. Image analysis requires the device to analyze millions of pixels every frame
  • You are not required to put your hands in front of you to be tracked, you can click with your hand in every position
  • You can wear industrial gloves and your gestures would still be tracked (camera-based hand tracking would fail in this condition)
  • If you add more sensors to the smartwatch, you can use BMI techniques to detect even just the intention of the click, without the user even performing the action. This is what CTRL+LABS was doing before being acquired by Meta
This video by Ctrl+Labs gets me excited every time I watch it

Furthermore, this approach works also if you are not wearing any XR headset, as we will see in a while. So, what Doublepoint is doing is worth keeping an eye on, to see which applications this solution will be ideal for.

Hands-on WowMouse at AWE

I had the occasion of trying WowMouse on the AWE show floor. In particular, there were two demos: one without glasses, dedicated to IoT, and one with glasses about the use of the smartwatch as a controller for XR.

After the team at Doublepoint put the smartwatch on my wrist, I just tried the system to detect the tap gesture: I looked at the smartwatch and I started making my thumb and index fingertips touch. Every time I did it, the screen of the device flashed. The detection was working fairly well, detecting the gesture every time I did it properly. I had a few misdetections if I made the gesture weakly, though, so I understood the gesture has to be done in a proper way.

via GIPHY

After that, I tried the first demo: I had to stand at an exact point in the booth, and then I could aim at two lamps that were in two different corners of the room. Pointing at a lamp and tapping my fingers would let me select it. After the lamp was selected, I could rotate my palm to make the light change its intensity. After that, with a double tap I could de-select the lamp.

The demo was pretty cool to try because it let me feel like I had superpowers: without any headset or any rmote, I could just point at lamps to make them change intensity. Technically speaking, I guess it was just a combination of IMU orientation detection (to understand which lamp I was aiming at and then the rotation of my wrist) and tap gesture detection, but the result was pretty cool anyway. I don’t know if this can be easily translated to a practical application, though, because to be used in a real home, it should be able to understand which object I’m pointing from any position in the room I may be in… but as a prototype was pretty cool and worked well.

I’m like Harry Potter, but without the wand and without the scar

The second demo was made in partnership with Magic Leap and let me use WowMouse with Magic Leap 2. I tried an application where many bubbles were flying around me, and I could click on them to make them pop. I could do that with an interaction scheme similar to Vision Pro: I could look at them with my eyes and then tap with my fingers to interact with them. If you have read my review of the Apple Vision Pro, you know that I’m not a big fan of using the eyes as a mouse, but anyway, this was not the point of this demo. The interesting thing was that I could do the tap with my hand totally at rest. With Vision Pro, or also with Ultraleap tracking, I have constantly to keep my hand slightly in front of me, to make it visible to the tracking cameras. But in this case, since the detection happened with the sensors on the wrist, I did not have to do that, so I could keep my arm full at rest going down along my body. This was simply amazing because the eye+hand interaction became much more relaxed and comfortable to do. It was at that moment that I understood the potential of having a sensor on my wrist instead of using hand tracking.

My tests with WowMouse and Magic Leap 2

With the same application, I could also try to use the Wowmouse in a more traditional “mouse” sense. There were a few menus that I could control by using my arm as if it was an “air-mouse” and point at the UI items and then clicking them with the air-tap gesture. Notice that the mouse was only 3DOF, so it didn’t work like a VR controller through which I could point exactly at the items I wanted. It was more like detecting if I was moving the hand left or right and moving the pointer in that direction accordingly, a bit like a mouse does when it moves my 2D pointer on my 2D computer screen. The system worked fairly well, even if I have to honestly say that being used to exact 6DOF pointing, the 2D-like mouse movement felt a bit less good to use.

When I came home from AWE, I noticed that also Upload had written a post about Doublepoint, and that my friend Don Hopper tried to use WowMouse at home with the Meta Quest 3, with good results.

Final impressions

It’s been a while since I’ve been following Doublepoint and I do that for a reason: they have pretty interesting ideas about using smartwatches to interact with XR headsets and with the environment around us. They have pretty solid interaction detection and I’m interested to see the development of these ideas to see if the smartwatch will manage to find its use cases in XR or IoT or if in the end, bare hands will win. Only time will tell, in the meantime, good luck to the Doublepoint team!


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

vps immersal visual positioning system

Visual Positioning Systems: what they are, best use cases, and how they technically work

Today I’m writing a deep dive into Visual Positioning Systems (VPS), which are one of the foundational technologies of the future metaverse. You will discover what a VPS service is, its characteristics, and its use cases, not only in the future but already in the present. As an example of a VPS solution, I will […]

vive focus vision hands on review

Vive Focus Vision and Viverse hands-on: two solutions for businesses

The most interesting hands-on demo I had at MatchXR in Helsinki was with the HTC Vive team, who let me try two of their most important solutions: the new Vive Focus Vision headset and the Viverse social VR space. I think these two products may be relevant for some enterprise use cases. Let me explain […]