As a personal project, I designed and developed a new hand tracking keyboard for Oculus Quest. I aim to improve the hand tracking experience.
The new design shows a very good potential in improving the input efficiency. With a user test showing more than 50% increase in typing speed.
Controller-free Hand Tracking is an experimental feature in Oculus Quest. When I first enabled this feature, I was amazed by this new level of immersion. I almost get illusional sensations from my virtual hands.
However, when I tried to use the built-in keyboard to search for some games, I had difficulty typing fluently. I would unintentionally delete my lines or input the wrong letters.
The Current Keyboard
The current interaction is similar to the controller keyboard experience. The user will see a target in direction of their hands and pinch their index and thumb fingers to select the letter.
Sometimes the direction of my hands are unclear and my hand moves when I try to pinch. When I try to use backspace, sometimes I hold my fingers too long and my input will get accidentally cleared.
Ideation: VR Keyboard
I want to create a 3D old-fashioned keyboard that mimics a keyboard in real life. By using 3D models and animations, I can introduce materiality to the keyboard. The interaction will be intuitive as it will be the same as a real keyboard but I would need to change some design to compensate for the lack of physical feedback and imprecision.
Hand tracking does not provide physical feedback like the buttons on the controllers. The original design solves this by introducing the pinch gesture which gives a physical sensation. However, the gesture detection is less precise and less intuitive than using a controller.
I want to create a more intuitive interaction by developing a virtual keyboard in Oculus. The main challenges here are:
- Hands are in the air, there is no physical sensation.
- Hand Tracking can be imprecise and glitchy.
- There is a trade-off between speed and accuracy.
Development & Preparation
I use Unity with the Oculus Quest package to develop the prototype. In order to have the flexibility of trying out different design, I created scripts and objects that can generate a keyboard using a prefab for one key. After I developed this object, it's easy for me to play around the layouts, shape and styles of the keyboards.
Reflection & The Next Step
1. More test with real users!
I only test with a singular friend of mine, because it's difficult to find testers in the quarantine 😅. However, there are definitely opportunities to reach out people online and send out my app to people around the world. I want to hear what they think, especially from users who have spent time in using this feature.
2. Talk to other creators and developers.
I have not seen any demo for hand tracking keyboard input. In my exploration, I definitely saw why the Oculus team chose the current design because of it's physical sensation and accuracy. I want to talk to developers around the world to see their thoughts about this design and learn some critical opinions from them.
3. Refine the experience.
I can continue to refine the experience by adding in common features like spelling correction and word prediction. Also the typing experience depends on the scenarios of different apps, it should have a universal application.
My first iteration is to simply put a trigger on each key and input the letter when the key is touched by the hand. I try out different sizes for each key and different layouts.
As you can see in the video, the result is... a disaster 🤯.
The accuracy of this design is really bad. It is even rare to have the correct input.
After several attempts, I gradually learned to control my gestures to avoid accidental touches. By using only two fingers, the input efficiency is pretty good if I can manage to touch the correct buttons.
Although this prototype is very bad, it shows that a virtual keyboard with hand tracking has a good potential if I can improve the precision of typing.
I looked into the problems and found that it's usually the buttons below the target button that gets touched accidentally.
My first thought was to decrease the sensitivity of these buttons.
In my second iteration, I change the key feedback. I make the button to have a higher tolerance to touching.
However the problem still persists as my hand would even trigger a button distant from my palm.
This reminds me of a keyboard in real life where I rest my palm and other fingers near others keys. User will carry this habit into the virtual space, which results in the inaccurate touching of the keys below and near the target key.
Based on the previous observation, I focus on distinguishing each tap from each finger and eliminating any ambiguous touch. I change the original hand prefab from Oculus package. In the new hands, I enlarge the collision radius for the finger tips and distinguish them from the other parts of the hands.
In this video, the collision volume of the finger tips are shown as spheres. This prototype is very successful, the new hands can even handle a fluent typing with a relatively small keyboard.
Final Design (with Sound 🔉)
For the final design, I polish the keyboard visual design and layout to find an optimal size. I added in the sound feedback and it feels more responsive!
Then I ask my friend Helen to test the keyboard. Helen has never used Oculus, and I asked her to type the same sentence using the built-in hand tracking input and my keyboard redesign.
She is asked to correct any wrong spelling. We repeated the process 3 times and calculate the average time for the input.
On average, Helen took 77 seconds using the built-in input and 31 seconds using the redesign.