User interfaces in virtual reality are a problem yet to be solved.
VR, as the name suggests, further blends the digital and real worlds. The solution to user interfaces lies in the hinterland between digital and real user interfaces.
I’ve been thinking about this problem for a long time. The first games I played in the current era of VR were small mini-game-like programs that booted up from desktop and went straight into the gameplay. These games were great experiments into the booming field, but whether the UI problem was not considered or it was too big a problem to solve. These developers chose to ignore it completely. As virtual reality forges it’s own path in the modern world, it’s users expect a far more slick and polished virtual environment. This UI problem won’t go away.
As we look back to previous digital user interfaces, we see buttons everywhere. These tools are the most common and effective form of interactivity in the history of digital development. In your browser now, you see the refresh button, the back button, the close button – even the bookmarks highlight themselves when the mouse rolls over them and appear to depress while the mouse button is down. There are two reasons this works, the mouse is represented on the screen as a pointer, a precise, manoeuvrable digital tool. We don’t have this in virtual reality. The second is that the animation of the button matches the action we are performing. As we press down on the mouse button, the digital button is depressed also. This helps us make the connection that we made this happen, it is a crucial piece of feedback.
This takes me on to real world user interfaces. What are the digital user interfaces trying to do? Navigate. Forward and back. Start game and settings. They are helping guide the user to the correct piece of the program. How do we do this in the real world? The folder system in computers has converted this to buttons but it’s origins are clear. A filing system requires a person to navigate a room by reading signs and understanding the implications. But they must move in the real world space. Perhaps this holds the solution to our UI problem, perhaps a Stanley Parable like journey through a series of doors and corridors will provide a simple menu where the user is not required to use buttons. The movement problem in VR is another problem for another post.
As the demand for virtual reality programs increases, many inventive and creative people have taken it upon themselves to solve this problem. Let’s split these solutions into three types, controller solutions, vehicular solutions and staring solutions.
This is a controller solution, we can see that the player controls the spray can and UI with their bespoke VR controllers (e.g. Oculus Touch). It’s clear that this developer considered what the player would be doing in the game. This is about painting, how do painters select which colour they’ll use? They have a wooden palette on their offhand on it they have the selection of colours they are using for this painting. The developer has taken a digital painting palette and placed it on the player’s off hand. The player can access it easily but it’s fixed to their hand and, therefore, anchored in the virtual world. This isn’t nearly as disorientating as having the palette fixed on the screen or having some permanent onscreen button that will open up the palette.
One area that VR particularly excels is in vehicular games, games where the player doesn’t move, but controls a vehicle that moves them. People are already familiar with this kind of interaction from driving cars. A popular game that employs this solution is Elite Dangerous.
Here the UI is still anchored in the real world, it exists on the dashboard of whichever spacecraft the player is piloting. To bring up a menu they look at whichever side of the dashboard that has the UI for what they want to do, then the game controls transfer over to UI controls and the player uses the UI like how they would a console UI. Using the joystick to scroll through buttons and select the button they’d like to press. This is pretty seamless, as people generally look at the thing they’re trying to use. It can be taken too far however.
This brings me to the final solution. Even if you’ve only played a couple of VR games it’s likely that you’ve come up against this design. In the history of game development, what comes first is the gameplay. The UI is made later and to fit around the game. This works for normal digital games as the computer/console/ phone already have good ways to interact with the traditional buttons. However making the UI part of the game world is essential for an immersive UI experience. The staring solution is essentially making a choice by looking directly at your choice for a specified amount of time. This is not how people look at things. People use a combination of moving their heads and their eyes. This is what the controller solutions are doing – they are tracking the player’s eyes by using their hands.
The best UI solution for your game depends on your game. But there are a few general rules: fix your UI elements in the real game world. Whether it’s on the vehicle’s dashboard, on the player’s hand, in the player’s lap – designate a game world space for your UI elements. If you need a main menu, make it similar to the VR world – let players make their choices in a context that fits your VR world. Perhaps you don’t need to have your main menu in VR. They’ve already navigated to your game without VR – maybe a ‘Put your 3D glasses on now’ moment isn’t out of place. Just make it easy.