UI / UX design patterns in virtual reality

UX experts used to design forms, websites and smartphone applications. Now they need to re-learn the trade to design interfaces and interactions in virtual reality.

Many of the traditional UI elements simply do not work in VR, and a significant part of the existing UX toolkit is simply inappropriate. To begin with, designing for a flat surface - our laptop or phone screens - is very different from designing for a spherical world with the user at its centre. Luckily there are people researching UI and UX in VR, and there are a few tried and tested patterns.

In this article we don't touch on the subject of motion tracking (that's a whole another huge topic), and mostly cover controls in VR.

Curved design

Take an element as simple as a rectangle, a picture or a video player. Put it into a VR enviroment, and it just doesn't work, especially if it's too wide or tall. The edges of a flat surface will be further away from the user's eye focus, making them blurry and hard to read.

(c) Oculus

(c) Oculus

The solution is curved design. You need to stop thinking about the canvas as a surface, and imagine it as a sphere.

Narrowing the field

In VR placing any meaningful control element to the periphery of the field of vision is a bad idea. It's simple biology: your vision is simply not sharp enough on the edges.

We are used to think in a landscape or portrait format on the web, and on smartphones. Neither work particularly well in VR, as they force the user to tilt their head too much. Instead, controls should be placed inside a 1:1 rectangle area.

Using Z zones and depth

Objects that are too close to your eyes will get blurry. Just raise your hand and start moving it towards your face - you simply can't maintain focus when it's closer than a few inches. Objects, and especially controls should never be placed too close to the user.

(c) Manuel Clément

(c) Manuel Clément

On the other hand people are much better in telling the distance between two objects in close distance, than between two objects far away. If the distance is important from an interaction point of view, the objects should be placed reasonably close to the user's eyes.


 

Click here to buy the Xiaozhai BOBOVR Z4 VR headset with remote controller. It's a great and affordable alternative to Gear VR.
 

Motion flow

In virtual reality the camera is mapped to the player's head, with it's focus being at the centre (unless you use eye tracking). This means you never ever should change the environment against the head movements, or force the user turn their head involutarily. Both leads to motion sickness, and being nauseated.

Of course sometimes you want to guide the user through the user interface, towards the direction where something is happening. This is where subtle tools like motion flow comes into play. Pressing a button can trigger the button to extend, or slowly start moving, gently guiding the user to turn their head. We are implying a direction, guiding their eye to the next point of interest.

Moving interface

It's the opposite of motion flow, when we don't urge the user to look in a specific direction, but rather move the new elements directly to the field of view. Standing the the centre of a sphere means you don't see, and you don't know about UI elements changing behind your back. Sometimes it's better to put them right in the front of the user.

Capturing attention

The long evolution of the Homo Sapiens trained us to pay attention to lights and movement - otherwise we probably won't be here, and predators would rule the Earth. This evolutionary skill is perfect to capture focus in a virtual reality environment.

A great example is the game demo Lost by Oculus: they use fireflies in a dark forest to lead the eyes of the user. We expect to see more subtle, clever plays on light and shadow in VR applications and games.

Anchor objects

Virtual reality causes dizziness for a lot of people. Chances are if you had a bad experience with VR you won't try it again - and definitely won't try it the third time.

virtual_reality_anchor_object.png

We are used to standing or sitting still while the world is moving around us, a good example being driving. It's not causing nausea, because we have a visual anchor, the car dashboard. Establishing such an anchor object in the virtual space is something interface designers should consider. Research suggests even a virtual nose can help. We all see our nose all the time anyway, right? 

Holophonic sound

Holophonic means 3D sound (the word itself rhymes with holography). The idea is that with listening through the right equipment you can tell if the sound is coming from above, below, or from behind your back (and not just left and right, as with the stereo systems).

Holophonic sounds are amazing in games, but they very well might be used for VR control interfaces. Imagine a video start playing outside of your view, and you will hear _exactly_ where the sound comes from. We are not quite there yet, but not that far either.

Want to know more?

Mike Alger's VR Interface Design Manifesto is a great place to start:

Designing for VR: Environments and Interactions from Microsoft's Channel 9 is a good introduction:

Navigating New Worlds: Designing UI and UX in VR from the Oculus Connect 2 is a fascinating watch if you have a deeper interest in the subject:

We also quite enjoyed Designing for Virtual Reality by Manuel Clément from Google I/O 2015: