UI/UX: Designing for AR & VR

UI/UX: Designing for AR & VR

What AR/VR is, how it works on a high-level, what it means for UI/UX designers, and how you can begin to cut your teeth designing for AR and VR applications. Overview In the next ten years, especially with Facebook Meta, crypto, NFTs, and online-only content, AR & VR

How does AR/VR work?

Both AR and VR leverage a device’s computational power to generate dynamic superimposed 2D/3D user interfaces and experiences in real-time

Zone 3: Rough view

Exists from the edge of decent view to about another 10-15° away radially. This area is good for information that is not absolutely necessary but the user should have access to if they want to glance at it.

Approaching the design

Box your AR designs in a way that allows for maximum usage with minimal potential for eye strain

Bringing it all together

As we march towards a future where AR and VR will dominate the cross-platform application marketspace, it is worth learning how to work with and design for AR & VR

Real-time Shared Experiences

You will need to account for real-time shared experiences between users

What does this mean for UI/UX designers?

AR and VR are VERY different animals

How to start designing for AR/VR

Design for the human eye itself

Zone 4: Nope

AVOID putting anything the user needs to be able to see, or interact with, outside of a radius of about 30° radius from the eye’s central line of sight (this constitutes the average maximum rotation that the human eye is capable of).

Surface-less interactions

The interactions in AR/VR will be nearly 100% virtual, and all haptics (physical feedback) from these interactions will be device-specific and non-standard

Multiple input types

Most interactions will be visual, track with eye movement, thumb controls, reach-depth, speech recognition, head-tilt gestures, and a whole lot more

Source

Get in