Notebook of Principles

2021-2023
Product Designer & Prototyper
This is a collection of prototypes reflecting design principles that have been followed.

Design Principles in VR Product Design

In web and mobile product design industry, there are various design systems that designers usually follow: Fluent Design System by Microsoft, Material Design by Google, Human Interface Guidelines by Apple, etc. Those design systems are experiences that people gained over so many years of design practices, they provide guidelines and tools for designers thus facilitate the development of future products.

Like the web and mobile products, VR products also need to follow certain design principles to deliver the best user experience. However, at this stage there is no well-developed design systems dedicated to the VR products. These classic web and mobile product design systems are still good references, but the dimensionality of user experience in VR products is naturally different. Additional guidelines for user interactions of extra dimensions are required.

This is a collection of prototypes I've been done in past several years, they reflect the principles I've learned through my experience of design work.

Spatial Interaction

The basics of spatial element interaction is the working area. Not only anthropometric numbers should be carefully considered, modules which were developed in so many years in the real world are also important in VR. The close and far reach, comfortable viewing distance, typical working surface sizes, etc., these are fundamentals of good user experience of the VR world.

Spatial UI

Instructional Tags

Instruction tags are critical especially for new users while they are not sure how to properly interact with the product; however, most of the time these tags are not important for the product to function. With that being said, instruction tags need to be visible when they are needed, meanwhile unnoticeable most of the time. Other than that, they should be switchable to instruct different interactions of different functionalities.

 A set of prototypes testing switchable instruction tags. The thumbstick and buttons have one set of functions when the user is trying to add new furniture; however, they have another set of functions while the user is trying to adjust an existing furniture. Instruction tags are properly arranged, so users only see the info they need. Etched tags are used here since they are less destructive and also all buttons are clearly visible in this case.

Interaction Feedback

Unlike the web and mobile applications, spatial relationships between different objects in VR are much more complex. Without proper feedback, users can hardly tell which object is interactable. Thus interaction feedback mechanism shall be carefully designed, to not only indicate which object the user is interacting with, but also to imply if the object is interactable and how users can interact with it.

Menus & Tools

Hand Menus

Hand Menus usually follow the movement of hands / arms. When people interact with items attaching to different parts of their hands / arms, gestures vary. Thus menus attaching to different locations also have their own optimal way of interaction.

Each type of menu has its own applicable typologies, functionalities, animations and feedback mechanism. They can be used independently or comprehensively. Here is a set of prototypes testing the optimal hand menu interactions and combinations.

Controller Tip & Radial Side

There are two other attaching areas while the controller is used: Radial Side and Controller Tip.
These two areas are not ideal for complex menus, but they are perfect to be used as a cursor or a medium of preview.

UI Element Positioning

Compared to web and mobile applications, VR products have an additional dimension of display which endows substantial visual space to interact with. Thus UI elements shall be flexible and special consideration is necessary.

A set of prototype testing UI element flexibility and spatial constraints - UI elements are initiated to a default location; they can be moved as needed within a certain distance; while they are pulled back to original location, they snap.

Not only can the UI itself be spatial, the elements within the UI can be spatial as well. This gives VR extra advantages for users to inspect products, designs and virtual spaces.

3D display as an UI component - A prototype testing 3D preview of selected furniture

System Tool Interaction

System tools are usually independent from main tasks, thus they shall always float in front of all spatial elements and interfaces. While they are turned on, interactions with all other spatial elements shall be disabled, meanwhile they shall be able to supersede certain functionalities and settings within the space.

A set of prototypes testing screenshot function - Camera ration and FOV can be adjusted, the reference plane is filtered in the screenshot view

Heads-up Display

Heads-up display is the UI element that users don't interact with directly. However, it is essential in some cases since it sticks to certain positions within users viewport, which endows it ability to display critical information without distracting users from the task itself.

Since it follows user's view, it is important to consider basic human visual data, as well as the FOV of the VR device itself.

A prototype of HUD above indicating tools that it is currently using.

To Be Continued...

More prototypes will be added to this collection, stay tuned...

BACK TO TOP