Chapter 5 - Interaction Pattern and Techniques
Chapter 5 - Interaction Pattern and Techniques
• Manipulation Patterns
Different interaction
techniques that are similar are
grouped under the same
interaction pattern.
Credit: An awesome Virtual
Reality pic! Would you walk Credit: Virtual Reality Training
the plank? Awesome ... Can Aid in Stroke Recovery - Y
pinterest.com GLENN MCDONALD
Organizing interaction
techniques under a broader Broader pattern names and
Too many existing interaction
interaction easier to design by concepts make it easier to
techniques to remember, and
focusing on conceptual utility communicate interaction
many more will be developed.
and higher-level design concepts.
decisions.
• Selection,
• Indirect control and
• Manipulation,
• Compound patterns
• Viewpoint control,
Interaction Patterns
Selection Manipulation View Point Control Indirect Control Compound
• The FIRST FOUR are used sequentially (e.g., a user may travel toward a table,
select a tool on the table, and then use that tool to manipulate other objects on
the table or can be integrated) into compound patterns.
• Useful interaction techniques are then described within the context of the
broader pattern.
• The intent for describing these patterns and techniques is for readers to:
• use them,
• extend them, and
• serve inspirations for creating new ways of interacting within VR.
It is the specification of one or more objects from a set-
Selection in order to:
• Just like in the real world, where you might use your hands to grab an object or point at
something, in VR, you have various methods to select and manipulate virtual objects.
• Expands upon the concept of a non-realistic hand by enabling one to reach far beyond personal
space.
• The virtual hand is mapped to the physical hand when within 2/3 of the full arm's reach and
when extended further, the hand “grows" in a nonlinear manner enabling the user to reach
further into the environment.
• Enables closer objects to be selected (and manipulated) with accuracy while allowing further
objects to be easily reached.
• Physical aspects of arm length and height have been found to be important for the go-go
technique, so measuring arm length should be considered when using this technique.
Pointing Pattern
Descriptions When to use: Limitations:
• One of the most • Better for selection than the • Selection by pointing is not
fundamental and often- Hand Selection pattern unless appropriate when realistic
used patterns for selection. realistic interaction is required interaction is required.
• It extends a ray into the – selection beyond personal • Straightforward
distance and the first object space and when small hand implementation result in
intersected can be selected motions are required. difficulty selecting small
via user-controlled trigger. • It is faster when speed of objects in the distance.
remote selection is vital, but • Pointing with the hand can be
• Most typically done with often used for precisely precise due to natural hand
the head, or a hand/finger. selecting closed objects – tremor.
pointing with the dominant • Object snapping and precision
hand to select component on mode pointing can lessen
panel held in a non-dominant problem.
hand.
Pointing Pattern
Hand Pointing ("Raycasting" technique)
• Object Snapping
• Precision mode pointing
• Two-handed pointing https://umtl.cs.uni-
saarland.de/research/projects/selection-based-
text-entry-in-virtual-reality.html
Eye gaze selection
• Selection with eye tracking.
Pointing Pattern • User looks at item of
interest and the provides a
signal to select the object.
Examples: • This is not a good technique
due to Midas Touch problem –
• Hand pointing the fact that people expect to
look at things without that look
• Head Pointing ‘meaning’ something. They
normally not accustomed to
• Eye Gaze selection use device simply by moving
their eyes.
• Object Snapping
• Precision mode pointing
• Two-handed pointing
Credit: Patryk Piotrowski
Adam Nowosielski, Gaze-Based
Interaction for VR Environments
(2019) - Advances in Intelligent
Systems and Computing book
series (AISC, volume 1062)
Pointing Pattern Object Snapping
• It works by objects having scoring
functions that cause the selection ray to
Examples: snap/bend toward the object with the
highest score.
• Hand pointing • It works well when selectable objects are
small and/moving.
• Head Pointing
• Eye Gaze selection
• Object Snapping
• Precision mode pointing
• Two-handed pointing Credit: Caitlin Webb (2017) - From VR training to
broadband on a plane – the future of innovation is in our
pockets. https://horizon-magazine.eu/
Precision Mode Pointing
• Framing hands
Image-Plane Selection Pattern
• Head crusher Sticky finger
• It provides an easier gesture—the object
• Sticky finger underneath the user’s finger in the 2D
image is selected.
• Lifting palm
• Framing hands
• The movements and actions performed with the the orientation of a neurological
dataset (Hinckley et al. 1994)
• For example, users may have access to a virtual paintbrush tool that
allows them to draw or paint directly onto virtual surfaces within the VR
environment.
• Similarly, users may use a virtual sculpting tool to shape and mold virtual
objects as if they were working with clay or other sculpting materials in
the real world.
3D Tool Pattern
Descriptions When to use: Limitations:
• It enables users to directly • Use to enhance the capability It can take more effort to use if
manipulate an intermediary of the hands to manipulate the user must first travel and
3D tool with their hands that objects. For example, a maneuver to an appropriate
in turn directly manipulates screwdriver provides precise angle in order to apply the tool
some object in the world. control of an object by to an object.
• An example, is a stick to mapping large rotations to
extend one's reach or a small translations along a
handle on an object that single axis.
enables the object to be
reshaped.
Credit: arm-hud-widge
https://blog.leapmotion.com/hovercast-vr-menu-power-
fingertips/arm-hud-widget/
Object-attached tools
3D Tool Pattern • A manipulated tool that is attached/co-located with
an object.
• It results in a more coupled signifier representing the
• Hand-held tools affordance between the object, tool, and user.
• For e.g., a color icon might be located on an
• Object-attached tools object, and the user simply selects the icon at
which point a color cube appears so the user
• Jigs can choose the color of the object.
• Or if the shape of a
box can be changed,
then an adjustment
tool can be made
available on the
corners of the box—
for example,
dragging the vertex.
Tilt brush enables the user to change the brush they use to paint in their VR
environment. It also has overall scene options for saving, copying, etc.
Credit: https://medium.com/@lucycarp/vr-menu-patterns-and-use-cases-f30fd8b5ef36
Jigs
Control
For example:
• Moving the viewpoint to the left is equivalent to moving
the world to the right, or
Warning:
• Some of these patterns may induce motion • Sickness can be reduced by integrating the
sickness and are NOT APPROPRIATE for techniques with the suggestions that will be
users new to VR or those who are sensitive to discussed later.
scene motion.
Viewpoint Control Patterns
+ allow users to explore virtual spaces, change their viewing angle, and interact with content from
different perspectives.
+ dictate how users move around and change their perspective within VR. Users can navigate
through the virtual environment, look around, zoom in or out, and change their viewpoint to
interact with objects or explore different areas.
1. Teleportation: Users can select a destination within the VR environment and instantly transport themselves
there, allowing for quick and seamless navigation without physically walking.
2. Free Movement: Users can physically walk or use handheld controllers to move within the VR space, like
walking in the real world. This provides a sense of immersion and freedom of movement.
3. Flying: Users can fly or float through the VR environment, allowing them to explore from a bird's-eye view
or navigate through obstacles with ease.
4. Grab and Drag: Users can grab onto the environment and physically drag themselves to move around. This
method is like pulling oneself along a railing or ledge.
5. Vehicle Controls: Users can pilot virtual vehicles, such as cars, planes, or boats, to navigate through the VR
environment.
Walking Pattern
Descriptions When to use: Limitations:
• It controls motion of the feet • It matches or mimics real- • It is not appropriate when
to control the viewpoint. world locomotion and therefore rapid or distant
• Walking within includes provides a high degree of navigation is important.
everything from real walking interaction fidelity. • True walking across
to mimicking walking by • It enhances presence and large distances requires
moving the feet when seated. ease of navigation as well as a large tracked space,
spatial orientation and and for wired headsets
movement understanding. cable tangling can pull on
• Ideal for navigating small to the headset and be a
medium-size spaces, and such tripping hazard.
travel results in no motion • Fatigue can result with
sickness. prolonged use.
Real walking
• It matches physical walking with motion in the virtual
Walking Pattern environment - the mapping from real to virtual is one-
to-one.
• An ideal interface for many VR experiences (high
• Real walking fidelity).
• Not directly measure foot motion, but instead tracks
• Redirected walking the head.
• It results in less motion sickness due to better
• Walking in place matching visual and vestibular cues.
• Unfortunately, real walking by itself limits travel in the
• The human joystick virtual world to the physically tracked space.
https://www.slideshare.net/marknb00/lecture-5-3d-user-interfaces-for-virtual-reality
Torso-directed steering.
Steering Pattern • Also called chair-directed steering when the torso is not
tracked—utilized when traveling over a terrain, separates the
direction of travel from the way one is looking.
• It has more interaction fidelity than gaze-directed steering
• Navigation by leaning since in the real world one does not always walk in the
• Gaze-directed steering direction the head is pointed.
• When the torso or chair is not tracked, a general forward
• Torso-directed steering direction can be assumed.
• This technique can have more of a nauseating effect to
• One-handed flying inexperience user.
• Visual cues can be provided to help users maintain a sense
• Two-handed flying of the forward direction.
• Dual analog stick steering
• World-grounded steering In half of the conditions, leaning
(left) was used to control velocity, in
device the second half a gamepad (right).
In all conditions, the subjects sat in
a swivel chair and wore an HTC
• Virtual Steering device Vive Pro extended with a wireless
adapter as well as an additional
torso tracker, which was mounted to
the chest using a strap for action
cameras.
One-handed flying.
• It works by moving the user in the direction the finger
Steering Pattern or hand is pointing.
• Velocity can be determined by the horizontal distance
of the hand from the head.
• Navigation by leaning
• Gaze-directed steering
• Torso-directed steering
• One-handed flying
• Two-handed flying
Two-handed flying.
• Dual analog stick steering
• It works by moving the user in the direction
• World-grounded steering determined by the vector between the two hands, and
device the speed is proportional to the distance between the
hands.
• Virtual Steering device • Flying backward with two hands is more easily
accomplished than one-handed flying—which requires
an awkward hand or device rotation—by swapping the
location of the hands
Dual analog stick steering
• Also known as joysticks or analog pads.
Steering Pattern • Works well for steering over a terrain—that is,
forward/back and left/right.
• The standard first-person game controls should be used
where the left stick controls 2D translation.
• Navigation by leaning • It is intuitive and consistent with traditional first-person
• Gaze-directed steering video games—that is, gamers already understand how to
use such controls, so they have little learning curve.
• Torso-directed steering • Virtual rotations can be disorienting and sickness inducing
for some people.
• One-handed flying • Thus, the designer might design the experience to have
the content consistently in the forward direction so that no
• Two-handed flying virtual rotation is required.
• Alternatively, if the system is wireless and the torso or
• Dual analog stick steering chair is tracked, then there is no need for virtual rotations
• World-grounded steering since the user can physically rotate 360°.
device
• Virtual Steering device
https://www.pcmag.com/news/beyond-the-gamepad-
alternative-controllers-for-your-nintendo-switch-ps4
Steering Pattern World-grounded steering devices.
• This type of input devices such as flight sticks,
or steering wheels are often used to steer
• Navigation by leaning through a world.
• Such devices can be quite effective for
• Gaze-directed steering viewpoint control due to the sense of actively
• Torso-directed steering controlling a physical device.
• One-handed flying
• Two-handed flying
• Dual analog stick steering
• World-grounded steering
device
• Virtual Steering device https://www.lavrock.io/vr-training-simulator-software/
Steering Pattern Virtual steering devices.
• This type uses visual representations of real-
world steering devices—although they do not
• Navigation by leaning actually physically exist in the experience—that
are used to navigate through the environment.
• Gaze-directed steering • More flexible than physical devices as they can
be easily changed in software.
• Torso-directed steering • But difficult to control due to having no
• One-handed flying proprioceptive force feedback.
• Two-handed flying
• Dual analog stick steering
• World-grounded steering
device
• Virtual Steering device
https://factschronicle.com/microsoft-expands-project-xcloud-plans-to-
xbox-vr-and-other-devices-24236.html
3D Multi-Touch Pattern
Descriptions When to use: Limitations:
• Enables simultaneous • For non-realistic interactions • Not appropriate when the
modification of the position, when creating assets, user is confined to the
orientation, and scale of the manipulating abstract data, ground.
world with the use of two hands. viewing scientific datasets, or • It can be challenging to
• Translation is obtained by rapidly exploring large and implement as small
grabbing and moving space small areas of interest from distinctions can affect the
with one hand (monomanual arbitrary viewpoints. usability of the system.
interaction) or with both hands
(synchronous bimanual
interaction).
Digital ArtForms’ Two-Handed Interface
3D Multi-Touch • Scale and rotation occur about the middle point
between the two hands.
Pattern • Rotation of the world is accomplished by grabbing
space with both hands and orbiting the hands about
the midpoint between the hands, like grabbing a
• Digital ArtForms’ Two- globe on both sides and turning it.
• Once learned, this implementation works well when
Handed Interface navigation and selection/manipulation tasks are
frequent and interspersed.
• The spindle. • Gives users the ability to place the world and objects
of interest into personal space at the most
comfortable working pose via position, rotation, and
scaling operations.
Figure, shows a
schematic for
manipulating the
viewpoint.
https://www.youtube.com/watch?v=Tw1mXjMshJE
3D Multi-Touch The spindle.
• Consists of geometry connecting the two hands, call
Pattern visual integration—along with a visual indication of the
center of rotation/scale dramatically helps users plan
• Digital ArtForms’ their actions and speeds the training process.
• The yellow dot between the two cursors is the point that
Two-Handed is rotated and scaled about.
• Users simply place the center point of the spindle at the
Interface point they want to scale and rotate about, push a button
in each hand, and pull/scale themselves toward it.
• The spindle.
https://www.engadget.com/2016-10-07-why-teleportation-makes-sense-
in-virtual-reality.html
Indirect Control Patterns
• Widgets and Panels Pattern
• Non-Spatial Control Pattern.
• Indirect control is ideal when an obvious spatial mapping does not exist, or it is
difficult to directly manipulate an aspect of the environment.
• Indirect control typically specifies only what should be done and the system
determines how to do it.
• Provide control through an intermediary to modify an object, the environment, or
the system.
• Example uses:
• Controlling the overall system, issuing commands, changing modes, and
modifying non-spatial parameters
Widgets and Panels Pattern
Descriptions When to use: Limitations:
• It is the most common form of VR • They are useful for • Not as obvious or intuitive as
indirect control, and typically follows complex tasks where it is direct mappings and may
2D desktop widget and difficult to directly interact take longer to learn.
panel/window metaphors. with an object and will be • For e.g.:
• It is a geometric user interface activated via a Pointing • If panels are not within
element and provide information to Pattern. personal space or
the user or might be directly • They can also be attached to an
interacted with by the user. combined with other appropriate reference
• The simplest widget is a single label selection option. frame, then the widgets
that only provides information. • More accuracy than can be difficult to use.
• Such a label can also act as a directly manipulating • If the widgets are too
signifier for another widget—for objects high, gorilla arm can
example, a label on a button. result for actions that
• Panels are container structures that take longer than a few
multiple widgets and other panels seconds.
can be placed upon (Placement of
panels).
2D desktop integration
Widgets and • An advantage of using desktop metaphors is their familiar
• Color cube
• Finger menus
• Above-the-head widgets
and panels
• Virtual hand-held panels
• Physical panels. https://www.geeks3d.com/hacklab/tag/color/
Finger menus
Widgets and • Finger menus consist of menu options attached to the fingers.
• A pinch gesture with the thumb touching a finger can be used
to select different options.
Panels Pattern • Once learned the user is not required to look at the menus; the
thumb simply touches the appropriate finger.
• This prevents occlusion as well as decreases fatigue.
• 2D desktop integration • The non dominant hand can select a menu—up to four
• Ring menus menus—and the dominant hand can then select one of four
items within that menu.
• Pie menus • For complex applications where more options are required, a
TULIP menu (Three- Up, Labels In Palm) can be used.
• Color cube • The dominant hand contains three menu options at a time and
the pinky contains a More option.
• Finger menus
• Above-the-head widgets
and panels TULIP menus on seven fingers and the right palm.
https://www.researchgate.net/figure/TULIP-menus-use-
finger-pinches-to-select-menu-items_fig7_35146207
• Virtual hand-held panels
• Physical panels.
Above-the-head widgets and panels
Widgets and • Above-the-head widgets and panels are placed out of the way
above the user and accessed via reaching up and pulling
down the widget or panel with the non-dominant hand.
Panels Pattern • Once the panel is released, then it moves up to its former
location out of view.
• The panel might be visible above, especially for new users, but
• 2D desktop integration after learning where the panels are located relative to the
body, the panel might be made invisible since the user can use
• Ring menus his sense of proprioception to know where the panels are
• Pie menus without looking.
• Users could easily select among three options above their field
• Color cube of view—up to the left, up in the middle, and up to the right.
• Finger menus
• Above-the-head
widgets and panels
• Virtual hand-held panels
https://www.dailymotion.com/video/x54el0u
• Physical panels.
Non-Spatial Control Pattern
Descriptions When to use: Limitations:
• It provides global action • Use when options can be • Gestures and accents are highly
performed through visually presented. variable from user to user and
description instead of a • For example: even for a single user.
spatial relationship. • gesture icons or text to • There is often a trade-off of
• It is most implemented speak—and appropriate accuracy and generality—the
through speech or gestures. feedback can be more gestures or words to be
provided. recognized then the less
• It is best used when there are accurate the recognition rate.
a small number of options to • System recognition of voice can
choose from and when a be problematic when many
button is available to push-to- users are present or there is a
talk or push-to-gesture. lot of noise. For important
• Use voice when moving the commands, verification may be
hands or the head would required and can be annoying to
interrupt a task. users.
Non-Spatial Control Pattern
• Voice menu hierarchies Voice menu hierarchies
• They are like traditional desktop
• Gestures menus where submenus are
brought up after higher-level menu
options are selected.
• Menu options should be visually
shown to users, so users explicitly
know what options are available.
Non-Spatial Control Pattern
Gestures
• Voice menu hierarchies • It can work well for non-spatial commands.
• It should be intuitive and easy to remember.
• Gestures • For example:
• a thumbs-up to confirm, raising the index
finger to select option 1, or raising the
index and middle finger to select option
2.
• Visual signifiers showing the gesture options
available should be presented to the user,
especially for users who are learning the
gestures.
• The system should always provide feedback
to the user when a gesture has been
recognized.