0% found this document useful (0 votes)
160 views77 pages

Chapter 5 - Interaction Pattern and Techniques

Uploaded by

Faqih Abdulais
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
160 views77 pages

Chapter 5 - Interaction Pattern and Techniques

Uploaded by

Faqih Abdulais
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 77

• Selection Patterns

• Manipulation Patterns

Interaction Pattern • Viewpoint Control Patterns


• Indirect Control Patterns

and Techniques • Compound Patterns


VR SYSTEM ARCHITECTURE Important I/O devices used
I/O
for user input or output.
The special purpose VR ENGINE
DEVICES
computer architecture
designed to match the
high I/O and Interaction
computation demands Human factors issues
of real-time VR SOFTWARE & affecting simulation
simulations. DATABASE USER
efficiency, comfort and
safety.
Software for virtual
object modeling, and 5 Classic Components of
programming a Virtual Reality System TASK Traditional VR application
languages to help VR and tool for solving various
application developer. practical problems in
medical care, education,
arts & entertainment, and
the military.
An interaction pattern is a VR SYSTEM ARCHITECTURE
generalized high-level
I/O
interaction concept that can VR ENGINE
DEVICES
be used across different
applications to achieve
common user goals. Interaction M. Achibet, A. Girard, M. Marchal, A. Lécuyer:PRESENCE:
Teleoperators and Virtual Environments, 2016

Are described from the SOFTWARE &


DATABASE USER
user's point of view.
Are largely implementation
independent, and state
relationships/interactions 5 Classic Components of An interaction technique is
between the user and the a Virtual Reality System TASK
more specific and more
virtual world along with its technology dependent than an
perceived objects interaction pattern.

Different interaction
techniques that are similar are
grouped under the same
interaction pattern.
Credit: An awesome Virtual
Reality pic! Would you walk Credit: Virtual Reality Training
the plank? Awesome ... Can Aid in Stroke Recovery - Y
pinterest.com GLENN MCDONALD

• The best interaction techniques


consist of:
• For example, the walking pattern • High-quality affordances,
covers several walking interaction
• Signifiers,
techniques ranging from real
walking to walking in place. • Feedback, and
• Mappings (immediate result and
useful mental model for users).
Why it is important?
Distinguishing between interaction patterns and interaction techniques is
important because:

Organizing interaction
techniques under a broader Broader pattern names and
Too many existing interaction
interaction easier to design by concepts make it easier to
techniques to remember, and
focusing on conceptual utility communicate interaction
many more will be developed.
and higher-level design concepts.
decisions.

When specific technics fails,


then other techniques can be
Higher level groupings enable
more easily thought about
easier systematic analysis and
and explored, resulting in
comparison.
better understanding that
interaction technique.

M. Achibet, A. Girard, M. Marchal, A. Lécuyer: IEEE Virtual Reality, 2015


Credit: M. Achibet, G. Casiez, M. Marchal:IEEE Symposium on 3D User Interfaces,
2016

Credit: M. Achibet, B. Le Gouis, M. Marchal, P-A. Leziart, F. Argelaguet, A. Girard, A.


Lécuyer, H. Kajimoto: IEEE Symposium on 3D User Interfaces, 2017

• Both techniques provide conceptual models to experiments with suggestions and


warnings of use and starting points for innovative new designs.
• Interaction designers should know and understand these patterns and techniques well
- have a library of options to choose from depending on their needs.
• Do not fall into the trap that there is a single best interaction pattern or technique.
• Each have strengths and weaknesses depending on application goals and user.
• Understanding distinctions and managing trade-offs help to creating high-quality
interactive experiences.
VR Interaction Pattern:
Interaction Patterns
Selection Manipulation View Point Control Indirect Control Compound

• Hand • Direct Hand • Walking • Widgets & • Pointing


Selection, • Proxy • Steering Panels Hand
• Pointing • 3D Tool • 3D Multi- • Non-Spatial • World-in-
• Image Plane touch Miniature
• Volume • Automated • Multimodal
Based

• Selection,
• Indirect control and
• Manipulation,
• Compound patterns
• Viewpoint control,
Interaction Patterns
Selection Manipulation View Point Control Indirect Control Compound

• Hand • Direct Hand • Walking • Widgets & • Pointing


Selection, • Proxy • Steering Panels Hand
• Pointing • 3D Tool • 3D Multi- • Non-Spatial • World-in-
• Image Plane touch Miniature
• Volume • Automated • Multimodal
Based

• The FIRST FOUR are used sequentially (e.g., a user may travel toward a table,
select a tool on the table, and then use that tool to manipulate other objects on
the table or can be integrated) into compound patterns.
• Useful interaction techniques are then described within the context of the
broader pattern.
• The intent for describing these patterns and techniques is for readers to:
• use them,
• extend them, and
• serve inspirations for creating new ways of interacting within VR.
It is the specification of one or more objects from a set-
Selection in order to:

Patterns: • Specify an object to which a command will be


applied,
• Hand Selection, • Denote the beginning of a manipulation tasks, or
• Pointing
• Image Plane • Specify a target to travel toward.
• Volume Based
Each has advantages over the others depending on the
application and task.

Credit: M. Achibet, G. Casiez, M. Marchal:IEEE Symposium on 3D User Interfaces, 2016


Hand Selections Pattern
• Hand selection patterns refer to the ways users choose and interact with objects or elements
within a VR environment using hand gestures or controllers

• Just like in the real world, where you might use your hands to grab an object or point at
something, in VR, you have various methods to select and manipulate virtual objects.

Example of Hand Selection Patterns:


1. Point and Click: Users point their hand or controller at an object and press a button to select
it, similar to clicking with a mouse.
2. Grip and Grab: Users use a gripping motion with their hand controller to grab and manipulate
objects within reach.
3. Pinch and Drag: Users perform a pinching gesture with their fingers to pick up and move
objects, similar to picking something up between thumb and forefinger.
4. Gesture Recognition: Users can use predefined hand gestures recognized by the VR
system to perform specific actions, such as making a thumbs-up gesture to confirm a
selection or making a fist to grab an object.
Hand Selections Pattern
Descriptions When to use: Limitations:
• Mimics real-world • Ideal for realistic Physiology
• User directly reaches out interactions • The arm can only be stretched so
the hand to touch some far, and the wrist rotated so far to
object and those objects within reach (personal
• Triggers a grab (e.g., space),
pushing a button on a • Requires users to first travel to
controller, making a fist, or place himself close to the object to
uttering a voice command. be selected.
• Different user heights and arm
lengths can make it uncomfortable -
select objects that are at the edge
of personal space.
• Virtual hand and arm often occlude
objects of interest and can be too
large to select small item. None
realistic hand selection techniques
are not as limiting.
Hand Selections Pattern
Realistic Hands Non-Realistic Hands
• Extremely compelling for • Need not to necessarily look • Users can accept
providing an illusion of self- real and trying to make abstract 3D cursors and
embodiment. hands and arms realistic can still feel like they are
• Although the entire arm would limit interaction. directly selecting objects.
be tracked, inverse kinematics • Non-realistic hands do not • Hand cursors/making the
can estimate the pose of the try to mimic reality but hand transparent reduce
arm quite well – user don’t instead focuses on ease of problems of visual
notice differences in arm pose. interaction. occlusion.
• Modelling users (measuring • Often hands are used
arm length) and placing without arms so that reach
objects within a comfortable can be scaled to make the
range depending on the design of interactions easier.
measured arm length is ideal. • Although the lack of arm can
be disturbing, user quickly
learn to accept having no
arms.
Hand Selections Pattern
Realistic Hands Non-Realistic Hands

• A realistic hand with arm. • Semi-realistic hands with no • Abstract hands


arms .
Credit: Gallery by Cloudhead Games, (left) semi-realistic hands from TrainWreck by NextGen Interactions (Center), and non-realistic hands (3D cursors) from Zombie
Apocalypse by Digital ArtForms (right).
Hand Selections Pattern
- Go-go Technique

• Expands upon the concept of a non-realistic hand by enabling one to reach far beyond personal
space.
• The virtual hand is mapped to the physical hand when within 2/3 of the full arm's reach and
when extended further, the hand “grows" in a nonlinear manner enabling the user to reach
further into the environment.
• Enables closer objects to be selected (and manipulated) with accuracy while allowing further
objects to be easily reached.
• Physical aspects of arm length and height have been found to be important for the go-go
technique, so measuring arm length should be considered when using this technique.
Pointing Pattern
Descriptions When to use: Limitations:
• One of the most • Better for selection than the • Selection by pointing is not
fundamental and often- Hand Selection pattern unless appropriate when realistic
used patterns for selection. realistic interaction is required interaction is required.
• It extends a ray into the – selection beyond personal • Straightforward
distance and the first object space and when small hand implementation result in
intersected can be selected motions are required. difficulty selecting small
via user-controlled trigger. • It is faster when speed of objects in the distance.
remote selection is vital, but • Pointing with the hand can be
• Most typically done with often used for precisely precise due to natural hand
the head, or a hand/finger. selecting closed objects – tremor.
pointing with the dominant • Object snapping and precision
hand to select component on mode pointing can lessen
panel held in a non-dominant problem.
hand.
Pointing Pattern
Hand Pointing ("Raycasting" technique)

Examples: • Come with a ray extending from the hand or


finger.
• Hand pointing • It is the most common method of selection.
• User provides signal to select the item of
• Head Pointing interest (e.g. button press, gesture with the
other hand).
• Eye Gaze selection
• Object Snapping
• Precision mode pointing
• Two-handed pointing
Head Pointing
• When no hand tracking is available – it become the
Pointing Pattern most common form of selection.
• Implemented by drawing a small pointer or reticle at
the center of the field of view so the user simply lines
Examples: up the pointer with the object of interest and then
provide signal to select the object (e.g. button press,
• Hand pointing if available or by dwell selection – by holding the
pointer on the object for some define period).
• Head Pointing • Dwell is not ideal due to having to wait for objects to
be selected and accidental selection when looking at
• Eye Gaze selection an object of interest.

• Object Snapping
• Precision mode pointing
• Two-handed pointing https://umtl.cs.uni-
saarland.de/research/projects/selection-based-
text-entry-in-virtual-reality.html
Eye gaze selection
• Selection with eye tracking.
Pointing Pattern • User looks at item of
interest and the provides a
signal to select the object.
Examples: • This is not a good technique
due to Midas Touch problem –
• Hand pointing the fact that people expect to
look at things without that look
• Head Pointing ‘meaning’ something. They
normally not accustomed to
• Eye Gaze selection use device simply by moving
their eyes.
• Object Snapping
• Precision mode pointing
• Two-handed pointing
Credit: Patryk Piotrowski
Adam Nowosielski, Gaze-Based
Interaction for VR Environments
(2019) - Advances in Intelligent
Systems and Computing book
series (AISC, volume 1062)
Pointing Pattern Object Snapping
• It works by objects having scoring
functions that cause the selection ray to
Examples: snap/bend toward the object with the
highest score.
• Hand pointing • It works well when selectable objects are
small and/moving.
• Head Pointing
• Eye Gaze selection
• Object Snapping
• Precision mode pointing
• Two-handed pointing Credit: Caitlin Webb (2017) - From VR training to
broadband on a plane – the future of innovation is in our
pockets. https://horizon-magazine.eu/
Precision Mode Pointing

Pointing Pattern • It is a non-isomorphic rotation technique that scales


down the rotational mapping of the hand to the
pointer, as defined by the control/display (C/D) ratio.
• The result is a slow-motion cursor that enables fine
Examples: pointer control.
• A zoom lens can also be used that scales the area
• Hand pointing around the cursor to enable seeing smaller objects,
but the zoom should not be affected by head pose
• Head Pointing unless the zoom area on the display is small.
• The user can control the amount of zoom with a
• Eye Gaze selection scroll wheel on a hand-held device.
• Object Snapping
• Precision mode pointing
• Two-handed pointing
Credit: Topics in 3D User Interfaces Joseph J. LaViola
Jr. CS. https://slideplayer.com/slide/5082977/
Two-handed pointing
• It originates the selection at the near hand and
Pointing Pattern extends the ray through the far hand.
• This provides more precision when the hands are
further apart and fast rotations about a full 360°
range when the hands are closer together—
Examples: whereas a full range of 360° for a single hand
pointer is difficult due to physical hand
• Hand pointing constraints.
• The distance between the hands can also be
• Head Pointing used to control the length of the pointer.
• Eye Gaze selection
• Object Snapping
• Precision mode pointing
Credit: Vijayakumar Nanjappan et.al. (2018). User-

• Two-handed pointing elicited dual-hand interactions for manipulating 3D


objects in virtual reality environments.
https://link.springer.com/

Credit: Gfycat – VRtisan Virtual Reality First-person


Architectural Visualisation Technology
https://gfycat.com/adorableconstantgartersnake
Image-Plane Selection Pattern
Descriptions When to use: Limitations:
• It uses a combination of • Image-plane techniques • It works for a single
eye position and hand simulate direct touch at a eye, so users should
position for selection. distance, thus are easy close one eye while
• It can be thought as the to use. using these
scene and hand being • These techniques work techniques—or use a
projected onto a 2D well at any distance as monoscopic display.
image plane in front of long as the object can be • It results in fatigue
the user—or on the eye. seen. when used often due to
• User holds one or two having to hold the hand
hands between the eye up high in front of the
and the desired object eye.
and then provides a • As in the Hand
signal to select the object Selection Pattern, the
when the object lines up hand often occludes
with the hand and eye. objects if not
transparent.
Image-Plane Selection Pattern
Head crusher
• Head crusher • The user positions his thumb and forefinger
around the desired object in the 2D image
• Sticky finger plane.
• The inset shows the user’s view of selecting
• Lifting palm the chair.

• Framing hands
Image-Plane Selection Pattern
• Head crusher Sticky finger
• It provides an easier gesture—the object
• Sticky finger underneath the user’s finger in the 2D
image is selected.
• Lifting palm
• Framing hands

Credit: D. Mendes, et.al. 2018, A Survey on 3D Virtual


Object Manipulation: From the Desktop to Immersive Virtual
Environments.
https://onlinelibrary.wiley.com/doi/pdf/10.1111/cgf.13390
Image-Plane Selection Pattern
Lifting palm
• Head crusher • User selects objects by flattening his
outstretched hand and positions the palm so that
• Sticky finger it appears to lie below the desired object.
• Lifting palm
• Framing hands

Credit: Jeffrey S. Pierce et.al. Image Plane


Interaction Techniques In 3D Immersive
Environment.
http://www.cs.cmu.edu/~stage3/publications/97/confe
rences/3DSymposium/HeadCrusher/
Image-Plane Selection Pattern
Framing hands
• Head crusher • It is a two-handed technique where the
hands are positioned to form the two
• Sticky finger corners of a frame in the 2D image
surrounding an object.
• Lifting palm
• Framing hands

Credit: Jeffrey S. Pierce et.al. Image Plane


Interaction Techniques In 3D Immersive
Environment.
http://www.cs.cmu.edu/~stage3/publications/97/confe
rences/3DSymposium/HeadCrusher/
Volume-Based Selection Pattern

• the Volume-Based selection pattern allows users to interact with virtual


objects by moving their body or hand controllers within a predefined
area in the VR environment. Instead of directly touching or pointing at
objects, users trigger actions or selections based on their proximity or
movement within the volume.
Volume-Based Selection Pattern
Descriptions: When to use: Limitations:
• Enables selection of a • It is appropriate when the user • Selecting volumetric
volume of space—for needs to select a not-yet- space can be more
example, a box, sphere, or defined set of data in 3D space challenging than
cone—and is independent or to carve out space within an selecting a single object
of the type of data being existing dataset. with the other more
selected. • Enables selection of data when common selection
• Data to be selected can be there are no geometric patterns.
volumetric (voxels), point surfaces—for e.g. medical CT
clouds, geometric datasets—whereas geometric
surfaces, or even space surfaces are required for
containing no data—for implementing many other
example, to follow with selection patterns/techniques—
filling that space with some for example, pointing requires
new data. intersecting a ray with object
surfaces.
Volume-Based Cone-casting flashlight
• Uses pointing, and instead of using a ray, a cone is
Selection Pattern used.
• This results in easier selection of small objects
• Cone-casting flashlight than standard pointing via ray casting.
• If the intent is a single object, then the object closest
• Two-handed box selection to the cone's centerline or the object closest to the
user can be selected [Liang and Green 1991].
• A modification of this technique is the aperture
technique which enables the user to control the
spread of the selection volume by bringing the
hand closer or further away.
Volume-Based Two-handed box selection
• Uses both hands to position, orient, and
Selection Pattern shape a box via snapping and nudging.
• Snap and nudge are asymmetric techniques
• Cone-casting flashlight where one hand controls the position and
orientation of the selection box, and the second
• Two-handed box selection hand controls the shape of the box.
• Both snap and nudge mechanisms have two
stages of interaction grab and reshape.
• Grab influences the position and orientation of
the box.
• Reshape changes the shape of the box.
Manipulation Patterns:

It is the modification of attributes for


one or more objects such as position,
orientation, scale, shape, color, and
texture.
• Typically follows selection, such as the
need to first pick up an object before
throwing it.
• It include the :
• Direct Hand Manipulation Pattern,
• Proxy Pattern, and
• 3D Tool Pattern.
Direct Hand Manipulation Pattern
Descriptions When to use: Limitations:
• Corresponds to the way • Direct positioning and Like the Hand Selection
we manipulate objects with orientation with the hand. Pattern (previous section), a
our hands in the real • It shown to be more straightforward
world. efficient and result in implementation is limited by
• After selecting the object, greater user satisfaction the physical reach o f the
the object is attached to the than other manipulation user.
hand moving along with it patterns.
until released.

Credit: Welcome To Virtual Reality


http://www.nexgentc.com/
Non-isomorphic rotation
Direct Hand • Some form of clutching is required to rotate
beyond certain angles, and clutching can
Manipulation Pattern hinder performance due to wasted motion.
• This can be reduced by using non-isomorphic
• Non-isomorphic rotation rotations that allow one to control larger
range of 3D rotation with smaller wrist
• Go-go technique rotation.
• It can also be used to provide precision by
mapping large physical rotations to smaller
virtual rotations.

A subject rotating the house


model to its target orientation.
Credit: An Exploration of Non-
Isomorphic 3D Rotation in Surround
Screen Virtual
Environments - Joseph J. LaViola Jr.
and Michael Katzourin
http://www.eecs.ucf.edu/~jjl/pubs/laviola
805-final.pdf
Direct Hand
Manipulation Pattern Go-go technique
• The go-go technique (Previous
• Non-isomorphic rotation Section) can be used for
• Go-go technique manipulation as well as selection
with no mode change.

Credit: Doug Bowman - Credit: Andrea Bönsch, et.al - Virtuelle und


http://people.cs.vt.edu/~bowman/grab.html Erweiterte Realität, 8. Workshop der GI-
Fachgruppe VR/AR (2011)
Proxy Pattern
Descriptions When to use: Limitations:
• It is a local object (physical or • Works well when a remote The proxy can be difficult to
virtual) that represents and object needs to be intuitively manipulate as intended when there
maps directly to a remote manipulated as if it is were in is a lack of directional compliance,
object. the user’s hands or when that is when there is an orientation
• It uses a proxy to manipulate viewing and manipulating offset between the proxy and the
a remote object. objects at multiple scales (e.g. remote object.
• As the user directly the proxy objects can stay the
manipulates the local same size relative to the user
object(s), the remote object(s) even as the user scales
is manipulated in the same himself relative to the world
way. and remote object).

Credit: VR based Architecture.


https://www.sciencedirect.com/science/article/pii/S131915781
9302320
Proxy Pattern
+Tracked-physical props
• In simple terms, the Proxy Pattern involves using
physical objects as proxies to interact with virtual
objects in VR.
• For example, users may use handheld controllers
or physical props that are tracked in the VR
environment to manipulate virtual objects. Physical proxy prop used to control

• The movements and actions performed with the the orientation of a neurological
dataset (Hinckley et al. 1994)

physical objects are translated into corresponding


actions in the virtual world, allowing users to
engage with virtual content in a more tactile and
intuitive way.
3D Tool Pattern
• The 3D Tool Pattern provides users with virtual tools or instruments that
can be used to interact with objects in a 3-D space. These tools may
include virtual brushes, pens, sculpting tools, or mechanical devices that
users can manipulate using hand controllers or gestures.

• For example, users may have access to a virtual paintbrush tool that
allows them to draw or paint directly onto virtual surfaces within the VR
environment.

• Similarly, users may use a virtual sculpting tool to shape and mold virtual
objects as if they were working with clay or other sculpting materials in
the real world.
3D Tool Pattern
Descriptions When to use: Limitations:
• It enables users to directly • Use to enhance the capability It can take more effort to use if
manipulate an intermediary of the hands to manipulate the user must first travel and
3D tool with their hands that objects. For example, a maneuver to an appropriate
in turn directly manipulates screwdriver provides precise angle in order to apply the tool
some object in the world. control of an object by to an object.
• An example, is a stick to mapping large rotations to
extend one's reach or a small translations along a
handle on an object that single axis.
enables the object to be
reshaped.

Credit: Digital ArtForms and Sixense


Hand-held tools
3D Tool Pattern • Are virtual objects with geometry and behavior that
are attached to/held with a hand.
• Such tools can be used to control objects from a far
(like a TV remote control) or to work more directly on
• Hand-held tools an object.
• A paintbrush used to draw on a surface of an object
• Object-attached tools is an example of a hand-held tool.
• Hand-held tools are often easier to use and
• Jigs understand than widgets due to being more direct.

Credit: arm-hud-widge
https://blog.leapmotion.com/hovercast-vr-menu-power-
fingertips/arm-hud-widget/
Object-attached tools
3D Tool Pattern • A manipulated tool that is attached/co-located with
an object.
• It results in a more coupled signifier representing the
• Hand-held tools affordance between the object, tool, and user.
• For e.g., a color icon might be located on an
• Object-attached tools object, and the user simply selects the icon at
which point a color cube appears so the user
• Jigs can choose the color of the object.
• Or if the shape of a
box can be changed,
then an adjustment
tool can be made
available on the
corners of the box—
for example,
dragging the vertex.
Tilt brush enables the user to change the brush they use to paint in their VR
environment. It also has overall scene options for saving, copying, etc.
Credit: https://medium.com/@lucycarp/vr-menu-patterns-and-use-cases-f30fd8b5ef36
Jigs

3D Tool Pattern • To enable precision is to add virtual constraints


with jigs (or else difficult with 6 DOF input device).
• Similar to real-world physical guides used by
carpenters and machinists, are grids, rulers, and
• Hand-held tools other reference.
• User adjust the jig parameters—for example, grid
• Object-attached tools spacing—and snap other objects into exact
position and orientation.
• Jigs • Jig kits support the snapping together of multiple
jigs—for example, snapping a ruler to a grid—for
more complex alignments. Credit: Digital ArtForms and Sixense

a) The blue, 3D cross-


hairs represent the user’s
hand. The user drags the
lower-left corner of the
orange object to a grid b) The user cuts shapes c) The user precisely snaps a
point. out of a cylinder at 15° wireframe-viewed object onto a
angles. grid.
It is similar to moving, rotating, or scaling the world.

Viewpoint The task of manipulating one’s perspective and can


include translation, orientation, and scale.

Control
For example:
• Moving the viewpoint to the left is equivalent to moving
the world to the right, or

Patterns • Scaling oneself to be smaller is equivalent to scaling the


world to be larger.
• Walking Pattern, • Thus, users perceive themselves either as moving
• Steering Pattern, through the world (self-motion) or as the world moving
around them (world motion) as the viewpoint changes.
• 3D Multi-Touch Pattern,
• Travel is a form of viewpoint control that does NOT
• Automated Pattern. ALLOW scaling.

Warning:
• Some of these patterns may induce motion • Sickness can be reduced by integrating the
sickness and are NOT APPROPRIATE for techniques with the suggestions that will be
users new to VR or those who are sensitive to discussed later.
scene motion.
Viewpoint Control Patterns
+ allow users to explore virtual spaces, change their viewing angle, and interact with content from
different perspectives.
+ dictate how users move around and change their perspective within VR. Users can navigate
through the virtual environment, look around, zoom in or out, and change their viewpoint to
interact with objects or explore different areas.

1. Teleportation: Users can select a destination within the VR environment and instantly transport themselves
there, allowing for quick and seamless navigation without physically walking.
2. Free Movement: Users can physically walk or use handheld controllers to move within the VR space, like
walking in the real world. This provides a sense of immersion and freedom of movement.
3. Flying: Users can fly or float through the VR environment, allowing them to explore from a bird's-eye view
or navigate through obstacles with ease.
4. Grab and Drag: Users can grab onto the environment and physically drag themselves to move around. This
method is like pulling oneself along a railing or ledge.
5. Vehicle Controls: Users can pilot virtual vehicles, such as cars, planes, or boats, to navigate through the VR
environment.
Walking Pattern
Descriptions When to use: Limitations:
• It controls motion of the feet • It matches or mimics real- • It is not appropriate when
to control the viewpoint. world locomotion and therefore rapid or distant
• Walking within includes provides a high degree of navigation is important.
everything from real walking interaction fidelity. • True walking across
to mimicking walking by • It enhances presence and large distances requires
moving the feet when seated. ease of navigation as well as a large tracked space,
spatial orientation and and for wired headsets
movement understanding. cable tangling can pull on
• Ideal for navigating small to the headset and be a
medium-size spaces, and such tripping hazard.
travel results in no motion • Fatigue can result with
sickness. prolonged use.
Real walking
• It matches physical walking with motion in the virtual
Walking Pattern environment - the mapping from real to virtual is one-
to-one.
• An ideal interface for many VR experiences (high
• Real walking fidelity).
• Not directly measure foot motion, but instead tracks
• Redirected walking the head.
• It results in less motion sickness due to better
• Walking in place matching visual and vestibular cues.
• Unfortunately, real walking by itself limits travel in the
• The human joystick virtual world to the physically tracked space.

• Treadmill walking and


running

Underwater virtual reality research: Creating astronaut training


simulation
Credit: Nicholas Ruggieri - https://www.unr.edu/nevada-
today/news/2017/underwater-virtual-reality
Redirect walking
• It is a technique that allows users to walk in a VR
Walking Pattern space larger than the physically tracked space.
• It accomplished by rotation and translation gains
that are different than the true motion of the user,
directing the user away from the edges of tracked
• Real walking space—or physical obstacles.
• Ideally, the gains are below perceptible thresholds
• Redirected walking so that the user does not consciously realize he is
being redirected.
• Walking in place
• The human joystick
• Treadmill walking and
running

A user walks in the real environment on a different


path with a different length in comparison to the
perceptual path in the virtual world.
Credit:https://www.researchgate.net/publication/3808
6775_Estimation_of_Detection_Thresholds_for_Redi
rected_Walking_Techniques
Walking in place
• It consists of making physical walking motions—for
Walking Pattern example, lifting the legs—while staying in the same
physical spot but moving virtually.
• Works well when there is only a small tracked area
and when safety is a primary concern.
• Real walking • The safest form of walking in place is for the user to
be seated.
• Redirected walking • Users can walk in place for any distance.
• However, travel distances are limited by the physical
• Walking in place effort the users.
• Thus, it works well for small and medium-size space -
• The human joystick short durations of travel are required.
• Treadmill walking and
running

Cybershoes enable VR users to physically walk,


run, and jump
Credit:https://venturebeat.com/2019/08/20/cyber
shoes-enable-vr-users-to-physically-walk-run-
and-jump-for-400/
Walking Pattern Human Joystick
• It utilizes the user’s position relative to a central
zone to create a 2D vector that defines the
horizontal direction and velocity of virtual travel.
• Real walking • The user steps forward to control speed.
• It has the advantage that only a small amount of
• Redirected walking tracked space is required—albeit more than
walking in place.
• Walking in place
• The human joystick
• Treadmill walking and
running

Human-Joystick VR Locomotion technique: An empirical


approach
Credit: https://www.semanticscholar.org/paper/Human-
Joystick-VR-Locomotion-technique%3A-An-approach-
Mendoza-
Andres/a689aa0cc5c26d5886162932bdf31ac8263ad9b1
Treadmill walking and running
• Various types of treadmills exist for simulating the physical
Walking Pattern act of walking and running.
• Not as realistic as real walking, treadmills can be quite
effective for controlling the viewpoint, providing a sense of
self-motion, and walking an unlimited distance.
• Real walking • Techniques should make sure foot direction movement is
compliant with forward visual motion.
• Redirected walking • Treadmill techniques that lack directional and temporal
compliance can be worse.
• Walking in place • Treadmills with safety harnesses are ideal, especially
• The human joystick when physical running is required.

• Treadmill walking and


running

Exciting 360 Shooting VR Treadmill Simulations Walk


Running Machine VR Exercise Equipment
Credit: http://www.vrmoviepower.com/vr-simulator/vr-
walker/exciting-360-shooting-vr-treadmill.html
Steering Pattern
Descriptions When to use: Limitations:
• It is a continuous control • It is appropriate for traveling great • It provides less
of viewpoint direction distances without the need for physical biomechanical symmetry
that does not involve exertion. than the walking pattern.
movement of the feet. • It should allow continuous control, or • Many users report
• It is typically no need to at least the ability to interrupt a symptoms of motion
control viewpoint pitch movement after it has begun. sickness.
with controllers as is • It also require minimum cognitive load • Virtual turning is more
done with desktop so the user can focus on spatial disorienting than physical
systems since users can knowledge acquisition and information turning.
physically look up and gathering.
down. • Work best when travel is constrained
to some height above a surface,
acceleration/deceleration can be
minimized, and real-world stabilized
cues can be provided.
Navigation by leaning
Steering Pattern • Navigation by leaning moves the user in
the direction of the lean. The amount of
lean typically maps to velocity.
• Navigation by leaning • One advantage of this technique is no
• Gaze-directed steering requirement for hand tracking.
• Motion sickness can be significant as
• Torso-directed steering velocity varies—that is, acceleration.
• One-handed flying
• Two-handed flying
• Dual analog stick steering
• World-grounded steering
device
• Virtual Steering device
The different human body movements, poses and gestures that trigger interaction and
navigation in VR. http://dx.doi.org/10.1016/j.compenvurbsys.2013.10.003
Gaze-directed steering
• Moves the user in the direction he/she is looking.
Steering Pattern • Typically, the starting and stopping motion in the
gaze direction is controlled by the user via a hand-
held button or joystick.
• Navigation by leaning • It is easy to understand and can work well for
novices or for those accustomed to first-person
• Gaze-directed steering video games where the forward direction is
identical to the look direction.
• Torso-directed steering
• However, it can also be disorienting as any small
• One-handed flying head motion changes the direction of travel, and
frustrating since users cannot look in one direction
• Two-handed flying while traveling in a different direction.
• Dual analog stick steering
• World-grounded steering
device
• Virtual Steering device

https://www.slideshare.net/marknb00/lecture-5-3d-user-interfaces-for-virtual-reality
Torso-directed steering.
Steering Pattern • Also called chair-directed steering when the torso is not
tracked—utilized when traveling over a terrain, separates the
direction of travel from the way one is looking.
• It has more interaction fidelity than gaze-directed steering
• Navigation by leaning since in the real world one does not always walk in the
• Gaze-directed steering direction the head is pointed.
• When the torso or chair is not tracked, a general forward
• Torso-directed steering direction can be assumed.
• This technique can have more of a nauseating effect to
• One-handed flying inexperience user.
• Visual cues can be provided to help users maintain a sense
• Two-handed flying of the forward direction.
• Dual analog stick steering
• World-grounded steering In half of the conditions, leaning
(left) was used to control velocity, in
device the second half a gamepad (right).
In all conditions, the subjects sat in
a swivel chair and wore an HTC
• Virtual Steering device Vive Pro extended with a wireless
adapter as well as an additional
torso tracker, which was mounted to
the chest using a strap for action
cameras.
One-handed flying.
• It works by moving the user in the direction the finger
Steering Pattern or hand is pointing.
• Velocity can be determined by the horizontal distance
of the hand from the head.
• Navigation by leaning
• Gaze-directed steering
• Torso-directed steering
• One-handed flying
• Two-handed flying
Two-handed flying.
• Dual analog stick steering
• It works by moving the user in the direction
• World-grounded steering determined by the vector between the two hands, and
device the speed is proportional to the distance between the
hands.
• Virtual Steering device • Flying backward with two hands is more easily
accomplished than one-handed flying—which requires
an awkward hand or device rotation—by swapping the
location of the hands
Dual analog stick steering
• Also known as joysticks or analog pads.
Steering Pattern • Works well for steering over a terrain—that is,
forward/back and left/right.
• The standard first-person game controls should be used
where the left stick controls 2D translation.
• Navigation by leaning • It is intuitive and consistent with traditional first-person
• Gaze-directed steering video games—that is, gamers already understand how to
use such controls, so they have little learning curve.
• Torso-directed steering • Virtual rotations can be disorienting and sickness inducing
for some people.
• One-handed flying • Thus, the designer might design the experience to have
the content consistently in the forward direction so that no
• Two-handed flying virtual rotation is required.
• Alternatively, if the system is wireless and the torso or
• Dual analog stick steering chair is tracked, then there is no need for virtual rotations
• World-grounded steering since the user can physically rotate 360°.
device
• Virtual Steering device
https://www.pcmag.com/news/beyond-the-gamepad-
alternative-controllers-for-your-nintendo-switch-ps4
Steering Pattern World-grounded steering devices.
• This type of input devices such as flight sticks,
or steering wheels are often used to steer
• Navigation by leaning through a world.
• Such devices can be quite effective for
• Gaze-directed steering viewpoint control due to the sense of actively
• Torso-directed steering controlling a physical device.

• One-handed flying
• Two-handed flying
• Dual analog stick steering
• World-grounded steering
device
• Virtual Steering device https://www.lavrock.io/vr-training-simulator-software/
Steering Pattern Virtual steering devices.
• This type uses visual representations of real-
world steering devices—although they do not
• Navigation by leaning actually physically exist in the experience—that
are used to navigate through the environment.
• Gaze-directed steering • More flexible than physical devices as they can
be easily changed in software.
• Torso-directed steering • But difficult to control due to having no
• One-handed flying proprioceptive force feedback.
• Two-handed flying
• Dual analog stick steering
• World-grounded steering
device
• Virtual Steering device
https://factschronicle.com/microsoft-expands-project-xcloud-plans-to-
xbox-vr-and-other-devices-24236.html
3D Multi-Touch Pattern
Descriptions When to use: Limitations:
• Enables simultaneous • For non-realistic interactions • Not appropriate when the
modification of the position, when creating assets, user is confined to the
orientation, and scale of the manipulating abstract data, ground.
world with the use of two hands. viewing scientific datasets, or • It can be challenging to
• Translation is obtained by rapidly exploring large and implement as small
grabbing and moving space small areas of interest from distinctions can affect the
with one hand (monomanual arbitrary viewpoints. usability of the system.
interaction) or with both hands
(synchronous bimanual
interaction).
Digital ArtForms’ Two-Handed Interface
3D Multi-Touch • Scale and rotation occur about the middle point
between the two hands.
Pattern • Rotation of the world is accomplished by grabbing
space with both hands and orbiting the hands about
the midpoint between the hands, like grabbing a
• Digital ArtForms’ Two- globe on both sides and turning it.
• Once learned, this implementation works well when
Handed Interface navigation and selection/manipulation tasks are
frequent and interspersed.
• The spindle. • Gives users the ability to place the world and objects
of interest into personal space at the most
comfortable working pose via position, rotation, and
scaling operations.

Figure, shows a
schematic for
manipulating the
viewpoint.
https://www.youtube.com/watch?v=Tw1mXjMshJE
3D Multi-Touch The spindle.
• Consists of geometry connecting the two hands, call
Pattern visual integration—along with a visual indication of the
center of rotation/scale dramatically helps users plan
• Digital ArtForms’ their actions and speeds the training process.
• The yellow dot between the two cursors is the point that
Two-Handed is rotated and scaled about.
• Users simply place the center point of the spindle at the
Interface point they want to scale and rotate about, push a button
in each hand, and pull/scale themselves toward it.
• The spindle.

Figure —Two hand cursors and their


connecting spindle
Automated Pattern
Descriptions When to use: Limitations:
• Passively changes the • When the user is playing the • Can be disorienting and
user’s viewpoint. role of a passive observer as sometimes nauseating.
• Common methods of a passenger controlled by • It is not meant to be used to
achieving this are by being some other entity or when completely control the
seated on a moving vehicle free exploration of the camera independent of
controlled by the computer environment is not important user motion.
or by teleportation. or not possible—for
example, due to limitations of
today’s cameras designed
for immersive film.
Automated Pattern Reducing motion sickness
• Can be significantly reduced with this
technique by keeping travel speed and
• Reducing motion direction constant—that is, keep velocity
constant—providing world-stabilized cues,
sickness and creating a leading indicator so users
know what motions to expect
• Passive vehicles
• Target-based travel and Passive vehicles
• Passive vehicles are virtual objects users
route planning can enter or step onto that transport the
user along some path not controlled by the
• Teleportation user.
• Passenger trains, cars, airplanes, elevators,
escalators, and moving sidewalks are
examples of passive vehicles.
Automated Pattern
• Reducing motion sickness Target-based travel and route planning
• It gives a user the ability to select a goal or
• Passive vehicles location he wishes to travel to before being
passively moved to that location.
• Target-based travel and • It is the active specification of a path between
route planning the current location and the goal before being
passively moved.
• Teleportation • It can consist of drawing a path on a map or
placing markers that the system uses to
create a smooth path.
Teleportation
• Teleportation is relocation to a new location
Automated Pattern without any motion.
• It is most appropriate when traveling large
distances, between worlds, and/or when
• Reducing motion sickness reducing motion sickness is a primary concern.
• Fading out and then fading in a scene is less
• Passive vehicles startling than an instantaneous change.
• Unfortunately, straightforward teleportation
• Target-based travel and comes at the cost of decreasing spatial
route planning orientation—users find it difficult to get their
bearings when transported to a new location.
• Teleportation

https://www.engadget.com/2016-10-07-why-teleportation-makes-sense-
in-virtual-reality.html
Indirect Control Patterns
• Widgets and Panels Pattern
• Non-Spatial Control Pattern.

• Indirect control is ideal when an obvious spatial mapping does not exist, or it is
difficult to directly manipulate an aspect of the environment.
• Indirect control typically specifies only what should be done and the system
determines how to do it.
• Provide control through an intermediary to modify an object, the environment, or
the system.
• Example uses:
• Controlling the overall system, issuing commands, changing modes, and
modifying non-spatial parameters
Widgets and Panels Pattern
Descriptions When to use: Limitations:
• It is the most common form of VR • They are useful for • Not as obvious or intuitive as
indirect control, and typically follows complex tasks where it is direct mappings and may
2D desktop widget and difficult to directly interact take longer to learn.
panel/window metaphors. with an object and will be • For e.g.:
• It is a geometric user interface activated via a Pointing • If panels are not within
element and provide information to Pattern. personal space or
the user or might be directly • They can also be attached to an
interacted with by the user. combined with other appropriate reference
• The simplest widget is a single label selection option. frame, then the widgets
that only provides information. • More accuracy than can be difficult to use.
• Such a label can also act as a directly manipulating • If the widgets are too
signifier for another widget—for objects high, gorilla arm can
example, a label on a button. result for actions that
• Panels are container structures that take longer than a few
multiple widgets and other panels seconds.
can be placed upon (Placement of
panels).
2D desktop integration
Widgets and • An advantage of using desktop metaphors is their familiar

Panels Pattern interaction style and thus users have an instant


understanding of how to use them.
• 2D desktop integration brings existing 2D desktop
• 2D desktop integration applications into the environment via texture maps and
mouse control with pointing.
• Ring menus • Figure:
• Pie menus • The center panel contains buttons, a dial, and a
rotary menu that can be used as both a ring menu
• Color cube and a pie menu.
• The right panel contains a color cube from which the
• Finger menus user is selecting a color.
• Above-the-head widgets
and panels
• Virtual hand-held panels
• Physical panels.
Figure, Three examples of hand-held panels with various widgets
Ring menus
Widgets and • It is a rotary 1D menu where several options are
displayed concentrically about a center point.
Panels Pattern • Options are selected by rotating the wrist until the
intended option rotates into the center position or a
• 2D desktop integration pointer rotates to the intended item, as shown in the
center image in Figure.
• Ring menus • It can be useful but can cause wrist discomfort when large
rotations are required.
• Pie menus • Ring menus is useful but can cause wrist discomfort
• Color cube when large rotations are required.
• Non-isomorphic rotations can be used to make small wrist
• Finger menus rotations map to a larger menu rotation.
• Above-the-head widgets
and panels
• Virtual hand-held panels
• Physical panels.
Figure, Three examples of hand-held panels with various widgets
Pie menus

Widgets and • Also known as marking menus—are circular menus with


slice shaped menu entries, where selection is based on

Panels Pattern direction, not distance.


• A disadvantage of pie menus is they take up more space
than traditional menus— although using icons instead of
• 2D desktop integration text can help.
• Advantages of pie menus compared to traditional menus
• Ring menus are that they are faster, more reliable with less error, and
• Pie menus have equal distance for each option.
• The most important advantage, however, may be that
• Color cube commonly used options are embedded into muscle
memory as usage increases.
• Finger menus
• Above-the-head widgets
and panels
• Virtual hand-held panels
• Physical panels.
https://blenderartists.org/t/pie-
menu-on-snapping/1128348
Widgets and Color cube

Panels Pattern • A color cube is a 3D space that users can select


colors from.
• In Figure (previous), the image on the right shows
• 2D desktop integration a 3D color cube widget—the color selection puck
can be moved with the hand in 2D about the
• Ring menus planar surface while the planar surface can be
• Pie menus moved in and out.

• Color cube
• Finger menus
• Above-the-head widgets
and panels
• Virtual hand-held panels
• Physical panels. https://www.geeks3d.com/hacklab/tag/color/
Finger menus
Widgets and • Finger menus consist of menu options attached to the fingers.
• A pinch gesture with the thumb touching a finger can be used
to select different options.
Panels Pattern • Once learned the user is not required to look at the menus; the
thumb simply touches the appropriate finger.
• This prevents occlusion as well as decreases fatigue.
• 2D desktop integration • The non dominant hand can select a menu—up to four
• Ring menus menus—and the dominant hand can then select one of four
items within that menu.
• Pie menus • For complex applications where more options are required, a
TULIP menu (Three- Up, Labels In Palm) can be used.
• Color cube • The dominant hand contains three menu options at a time and
the pinky contains a More option.
• Finger menus
• Above-the-head widgets
and panels TULIP menus on seven fingers and the right palm.
https://www.researchgate.net/figure/TULIP-menus-use-
finger-pinches-to-select-menu-items_fig7_35146207
• Virtual hand-held panels
• Physical panels.
Above-the-head widgets and panels

Widgets and • Above-the-head widgets and panels are placed out of the way
above the user and accessed via reaching up and pulling
down the widget or panel with the non-dominant hand.
Panels Pattern • Once the panel is released, then it moves up to its former
location out of view.
• The panel might be visible above, especially for new users, but
• 2D desktop integration after learning where the panels are located relative to the
body, the panel might be made invisible since the user can use
• Ring menus his sense of proprioception to know where the panels are
• Pie menus without looking.
• Users could easily select among three options above their field
• Color cube of view—up to the left, up in the middle, and up to the right.

• Finger menus
• Above-the-head
widgets and panels
• Virtual hand-held panels
https://www.dailymotion.com/video/x54el0u

• Physical panels. https://www.researchgate.net/publication/335633343_Im


mersive_Virtual_Reality-
Based_Interfaces_for_Character_Animation
Virtual hand-held panels
Widgets and • If a widget or panel is attached somewhere in the
environment, then it can be difficult to find.
Panels Pattern • One solution is to use virtual hand-held panels that
have the advantage of always being available—as
well as turned off—at the click of a button.
• 2D desktop integration • Attaching the panel to the hand greatly diminishes
many of the problems of world spaced panels—for
• Ring menus example, panels that are difficult to read or get in
• Pie menus the way of other objects can be reoriented and
moved in an intuitive way without any cognitive
• Color cube effort.
• Finger menus
• Above-the-head widgets
and panels
• Virtual hand-held panels
• Physical panels.
https://www.pinterest.com/pin/474144667005470779/
Widgets and Physical panels
Panels Pattern • Virtual panels offer no physical feedback, which can make
it difficult to make precise movements.
• A physical panel is a real-world tracked surface that the
• 2D desktop integration user carries and interacts with via a tracked finger, object,
or stylus. Using a physical panel can provide fast and
• Ring menus accurate manipulation of widgets due to the surface acting
as a physical constraint when touched.
• Pie menus • The disadvantage of a physical panel is that users can
become fatigued from carrying it and it can be misplaced if
• Color cube set down.
• Finger menus • Providing a physical table or other location to set the panel
on can help reduce this problem where the panel still
• Above-the-head widgets travels with the user when virtually moving.
and panels • Another option is to strap the panel to the forearm.
• Alternatively, the surface of the arm and/or hand can be
• Virtual hand-held panels used in place of a carried panel.

• Physical panels.
Non-Spatial Control Pattern
Descriptions When to use: Limitations:
• It provides global action • Use when options can be • Gestures and accents are highly
performed through visually presented. variable from user to user and
description instead of a • For example: even for a single user.
spatial relationship. • gesture icons or text to • There is often a trade-off of
• It is most implemented speak—and appropriate accuracy and generality—the
through speech or gestures. feedback can be more gestures or words to be
provided. recognized then the less
• It is best used when there are accurate the recognition rate.
a small number of options to • System recognition of voice can
choose from and when a be problematic when many
button is available to push-to- users are present or there is a
talk or push-to-gesture. lot of noise. For important
• Use voice when moving the commands, verification may be
hands or the head would required and can be annoying to
interrupt a task. users.
Non-Spatial Control Pattern
• Voice menu hierarchies Voice menu hierarchies
• They are like traditional desktop
• Gestures menus where submenus are
brought up after higher-level menu
options are selected.
• Menu options should be visually
shown to users, so users explicitly
know what options are available.
Non-Spatial Control Pattern
Gestures
• Voice menu hierarchies • It can work well for non-spatial commands.
• It should be intuitive and easy to remember.
• Gestures • For example:
• a thumbs-up to confirm, raising the index
finger to select option 1, or raising the
index and middle finger to select option
2.
• Visual signifiers showing the gesture options
available should be presented to the user,
especially for users who are learning the
gestures.
• The system should always provide feedback
to the user when a gesture has been
recognized.

You might also like