0% found this document useful (0 votes)
5 views

Human Computer Interaction

Human-computer interaction (HCI) is the study of how people interact with computers, focusing on the design and implementation of user-friendly systems. The document outlines the history, goals, and importance of HCI, emphasizing user-centered design and the need for effective interfaces to enhance user experience. It also discusses principles of user interface design and the evolution of graphical user interfaces, highlighting the significance of clarity, efficiency, and direct manipulation in creating effective interactions.

Uploaded by

Alisha Bhagat
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Human Computer Interaction

Human-computer interaction (HCI) is the study of how people interact with computers, focusing on the design and implementation of user-friendly systems. The document outlines the history, goals, and importance of HCI, emphasizing user-centered design and the need for effective interfaces to enhance user experience. It also discusses principles of user interface design and the evolution of graphical user interfaces, highlighting the significance of clarity, efficiency, and direct manipulation in creating effective interactions.

Uploaded by

Alisha Bhagat
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 70

Human computer

interaction (HCI)
INTRODUCTION
• Human computer interaction (HCI), alternatively man machine interaction
(MMI) or computer human interaction (CHI) is the study of interaction
between people (users) and computers.

•DEFINITION
• Human-computer interaction is a discipline concerned with the design,
evaluation and implementation of interactive computing systems for
human use and with the study of major phenomena surrounding them."
A BRIEF HISTORY OF THE HUMAN-COMPUTER
INTERFACE
• The need for people to communicate with each other has existed since we
first walked upon this planet.
• The lowest and most common level of communication modes we share are
movements and gestures. Movements and gestures are language
independent, that is, they permit people who do not speak the same
language to deal with one another.
• The next higher level, in terms of universality and complexity, is spoken
language.
• Most people can speak one language, some two or more. A spoken
language is a very efficient mode of communication if both parties to the
communication understand it.
• At the third and highest level of complexity is written language. While
most people speak, not all can write. But for those who can, writing is still
nowhere near as efficient a means of communication as speaking.
• In modem times, we have the typewriter, another step upward in
communication complexity. Significantly fewer people type than write.
(While a practiced typist can find typing faster and more efficient than
handwriting, the unskilled may not find this the case.)
• Through its first few decades, a computer's ability to deal with human
communication was inversely related to what was easy for people to do.
-- The computer demanded rigid, typed input through a keyboard; people
responded slowly using this device and with varying degrees of skill.
-- The human-computer dialog reflected the computer's preferences, consisting
of one style or a combination of styles using keyboards, commonly referred to as
Command Language, Question and Answer, Menu selection, Function Key
Selection, and Form Fill-In.
• Throughout the computer's history, designers have been developing, with
varying degrees of success, other human-computer interaction methods that
utilize more general, widespread, and easier-to-learn capabilities: voice and
handwriting.
--Systems that recognize human speech and handwriting now exist, although
they still lack the universality and richness of typed input.
GOALS OF HCI
A basic goal of HCI is
• to improve the interactions between users and computers
• by making computers more usable and receptive to the user's needs.
A long term goal of HCI is
• to design systems that minimize the barrier between the human's
cognitive model of what they want
• to accomplish and the computer's understanding of the user's task
WHY IS HCI IMPORTANT
• User-centred design is getting a crucial role!
• It is getting more important today to increase competitiveness via HCI
studies (Norman,1990)
• High-cost e-transformation investments
• Users lose time with badly designed products and services
• Users even give up using bad interface
• Ineffective allocation of resources
HCI Life Cycle
• The HCI life cycle refers to the various stages involved in designing and
developing user-friendly computer systems. While specific models may
vary, a typical HCI life cycle includes the following stages.
• Requirements Analysis
• Design
• Implementation
• Evaluation
• Deployment
• Maintenance
DEFINING THE USER INTERFACE
• User interface, design is a subset of a field of study called human-computer
interaction (HCI).
• Human-computer interaction is the study, planning, and design of how people
and computers work together so that a person's needs are satisfied in the
most effective way.
•HCI designers must consider a variety of factors:
• what people want and expect, physical limitations and abilities people
possess,
--how information processing systems work,
• what people find enjoyable and attractive.
• Technical characteristics and limitations of the computer hardware and
software must also be considered.
•The user interface is :
•the part of a computer and its software that people can
see, hear, touch, talk to, or otherwise understand or
direct.
•The user interface has essentially two components: input
and output.
•Input is how a person communicates his / her needs to
the computer.
•Some common input components are the keyboard,
mouse, trackball, one's finger, and one's voice.
•Output is how the computer conveys the results of its
computations and requirements to the user.
• Today, the most common computer output mechanism is the display
screen, followed by mechanisms that take advantage of a person's auditory
capabilities: voice and sound.
• The use of the human senses of smell and touch output in interface design
still remain largely unexplored.
• Proper interface design will provide a mix of well-designed input and
output mechanisms that satisfy the user's needs, capabilities, and
limitations in the most effective way possible.
• The best interface is one that it not noticed, one that permits the user to
focus on the information and task at hand, not the mechanisms used to
present the information and perform the task.
PRINCIPLES OF USER INTERFACE DESIGN
• An interface must really be just an extension of a person. This means that the
system and its software must reflect a person's capabilities and respond to his
or her specific needs.
• It should be useful, accomplishing some business objectives faster and more
efficiently than the previously used method or tool did.
• It must also be easy to learn, for people want to do, not learn to do.
• Finally, the system must be easy and fun to use, evoking a sense of pleasure
and accomplishment not tedium and frustration.
• The interface itself should serve as both a connector and a separator
• a connector in that it ties the user to the power of the computer, and a
separator in that it minimizes the possibility of the participants damaging one
another.
• While the damage the user inflicts on the computer tends to be physical (a
frustrated pounding of the keyboard), the damage caused by the computer is
more psychological.
• Throughout the history of the human-computer interface, various researchers
and writers have attempted to define a set of general principles of interface
design.
• What follows is a compilation of these principles. They reflect not only what
we know today, but also what we think we know today.
• Many are based on research, others on the collective thinking of
behaviourists working with user interfaces.
• These principles will continue to evolve, expand, and be refined as our
experience with Gills and the Web increases.
THE IMPORTANCE OF THE USER INTERFACE
• A well-designed interface and screen is terribly important to our users. It is
their window to view the capabilities of the system.
• It is also the vehicle through which many critical tasks are presented. These
tasks often have a direct impact on an organization's relations with its
customers, and its profitability.
• A screen's layout and appearance affect a person in a variety of ways. If they
are confusing and inefficient, people will have greater difficulty in doing their
jobs and will make more mistakes.
• Poor design may even chase some people away from a system permanently.
It can also lead to aggravation, frustration, and increased stress.
THE CONCEPT OF DIRECT MANIPULATION
• The system is portrayed as an extension of the real world: It is assumed that a
person is already familiar with the objects and actions in his or her
environment of interest.
• The system simply replicates them and portrays them on a different medium,
the screen.
• A person has the power to access and modify these objects, among which are
windows.
• A person is allowed to work in a familiar environment and in a familiar way,
focusing on the data, not the application and tools.
• The physical organization of the system, which most often is unfamiliar, is
hidden from view and is not a distraction.
• Continuous visibility of objects and actions: Like one's desktop, objects are
continuously visible. Reminders of actions to be performed are also obvious,
labelled buttons replacing complex syntax and command names.
• Cursor action and motion occurs in physically obvious and natural ways. One
problem in direct manipulation, however, is that there is no direct analogy on
the desk for all necessary windowing operations.
• A piece of paper on one's desk maintains a constant size, never shrinking or
growing. Windows can do both. Solving this problem required embedding a
control panel, a familiar concept to most people, in a window's border.
• This control panel is manipulated, not the window itself.
• Actions are rapid and incremental with visible display of results , the results
of actions are immediately displayed visually on the screen in their new
and current form.
• Auditory feedback may also be provided. The impact of a previous action is
quickly seen, and the evolution of tasks is continuous and effortless.
• Incremental actions are easily reversible.
INDIRECT MANIPULATION
• In practice, direct manipulation of all screen objects and actions may not be
feasible because of the following:
• The operation may be difficult to conceptualize in the graphical system.
• The graphics capability of the system may be limited.
• The amount of space available for placing manipulation controls in the window
border may be limited.
• It may be difficult for people to learn and remember all the necessary
operations and actions.
• When this occurs, indirect manipulation is provided. Indirect manipulation
substitutes words and text, such as pull-down or pop-up menus, for symbols,
and substitutes typing for pointing.
• Most window systems are a combination of both direct and indirect
manipulation.
• A menu may be accessed by pointing at a menu icon and then selecting it (direct
manipulation).
• The menu itself, however, is a textual list of operations (indirect manipulation).
• When an operation is selected from the list, by pointing or typing, the system
executes it as a command.
• Which style of interaction-direct manipulation, indirect manipulation, or a
• combination of both-is best, under what conditions and for whom, remains a
question whose answer still eludes us.
The Benefits of Good Design
• Poor clarity forced screen users to spend one extra second per screen.
• Almost one additional year would be required to process all screens.
• Twenty extra seconds in screen usage time adds an additional 14 person years.
• The benefits of a well designed screen have also been under experimental
scrutiny for many years.
• One researcher, for example, attempted to improve screen clarity and readability by
making screens less crowded.
• Separate items, which had been combined on the same display line to conserve space,
were placed on separate lines instead.
• The result screen users were about 20 percent more productive with the less crowded
version.
• Proper formatting of information on screens does have a significant positive effect on
• performance.
• In recent years, the productivity benefits of well-designed Web pages have also been
scrutinized.
• Training costs are lowered because training time is reduced.
• support line costs are lowered because fewer assist calls are necessary.
• Employee satisfaction is increased because aggravation and frustration are reduced.
• Ultimately, that an organization's customers benefit because of the improved service
they receive.
• Identifying and resolving problems during the design and development process also
has significant economic benefits:
• How many screens are used each day in our technological world?
• How many screens are used each day in your organization? Thousands? Millions?
• Imagine the possible savings. Proper screen design might also, of course, lower the costs of
replacing "broken" PCs.
CHARACTERISTICS OF THE GRAPHICAL USER
INTERFACE

• A graphical system possesses a set of defining concepts.


• Included are sophisticated visual Presentation, pick-and click interaction, a restricted
set of interface options, visualization, object orientation, extensive use of a person's
recognition memory, and concurrent performance of functions.
Sophisticated Visual Presentation:
• Visual presentation is the visual aspect of the interface. It is what people see
on the screen.
• The sophistication of a graphical system permits displaying lines, including
drawings and icons.
• It also permits the displaying of a variety of character fonts, including
different sizes and styles.
• The display of 16 million or more colours is possible on some screens.
• Graphics also permit animation and the presentation of photograph and
motion video.
• Restricted Set of Interface Options: The array of alternatives available to
the user is what is presented on the screen or may be retrieved through
what is presented on the screen, nothing less, nothing more. This concept
fostered the acronym WYSIWYG.
• Pick-and-Click Interaction: Elements of a graphical screen upon which some
action is to be performed must first identified.
• The primary mechanism for performing this pick-and-click is most often the
mouse and its buttons.
• The secondary mechanism for performing these selection actions is the
keyboard most systems permit pick-and-click to be performed using the
keyboard as well.
Visualization
• Visualization is a cognitive process that allows people to understand
• Presenting specialized graphic portrayals facilitates visualization.
• The best visualization method for an activity depends on what People are
trying to learn from the data.
• The goal is not necessarily to reproduce a really graphical image, but to
produce one that conveys the most relevant information.
• Effective visualizations can facilitate mental insights, increase
productivity, and for faster and more accurate use of data.
Object Orientation
• A graphical system consists of objects and actions. Objects are what
people see on screen. They are manipulated as a single unit.
• Objects can be composed of sub objects. For example, an object may be a
document. The document's sub objects may be a paragraph, sentence,
word, and letter.
• A collection is the simplest relationship-the objects sharing a common
aspect.
• A collection might be the result of a query or a multiple selection of
objects.
• Operations can be applied to a collection of objects.
• A constraint is a stronger object relationship. Changing an object in a set
affects some other object in the set.
• A document being organized into pages is an example of a constraint. A
composite exists when the relationship between objects becomes so
significant that the aggregation itself can be identified as an object.
• Examples include a range of cells organized into a spreadsheet, or a collection
of words organized into a paragraph.
• A container is an object in which other objects exist. Examples include text in
a document or documents in a folder.
GUI VERSUS WEB PAGE DESIGN

• GUI and Web interface design do have similarities. Both are software
designs, they are used by people, they are interactive, they are
heavily visual experiences presented through screens, and they are
composed of many similar components.
• Significant differences do exist.
GENERAL PRINCIPLES for
creating a user interface
• The design goals in creating a user interface are described below.
• They are fundamental to the design and implementation of all
effective interfaces, including GUI and Web ones.
• These principles are general characteristics of the interface, and they
apply to all aspects.
• Aesthetically Pleasing
• Clarity
• Compatibility
• Configurability
• Comprehensibility
• Consistency
• Control
• Directness
• Flexibility
• Efficiency
• Familiarity
• Forgiveness
• Predictability
• Recovery
• Responsiveness
• Transparency
• Simplicity
Xerox
Star
• The workstation from Xerox that introduced the graphical user
interface and desktop metaphor in 1981. The Star was designed to
work in an Ethernet network connected to other workstations along
with a file and print server. Approximately 30,000 Star workstations
were sold.
• Astounding people who were given demos, the Star's user interface
was the inspiration for Xerox's subsequent computers and Apple's
Lisa and Mac. All graphical user interfaces (GUIs) today owe their
roots to Xerox. Companies like Alto, Lisa and Mac computer.
PRINCIPLES FOR THE XEROX STAR
• Displaying objects that are selectable and manipulable must be created.
• A design challenge is to invent a set of displayable objects that are
represented meaningfully and appropriately for the intended application.
• It must be clear that these objects can be selected, and how to select
them must be Self-evident.
• When they are selected should also be obvious, because it should be clear
that the selected object will be the focus of the next action. Standalone
icons easily fulfilled this requirement.
• The handles for windows were placed in the borders.
• Visual order and viewer focus: Attention must be drawn, at the proper
time, to the important and relevant elements of the display.
Theories, Principles, and Guidelines

• · Emerging designers follow some guidance


• High level theories and models
middle-level principles
specific and practical guidelines
• · User interface displays could be improved
• Clutter, complex and tedious procedures, inadequate functionality,
inconsistent sequences of actions, insufficient informative feedback
High Level Theories

• · Explanatory
• Observe behaviour, conceive designs, compare high-level concepts of
2 designs, Describe activity
• · Predictive
• Compare proposed designs for execution time or error rates,
motor-task predictions (keystroking, pointing times), perceptual
theories predict reading times, predicting performance on cognitive
task is hard (ratio to perform tasks between novice and expert may
be 100/1)
Taxonomies
• · Input devices
• Direct Vs indirect, linear Vs rotary
• · Tasks
• structured Vs unstructured, controllable Vs immutable personality styles
• convergent Vs divergent, field dependent Vs independent
• · Technical aptitudes
• spatial visualization, reasoning
• · User experience levels
• novice, knowledgeable, expert
• · User-interface styles
• menus, form filling, commands
Many theories...

• compete for attention


• continuous refinements
• extended by critics
• applied by eager and hopeful designers
• implementers must keep up with developments not only in s/w tools
but also in theories
Subject satisfaction

• researchers in media and advertising recognize the difficulty in predicting


emotional reactions
• theoretical predictions are combined with intuitive judgements + extensive
market testing
• broader theories: small group behaviour, organizational dynamics, sociology
of knowledge
• barriers to new technology may be analysed via social psychology and
anthropology
• coming up with a god theory is not simple
Conceptual, semantic, syntactic and lexical
model [Foley & Van Dam (1970)]

• · conceptual: user’s mental model of the interactive system


• a line editor models a word processor program
• · semantic: describes meanings conveyed by the user’s command i/p
+ comp. display
• · syntactic: how the units are assembled into a complete sentence
• · lexical deals with device dependencies + precise mechanisms by
which a user specifies the syntax
Advantages

•the above model is clear since it is top-down


• · matches the s/w architecture
•· allows useful modularity during design
•· designers move from conceptual to lexical and
carefully record mappings between these 2 levels
Keystroke-level model

• attempts to predict performance times for error-free task execution


• · different activities are considered: key-stroking, pointing, homing, drawing,
thinking, waiting for system response
•· the sum of all insured times provides performance times
•· these models hold for experienced users and error-free
performance
•· no emphasis on learning, problem solving, error handling,
subjective satisfaction and retention
GOMS: goals, operators, methods & selection rules
[Card, Moran and Newell (1980-83)]
• · users formulate goals (edit documents) and sub-goals (insert word) by
using methods or procedures
• move cursor by following a sequence of arrow keys
• · operators are elementary perceptual, motor or cognitive acts whose
execution is necessary to change any aspect of the user’s mental state or to
affect the task environment
press up-arrow key, move hand to mouse, recall file name...
• · methods-selection rules these rules are used to choose one possible way
of achieving the task
Again on GOMS
[Kieras and Polson (1985)]

•Kieras and Polson formalized GOMS by using production


rules
•via these rules predictions of learning and performance time
were given when interacting with a text editor through 5
different actions:
•insert, delete, copy, move and transpose
Natural GOMS
Kieras (1988)
• "the GOMS analysis did not explain how the notation works, it
is clumsy, detached from the underlying cognitive theory"
•· GOMS was refined into Natural GOMS Language (NGOMSL)
•· Find when the task-analyst must make: a judgement call
•assumptions about how the users view the system
•bypass a complex hard-to-analyze task
•check for consistency
Method descriptions
Elkerton and Palmiter (1991)
• · They applied NGOMSL to implement on-line Help
•· introduced method descriptions breaking down the actions
necessary to accomplish a goal into steps
• decide, accomplish, report goal accomplished
• · introduced selection rules whenever alternative methods exist
• · empirical evaluation with 28 subjects proved that NGOMSL version
of help halved the time to complete information searches
Stages of action
Norman (1988)
• · 7 stages of action to model HCI:
• forming the goal
• forming the intention
• specifying the action
• executing the action
• perceiving the system state
• interpreting the system state
• evaluating the outcome
• · similar to Foley and van Dam’s separation of concerns:
• conceptual intention - reformulation into commands
• syntax construction - production of lexical tokens
Norman’s contribution

• · taking care of
• cycles of action
• evaluation
• · the process of action is considered dynamically (i.e. in evolution)
• · alternative models only consider what is in the user’s mind at
execution time
• · Two new concepts:
• gulf of evolution which separates user’s intentions from allowable actions plus
• gulf of evaluation separating system’s representation from user’s expectation
Four principles

• · Norman’s four principles for good design


• state and action alternatives should be visible
• there should be a good conceptual model with a consistent system image
• the interface should include good mappings that reveal the relationships
between stages
• the user should receive continuous feedback
•· Study errors - they often occur when moving from goals to
intentions to actions and to executions
Exploring the interface
Polson and Lewis (1990)

• when users explore an interface and try to accomplish their goals, they
pin-point 4 critical points where failures may occur:
• users can form an inadequate goal
• users might not find the correct interface object because of an incomprehensible label
or icon
• users may not know how to specify or execute a desired action
• users may receive inappropriate or misleading feedback
Franzke (1995)

• the bottom 3 failures may be prevented by improved design or time


consuming experience
Consistency through grammars
•· consistency is elusive with multiple levels
•· it may be even positive to be inconsistent !
•· a command language or set of actions should be:
•orderly - predictable - describable by a few rules
•easy to learn - easy to retain
•· these overlapping concepts are shown by an example
showing 2 kinds of inconsistencies...
Consistencies-Inconsistencies
Consistent + Inconsistent A Inconsistent B

delete/insert character delete/insert character delete/insert character

delete/insert word remove/bring word remove/insert word

delete/insert line destroy/create line delete/insert line

delete/insert paragraph kill/birth paragraph delete/insert paragraph

· all actions are the same in + but vary in version A


· the inconsistent verbs are all acceptable but their variety suggests they will be more difficult
to learn, to remember, will slow down users, will be error-prone
· version B is more malicious: only one inconsistency (remove) but may be easier to
remember...
Task Action Grammar
Payne and Green (1986)
•expanding on Reisner (1981) they addressed:
•multiple levels of consistency (lexical, syntactic and semantic)
•aspects of completeness (complete set of tasks)
•once the full set of task-action mappings is written down, the
grammar of the command language can be tested for
completeness
Command Language Grammar

• e.g. a TAG definition of cursor control:


• move-cursor-one-character-forward
[Direction = forward, Unit=char]
• move-cursor-one-character-backward
[Direction = backward, Unit=char
• move-cursor-one-word-forward
[Direction =forward, Unit=word]
• move-cursor-one-word-backward
[Direction =backward, Unit=word]
High rule schemas describing the commands
syntax

These schemas generate a consistent grammar


move cursor one character forward CTRL-C

move cursor one character backward ESC-C

move cursor one word forward CTRL-W

move cursor one character backward ESC-W


Again on consistency

• notation and approach (for TAGS) are flexible and extensible


• · consistency is subtle, multiple levels, may also hinder some
implementation details - Reisner (1990)
• · understanding consistency is instrumental for implementors,
researchers and designers - Grudin (1989)
Widget level theories
• reductionism approach may be fallacious, i.e. details may become
misleading in the evaluation of a GUI
• validity of simple summations of time periods may be questionable
• alternatively, one may use a model based on widgets (interface
components)
• layout appropriateness (frequently used widgets should be adjacent,
left-to-right sequence should be matched to the task-sequence
description,...)
• higher-level patterns appear by widget composition
Object-Action Interface Model
• · syntactic-semantic model of human behaviour - Shneiderman
(1980)
• · this model describes programming, database manipulation facilities,
direct manipulation
• · a distinction was made between
• acquired semantic concepts (delete - copy)
• and
• root-learned syntactic concepts (function keys)
• · the first were stable in memory and well organized, the second
were arbitrary and had to be rehearsed to be maintained
Objects and Actions: differences

• task domain concepts (stock-market portfolios)


• computer domain concepts (folders)
OAI model
• GUIs have replaced command languages substituting complex syntax
with direct manipulation
• emphasis is now on the visual display of user tasks objects and
actions
• stock-market portfolios could be represented by leather folders with
engraved certificates
• actions represented by trashcans, shelf icons,etc
Object-Action design

• action syntax is easier than command language expressions even if


precedence rules must be known (file/folder to the trashcan and not vice
versa)
• · mouse clicking, mouse retention, gestures also obey rules
• · design starts with a clear identification of the task/s - including the
universe of real world objects + user intentions + actions required
• · high level task objects could be stock-market statistics, a photo library,
a scientific journal
Task and Interface concepts
OAI Model
• the two pictures show
• the objects of the universe (shelves, cupboards, books in a library)
• the actions satisfying the intentions of the user
• the objects of the interface visualizing (through pixels)
• the actions to be performed by the user (via mouse clicks)
• · in this way, the interface is easy to learn and to use since it maps the world
domain with the metaphoric domain
• · it focuses on task objects and actions and on interface objects and actions
• · OAI reflects design at high level as when programmers use widgets in
user-interface-building tools
• · standard widgets have a simple syntax : click, double-click, drag, drop
• · OAI follows the object-oriented approach
Hierarchies
•when problems are complex: break them down!
•· intentions may be decomposed into smaller action steps
• building: surveying, building the frame, raising the roof
• symphony: movements, measures, notes
• baseball: innings, outs, pitches
•· people learn task objects & actions independently of their
implementation
•· people learn by studying & practicing
Application domains
• designers must learn via
•training courses
•books
•interviews
• designers generate a hierarchy of objects and actions to model the user’s
tasks
• the model is the basis for designing the interface objects and actions + their
representation in pixels on the screen, in physical devices or audio cues
• users must firstly become proficient in their task domain
• next, they may learn the equivalent computer program
Interface objects

• the interface includes hierarchies of objects and actions at high and low levels
• storage is a high level concept:
computers store information
• by means of the directory and files (objects)
• a directory is made of entries (lower level objects)
• each entry is made of a name, length, creation date, owner, access control,...
• each file has lines, fields, characters, fonts, pointers, binary numbers,...
Interface actions
• both high and low level actions like
• creating a text data file
• load, insertion, save actions
storing a file, backup on one of many disks, applying access control
rights,...
• permissible file types, sizes, error conditions, responses to h/w or s/w
errors and finally...
• clicking on a pull-down menu
Familiar examples
• · a designer will build interface objects and actions based on familiar examples
• · tune those objects and actions to fit the task
• · for a real estate business, geographical maps and houses will be available as
well as their properties, cost, distance, size and location (familiar concepts) will
be mapped on the screen
• · to explain "saving a file", icons representing a disk drive and the directory will
show where the file will be stored
• · a demonstration will be performed to enable the user a logical understanding
of the process
Metaphors
• · they map a meaning from one known domain to another one
• · they may be abstract, concrete or analogical
• · they are used to avoid long, tiresome training of new concepts
• · most icons user representations which are visual metaphors (which, in itself,
is a metaphor)
• · interface objects and actions have a logical structure which is easy to
memorize in a stable way
Bottom-up modelling
• · task objects made explicit
• · user’s task actions laid out clearly
• · next, interface objects are identified
• · and interface actions follow...with the OAI model
• · many years ago users had to remember device dependent details
(format instruction in Fortran, number of i/o device to be deployed,etc.)
• · or, which action deletes a character: delete, backspace, CTRL-H, CTRL-G,
CTRL-D, rightmost mouse button or Escape
• · which action inserts a new line after the third text line: CTRL-I, INSERT KEY,
I3, I 3, 3I,...
Remembering...
• · problem 1: details vary across computer platforms
• · problem 2: arbitrariness of minor design features reduces the effectiveness of
paired-associate learning
• · repeated rehearsals for rote memorization
• · moreover, syntactic knowledge is hampered by the lack of a hierarchical or
modular structure to cope with complexity
• Example
• · within e-mail:
• press RETURN to terminate a paragraph
• CTRL-D to terminate a letter
• Q to quit the e-mail subsystem
• logout to terminate the session
• · for the novice these similar termination commands bear no logical connection
Syntactic knowledge
• · it is system-dependent
• · different
• keyboards, commands, function keys, sequences of actions
• · some overlap may exist (e.g. with arithmetical operations)
• · s for sending a message - s for saving a file - ...
• · to overcome these problems, new interfaces show familiar objects and
actions representing the user’s task objects and actions

• · Standard widgets are easily available

You might also like