Notes10a HeuristicEvaluationOverview
Notes10a HeuristicEvaluationOverview
1
Design principles and usability heuristics (II)
Advantages
• It is a “minimalist” approach where using a few general
guidelines grounded in research and experience can help
identify and correct the majority of usability problems.
– Also, reasonable easily-remembered list and easily applied with
modest effort.
2
Why is “discount usability engineering” approach?
Relative to user-observational studies, this can be cheap
and fast and relatively easy for trained practitioners,
which can be critical in today’s product cycle…
• There are no special labs or equipment needed.
– For many things, likely able to run it on your own machine in your office
– Interesting bonus: can even be used on paper prototypes
• Doing this type of evaluation can be done on the order of one day
where other usability testing could take weeks.
• Once the approach is understood by a team it can be used in many
scenarios with little additional learning and the more careful you
are, the better the results get. Evan Golub / Ben Bederson / Saul Greenberg
Heuristic Evaluation
Developed by Jakob Nielsen (1990)
• Original list of heuristics seems inspired by Shneiderman’s “Eight Golden Rules”
of design.
• Nielsen has had multiple similar lists over the years.
• Jill Gerhardt-Powals has a list as well but they have a very different feel to them.
3
Heuristic Evaluation Process
Evaluators go through UI several times
• inspects various dialogue elements
• compares with list of usability principles
• consider other principles/results that come to mind
Usability principles
• Nielsen’s “heuristics”
– there are several slightly different sets (we will see one) of heuristics
• supplementary list of category-specific heuristics
– competitive analysis & user testing of existing products
4
How to Perform Evaluation
At least two passes for each evaluator
• first to get feel for flow and scope of system
• second to focus on specific elements
Note: This is not the only list teams use, but it’s the one on which we will focus.
5
Severity Rating
Used to allocate resources to fix problems
Estimates of need for more usability efforts
Combination of
• frequency
• impact
• persistence (one time or repeating)
6
Examples of individual entries
Can’t copy info from one window to another.
violates “Minimize the users’ memory load”
severity: (3) major
fix: allow copying
Typography uses mix of upper/lower case formats and fonts.
violates “Consistency and standards”
slows users down
probably wouldn’t be found by standard user testing
severity: (1) cosmetic
fix: pick a single format for entire interface
Green flashing lights mean system settings are being changed, red lights means normal
functionality taking place.
violates “Consistency and standards”
could confuse new users
severity: (2) minor issue, cosmetic fix
fix: reverse color usage of lights
Debriefing
Conduct with evaluators, observers, and design/development
team members
7
Results of Using HE
Discount: benefit-cost ratio of 48 [Nielsen94]
• cost was $10,500 for benefit of $500,000
• value of each problem ~15K (Nielsen & Landauer)
• how might we calculate this value?
– in-house −> productivity
– open market −> sales
http://www.useit.com/papers/heuristic/heuristic_evaluation.html
Evan Golub / Ben Bederson / Saul Greenberg
One question that came up was “why not more evaluators?” Would
it help to go up to 10? 20?
– The reality is that adding evaluators costs more (not just scaling for number
of people but also increased time for everyone during the aggregation stage).
– Having that many evaluators won’t identify many more problems in practice.
8
Why Multiple Evaluators (cont)?
(graphs for a specific example study that was done)