+ - 0:00:00
Notes for current slide

Thanks to Jeremy Zhang for the find.

Notes for next slide

Hall of Shame?

What do you think happened here?

Funimation web page with a large image and an interface to add items to a queue and remove items from a queue

Large red remove from queue button Large red add to queue button

Slide 1 of 38

Thanks to Jeremy Zhang for the find.

Today's goals

Slide 2 of 38

Introducing Heuristic Evaluation

Discount usability engineering methods

-- Jakob Nielsen

Involves a small team of evaluators to evaluate an interface based on recognized usability principles

Heuristics–”rules of thumb”

Slide 3 of 38

"serving to discover or find out," 1821, irregular formation from Gk. heuretikos "inventive," related to heuriskein "to find" (cognate with O.Ir. fuar "I have found"). Heuristics "study of heuristic methods," first recorded 1959.

Introducing Heuristic Evaluation

First introduced in 1990 by Nielsen & Molich

Quick, inexpensive, popular technique

~5 experts find 70-80% of problems n Based on 10 heuristics

Does not require working interface

Slide 4 of 38
2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-022025-05-032025-05-042025-05-05Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems Designer conducts synthesis and analysis Designer writes report EvaluationTypical Heuristic Evaluation Process
Slide 5 of 38

So what are the heuristics?

Slide 6 of 38

So what are the heuristics?

  • H1: Visibility of system status
  • H2: Match between system and the real world
  • H3: User control and freedom
  • H4: Consistency and standards
  • H5: Error prevention
  • H6: Recognition vs. recall
  • H7: Flexibility and efficiency of use
  • H8: Aesthetic and minimalist design
  • H9: Error recovery
  • H10: Help and documentation
Slide 7 of 38

These should not be hugely surprising after everything we've talked about...

H1: Visibility of System Status

Keep users informed about what is going on

What does this interface tell you?

feedback being shown about how much time is left until a
database search returns

Slide 8 of 38

H1: Visibility of System Status

Keep users informed about what is going on

What does this interface tell you?

feedback being shown about how much time is left until a
database search returns

  • What input has been received--Does the interface above say what the search input was?
  • What processing it is currently doing--Does it say what it is currently doing?
  • What the results of processing are--Does it give the results of processing?

Feedback allows user to monitor progress towards solution of their task, allows the closure of tasks and reduces user anxiety (Lavery et al)

Slide 9 of 38

H2: Match between system and real world

Speak the users’ language

Follow real world conventions

two error messages returned by a bank machine-- one in terms the
user can understand relating to dollars available and the other
uninterpretable

  • Use concepts, language and real-world conventions that are familiar to the user.
  • Developers will need to understand the task from the point of view of users.
  • Cultural issues relevant for the design of systems that are expected to be used globally.

A good match minimizes the extra knowledge required to use the system, simplying all task action mappings (re-expression of users’ intuitions into system concepts)

Slide 10 of 38

H2: Match between system and real world

Example of a huge violation of this H2

Picture of macintosh desktop with old style floppy disk to be dragged over trash to be ejected Possibly the biggest usability in problem in the Macintosh. Heuristic violation--people want to get their disk out of the machine--not discard it.
Slide 11 of 38

H2: Match between system and real world

Example of a mismatch depends on knowledge about users

Would an icon with a red flag for new mail be appropriate in all cultures?

Picture of a mailbox with red flag raised

Slide 12 of 38

H3: User Control and Freedom

“Exits” for mistaken choices, undo, redo

Don’t force down fixed paths

Dialog box with lots of exits for mistaken choices--
undo;redo;Don’t force down fixed paths

Users choose actions by mistake

Slide 13 of 38

H4: Consistency and Standards

Four dialogue boxes with different locations for ok; cancel and help (inconsistent)

Same words, situations, actions, should mean the same thing in similar situations; same things look the same, be located in the same place.

Different things should be different

Slide 14 of 38

H4: Consistency and Standards

  • Both H2 (Match between system and the real world) and H4 related to user’s prior knowledge. The difference is
    • H2 is knowledge of world
    • H4 of knowledge others parts of application and other applications on the same platform.
  • Consistency within an application and within a platform. Developers need to know platform conventions.
  • Consistency with old interface

Consistency maximizes the user knowledge required to use the systems by letting users generalize from existing experience of the system to other systems

Slide 15 of 38

H4: Consistency and Standards

Evidence: Should include at least

  • two inconsistent elements in the same interface, or
  • an element that is inconsisten with a platform guideline

Explanation: What inconsistent element is and what it is inconsistent with

Four dialogue boxes with different locations for ok; cancel and help (inconsistent)

Slide 16 of 38

H5: Error Prevention

Careful design which prevents a problem from occurring in the first place

Picture of a calendar entry interface with text entry (error
prone) vs data selection (less error prone)

  • Help users select among legal actions (e.g., greying out inappropirate buttons) rather than letting them select and then telling them that they have made an error (gotcha!).
  • Subset of H1 (Visibility of system status) but so important it gets a separate heuristic.

Motivation: Errors are a main source of frustration, inefficiency and ineffectiveness during system usage (Lavery et al)

Explanation in terms of tasks and system details such as adjacency of function keys and menu options, discriminability of icons and labels.

Slide 17 of 38

H6: Recognition Rather than Recall

Make objects, actions and options visible or easily retrievable

Compuserve connect old school interface

  • Classic examples:
    • command line interfaces (rm *)
    • Arrows on keys that people can’t map to functions
  • Much easier for people to remember what to do if there are cues in the environment

Goes into working memory through perceptions

Slide 18 of 38

H7: Flexibility and Efficiency of Use

Accelerators for experts (e.g., gestures, keyboard shortcuts)

Allow users to tailor frequent actions (e.g., macros)

  • Typing single keys is typically faster than continually switching the hand between the keyboard and the mouse and point to things on the screen.
  • Skilled users develop plans of action, which they will want to execute frequently, so tailoring can capture these plans in the interface.
Slide 19 of 38

H8: Aesthetic and Minimalist design

Dialogs should not contain irrelevant or rarely needed information

  • Visual search--eyes must search through more. More (irrelevant info) interferes with Long Term Memory (LTM) retrieval of information that is relevant to task.
  • Cluttered displays have the effect of increasing search times for commands or users missing features on the screen (Lavery et al)

Chartjunk (Tufte): "The interior decoration of graphics generates a lot of ink that does not tell the viewer anything new."

Slide 20 of 38

H9: Help users recognize, diagnose, and recover from errors

  • Error messages in language user will understand
  • Precisely indicate the problem
  • Constructively suggest a solution
Slide 21 of 38

H10: Help and Documentation

Easy to search

Focused on the user’s task

List concrete steps to carry out

Always available

Allow search by gist--people do not remember exact system terms

Slide 22 of 38

If user ever even knew system terms.

2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

Why 5 or more people?

-4 or 5 are recommended by Nielsen (this is a point of contention. I aim for saturation).

  • A single person will not be able to find all usability problems
  • Different people find different usability problems
  • Successful evaluators may find both easy and hard problems
Slide 23 of 38

You can estimate how many you need (see NM book, pp 32-35).

4 or 5 are recommended by Nielsen

2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

How should the Designer "provide a setting"? How should the evaluator evaluate?

Slide 24 of 38
2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

What does the evaluator do?

Designer: Briefing (HE method, Domain, Scenario)

Evaluator:

  • Two passes through interface (video in our case)
  • Inspect flow
  • Inspect each screen, one at a time against heuristics
  • Fill out a Usability Action Report (we'll keep this simple in peer review)
Slide 25 of 38

NOT a single-user empirical test that is, do not say “I tried it and it didn’t work therefore I’ll search for a heuristic this violates”

2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

Usability Action Report

UAR rather than “Problem report” because you report good aspects as well as problems -- you want to preserve them in the next iteration of the system!

  • UAR Identifier (Type-Number) Problem or Good Aspect
  • Describe: Succinct description of the usability aspect
  • Heuristics: What heuristics are violated
  • Evidence: support material for the aspect
  • Explanation: your own interpretation
  • Severity: your reasoning about importance
  • Solution: if the aspect is a problem, include a possible solution and potential trade-offs
  • Relationships: to other usability aspects (if any)
Slide 26 of 38

We'll ask you to do the things in italics in peer grading

2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

Writing a description

Should be A PROBLEM, not a solution

Don't be misleading (e.g., “User couldn’t find state in the list” when the state wasn’t in the list)

Don't be overly narrow (e.g., “PA not listed” when there is nothing special about PA and other states are not listed)

Don't be too broad, not distinctive (e.g., “User can’t find item”)

Slide 27 of 38
2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

Picking a Heuristic

Ok to list more than one

This is subjective. Use your best judgement

Slide 28 of 38
2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

Deciding on a severity

  • Make a claim about factors and support it with reasons
  • Consider Frequency (e.g., All users would probably experience this problem because…)
  • Consider Impact Will it be easy or hard to overcome. NOT is the task the user is doing important (put importance of task in explanation and in justification of weighting, if relevant).
  • Consider Persistence Once the problem is known, is it a one-time problem or will the user be continually bothered? NOT low persistence because the user abandons goal (that is impact--can’t overcome it. If can’t detect and can’t overcome, problem persists).
Slide 29 of 38

Why? Make claim (All users would PROBABLY experience this problem BECAUSE…) and IMMEDIATELY

2025-04-222025-04-232025-04-242025-04-252025-04-262025-04-272025-04-282025-04-292025-04-302025-05-012025-05-02Designer Provides Setting (Description of Interface and List of Tasks) 5ish evaluators Try tasks and record problems EvaluationTypical Heuristic Evaluation Process

Rating Severity

5-point scale

0 - Not a problem at all (or a good feature)

1 - Cosmetic problem only

2 - Minor usability problem (fix with low priority)

3 - Major usability problem (fix with high priority)

4 - Usability catastrophe (imperative to fix before release)

Slide 30 of 38
2025-03-032025-03-052025-03-072025-03-092025-03-112025-03-132025-03-152025-03-172025-03-19Designer Provides Setting (>Description of Interface and List of Tasks) Videos distributed Tasks--5 evaluators E1--Problem1... E1--Problem2... E2--Problem3... ... Group like problems Summarize problems Write report EvaluationProblemsSynthesis and AnalysisHeuristic Evaluation Process

Group like problems

  • Important thing is whether they have similar description
  • This is for you to decide
  • Similarity may be conceptual (e.g. the same problem may show up in multiple parts of your interface)
Slide 31 of 38
2025-03-032025-03-052025-03-072025-03-092025-03-112025-03-132025-03-152025-03-172025-03-19Designer Provides Setting (>Description of Interface and List of Tasks) Videos distributed Tasks--5 evaluators E1--Problem1... E1--Problem2... E2--Problem3... ... Group like problems Summarize problems Write report EvaluationProblemsSynthesis and AnalysisHeuristic Evaluation Process

Summarize problems

  • Average severities
  • List all relevant heuristics
  • List all areas of website affected
  • Also prioritize at this point
Slide 32 of 38
2025-03-032025-03-052025-03-072025-03-092025-03-112025-03-132025-03-152025-03-172025-03-19Designer Provides Setting (>Description of Interface and List of Tasks) Videos distributed Tasks--5 evaluators E1--Problem1... E1--Problem2... E2--Problem3... ... Group like problems Summarize problems Write report EvaluationProblemsSynthesis and AnalysisHeuristic Evaluation Process

Write report

We've provided a template

Slide 33 of 38

Advantages of HE

“Discount usability engineering”

Intimidation low

Don’t need to identify tasks, activities

Can identify some fairly obvious fixes

Can expose problems user testing doesn’t expose

Provides a language for justifying usability recommendations

Slide 34 of 38

Disadvantages of HE

Un-validated

Unreliable

Should use usability experts

Problems unconnected with tasks

Heuristics may be hard to apply to new technology

Coordination costs

Slide 35 of 38

Summary

Heuristic Evaluation can be used to evaluate & improve user interfaces

10 heuristics

Heuristic Evaluation process

Individual: Flow & screens

Group: Consensus report, severity

Usability Aspect Reports

Structured way to record good & bad

Slide 36 of 38

Hall of Shame?

Cycling back: What would you say in your HE of this interface?

Slide 37 of 38

Hall of Shame?

Cycling back: What would you say in your HE of this interface?

Funimation web page with a large image and an interface to add items to a queue and remove items from a queue

Large red remove from queue button Large red add to queue button

Slide 38 of 38

Today's goals

Slide 2 of 38
Paused

Help

Keyboard shortcuts

, , Pg Up, k Go to previous slide
, , Pg Dn, Space, j Go to next slide
Home Go to first slide
End Go to last slide
Number + Return Go to specific slide
b / m / f Toggle blackout / mirrored / fullscreen mode
c Clone slideshow
p Toggle presenter mode
s Start & Stop the presentation timer
t Reset the presentation timer
?, h Toggle this help
Esc Back to slideshow