0% found this document useful (0 votes)
18 views27 pages

The Cognitive Task Analysis Methods For Job and Task Design - Review and Reappraisal

This paper reviews cognitive task analysis (CTA) methods for job and task design, highlighting their effectiveness in understanding cognitive elements of job performance compared to traditional task analysis techniques. It classifies various CTA methods, discusses their applications in training and cognitive engineering, and presents a validated human-centered information-processing model for cognitive task performance. The document emphasizes the importance of CTA in jobs requiring complex cognitive skills and provides guidelines for selecting appropriate methods for task analysis.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views27 pages

The Cognitive Task Analysis Methods For Job and Task Design - Review and Reappraisal

This paper reviews cognitive task analysis (CTA) methods for job and task design, highlighting their effectiveness in understanding cognitive elements of job performance compared to traditional task analysis techniques. It classifies various CTA methods, discusses their applications in training and cognitive engineering, and presents a validated human-centered information-processing model for cognitive task performance. The document emphasizes the importance of CTA in jobs requiring complex cognitive skills and provides guidelines for selecting appropriate methods for task analysis.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

BEHAVIOUR & INFORMATION TECHNOLOGY, JULY–AUGUST 2004, VOL. 23, NO.

4, 273–299

The cognitive task analysis methods for job and


task design: review and reappraisal
JUNE WEI{ and GAVRIEL SALVENDY{}
{Department of Management/MIS, College of Business, University of West Florida, Pensacola, FL 32514, USA;
e-mail: [email protected]
{School of Industrial Engineering, Purdue University, West Lafayette, IN 47907, USA
}Department of Industrial Engineering, Tsinghua University, Beijing 100084, P.R. China
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

Abstract. This paper reviews and reappraises the current ods derived from cognitive science have begun to be used
research on the cognitive task analysis methodology for job or to conduct CTA for training research programmes,
task design and analysis. Specifically, it classifies the current curriculum redesign, and computer-based training devel-
cognitive task analysis methods for job or task design and
analysis, sorts out commonalities and differences among all opment. CTA has also been used in cognitive engineering
these cognitive task analysis methodology for job and task of human – machine systems, intelligent tutoring system
design and analysis by conducting pros and cons comparisons, development, decision support system design, and knowl-
and provides guidelines in selecting cognitive task analysis edge elicitation and acquisition for expert systems (Ryder
methods for job and task design and analysis. Moreover, based and Redding 1993). Lehto et al. (1992) used knowledge
on the current literature review, a validated human-centered
information-processing model for cognitive task performance acquisition methods on task analysis and performance
was developed based on human information processing theory. measurement in the cognitive domain for an expert system
This new model focuses on identifying all cognitive aspects of development. Some examples of jobs with strong
human performance in technical work, with the goal of cognitive components are situation assessment and
assisting job (re)design to increase human job performance. intelligence analysis, aviation and air traffic control,
process control, sensor data interpretation, and equip-
ment maintenance and troubleshooting (Ryder and
1. Introduction Redding 1993). Schraagen et al. (2000) identified two
areas of the CTA applications, one for the individual
The purpose of this section is to provide an overview of training, performance assessment, and selection, and the
cognitive task analysis (CTA) methods for job and task other for design of human – system interaction.
design by presenting the concepts of CTA, the history of
job design, and requirements of CTA for job design. 1.1.1. Concepts: A cognitive task is defined as a group
of related mental activities directed toward a goal
(which may not be clear) (Klein and Militello 1998).
1.1. CTA The cognitive task activities are unobservable. Many of
the unobservable activities are mental and often good
Although traditional task analysis techniques have candidates for a cognitive task analysis.
made significant contributions in improving productivity Klein and Militello (1998) defined cognitive task
when important task elements are visually observable, analysis (CTA) as the description of the cognitive skills
their focus on manual task procedures make them less needed to perform a task proficiently. ‘CTA is primarily
effective for cognitively oriented activities (Koubek et al. valuable for tasks that depend on cognitive aspects of
1994). Recent advances in cognitive science provide new expertise, such as decision making and problem solving
ways of characterizing learning and skill development (p. 6).’ They noted that there existed a need for an
that are appropriate for complex cognitive tasks. Meth- efficient and validated method for identifying cognitive

Behaviour & Information Technology


ISSN 0144-929X print/ISSN 1362-3001 online # 2004 Taylor & Francis Ltd
http://www.tandf.co.uk/journals
DOI: 10.1080/01449290410001673036
274 J. Wei and G. Salvendy

task requirements which could be (a) incorporated into ogy based on a single sequence of behaviours will
other methods to provide a complete picture of support only one way of performing the task. In
performance and (b) did not require extensive training contrast, CTA methods attempt to identify problem-
to use and significant resources to implement. solving strategies that may be manifest in variable
Schraagen et al. (2000) defined the cognitive task sequences of actions depending on the environmental
analysis (CTA) as ‘the extension of traditional task dynamics in each task and also to identify important
analysis techniques to yield information about the individual differences (Ryder and Redding 1993, Neer-
knowledge, thought processes, and goal structures that incx and Griffioen 1996, Seamster et al. 1997).
underlie observable task performance’.
CTA provides the tools for understanding the
cognitive elements of job performance. This under- 1.2. Job design history
standing is necessary for designing jobs that support and
maximize cognitive skill performance. The CTA differs Koubek et al. (1994) identified four historical phases
from traditional methods in a number of ways, as of job analysis based on three chronological periods
outlined in table 1. Table 1 is constructed based on taken from Primoff and Fine (1988):
Seamster et al. (1997) and Ryder and Redding (1993).
.
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

A traditional task analysis approach usually has an Phase I (1865 – 1918), which formed the founda-
observable process and emphasizes the behaviour, tions for job and task analysis when Taylor and
whereas CTA has an unobservable process and empha- Gilbreths began to produce techniques to identify,
sizes the cognition. A traditional task analysis approach measure, and organize the elements of manual
emphasizes the target performance desired; in contrast, work tasks.
CTA addresses expertise. Expertise refers to the knowl- . Phase II (1918 – 1945), which shifted from analys-
edge structure and information processing strategies ing specific elemental task components of manual
involved in task performance. A traditional task analysis jobs to identification of the skills and abilities
method focuses on identifying the knowledge required necessary for successful job performance. Ques-
for each individual task element, whereas CTA empha- tionnaires instead of visual observation of jobs
sizes the knowledge base for the whole job: its were developed and used to identify underlying
organization and the interrelations among concepts or abilities, thus providing opportunities to extend
knowledge elements. The emphasis on the knowledge the benefits of job analysis to jobs in which the
base for the whole job provides useful information for important tasks were more cognitive in nature,
structuring training to facilitate initial learning as well as such as jobs performed by a manager.
progression to the knowledge organization used by . Phase III (1945 – 1990) focused on consensus-
expert. Skills are identified for each separate task within based techniques to analyse underlying compo-
a traditional task analysis approach and those are for nents of work based on the activities of McCor-
the job as a whole within CTA. CTA includes mick et al. (1972). McCormick and Jeanneret
determination of mental models used in task perfor- developed the Position Analysis Questionnaire
mance, whereas traditional methods do not contain (PAQ) to identify the underlying work compo-
mental models. Traditional approaches cannot charac- nents, and the skills and abilities to perform these
terize variability in performance within and between work activities (McCormick 1977, 1979, McCor-
individuals. Because a single individual can perform a mick et al. 1969a,b,c, 1972, 1989, McCormick et
task in a variety of ways and use different sequences and al. 1977, 1989, Jeanneret and McCormick 1969,
methods, adopting a traditional task analysis methodol- Mecham and McCormick 1969a,b, Mecham et al.

Table 1. Traditional task analysis and cognitive task analysis comparisons (modified, based on Seamster, Redding and Kaempf
(1997), Ryder and Redding (1993)).
Traditional task analysis Cognitive task analysis (CTA)
Process observed Process not observed
Behaviour emphasized Cognition emphasized
Target performance analysed Expertise analysed
Knowledge for each task evaluated separately Interrelationship among knowledge elements for whole job evaluated
Segments tasks according to behaviours required Segments tasks according to cognitive skills required
Mental models not addressed Mental model addressed
Only one way to perform described Individual differences accounted for
Cognitive task analysis methods 275

1977). The PAQ is one of the most widely used detailed cognitive oriented task specific level (Koubek et
standard structured job analysis techniques com- al. 1994). Therefore, the current research focuses on the
mercially available in the United States (Jeanneret development of a cognitive oriented consensus-based
1991, 1992a,b, Jeanneret and McPhail 1991, questionnaire and psychological rating scales to sample
McPhail et al. 1992). The recently developed job at a complete and more detailed cognitive oriented task
analysis technique, Professional and Managerial specific level.
Position Questionnaire (PMPQ), focused on jobs Second, traditional methods for task analysis break
performed by managers (Mitchell and McCormick down jobs into discrete tasks composed of specific
1990). Rohmert and his colleague (Rohmert 1988 action sequences and identify prerequisite knowledge
and Romert and Landau 1979, Kulik and Oldham and skills for each task. Although these methods have
1988) also presented a technique for job analysis been effective for designing jobs for simple procedural
called Ergonomic Job Analysis Technique (AET). skills, they offer little insight for analysis of jobs
In addition to the consensus-based techniques, involving complex cognitive skills. Because of this,
Phase III also advocated a view of validity cognitive considerations need to be incorporated into
generalization. The validity generalization em- task analysis. Recently, cognitive methods have begun
phases the generalizability of ability requirement to be used to conduct task analysis for training program
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

across a number of jobs, therefore, abilities development (Love and O’Hara 1987, Polson and Lewis
necessary for successful performance remain fairly 1990, Wilson and Cole 1991) and human – computer
consistent across jobs and detailed analyses are system development (Eberts et al. 1990, Ryder and
less important (Schmidt and Hunter 1977). Redding 1993).
. Phase IV (1990 – present) focused on cognitive Third, technological development has significantly
task analysis (CTA) by combining techniques increased the complexity of job and task designs. The
from different disciplines into CTA methodology, role of cognitively oriented tasks in the workplace
such as combining knowledge elicitation for continues to increase as automation of physical task
expert systems (Visser and Morais 1991) and components advance. The operators’ tasks in highly
cognitive science (Anderson 1993) into CTA automated systems contain more and more planning
methodology. Koubek et al. (1994) integrated and decision components. The analysis and registration
protocol analysis with McCormick’s PAQ to of these components include mental activities in tradi-
develop a cognitive task analysis methodology, tional techniques; however, only task-oriented beha-
which identifies how operator abilities are used by viour is analysed at the level of skill-based and rule-
mapping the decision-making process. Results based action patterns (Luczak 1997). The design of a
from the developed cognitive task analysis show properly functioning job requires a different kind of
that a consensus based analysis technique can be knowledge to describe the cognitive or mental functions
significantly improved for identifying non-physi- at the knowledge-based information processing level.
cal task components. This focus complemented In computerized work, the growth of computer
the consensus-based approach that addressed applications has radically changed the nature of job
what abilities were required for successful task designs in two aspects. First, through increased auto-
performance. mation, the nature of the human’s task has shifted from
an emphasis on perceptual motor skills to an emphasis
on cognitive activities, such as problem solving and
1.3. Requirements of CTA in job design decision making. Second, through the increasing so-
phistication of computer applications, the job design in
There are three major requirements for the CTA. computerized work is gradually emphasizing the inter-
First, the history of job and task analysis showed a trend action between two cognitive systems (Hollnagel and
from analysing specific work elements to identify the Woods 1983, Christal and Weissmuller 1988).
underlying job factors with complex methodologies
(Davis and Wacker 1987, Drauden 1988). However,
the detailed levels available by measuring actual physical 2. Methodology
work activities in the traditional methods lack sensitivity
to cognitive task components, and therefore, produce This paper classifies the current cognitive task analysis
somewhat unsatisfactory results for cognitively oriented methods for job or task design and analysis, and sorts
jobs. The traditional consensus-based questionnaire and out commonalities and differences among all these
psychological rating scales can sample some general cognitive task analysis methodology. Guidelines and
cognitive activities, but not at a complete or a more helps are also provided in selecting cognitive task
276 J. Wei and G. Salvendy

analysis methods for a variety of tasks design and actual task scenario. Process tracing methods are more
analysis. formal than those methods in Family 1. In a sense, they
take analysis one step further, in that they explore the
cognitive structure and processes underlying task
2.1. CTA method classifications performance. These methods most commonly make
use of verbal data, and hence are fraught with the
CTA draws upon laboratory and field research to limitation of verbal reports. In addition, these methods
obtain methods and techniques for analysing the result in large and often unmanageable data sets that are
cognitive task that involve complex cognitive skills often difficult to interpret in a meaningful fashion. On
(Kirwan and Ainsworth 1992, Cooke 1994, Seamster et the other hand, the data collection techniques are
al. 1997, Schraagen et al. 2000). The classification of generally carried out easily, although some practice
CTA methods described in this research is based on the may be necessary on the part of the job analysers. The
mechanism of the methods themselves. Each CTA representative methods in this family are cognitive
method is classified into one of the four families based walkthrough (Clarke 1987), verbal reports (Ericsson
on its formality and analysis mechanism. Each method and Simon 1984), and protocol analysis (Sanderson et
is compared based on characteristics, inputs, outputs, al. 1989).
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

and processes. A complete comparison of the pros and


cons of these methods is also provided. Table 2 2.1.3. Family 3 – conceptual techniques: Conceptual
illustrates these comparisons. techniques refer to the products of the representations of
domain concepts and their structure or interrelations to
2.1.1. Family 1 – observations and interviews: In this analysis tasks. The methods in this family tend to be
family, direct methods of watching and talking with indirect, requiring less introspection and verbalization
subjects are used. The informal methods in this family than interview and verbal report methods, and handle
seem well suited to the initial phases of CTA in which multiple job analysers better than many of the other
the domain needs to be defined and circumscribed. They methods. The relatedness estimates that are used as
may also be useful in achieving subject rapport, because input can be aggregated over multiple job analysers to
in many cases they seem natural. On the other hand, the generate a composite structural representation. On the
results of the techniques in this family are often other hand, these techniques may generate information
unwieldy and difficult to interpret. The specific proce- that is unrelated to task performance. In addition, these
dures in this family involved in carrying out task methods center on data reduction, revealing what are
analysis are not well defined. The representative hoped to be the most meaningful features of the domain.
methods are observation (Hoffman 1987, Drury 1990), Thus, they provide a more objective approach to the
unstructured interview (Hoffman 1987), structured interpretation of large amounts of data. The limitations
interview (Hoffman 1987). include the tendency to focus on conceptual knowledge
at the expense of heuristics, rules, and strategies. The
2.1.2. Family 2 – process tracing: The process tracing representative methods in this family are conceptual
refers to the tracking of a particular task process. graph analysis (Gordon et al. 1993), consistent compo-
Methods in this family are associated with specific tasks, nent (Fisk and Eggemeier 1988), diagramming (Lesgold
and generally are used concurrently with task perfor- et al. 1988), error analysis (Norman 1984, Rasmussen et
mance. Unlike informal observations in Family 1, in this al. 1987, Reason 1987), psychological scaling-rating and
family (Family 2), the data that are recorded are of a ranking (Ryder and Zachary 1991), repertory grid
pre-specified type (e.g., verbal reports) and are used to (Mancuso and Shaw 1988, Boose 1990), sensori-motor
make inferences about the cognitive processes or process chart (Crossman 1956, Salvendy and Seymour
knowledge underlying task performance. In the verbal 1973), and sorting (Geiwitz et al. 1990). Questionnaires
reports, subjects report a running commentary describ- can be classified in this family.
ing what is seen, what is done, and why it is done during
a task performance. Reports are recorded on audio or 2.1.4. Family 4 – formal models: Methods in this
video tapes. The input of the verbal report method is family use formal models and simulations to model
subjects’ experience and specific tasks need to be tasks in the cognitive domains. For cognitive modelling,
analysed, and the output is subjects’ verbal data. Data the task analyst does not need a mock-up but can
analysis is based on procedural skills and decision points perform the modelling with only a sketchy description of
(Ericsson and Simon 1984). Because process tracing the product. This can save time and money because,
typically focuses on specific tasks, it is important to before incurring the expense of mock-ups, prototypes,
select tasks wisely so that they are representative of the or the actual products, the important aspects of usability
Table 2. Cognitive task analysis (CTA) methods.
Method Reference Characteristic Input Output Process Weakness Advantage

Family 1 – Observations and interviews


Observation Drury (1990), Focus on implicit activities Task Observation data Record features, actions, and Hard when impossible for Cheap, easy, and low time required
Hoffman (1987) and verification implementation events in written, audio, or observer to accompany Effective to gather initial information
Used effectively with other video subjects about a domain or problem, identify
methods Presence of observer has problem solving strategies not
influence on subject’s consciously accessible, study motor
behaviour skills and automatic procedures, identify
Hard to interpret observation the tasks involved in a domain and
data constraints on the tasks, identify the
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

Observer may not recognize information required for a task, and


important performance verify an expert’s description of the task
elements Minimal interference with task and
Detailed observation may be environment
impossible in real time process
control tasks
May not be accurate
Unstructured Hoffman (1987) Content and order of Experience Conversation data Record conversation in Require training and skill Get a broad view of domain
interview interview not predetermined Question written, audio or video Produce copious, unwieldy Fast
Direct or indirect data

Cognitive task analysis methods


Explicit or implied Subjective
questions Hard to analyse data
Retrospective May not be accurate
Structured Hoffman (1987) Range from high-structure Experience Conversation data Record conversation in More preparation time and Get more systematic, and complete
interview (predetermined content and Question written, audio, or video form domain knowledge required coverage of domain
order) to semistructure of analyser More comfortable structure
(predetermined content and Subjective Not required to describe procedures
vary order) Hard to analyze data Fast
Direct or indirect May not be accurate
Explicit or implied, open
and close questions
Retrospective
e.g.1 of Klein et al. Semistructured interview Sources of Extensive data Use incident as framework to Considerable expertise Yield richer information in variety,
Structured (1989) Elicit decision information incidences are probe decisions, judgments, required of analyst specificity, and quantity than verbal
Interview – in highly dynamic setting experiences or and problem solving Moderate time and cost: 6 reports
Critical decision Focus on critical decisions planned scenarios Focus on experience and months of training and
and judgements apply cognitive probes to elicit practice
Output for high level decision strategies, perceptual Hurdles for exporting it into
cognitive task analysis such discriminations, pattern more general use are training
as decision making, recognition, expectancies, and less experienced analysts
judgments, and problem cues, and errors
solving
e.g.2 of Hall et al. Structure interview of Multiple experts Action, precursor Nine stages Time consuming Recognize characteristics of procedural
Structured (1994) problem solving Problem set and interpretation Multiple experts work on Not feasible in most skills in operational setting
Interview – Skill analysis method to data problem set operational environments
Full PARI elicit finer level of skill
(precursor, details
action, result,
interpretation)

(continued overleaf )

277
278
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

e.g.3 of Hall et al. Structure interview of Pairs of experts (one Action, precursor Three stages Time required and cost are Very efficient
Structured (1994) problem solving poses, the other and interpretation Pairs of experts work on a moderate Recognize characteristics of procedural
interview – Skill analysis method to solves, a task data single problem skills in operational setting
simplified PARI elicit finer level of skill specific problem) Break down action, precursor,
details A single problem and interpretation data into
Study subtasks based on a cognitive procedures
single problem
e.g. 4 of Klein and Used when you do not Questionnaire Task diagram Frame the task as a process Not suitable for complex tasks Provides an overview of the task
Structured Militello (1998) already have a roadmap Be patient to repeat, and Identifies the cognitively complex
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

interview – task rephrase questions; keep it aspects of the task


diagram simple Frames the rest of the cognitive task
interview analysis
e.g. 5 of Klein and Used when you need to Six probes: past and Constructs expert Ask for examples and specifics Subjective Elicits detailed, specific information,
Structured Militello (1998) quickly identify the key future; big picture; and novice Write enough but not May not be accurate which is a source of interesting incidents
interview – cognitive element of the job noticing; job performance everything so that you will
knowledge smarts; know later what was said
audit opportunities/ Do not rush
improving; expert/ Check that your subject
novice difference matter expert understands
Three optional what you are asking
probes: anomalies;

J. Wei and G. Salvendy


equipment
difficulties; scenario
from hell
e.g. 6 of Klein and Used when you need to Process Major events Elicit the major events Subjective Highlights cognitive elements of the task
Structured Militello (1998) understand the process Errors Not let the subject matter May not be accurate within the context of a specific incident
interview – expert’s desire to critique task Shows how experts think about and
simulation overwhelm the interview carry out tasks
interview Write enough but not Identifies potential errors a novice
everything so that you will would be likely to make
know later what was said
Check that your subject
matter expert understands
what you asked
e.g. 7 of Klein and Cognitive demand Pattern Synthesize data from multiple Subjective Identify the cognitively demanding
Structured Militello (1998) table (difficult interviews May not be accurate elements of the task
interview – cognitive elements, Organize data Make the cognitive demands context-
cognitive why difficult, See patterns in the data specific
demand table common errors, Apply the cognitive demands
cues, and strategies throughout the design cycle: early
used) conceptual design, specifications, test
and evaluation, and redesign

Family 2 – Process tracing


Cognitive Clarke (1987) Exploratory learning Learn Successful rates Describe thoughts and actions Incomplete sampling of Low time/cost
walkthrough Task Error recovery as progress with tasks routine cases Effective to evaluate decision
implementation Verbal data Analyse through exploratory Need task definitions alternatives
learning Accomplished by novices or experienced
performers
Obtain verbal protocols
Easy
Good for explore learning

(continued overleaf )
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

Verbal reports Ericsson and Procedural skills and Experience Verbal data Subject reports a running High time consuming Easy to obtain data
Simon (1984) decision points Task commentary describing what Moderate cost Helpful in developing an initial
Concurrent, retrospective, is see, what is done, and why it Low accuracy understanding of a domain
or prospective is done, which reports are Incomplete
Self-report or shadow recorded on audio or video Concurrent is impossible if a
tapes task involves verbal
communication
Impossible for high cognitive
workload tasks
Subjects may perform a task
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

in a way he can explain rather


than the normal way he
proceeds
Knowledge is compiled or
chunked so that unit that
making up the chunks is not
verbalizable
Qualitative and complex
nature of data
Subjective interpretation

Cognitive task analysis methods


Protocol Sanderson et al. Think aloud and verbalize Cognitive Encoded protocol Verbalize to infer cognitive Moderate time consuming Get process information
analysis (1989) to infer cognitive processes processing processing using coding May be unnatural to use Pinpoints misconceptions
Encode and analyse scheme May be hard to analyser Low cost
collected protocols
e.g. of Protocol Sanderson et al. Team tasks – Statements Coded data based Break down statement into Inaccurate because analysis Improved reliability by multiple
Analysis – (1989) Serve as editors for protocol on protocols – units, classify each unit into depends on verbal iterations
SHAPA analyse to protocol data to Pattern predetermined categories Extremely time consuming Reveal information on protocol such as
(Hemi-Semi- text Resulting patterns make Moderate cost flow of control, repetition of constructs,
Automated Coding of data takes inference on cognitive Hard to standardize among sequence analysis, and reliability
Protocol predicate-argument coding processes different analysts
Analyzer) scheme

Family 3 – Conceptual techniques


Conceptual Gordon et al. Focus on conceptual task Information from A formal Consist of nodes (a concept or Moderate time Low cost
graph analysis (1993) map documents, verbal representational statement categorized as state, Directly captures factual and rules
A representational medium protocols, interviewsyntax termed event, style, goal, or goal/ knowledge and provides a structured
for integrating and probes, and conceptual graph action) and labelled framework for making implicit
organizing information observations of taskstructure directional arcs (relationship) knowledge explicit
Comprise iterations of performance to present information about Provide a framework for knowledge
knowledge acquisition the types of knowledge and elicitation through several knowledge
around conceptual graph their relationships elicitation techniques performed in
structures Arcs capture different types of sequence to shape the form of the
relationships which cluster conceptual graphs from very general and
into three graph high level to very specific level
substructures: taxonomic, Require a moderate level of skill from
spatial, and causal analyst
substructures
Consistent Fisk and Intermediate or final form Task Consistent task Break down tasks into main Moderate time Low cost
component Eggemeier of analysis in identification Novices and experts elements forming elements Identify the consistent task elements of
(1988) of automated skill basis of automated Identify high-level skills and complex skills that form the basis of
components skills decision points to locate area automated skills
of automated skill for each
element
Examine areas where novices
have difficulty under higher
workload and identify
consistent information for
located area

279
(continued overleaf )
280
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

Diagramming Lesgold et al. Focus on knowledge Task structured and Diagrams Elicit the critical piece of Complex for huge amount of Low time/cost
(1988) representation response controlled information that ties all the input information Simple solution for a complex problem
by a standard pieces of the information Intuitive method for eliciting critical
format together information
Lead to more thorough analysis
Error analysis Rasmussen et Focus on error types and Errors Mapped errors to Systematic analysis of Moderate time Low cost
al. (1987), source cognitive processing operational and performance Provide insights into decision making,
Norman (1984), failures errors to determine the particularly in critical situations, which
Reason (1987) relationship between error would not be gained through the
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

types and cognitive processing routine, error free performance study


Obtained information is useful in
determining thought processes,
designing man-machine interfaces, and
developing judgement or critical incident
training programming
Psychological Ryder and Focus on relationships of Task concepts and Rating data Identify the relevant domain Rating is tied to a specific Low time/cost
scaling–rating Zachary (1991) task concepts context Ranking data concepts and context context Although the ranking procedure needs
and ranking Based on a different Estimate degrees of to be carried out n (number of concepts)
criterion such as preference relatedness or proximity times to derive relatedness, it requires
between two concepts. These less time than pairwise comparisons
rating data lead to matrixes of when n is large

J. Wei and G. Salvendy


proximity where the row and Best-suited for homogeneous concept
columns represent domain sets which can be ordered on one or
concepts more dimensions
Each concept is used as a
reference point and the
remaining concepts are
ranked in order of similarity
to this reference concept
e.g. of Rating Pairs of concepts Rating data Assign a rating of relatedness Time-consuming if the Computer version has benefits of
and ranking – for each pair of concept number of concepts 4 25 facilitating random presentation of pairs
paired Compares the relatedness of Judge relatedness along only a and counterbalancing of items within
comparison each pair to a standard pair specific dimension in some the pairs
Obtain magnitude estimations cases
of each pair
Repertory grid Boose (1990), Focus on concepts and Concepts Grid Elicits ratings on dichotomous Moderate time Low cost
and Mancuso structural knowledge Overall relatedness constructs to build the model Complex Popular analytic technique for
and Shaw A specific version of the As the ratings along each Judgements may seem identifying the concepts underlying
(1988) more general rating construct for each element are awkward or forced when subjects performance and the
procedures elicited, a grid is constructed, constructs may not apply relationships among those concepts
plotting constructs against equally to all elements Construct provides an avenue for
elements Hard to compare when distinguishing among domain elements
Derive overall relatedness elements and constructs vary
from the grid by computing in level of abstraction
the summed difference (or
correlation) between ratings
for either the constructs or the
elements
Sensori–motor Crossman Focus on mental activities Sensors (e.g. visual) Skill analysis chart Present five predominantly Relatively few people Describe operator’s global task and
process chart (1956), Task elements Observed features mental activities or steps in experienced in perceptual elemental motions that compose the
Salvendy and of performance most psychomotor tasks: analysis overall task
Seymour (1973) plan, initiate, control, end and Valuable method of charting
check the analysis of fine work
where the perceptual load is
high

(continued overleaf )
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

Sorting Geiwitz et al. Focus on high level Concepts Sorted piles Sort concepts into piles based Data generated not very Low time/cost
(1990) conceptual structures on relatedness, one concept sensitive to variations Quick and easy to elicit conceptual
Derive models of expert’s can be put in different piles Decreased sensitivity to information
conceptual framework for a Revisit cards already in a pile proximity differences Effective to develop coding schemes that
task/job without time constraints can be used reliably to analyse
Label each pile unstructured data such as interview
protocols

Family 4 – Formal models


Downloaded by [Stockholm University Library] at 17:07 10 August 2015

Multi- Klein et al. Focus on knowledge A set of concepts A set of coordinates Provide pairwise proximity Complex with statistical Successful in a large variety of settings
dimensional (1989) representations corresponding to estimates for a set of concepts, methods and for a variety of purposes
scaling Structural modelling the location of each and generates Moderate time and cost Handle missing data given a sufficient
technique using descriptive item in multidimensional spatial Dimensions portrayed in the number of subjects and relatively low
multivariate statistical multidimensional layouts of those concepts physical layout of the space do error
techniques space Number of dimensions is not always correspond to the Provide a faithful representation of
decided using several best fitting dimensions, may original data that are unreliable at the
statistically based heuristics need rotate dimensions to unrelated end of the scale
interpret them Better represents perceptual information
than conceptual information

Cognitive task analysis methods


ACT model Anderson Focus on problem solving Problem solving Strength of Interpretative stage requires Only applies to problem Provide a basis for constructing
(1993) skills acquisition Learning theory encoding recalling specific problem- solving domain intelligent computer-based tutoring
Central concept is the Accessibility of solving examples and Assume means-ends problem systems
production rule that plays a declaration interpreting them. Use solving structure Operators encoded as rules in an
similar role to the stimulus- knowledge declarative memory retrieval abstract form that can apply across a
response bond in learning Performance of without necessary long-term range of situation
theories procedural memory involved Knowledge in production form applies
Distinguish between knowledge Procedural stage is knowledge much more rapidly and reliably
declarative and procedural compilation where procedural Deal with variability in problem solving
knowledge and use knowledge is encoded in terms behaviour
encoding strength to of production rules of Bring together ideas from problem
determine accessibility of condition-action pairs solving theory and learning theory
declarative knowledge and Rapid and important progress to
performance of procedural understand how complex problem
knowledge solving skills are learned
ARK Geiwitz et al. Inspired by ACT Knowledge A network of Similar to goal decomposition Very time consuming Well suited to planning tasks
(ACT–based (1988) Elicit both a network of objects and their with breaking goals into Require a rich data set that
representation static knowledge about the relations subgoals or actions requires extensive reduction
of knowledge) domain and a set of Generates production rules to and interpretation
procedures performed on represent goal-subgoal and
that knowledge goal-action relations
Construct a network of
objects and their relations
Human Card et al. Based on theory of the Three parameters of Total task time Three discrete stages that Individual components of task Very accurate at making estimates,
processor (1983, 1986) human information cycle time in three information would flow are at the very atomistic level especially for simple task
model processor with a sequential stages through in order to be used of cognitive process, not at the
manner from one stage to 12 parameters of and proceeded in cognitive higher level of keystrokes or
the next memory including psychology goals and intentions
code, decay time, Quantitatively predicts
and capacity for parameters which can break
both visual and down complex tasks into
audio components relevant timing
attributes

(continued overleaf )

281
282
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

GOMS (goals, Card et al. Assume that interacting Goal Behaviour Four basic components (goal, Inaccurate when the method Useful in cases where the sequence of
operators, (1983), Kieras with machines is similar to Task context Task time and operators, methods and of operation is not known or events is known
methods, and (1988), Carroll solving a problem Action learning time selection rules) when working with novices Useful for investigating the impact of
selection rules) and Olson Task modelling and Information Error Problems broken down into Difficult and extremely design alternatives and for
(1988), decomposition by Knowledge and subproblems, as well as goals complex understanding critical pathways for task
Elkerton and considering performance process ability Require significant amounts performance
Palmiter (1991), Procedural model based on limits of time to construct Very important in trying to understand
Gray et al. theory of human Limit to a specific set of tasks how a user interacts cognitively with
(1993), Eberts information processor and a specific functionality interactive software and in quantifying
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

(1997) (Human Processor Model) Moderate cost aspects of the interaction even before the
Vague in applications software is prototyped
Only applicable to error-free
performance
Only good for expert
GOMS Outline the cognitive Six operators Predicate times Predicate by quantifying Remove large chunk of time Best for making qualitative predications
variation 1 – behaviour of a user by values for observable by not analyse unobservable about differences between tasks
Basic GOMS breaking down the problem operations behaviour such as the time to Depend greatly on the skill of task
into subgoal and goal stacks acquire goals analysis. Different skills result in
Time consuming different variant of models
Assume error-free
performance

J. Wei and G. Salvendy


GOMS Enables analysts to Predict average time Take verbal protocol of users Time consuming Useful for making specific time
variation 2 – construct a linear model of of unit tasks Find the times for the unit Hard to determine unit predictions based on simplifying
keystroke task actions, assign realistic tasks (e.g. collect key stroke task-subjective assumptions
model estimate of task duration data) Easier for making quantitative
Break down tasks at the Determine how many times prediction
keystroke level the unit tasks would occur Accurate in predicting qualitative
Identify observable Determine and predicate relationship among the different systems
behaviour average times for each unit Good model if interface designs can be
Engineer models since it task through experimentation specified in terms of the keystrokes
represents more of a
practical design tool
GOMS Break down tasks at the unit Unit tasks Easier to make quantitative prediction
variation 3 – task level on time
model unit task Assumption for unit task Efficient task analysis for more
complicated tasks
Appropriate to specify unit task level
since specifying keystrokes level details
may be inappropriate, or not needed
GOMS other Arend (1991) Uses basic building blocks Goal These models can be used to predict
variations to develop a model of Task context operator performance, validate
operator performance Operator empirical findings, explore theoretical
Information input explanations for observed performance,
Knowledge and and investigate the impact of various
process ability design alternatives
limits

(continued overleaf )
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

e.g.1 of GOMS Kieras (1988) Basic structure is methods Methods Describe interaction of user More cycle times than goal Has a fairly simple concise structure
other variations Goes beyond GOMS by and computer in a computer stack model providing a good overall structure,
– NGOMSL combining several GOMS programming-like language easily understood statements are in a
(Natural models into one integrated Activities of the user described very concise form, useful in initial task
GOMS mode as subroutines with decision analysis stage
language) Incorporate the human statement (IF-THEN), flow of Enables task analysis using a GOMS-
model processor model through control statement (GOTO), like model to be more specific
the use of analyst-defined memory storage and retrieval Forces task analysis to be very precise
operators and specification Describe user computer interaction in a
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

of the cognitive cycle time to specification language


be equal to the statement Advantage for users having experience
cycle time writing computer programs
Clear mechanism
Used by both experts and novices
Easy to estimate execution time,
learning time, mental workload, design
revisions, and analyse documents
e.g.2 of GOMS John (1990), Integrates the critical path Execution time Analyst categorizes operators Extends the method to a wider range of
other variations John et al. method and the GOMS according to the human task, those that involve auditory and

Cognitive task analysis methods


– CPM-GOMS (1994) approach, and analyses resource on which they rely visual input
concurrent interface Analyst then uses these Good for more detailed analysis in later
activities categories to build a schedule task analysis stage
chart and plot a critical path Good theoretical basis
that links the sequence of Used easily in a computerized
operators that represents the simulation
greatest total time and
determine the overall time for
the task
Production Kieras and Contain declarative Execution time Design equivalent to Has much more overhead Can run on a computer
systems Polson (1985), knowledge and procedural Learning time NGOMSL, or various Very hard to develop a model More appropriate for some estimation
Anderson knowledge combinations of the GOMS than the NGOMSL model, especially
(1976), Newell For most part, this model is model expert-novice differences
and Simon highly equivalent to the Differences between experts and novices
(1972), Bovair NGOMSL, so the two could are more clearly stated using this model
et al. (1990) be used interchangeably than NGOMSL
Grammars Eberts (1997) Describe the human- Fine-level cognitive activities Easy method to model human computer
computer interaction tasks could not be easily modelled interaction languages
in the formal language Make no predictions about Most appropriate for complex systems
execution time or mental
workload
e.g.1 of Reisner (1981) Borrow many concepts Determine and Five components: terminal Only accounts for the Describe complex languages and
Grammar – from linguistics by predicate execution symbols are the actions that a executable parts of the task, operating system in a relatively small
cognitive incorporating the BNF times user has to learn and not for the cognitive number of statements when applied to
grammar (Backus-Naur Form) remember; non-terminal processing of the task human-computer interaction tasks
method symbols represent sets of Used to determine the consistency of the
Emphasize the relationships similar actions that can be design, the simplicity of the interaction,
between the syntax and the grouped together; the starting and the learnability
actions needed to perform symbol is a high-level task to
the commands be performed by the user; the
metasymbols are common
meaning for ‘and’, ‘or’, and ‘is
composed of’; the rules are
constructed for the interaction
grammar used

(continued overleaf )

283
284
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage

e.g. 2 of Payne and Emphasize the family Identify ‘simple tasks’ that the Alternative representations Especially useful for evaluating
Grammar – Green (1986, resemblance among the user can perform without make slightly different consistency of the interface design and
TAG (task- 1989) language elements problem solving and that predictions that are also for offering design revisions based on
action Designed to make contain no control structure logical consistency
grammar) predications about the Describe simple tasks in a Determine well-defined categories of
relative complexity of dictionary by sets of semantic tasks with most structural consistency
designs rather than to components reflecting Investigate consistency in more detail
provide quantitative categorizations of the task Concentrate on the overall structure of
measures and predictions of world the language rather than individual rules
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

performance, in other Rewrite rules that map simple


words, generate tasks on to action
experimental hypotheses, specification
based on the model, test
those hypotheses
experimentally
Executive- Meyer and New theoretical framework Complex Good for characterizing human
process Kieras (1997) Time consuming performance of concurrent perceptual-
interactive High cost motor and cognitive tasks. It is general
control (EPIC) No practical usage except for enough
engineering design
Object-based Beringer and Take advantage of object Declarative and Well-defined object Define object classes Time required and cost are Data abstraction attribute from object

J. Wei and G. Salvendy


model Wandmacher oriented technology procedural aspects class Define methods and attributes moderate to high oriented modelling provided explicit
(1991) Integrate procedural models of a task Well-defined task within each object class links between semantic and procedural
such as GOMS-type model Present a well defined set of levels
with the semantic simple task where each task Bridge the gap between the high level
descriptive level models was defined by a state- semantic description of a task space and
Serve for an integrating oriented definition the procedural description of possible
formalism for representing activities
both the declarative A semantic goal structure depends on
(modelling concepts) and the given task and a GOMS description
the procedural (modelling is always limited to a specific set of tasks
procedures) aspects of and a specific functionality. In contrast,
human computer this model can derive semantic goal
interaction, and the structures for any potential meaningful
cognitive task analysis task given a specific functionality in a
very complex system. The functionality
can be easily varied by adding/removing
classes/ methods
Cognitive Roth et al. Built on artificial A domain scenario Generate model Compare model behaviour Complex Build a runnable computer program
environment (1992) intelligence problem solving behaviour with observed human Time consuming forces the modeller to describe
simulation system representing models behaviour for the same High cost mechanisms in great detail
(CES) of human cognitive scenario Uncover a variety of consequences of
activities Highlight the emphasis on the basic information processing
Perform diagnostic capturing the demands mechanisms that are instantiated in the
reasoning for fault imposed by the problem- program
management under dynamic solving environment Running the simulation through a
conditions scenario produces specific behaviour
that can be analysed and compared to
other data
Resulting simulation can be run on a
variety of scenarios, including scenarios
that were not part of the original design
set, therefore, capture human cognitive
activities in a wide range of domain-
specific circumstance
Capture the cognitive demands imposed
by dynamic fault-management situation
Cognitive task analysis methods 285

can be decided. They produce good quantitative (3) Interfering with task performance: Because the
prediction. The procedural models are located here. more formal methods tend to be conducted apart
People learn to use products by generating rules for use from actual task performance, there is less
and then ‘run’ their models, by interacting with the chance of interfering with performance, but they
product, and by sequencing through the set of rules tend to be artificial or to lack face validity, in
(Fischer 1991, Eberts 1997). There are several limita- that they are unlike any task that is actually
tions. Building models is expensive and time consuming. performed.
Most models are theoretical based on assumptions. If (4) Output data: The data obtained using the formal
the environment or scenario changes, the model needs to methods tend to be more quantitative, compared
be modified. Moreover, alternative models can produce to the more qualitative data obtained using the
different results that are also logical (Pashler 1998). The informal methods. Because the quantitative data
representative methods in this family are multi-dimen- is easier to interpret than qualitative data (Meyer
sional scaling, (Klein et al. 1989), ACT model (Ander- and Kieras 1997), informal method tends to be
son 1993), ARK model (Geiwitz et al. 1988), human more time consuming than formal methods.
processor model (Card et al. 1983, 1986), GOMS model Also, because it is easier to summarize quanti-
(Card et al. 1983, Kieras 1988, Carroll and Olson 1988, tative data, the analysis of group data is more
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

Elkerton and Palmiter 1991, Gray et al. 1993, Eberts straightforward for formal methods. However,
1997), production systems (Newell and Simon 1972, the informal methods often generate richer
Anderson 1976, Kieras and Polson 1985, Bovair et al. output than the formal methods.
1990), grammars (Eberts 1997), executive-process inter-
active control (Meyer and Kieras 1997), and object- In order to ensure that CTA is thorough and accurate,
based model (Beringer and Wandmacher 1991). two or more methods may be combined. Different
The four families of CTA are distinguished by the methods may result in very different models, all of which
degree to which the methods and analyses are specified. are good models with respect to very different aspects of
Observations and interviews are relatively informal, the domain knowledge. Therefore, the best way to
with much of the specification of methods and analyses minimize potential measurement errors and to maximize
left to intuition. Process tracing methods are somewhat the scope of domain coverage is to combine multiple
better specified, with some procedures, such as protocol methods (Klein and Militello 1998).
encoding. Conceptual techniques are fairly formal and There are two types of methods combination. The
well specified with fewer decisions. Formal models are first is to combine one or more traditional task analysis
very well defined. The tradeoffs between these four method with one or more cognitive task analysis
families are described next. method. The other is to integrate two or more cognitive
task analysis methods from the same or different CTA
(1) Training requirements: Because of the more families classified above. There are some examples in
active analysis role associated with informal which methods have been combined successfully to
CTA methods, analysers’ training requirements accomplish specific goals.
for these methods tend to be focused on aspects For the first type of methods combination, most
of interview skills and domain knowledge. On researchers considered that cognitive methods supple-
the other hand, the more formal methods tend to ment rather than replace traditional methods (Ryder
require training in the procedural and analytic and Redding 1993). The advantages of sensitivity and
details of the method. In general, these formal objectivity of the traditional approaches should not
methods are less flexible and require more necessarily be abandoned when faced with cognitively
methodological training, which may explain oriented tasks. However, their reliance on visually
why they are used less frequently than informal observable task components must be overcome. Koubek
methods. The more structured techniques typi- et al. (1994) made a first attempt at integrating a
cally require advanced preparation and signifi- traditional task analysis method with a CTA. In their
cant knowledge by the job analysers. research, protocol analysis was combined with a
(2) Introspection and verbalization: The less formal traditional task analysis method, McCormick’s PAQ,
direct methods require more introspection and to determine ability requirements for personnel selection
verbalization from the subject, compared to the on a computer-based task. The results supported the
more formal indirect methods. The indirect hypothesis that a consensus-based selection test can be
nature of the formal methods is typically improved beyond the results that were obtained from
associated with more time required to prepare the PAQ by including data derived from task analysis
the materials. techniques adapted for cognitively oriented tasks.
286 J. Wei and G. Salvendy

Protocol analysis refers to having persons think aloud CTA where we need to define and circumscribe the
while performing or describing a task, and then using domain of tasks or jobs, select CTA methods in
verbalization to infer subjects’ cognitive processing Family 1.
(Ryder and Redding 1993). Johnson and Johnson . Guideline 2. When specific procedures involved in
(1987) combined traditional task analysis methods with carrying out a task are not well defined, select
verbal protocol for troubleshooters. They analysed the CTA methods in Family 1.
performance of troubleshooters using traditional meth- . Guideline 3. When we can easily define a task that
ods to identify tasks done during troubleshooting, and is representative of the actual task scenario, and
then collected verbal protocols as technicians went this task has clear process, select CTA methods in
about the job and, therefore, improved the performance Family 2.
of troubleshooters. Ryder and Redding (1993) reviewed . Guideline 4. When a particular process of a task
the recent developments in CTA methods, and devel- and its concurrent task performance need to be
oped the integrated task analysis model (ITAM), a tracked, select CTA methods in Family 2.
framework for integrating cognitive and behavioural . Guideline 5. When data is easily captured by
task analysis methods within the Instructional Systems verbal means and data collection does not affect
Development (ISD) model. They presented ITAM’s the task/job incumbent’s performance such as
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

three analysis stages (progressive cycles of data collec- distraction, select CTA methods in Family 2.
tion, analysis, and decision making) in three compo- . Guideline 6. When domain knowledge, structures,
nents (skills, knowledge, and mental models). This interrelations of tasks need to be defined and
integrated approach could ‘support development of known, select CTA methods in Family 3.
training programs that build a flexible knowledge base, . Guideline 7. When multiple job or task analysers
automated skill components for high performance tasks, are analysing a task, and the task analysis requires
and efficient mental models for task understanding and less introspection and verbalization, select CTA
decision making. Trainees would be provided with methods in Family 3.
better tools for mastering the complex tasks that are . Guideline 8. When a task needs quantitative
increasingly required of workers today (Ryder and predication, and models of a task do not need to
Redding 1993: 93).’ Research on real-time, high- change (or only change a little) when environ-
performance jobs, such as supervisory control and air ments or scenario changes, select CTA methods in
traffic control, has shown that both traditional beha- Family 4.
vioural and cognitive analyses methods are required to . Guideline 9. When task performances are easily
understand performance (Schlager et al. 1990). affected or distracted by any interference, select
For the second type of methods combination, Potter CTA methods in Families 2, 3, and 4.
et al. (1998) described a framework for integrating . Guideline 10. When job or task analysers do not
different types of specific CTA techniques into software have significant knowledge on analytical techni-
system development. Thordsen and Hutton (1998) ques, select CTA methods in Families 1, 2, and 3.
presented a method of combining a cognitive function . Guideline 11. When a task is skill-based, select
model to identify the components of the new systems CTA methods in Families 1 and 2; when a task is
that have varying degrees of cognitive complexity. The rule-based, select in Families 2 and 3; and when a
combination better defined the role of the human in task is knowledge-based, select in Families 3
complex system design for the engineers and designers, and 4.
therefore, assisting the understanding of the roles of
human in complex system designs. Rasmussen (1983) classified tasks into skill, rule, and
knowledge-based categories on human performance.
The classification of human performance in skill, rule,
2.2. CTA method selections and knowledge-based tasks behaviour is the role of the
information observed from the environment. At the
Table 3 summarized the comparisons of these four skill-based level, the perceptual motor system acts as a
CTA methods families presented in the current research. multivariable continuous control system, which syn-
Based on Table 3, some guidelines have been chronies the physical activity. The sensed information
developed to help select the CTA methods in practical for this control is perceived as time-space signals,
applications. indicating time-space behaviour of the environment.
At the rule-based level, the information is perceived as
. Guideline 1. When tasks or jobs do not have a signs. Signs refer to precepts and rules for action. Signs
defined domain, especially in the initial phase of can only be used to select and modify rules controlling
Table 3. CTA methods comparisons and CTA method selections guidelines.
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

CTA methods comparisons

Family 1 – Observations and Family 2 – Process Family 3 – Conceptual Family 4 – Formal


Interviews Tracking Techniques Models

~~
Direct levels Direct Indirect
Formal levels Informal Formal
Defined levels Intuition Somewhat better specified Well specified with Very well defined

Cognitive task analysis methods


some procedures fewer decisions
Training levels (analysers) Job analysers need interview skills Job analysers need advanced

~
and domain knowledge training and preparations,
and significant knowledge
on analytical techniques
(modeling and simulation
techniques)

~ ~ ~
Introspection and Maximum Minimum
verbalization levels (subjects)
Interfering with task Maximum Minimum
performance levels
Output data levels Qualitative Quantitative

CTA selection guidelines


Guidelines 1, 2 Guidelines 3, 4, 5 Guidelines 6, 7 Guideline 8
Guideline 9
Guideline 10
Guideline 11

287
288 J. Wei and G. Salvendy

skilled routine sequences. At the knowledge-base level, However, the framework proposed by Meyer and
symbols are used for casual functional reasoning in Kieras (1997) does not cover all aspects of cognitive job
perceiving or explaining unfamiliar behaviour of the and task performance such as monitoring, motivation,
environment information. Symbols are the basis for and cognitive environments. The EPIC framework and
reasoning and planning, which refer to concepts related models (SRD) are theoretical and complex with lots of
to functional properties (Rasmussen 1983). parameters. Because there are no general agreements
consistent with various data, EPIC framework is not
directly attached to specific tasks. People need to
2.3. Human Centered Information Processing (HCIP) develop specific models for specific tasks. It is costly.
model Their detailed models are also complex and have some
limitations based on assumptions. Moreover, unlimited
One of the existing representative new frameworks of capacity assumption is not practical in job settings.
human information processing for cognitive task The ACT model (Anderson 1993) focused on
analysis is an executive process interactive control problem-solving skills acquisition. The central concept
(EPIC) proposed by Meyer and Kieras (1997). This is the production rule that plays a similar role to the
framework is for characterizing human performance of stimulus-response bond in learning theories. It distin-
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

concurrent perceptual-motor and cognitive tasks and is guishes between declarative and procedural knowledge
the basis of formulating computational models to and uses encoding strength to determine accessibility of
simulate multiple-task performance under a variety of declarative knowledge and performance of procedural
representative circumstances. These models can be knowledge. The advantages of this method are that it
applied to characterize skilled human information provided a basis for constructing intelligent computer-
processing and action performance in multiple tasks based tutoring systems; operators are encoded as rules
under a variety of conditions. As an instructive in an abstract form that can apply across a range of
illustration of how Meyer and Kieras’ theoretical situations; knowledge in production form applies much
framework may be used to model multiple task more rapidly and reliably; variability in problem-solving
performance, one particularly influential experimental behaviour is dealt with; ideas from problem-solving
paradigm, the psychological refractory period (PRP) theory and learning theory are brought together; and
procedure, is investigated. For performance under this rapid and important progress in understanding how
procedure, Meyer and Kieras proposed an explicit complex problem-solving skills are learned is made.
computational model, strategic response-deferment However, this method only applies to the problem-
(SRD) model, which is constructed and tested based solving domain and assumes means-ends problem-
on their production system formalism and EPIC solving structure. The ACT-based representation of
information processing architecture. The proposed knowledge (ARK) model presented by Geiwitz et al.
SRD model is applied in accounting for a variety of (1988) is inspired by the ACT model to elicit both a
quantitative results from the PRP procedure, and leads network of static knowledge about the domain and a set
to interesting new predictions. The good accounts of procedures preformed on that knowledge. This model
obtained suggest that EPIC framework and models is very well suited for job or task planning. However, it
(SRD) built upon this framework have merit. This is very time-consuming. The implementation of this
comprehensive theoretical framework concludes that at model requires a rich data set that requires extensive
a cognitive level, people can apply distinct sets of reduction and interpretation. Therefore, these two
production rules simultaneously for executing multiple models are not suitable for the complete job and task
tasks and individual tasks coordinated through various capability requirement analysis in the cognitive domain.
types of supervisory control and scheduling. A hu- The object-based model presented by Beringer and
man’s information-processing capacity is unlimited Wandmacher (1991) takes advantage of object-oriented
when using flexible scheduling strategies. EPIC includes technology and integrates procedural models such as
few physical constraints, that is, no limitation on GOMS-type model (Card et al. 1983) with the semantic
physical except the visual, such as omission of descriptive level models. This model is an integrating
processing capacity limitation assumptions. The EPIC formalism for representing both the declarative (model-
processing units are visual, auditory, and tactile ling concepts) and the procedural (modelling proce-
perceptual processors that receive inputs from simu- dures) aspects of human – computer interaction and the
lated physical sensors. EPIC has three functionally cognitive task analysis. The advantages are that data
distinct memories: declarative long-term memory, abstraction attributes from object-oriented modelling
procedural memory, and working memory. The work- provided explicit links between semantic and procedural
ing memory does not have capacity limitation. levels. This method bridges the gap between the high
Cognitive task analysis methods 289

level semantic description of a task space and the added to this model based on Salvendy and Knight’s
procedural description of possible activities. A semantic information processing model of the human operator
goal structure depends on the given task, and a GOMS (Salvendy and Knight 1988). The objective of this model
description is always limited to a specific set of tasks and is to capture a variety of aspects of task performance in
a specific functionality. In contrast, this model can the cognitive domain for job design. This model is
derive semantic goal structures for any potential mean- focused at the level of knowledge-based human perfor-
ingful task given a specific functionality in a very mance. Skill- and rule-based human performance is not
complex system. The functionality can be easily varied a focus, because they do not include many cognitive
by adding or removing classes or methods. However, elements such as decision making and problem solving.
this method has the disadvantage that time required and There are two types of job elements: job-oriented and
cost are high, hence, it is not suitable for a large variety worker-oriented. McCormick et al. (1972) distinguished
of jobs in industrial job design. between these two types of job elements. Job-oriented
Cognitive environment simulation (CES) presented by elements are job content descriptions that have a
Roth et al. (1992) is built on an artificial intelligence dominant association with the technological aspects of
problem-solving system representing models of human jobs and generally reflect what the worker achieved.
cognitive activities. This method performs diagnostic Worker-oriented elements are those that tend to
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

reasoning for fault management under dynamic condi- characterize the generalized human behaviours and
tions. The advantages of this method are to build a capabilities involved. The contexts of the present
runnable computer program that forces the modeller to discussion are those of a worker-oriented nature,
describe mechanisms in great detail; to uncover a variety because they offer some possibility of serving as bridges
of consequences of the basic information processing of various technologies. ‘A worker-oriented element
mechanisms that are instantiated in the program; to run can, in effect, be viewed collectively within the frame-
the simulation through a scenario to produce specific work of the stimulus – organism – response paradigm or,
behaviour that can be analysed and compared to other in more operational terms, information input, mediation
data; to produce simulation that can be run on a variety processes, and output’ (McCormick et al. 1972: 348).
of scenarios, including scenarios that were not part of The developed conceptual model somewhat fits into this
the original design set and, therefore, capture human paradigm.
cognitive activities in a wide range of domain-specific Based on the summarized literature reviews on what
circumstance; and to capture the cognitive demands and how traditional task analysis and cognitive task
imposed by dynamic fault-management situations. analysis that can analyse human job and task perfor-
However, it is complex, time consuming, and high cost. mances to assist job design, the current methods are
Therefore, this method is not suitable for a large variety only found to capture parts of the human performance
of jobs in industrial job design. aspects in the cognitive domain.
Wickens’ and Carswell’s resource allocation model Table 4 presents different aspects of task perfor-
(1987, 1997) is a general frame when analysing human mances on cognitive attributes and the affects on
information processing with a focus on attention and cognitive attributes. Cognitive attribute here refers to
memory. However, the cognitive stage is not detailed constructs of various types of human qualities most
enough when analysing human information processing closely related to the human traits in the cognitive
in the cognitive domain. Therefore, in this research, a domain. Some of these are of an ‘aptitude’ nature in the
new cognitive task performance analysis model, a cognitive domain; others are of a ‘situational’ nature in
human-centered information processing (HCIP) model the sense of imposing a requirement on the individual to
for the cognitive task performance, is presented. This adapt to the situation in question in the cognitive
model is developed based on human information domain. The detailed descriptions of these performance
processing theory, which modifies Wickens’ and Cars- aspects in 11 modules are presented as follows.
well’s resource allocation model (1987, 1997) in the
breadth and depth of perception and decision-making 2.3.1. Information interface module: This module
stages. One assumption is made that this model does not captures the input of information or data for cognitive
include the motor stage. The model proposed here also processing. The information input can be achieved
assumes that the human’s information-processing capa- through perceiving stimuli from physical channels such
city is limited, including attention capacity limitation as visual and auditory channels. There are two classes
and working memory capacity limitation. This assump- within this module: search and receive information class,
tion conforms to Pashler’s (1998) limited capacity in and identify objects, actions, and events class. The first
working memory, perceptual capacity, and central class specifies operations such as detect, inspect,
bottleneck limited capacity. The sensory memory is observe, read, receive, scan, survey, and listen to
290
Table 4. Breakdown of cognitive attributes and cognitive affecting attributes.
Cognitive attributes and cognitive affecting attributes

Modules Classes Attributes


Functional modules 1 Information interface a Search and receive information Detect, inspect, observe, read, listen, receive, scan, write
b Identify objects, actions, and Discriminate, identify, locate, judge, compare
events
2 Information handling a Manage existing information Categorize, arrange, classify, code/decode, translate, itemize, tabulate,
combine, order, transcribe, analyse
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

b Predict extra information Imagine, predict


3 Mental plan and schedule a Plan Understand controlled human information processes, satisfy objectives,
set general strategy, decide and test human information processing
control actions
b Schedule Identify goal/criteria, constraints, generate optimized sequences to satisfy
objectives for schedule
4 Mental execution a Generate ideas Produce ideas, generate creative ideas
b Decision making Analyse, calculate, choose, compare, compute, estimate, use rules, etc.
c Problem solving Reasoning such as deductive reasoning and inductive reasoning
5 Monitor a Sense problems Sense problems by acquiring, calibrating and combining measures of

J. Wei and G. Salvendy


state, estimate state from current and past measures, and evaluate state by
diagnosing
b Intervene Correcting failure or errors, execute planned abort or reset if failure, stop
by completion if normal end of information processing, or start to
initialize information processing
6 Communication a Interpersonal communication Advise, negotiate, supervise, coordinate answer, work communication,
direct, indicate, inform, instruct, request, transmit
b Noninterpersonal communication Signal/code transmit and request (by deciding, testing and communicating
commands)
7 Learning a Use of learned information Use job-related experience, use mathematics
b Human learning Training, learn by being told, instruction, deduction, induction, or
analogy, learn by program and other equipment
Resource modules 8 Attention a Visual channel Visual attention
b Audio channel Auditory attention
c Cognitive channel Cognitive attention
9 Memory a Sensory memory Retrieve, store
b Short-term memory Retrieve, store/retention, transfer
c Long-term memory Retrieve, store/retention, incubation
Affect modules 10 Motivation a Job/task attributes Autonomy, task goal clarity, task significance, challenge
b Benefit Achievement, pay adequacy and quick promotion, personal growth,
knowing well from job
c Ability/skill Ability/skill level, ability/skill variety
11 Environment a Physical, physiological and Information presentation mode, stress, impairment, boredom
psychological environment
b Social environment Social interaction assistance, personal sacrifice
Cognitive task analysis methods 291

information or data. The second class recognizes advising, coordinating, negotiating, and supervising,
discrimination, identification, location, judgment, and and so on. Noninterpersonal communications includes
comparison of objects, actions, and events such as signal or code transmission and requesting (by choosing,
position, structures and patterns, colours, shapes, sizes, testing, and communicating commands).
and speeds.
2.3.7. Learning module: This module includes two
2.3.2. Information handling module: This module cap- classes of learning: the use of learned information and
tures human information processing performances human learning. The first class includes the using of job-
related to the handling of information. This information related experience and mathematics. The second class
handling refers to some aspects of the way in which the includes learning by being told or by instruction, by
worker processes job-related information to perform the reasoning (such as inductive and deductive reasoning),
job, including managing existing information, such as and also learning by programmes and other equipment
transcribe (copy/poster), code and decode, translate, (such as recording immediate events when a human
arrange, classify, order, itemize, tabulate, combine, intervenes with equipment, analysing cumulative experi-
analyse information or data, and predicting extra ence, then updating a model). It updates working
information. Although the work activities considered memory and long-term memory based on human
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

might include a substantial physical component, this learning functions.


module only includes some level of mental activity. For
example, data may be collected from records (tran- 2.3.8. Attention module: This module captures human
scribed) and standardized (coded). These data can be cognitive performance related to attention resource
organized into categories (classified). The organized allocation needed for cognitive tasks. The resources
data may be broken into manageable or comprehensible are the limited capacity inventory that supplies the
subsectors (analysed), and subsectors may be combined attention resources to the other modules for jobs. Three
to relate conceptually to create a testable hypothesis attention channels are involved: visual, audio, and
(combined). cognitive channels.

2.3.3. Mental plan and schedule module: This module 2.3.9. Memory module: This module captures human
captures the human cognitive performance related to the cognitive task performance related to memory resources
mental plan and schedule. It specifies goals and task needed for cognitive tasks. The resources limit retrievals
performance criteria such as meeting due dates, max- and stores for sensory, short-term, and long-term
imizing production rate, and maximizing accuracy. It memory demanded by tasks.
specifies constrains to human cognitive capabilities,
generating satisfied or optimized cognitive task execu- 2.3.10. Motivation module: This module refers to the
tion sequences, and so forth. job incumbent’s and task performer’s motivation, which
has a major affect on human cognitive attributes
2.3.4. Mental execution module: This module captures required for human information processing when
human cognitive task performance related to the major performing tasks. There are three classes that determine
execution of mental activities, including generating the motivation of job incumbents and task performers:
ideas, decision making, and problem solving. job/task attributes, benefits, and abilities/skills classes.

2.3.5. Mental monitor module: This module tracks or 2.3.11. Environment module: This module refers to
monitors the status of the mental activities within the environments that have a major affect on human
modules. After an abnormal status is detected, it sends cognitive attributes required for human information
feedback to the related modules for corresponding processing when performing tasks. Two classes are
corrections or intervenes in information processing included in this module. One is the physical,
based on abort, reset, or correction of failures or errors. physiological, and psychological environment. The
For example, after the execution module has a function physical environment refers to the information pre-
such as decision making, go wrong, this module send sentation mode; for example, the information pre-
this information back to the mental plan module for sentation mode is defined through letter image
replanning, rescheduling, or simply stopping. preview on the computer screen, or through the
specific keying method (Karwowskiet et al. 1994).
2.3.6. Communication module: This module refers to The physiological environment of an operator is
interpersonal communications and noninterpersonal defined in terms of stress and impairment. The
communications. Interpersonal communication includes psychological environment includes other goal-direc-
292 J. Wei and G. Salvendy

ted activities that are more or less congruent with the resource allocation. These capacities are of two generic
subject task. The effect of stress on operator forms: (a) each operation has limits in the speed of its
performance can be assumed to be similar to those functioning and in the amount of information that can
of physiological stresses (Fleishman and Quaintance be processed in a given unit of time; and (b) there are
1996: 280 – 281). The other class is the social limits on the total attention or memory, ‘mental
environment class, which refers to social interaction. energy,’ or resources available to the information
These two environment classes may have deleterious processing system. These limits are represented by the
effects on the operator, and potentially may threaten pool of attention resources and memory resources
the effectiveness of task performance. including memory retrieval and storage (Sheridan
Figure 1 presents the conceptual model that logically 1997); module 10 and 11 are modules that specify the
links all cognitive attributes in Table 4 together. attributes affecting cognitive attributes such as motiva-
In Figure 1, there are 11 boxes representing 11 tions and environments. The representation of affec-
categories of generic cognitive attributes and cognitive tions uses the double arrow lines. Each module is
affecting attributes in the perception and decision- further divided into classes, which are templates to
making stages of dynamic human information proces- hold similar cognitive attributes. Each class can be
sing. These categories are defined as modules. Modules further broken down into cognitive attributes based on
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

1 to 7 are functional modules, which represent the functional decomposition or resource allocation. These
major mental activities or functional categories. These cognitive attributes then can be mapped into job
cognitive functional relationships are represented using elements. These mapped job elements have been
solid arrow lines; modules 8 and 9 are resource presented in the structured form of questionnaires for
modules specifying the constraints that reflect resource quantitatively collecting data from the survey for
capacities such as attention and memory, of the various further statistical analysis (Wei and Salvendy 2000,
mental operations involved. The dotted lines represent 2003).

Figure 1. Human centered Information Processing (HCIP) model for cognitive task performance. Note: The solid arrow lines
represent the functional relations between functional modules 1 through 7, the dotted arrow lines the resource allocations for the
resource modules 8 and 9, and the double arrow lines represent the effects on cognitive attributes for affection modules 10 and 11.
Cognitive task analysis methods 293

2.4. HCIP model validation capturing the complete cognitive capability require-
ments for job design.
Table 5 summarized the capture of different aspects of Human cognitive-oriented work can be viewed from
human cognitive task performances in different job and various frames of reference, and thus can be character-
task analysis methods. A series of tasks construct a job. ized in terms of different classes of job-related variables.
In Table 5, G represents the job or task analysis method This research is based on a probing effort relating to
generally covered the cognitive attributes; N, somewhat cognitive capability requirements from workers, or what
covered; and S, extensively covered. The blank means are herein referred to as the worker-oriented aspects of
that the job or task analysis method does not cover the jobs. It would seem that such variables could serve as
cognitive attributes. The numbers 1 – 11 correspond to possible common denominators on which to compare or
the module numbering in Table 4, and the letters a and b contrast jobs of different technological areas. This
correspond to class numbering in Table 4. From Table research provides a system to analyse jobs in terms of
5, we see that no method captures all aspects of task reasonably discrete, separate job elements of a worker-
performance presented in Table 4 in the cognitive oriented nature and serves as the basis for building
domain. statistically related groupings of such elements or
Fleishman and Quaintance (1995, 1996) constructed cognitive job dimensions in order to describe the
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

ability requirement classification and its evaluations. In dominant dimensions of jobs.


their survey, there are 21 cognitive abilities presented. The current research captured the most complete
Table 6 is a checklist to verify that Table 4 captures all cognitive capability requirements for job design based
aspects of cognitive abilities presented by Fleishman on the HCIP model, which is not done in the existing
(1995). Again, these numbers 1 – 11 and a and b in Table literature. The HCIP model is developed based on
5 represent module and class numbering in Table 4, modifying the resource allocation model created by
respectively. The mark, 3, means attributes in Table 4 Wickens and capturing more cognitive aspects than
cover those in Fleishman’s 21 cognitive abilities. Wickens’ resource allocation model based on literature
reviews. The literature reviews show that the CTA
methods only capture partial cognitive capability
3. Discussions and conclusions requirements. For the cognitive attributes illustrated in
these classes in the 11 modules in Table 2, these 26
The current research reviews and reappraises the representative job and task analysis methods summar-
CTA methods for job and task design and provide ized in Table 2 only capture a portion of these cognitive
guidelines for job and task designers when selecting ability requirements. For example, from Table 2,
CTA methods. It differs from other review papers by cognitive attributes for the generate ideas (Class 4a),
providing guidelines on how to use the classification of intervene (Class 5b), human learning (class 7b), cogni-
CTA methods in the current paper to select CTA tive attention (Class 8c), sensory memory (Class 9a),
methods for job and task design. Specifically, it reviews ability and skills (Class 10c), and social environment
the current CTA methods by comparing their advan- (Class 11b) are not or only rarely covered by these
tages and disadvantages; sorts out commonalities and representative 26 methods. Within these 11 modules in
differences of these methods; and provides guidelines in Table 2, the PAQ (McCormick et al. 1972) and PAQ
selecting CTA for different job or task design. The with Protocol Analysis (Koubek et al. 1994) covers nine
results help new CTA analysers to become familiar modules; the AET (Rohmert and Landau 1979, 1988),
with the current CTA methods by providing a whole the Job Element Analysis (Primoff and Eyde 1988), the
picture of current CTA methods; provide guidelines for TAP (Fogli 1988), the Micro Saint (Laghery and Cooker
job or task designers to select most suitable CTA 1992), and MIDAS (Laghery and Cooker 1992, Tyler et
methods; and help CTA experts to develop further al. 1998) cover eight modules; the MJDQ (Campion and
CTA methods or combine CTA methods for further Thayer 1983, Campion 1988, 1989, Campion and
development. Medsker 1992, Medsker and Campion 1997), the
The present research also overviews the current CPM – GOMS (John 1990), the GOMS (Card et al.
research on CTA for the human performance, and 1983), the NGOMSL (Kieras 1988), the object-based
follows a long line of job analysis research and the model (Beringer and Wandmacher 1991), the produc-
development of ability requirement taxonomies such as tion systems (Kieras and Polson 1985), and the task
PAQ and F-JAQ, and attempts to extend this lineage by functions in a generalized information-processing sys-
building on a novel model of cognitive attributes, an tem (Miller 1996) cover six modules; the JCT (Bank
HCIP model for the cognitive task performance based 1988), the OAI (Cunningham 1988), the TAI (Luczak
on human information processing theory. It aims at 1997), the cognitive grammar (Reisner 1981), the MHP
294
Table 5. Cognitive attributes and affecting attributes in job or task analysis methods.
Cognitive attributes and cognitive affecting attributes
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

1 2 3 4 5 6 7 8 9 10 11

Reference Methods a b a b a b a b c a b a b a b a b c a b c a b c a b

Rohmert and Landau (1979), Rohmert (1988) Ergonomic job analysis technique G G N G G G G N G
Hackman and Oldham (1980) Job diagnostic survey (JDS) G
Bank (1988) Job component inventory (JCI) N N G G N G
Primoff and Eyde (1988) Job element analysis G G G N G N N N N N G
Campion and Thayer (1983, 1985) Multimethod job design questionnaire (MJDQ) G G N N N G G
Cunningham (1988) Occupation analysis inventory (OAI) G G G G G
McCormick et al. (1972) Position analysis questionnaire (PAQ) G G G N N N G G N G
Koubek et al. (1994) PAQ with protocol analysis G G G N N N G G N G

J. Wei and G. Salvendy


Luczak (1997) Task analysis inventory (TAI) G G N G G N
Fogli (1988) Task attribute performance (TAP) analysis G G N N N N G N G N G
Anderson (1993) ACT model S S N
Geiweitz et al. (1988) ARK model S S N
Reisner (1981) Cognitive grammar N N N N S
Sheridan (1997) Computer decision aids and mental models G G G G
John (1990) Critical path method GOMS (CPM-GOMS) G G G G S S N N
Card et al. (1983) Goal, operator, methods, and selection rules (GOMS) G G N N S S N N
Card et al. (1983, 1986) Model human processor (MHP) G G N N G
Salvendy and Knight (1988) Information processing model of the human operator G G G G G
Laghery and Cooker (1992) Micro saint G G N N G N N N G
Laghery and Cooker (1992), Tyler et al. (1998) Man machine integrated design and analysis system (MIDAS) G G N G G G G N N N
Kieras (1988) Natural GOMS language (NGOMSL) model G G N N G G N N
Beringer and Wandmacher (1991) Object-based model G G N N G G G G
Kieras and Polson (1985) Production system G G N N G G G G
Crossman (1956) Sensori-motor process chart G G G N
Payne and Green (1986, 1989) Task-action grammar (TAG) N N N N G
Miller (1996) Task functions in a generalized information-processing system G G N N N N

G – The job or task analysis method generally covered the cognitive attributes; N – The job or task analysis method somewhat covered the cognitive attributes; S – The job
or task analysis method extensively covered the cognitive attributes; Blank – The task analysis method does not cover the cognitive attributes; The numbers 1 – 11
correspond to the module numbering in Table 4, and a and b correspond to class numbering in Table 4.
Table 6. Cognitive attributes and affecting attributes from Table 2 and Fleishman
Cognitive attributes and cognitive affecting attributes of HCCP model
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

1 2 3 4 5 6 7 8 9 10 11

Cognitive factors from Fleishman a b a b a b a b c a b a b a b a b c a b c a b c a b


1 Oral comprehension 3 3 3 3 3 3 3 3 3
2 Written comprehension 3 3 3 3 3 3 3 3 3
3 Oral expression 3 3 3 3 3 3 3 3 3
4 Written expression 3 3 3 3 3 3 3 3 3

Cognitive task analysis methods


5 Fluency of ideas 3 3 3 3
6 Originality 3 3 3 3
7 Memorization 3 3 3 3 3 3 3 3 3 3 3
8 Problem sensitivity 3 3 3
9 Mathematical reasoning 3 3 3 3
10 Number facility 3 3 3 3 3 3 3
11 Deductive reasoning 3 3 3 3
12 Inductive reasoning 3 3 3 3
13 Information ordering 3 3 3 3 3 3 3 3
14 Category flexibility 3 3 3 3 3 3 3 3 3
15 Speed of closure 3 3 3 3 3 3 3 3 3 3 3
16 Flexibility of closure 3 3 3 3 3 3
17 Spatial orientation 3 3 3 3 3 3 3
18 Visualization 3 3 3 3 3 3 3
19 Perceptual speed 3 3 3 3 3 3
20 Selective attention 3 3 3 3 3 3 3 3 3 3 3
21 Time sharing 3 3 3 3 3 3 3 3 3 3 3
Note: Numbers 1 – 11 and a and b represent module numbering and class numbering, respectively. Mark (3) means attributes in Table 2 cover those in Fleishman’s 21
cognitive abilities (Fleishman, 1995).

295
296 J. Wei and G. Salvendy

(Card et al. 1983, 1986), the information processing CAMPION, M. A. and MEDSKER, G. J. 1992, Job design. In
model of the human operator (Salvendy and Knight Gavriel Salvendy (eds), Handbook of human factors, Chapter
32 (New York: John Wiley & Sons), pp. 645 – 881.
1988), and the TAG (Payne and Green 1986, 1989) CAMPION, M. A. and THAYER, P. W. 1983, Job design:
cover five modules; the computer decision aids and approaches, outcomes, and trade-offs. Organizational Dy-
mental models (Sheridan 1997) and the sensori-motor namics, 15(3), 66 – 79.
process chart (Crossman 1956) cover four modules; the CAMPION, M. A. and THAYER, P. W. 1985, Development and
ACT (Anderson 1993) and the ARK (Geiwitz et al. field evaluation of an interdisciplinary measure of job
design. Journal of Applied Psychology, 70(1), 29 – 43.
1988) models cover three modules; and the JDS (Hack- CARD, S. K., MORAN, T. P. and NEWELL, A. L. 1983, The
man and Oldham 1975, 1980) only cover one module. psychology of the human-computer interface. (Hillsdale, NJ:
Future research can be conducted by applying object- Lawrence Erlbaum Associates).
oriented technology to systematically analyse and design CARD, S. K., MORAN, T. P. and NEWELL, A. L. 1986, The model
these elements in the HCIP model in a more structured human processor: an engineering model of human perfor-
mance. In K. R. Boff, L. Kaufman and J. P. Thomas (eds),
class object format. The classes developed in the current Handbook of perception and human performance, Vol 2 (New
research can be further described using attributes and York: John Wiley & Sons).
member functions conforming to the object oriented CARROLL, J. M. and OLSON, J. R. 1988, Mental models in
concepts. Scenarios of the human information proces- human-computer interaction. In M. Helander (Eds), Hand-
book of human-computer interaction. (Amsterdam: Elsevier).
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

sing can be modelled in computers to visually monitor


CHRISTAL, R. E. and WEISSMULLER, J. J. 1988, Job-task
human information processing for different tasks, since inventory analysis. In S. Gael (ed), The job analysis
the design and development of the modular structure of handbook for business, industry, and government, Vol II
this HCIP model conforms to the theory of object- (New York: John Wiley & Sons), pp. 1036 – 1050.
oriented analysis and design. Object-oriented languages CLARKE, B. 1987, Knowledge acquisition for real-time knowl-
such as C++ and Java may be used to develop a edge-based systems. In Proceedings of the First European
Workshop on Knowledge Acquisition for Knowledge Based
modelling package. Systems, 2 – 3 September. Reading University, UK.
COOKE, N. J. 1994, Varieties of knowledge elicitation
techniques. International Journal of Human-Computer stu-
dies, 41, 801 – 849.
CROSSMAN, E. R. F. W. 1956, Perceptual activity in manual
work, Research, 9, 42 – 49.
References CUNNINGHAM, J. W. 1988, Occupation analysis inventory. In S.
Gael (ed), The job analysis handbook for business, industry,
ANDERSON, J. R. 1976, Language, memory, and thought. and government, Vol II (New York: John Wiley & Sons),
(Hillsdale, NJ: Lawrence Erlbaum Associates). pp. 975 – 992.
ANDERSON, J. R. 1993, Problem solving and learning. American DAVIS, L. E. and WACKER, G. L. 1987, Job design. In G.
Psychologist, 48(1), 35 – 44. Salvendy (ed), Handbook of Human Factors (New York:
AREND, U. 1991, Analyzing complex tasks with an extended John Wiley & Sons).
GOMS model. In M. J. Tauber and D. Ackermann (eds), DRAUDEN, G. M. 1988, Task inventory analysis in industry and
Mental models of human-computer interaction 2 (B.V. North- the public sector. In S. Gael (ed), The job analysis handbook
Holland: Elsevier), pp. 115 – 133. for business, industry, and government, Vol II (New York:
BANK, M. H. 1988, Job component inventory. In S. Gael (ed), John Wiley & Sons), pp. 1051 – 1071.
The job analysis handbook for business, industry, and DRURY, C. G. 1990, Methods for direct observation of
government, Vol II (New York: John Wiley & Sons), performance. In J. R. Wilson & E. N. Corlett (eds),
pp. 960 – 974. Evaluation of human work: a practical ergonomics methodol-
BERINGER, J. and WANDMACHER, J. 1991, Object-based action ogy (London: Taylor & Francis), pp. 35 – 57.
planning. In M. J. Tauber and D. Ackermann (eds), Mental EBERTS, R. 1997, Cognitive Modeling. In Gavriel Salvendy
models of human-computer interaction 2 (B.V. North-Hol- (ed.), Handbook of Human Factors & Ergonomics (2nd Ed.),
land: Elsevier), pp. 135 – 155. Chapter 40 (New York: John Wiley & Sons), (pp. 1 – 47).
BOOSE, J. H. 1990, Uses of repertory grid-centered knowledge EBERTS, R., MAJCHRZAK, A., PAYNE, P. and SALVENDY, G. 1990,
acquisition tools for knowledge-based systems. In J. H. Integrating social and cognitive factors in design of human-
Boose and B. R. Gaines (eds), The foundations of knowledge computer interactive communication. International Journal
acquisition (San Diego, CA: Academic Press), pp. 61 – 84. of Human-Computer Interaction, 2(1), 1 – 27.
BOVAIR, S., KIERAS, D. E. and POLSON, P. G. 1990, The ELKERTON, J. and PALMITER, S. L. 1991, Designing help using a
acquisition and performance of text-editing skill; a cognitive GOMS model: an information retrieval evaluation. Human
complexity analysis. Human Computer Interaction, 5, 1 – 48. Factors, 33(2), 185 – 204.
CAMPION, M. 1988, Interdisciplinary approaches to job design: ERICSSON, K. A. and SIMON, H. A. 1984, Protocol analysis:
a constructive replication with extensions. Journal of Applied verbal reports as data (Cambridge, MA: MIT Press).
Psychology, 73(3), 467 – 481. FISCHER, G. 1991, The importance of models in making
CAMPION, M. 1989, Ability requirement implications of job complex systems comprehensible. In M. J. Tauber and D.
design: an interdisciplinary perspective. Personnel Psychol- Ackermann (eds), Mental models of human-computer inter-
ogy, 42, 1 – 24. action 2 (B.V. North-Holland: Elsevier), pp. 3 – 33.
Cognitive task analysis methods 297

FISK, A. D. and EGGEMEIER, F. T. 1988, Application of JOHN, B. E. 1990, Extension of GOMS analyses to expert
automatic/controlled processing theory to training tactical performance requiring perception of dynamic auditory and
command and control skills: 1. Background and task visual information. In Proceedings of CHI ‘90: human
analytic methodology. In Proceedings of the human factors factors in computing systems (New York: ACM/SIGCHI),
society 32nd annual meeting (Santa Monica, CA: Human pp. 107 – 115.
Factors Society), pp. 1227 – 231. JOHN, B. E., VERA, A. H. and NEWELL, A. 1994, Towards real-
FLEISHMAN, E. A. 1995, Rating scale booklet: Fleishman Job time GOMS: a model of expert behavior in a highly
Analysis Survey. Management Research Institute, Inc. interactive task. Behavior & Information Technology, 13(4),
FLEISHMAN, E. A. and QUAINTANCE, M. K. 1996, Taxonomies of 255 – 267.
human performance: the description of human tasks. Manage- JOHNSON, L. and JOHNSON, N. E. 1987, Knowledge elicitation
ment Research Institute, Inc. involving teachback interviewing. In A.C. Kidd (ed),
FOGLI, L. 1988, Task attribute performance analysis. In S. Gael Knowledge acquisition for expert systems: a practical hand-
(ed), The job analysis handbook for business, industry, and book. (New York: Plenum Press).
government, Vol II (New York: John Wiley & Sons), KARWOWSKI, W., EBERTS, R., SALVENDY, G. and NOLAND, S.
pp. 1105 – 1119. 1994, The effects of computer interface design on human
GEIWITZ, J., KLATSKY, R. L. and MCCLOSKEY, B. P. 1988, postural dynamics. Ergonomics, 37(4), 703 – 724.
Knowledge acquisition for expert systems: conceptual and KIERAS, D. E. 1988, Towards a practical GOMS model
empirical comparisons. (Santa Barbara, CA: Anacapa methodology for user interface design. In M. Helander
Sciences). (ed), Handbook of human-computer Interaction. (Amster-
GEIWITZ, J., KORNELL, J. and MCCLOSKEY, B. 1990, An expert dam: Elsevier).
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

system for the selection of knowledge acquisition techniques. KIERAS, D. E. and POLSON, P. 1985, An approach to the formal
Technical Report 785(2). (Santa Barbara, CA: Anacpapa analysis of user complexity. International Journal of Man-
Sciences). Machine Studies, 22, 365 – 394.
GORDON, S. E., SCHMIERER, K. A. and GILL, R. T. 1993, KIRWAN, B. and AINSWORTH, L. K. (eds). 1992, A guide to task
Conceptual graph analysis: knowledge acquisition for analysis. (London, Washington, DC: Taylor & Francis Ltd).
instructional systems design. Human Factors, 35, 459 – 481. KLEIN, G. and MILITELLO, L. G. 1998, Cognitive task analysis.
GRAY, W. D., JOHN, B. E. and ATWOOD, M. E. 1993, Project In Workshop of human factors and ergonomics society 42nd
ernestine: validating a GOMS analysis for predicting and Annual Meeting, No.12. Chicago, Illinois.
explaining real-world task performance. Human-Computer KLEIN, G. A., CALDERWOOD, R. and MACGREGOR, D. 1989,
Interaction, 8, 237 – 309. Critical decision method for eliciting knowledge. IEEE
HACKMAN, R. J. and OLDHAM, G. R. 1975, Development of job Transactions on Systems, Man, & Cybernetics, 19, 462 – 472.
diagnostic survey. Journal of Applied Psychology, 60(2), KOUBEK, R. J., SALVENDY, G. and NOLAND, S. 1994, The use of
159 – 170. protocol analysis for determining ability requirements for
HACKMAN, R. J. and OLDHAM, G. R. 1980, Work redesign. personnel selection on a computer-based task. Ergonomics,
(Reading, MA: Addison-Wesley). 37(11), 1787 – 1800.
HALL, E. M., GOTT, S. P. and POKORNY, R. A. 1994, A KULIK, C. T. and OLDHAM, G. R. 1988, Job diagnosis survey. In
procedural guide to cognitive task analysis: the PARI S. Gael (ed), The job analysis handbook for business, industry,
methodology. Brooks AFB, TX. and government, Vol II (New York: John Wiley & Sons),
HOLLNAGEL, E. and WOODS, D. D. 1983, Cognitive systems pp. 936 – 959.
engineering: new wine in new bottle. International Journal of LAGHERY, K. R. and COOKER, K., 1992, Computer modeling
Man-Machine Studies, 18, 583 – 600. and simulation. In G. Salvendy (ed), Handbook of human
HOFFMAN, R. R. 1987, The problem of extracting the knowl- factors & ergonomics (2nd Ed.), Chapter 41 (New York:
edge of experts from the perspective of experimental John Wiley & Sons), pp. 1375 – 1408.
psychology. AI Magazine, 8, 53 – 67. LEHTO, M. R., BOOSE, J., SHARIT, J. and SALVENDY, G. 1992,
JEANNERET P. R. and MCCORMICK, E. J. 1969, The job Knowledge acquisition. In G. Salvendy (ed), Handbook of
dimensions of ‘work oriented’ job variables and of their industrial engineer (2nd ed), Chapter 58 (New York: John
attribute profiles as based on data from the position analysis Wiley & Sons), pp. 1495 – 1545.
questionnaire, Report 2. Occupational Research Center, LESGOLD, A., RUBINSON, H., FELTOVICH, P., GLASER, R.,
Purdue University, West Lafayette, IN 47907. KLOPFER, D. and WANG, Y. 1988, Expertise in a complex
JEANNERET, P. R. 1991, Introductory guide for use with the skill: diagnosing x-ray pictures. In M. T .H. Chi, R. Glaser
position analysis questionnaire. (Logan, Ulta: PAQ Services, and M. J. Farr (eds). The nature of expertise (Hillsdale, NJ:
Inc). Lawrence Erlbaum Associates, Publishers), pp. 311 – 342.
JEANNERET, P. R. and MCPHAIL, S. M. 1991, Position analysis LOVE, K. G. and O’HARA, K. 1987, Predicting job performance
questionnaire: the standard for job analysis. (Logan, Ulta: of youth trainees under a job training partnership act
PAQ Services, Inc). program (JTPA): criterion validation of a behavior-based
JEANNERET, P. R. 1992a, Job analysis guide: major duties, measure of work maturity. Personnel Psychology, 40, 323 –
essential functions, and job requirements. (Logan, Ulta: PAQ 340.
Services, Inc). LUCZAK, H. 1997, Task analysis. In G. Salvendy (ed), Hand-
JEANNERET, P. R. 1992b, User’s manual for PAQ ENTER-ACT: book of human factors and ergonomics (3rd ed), Chapter 12
a micro-computer software package for use with the position (New York: John Wiley & Sons), pp. 340 – 416.
analysis questionnaire, Version 3.1, (Logan, Ulta: PAQ
Services, Inc).
298 J. Wei and G. Salvendy

MANCUSO, J. C. and SHAW, M. L. G. 1988, Cognition and NEERINCX, M. A. and GRIFFIOEN, E. 1996, Cognitive task
personal structure: computer access and analysis. (New York: analysis: harmonizing tasks to human capacities. Ergo-
Praeger). nomics, 39(4), 543 – 561.
MCCORMICK, E. J. 1979, Job analysis: methods and applications. NEWELL, A. and SIMON, H. 1972, Human problem solving.
AMACOM. (Englewood Cliffs, NJ: Prentice Hall).
MCCORMICK, E. J. 1977, Job analysis manual for the position NORMAN, D. A. 1984, Stages and levels in human-machine
analysis questionnaire. (Logan, Ulta: PAQ Services, Inc). interaction. International Journal of Man-Machine Studies,
MCCORMICK, E. J., MECHAM, R. C. and JEANNERET, P. R. 1977, 21, 365 – 370.
Technical manual for the position analysis questionnaire PASHLER, H. E. 1998, The psychology of attention. (Cambridge,
(System II). (Logan, Ulta: PAQ Services, Inc). MA: The MIT Press).
MCCORMICK, E. J., MECHAM, R. C. and JEANNERET, P. R. 1989, PAYNE, S. J. and GREEN, T. R. G. 1986, Task-action grammars:
Technical manual for the position analysis questionnaire, 2nd a model of the mental representation of task languages.
edn. (Logan, Ulta: PAQ Services, Inc). Human-Computer Interactions, 2, 93 – 133.
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1969a, PAYNE, S. J. and GREEN, T. R. G 1989, The structure of
A study of job characteristics and job dimensions as based on command languages: an experiment on task-action gram-
the position analysis questionnaire, Report No. 6. Occupa- mar. International Journal of Man-Machine Studies, 30,
tional Research Center, Purdue University, West Lafayette, 213 – 234.
IN. POLSON, P. G. and LEWIS, C. H. 1990, Theory-based design for
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1969b, easily learned Interfaces. Human-Computer Interaction, 5,
Position analysis questionnaire. Occupational Research 191 – 220.
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

Center, Purdue University, West Lafayette. POTTER, S. S., ROTH, E. M., WOODS, D. D. and ELM, W. C.
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1969c, 1998, A framework for integrating cognitive task analysis
The development and background of the position analysis into the system development process. In Proceedings of the
questionnaire, Report No.5. Occupational Research Center, human factors and ergonomics society 42nd Annual Meeting,
Purdue University, West Lafayette, IN. pp. 395 – 399.
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1972, PRIMOFF, E. S. and FINE, S. A. 1988, A history of job analysis.
A study of job characteristics and job dimensions as based In G. Salvendy (ed) Handbook of industrial engineering (New
on the position analysis questionnaire. Journal of Applied York: John Wiley & Sons), pp. 1415 – 445.
Psychology Monograph, 56(4), 347 – 368. PRIMOFF, E. S. and EYDE, L. D. 1988, Job element analysis. In
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1989, S. Gael (ed), The job analysis handbook for business, industry,
Position analysis questionnaire. (Palo Alto, CA: Consulting and government, Vol II (New York: John Wiley & Sons),
Psychologists Press, Inc). pp. 807 – 824.
MCPHAIL, S. M., JEANNERET, P. R., MCCORMICK, E. J. and RASMUSSEN, J. 1983, Skills, rules, and knowledge: signals, signs,
MECHAM, R. C. 1992, Job analysis manual for the position and symbols, and other distinctions in human performance
analysis questionnaire, Revised Edition. (Palo Alto, CA: models. IEEE, 257 – 266.
Consulting Psychologists Press, Inc). RASMUSSEN, J., DUNCAN, K. and LEPLAT, L. 1987, New
MECHAM, R. C., MCCORMICK, E. J. and JEANNERET, P. R. 1977, Technology and Human Error. (New York: John Wiley &
Users manual for the position analysis questionnaire (System Sons).
II). (Logan, Ulta: PAQ Services, Inc). REASON, J. 1987, Generic errors modeling systems (GEMS): a
MECHAM, R. C. and MCCORMICK, E. J. 1969a, The use in job cognitive framework for locating common error forms. In J.
evaluation of job elements and job dimensions based on the Rasmussen, K. Duncan, and L. Leplat (eds) New Technol-
position analysis questionnaire, Report No.3. Occupational ogy and Human Error (New York: John Wiley & Sons),
Research Center, Purdue University, West Lafayette, IN. pp. 63 – 86.
MECHAM, R. C. and MCCORMICK, E. J. 1969b, The use of data REISNER, P. 1981, Formal grammar and human factors design
based derived attribute requirements of jobs, Report No.4. of an interactive graphics system. IEEE Transactions on
Occupational Research Center, Purdue University, West Software Engineering, SE-7, 229 – 240.
Lafayette, IN, USA. ROHMERT, W. and LANDAU, K. 1979, A new techniques for job
MEDSKER, G. J. and CAMPION, M. A. 1997, Job and team analysis (AET). (New York: Taylor & Francis Inc).
design. In G. Salvendy (ed) Handbook of human factors and ROHMERT, W. 1988, AET. In S. Gael (eds), The job analysis
ergonomics, 3rd edn. (New York: John Wiley & Sons) handbook for business, industry, and government, Vol II (New
pp. 450 – 489. York: John Wiley & Sons), pp. 843 – 859.
MEYER, D. E. and KIERAS, D. E. 1997, A computational theory ROTH, E. M., WOODS, D. and POPLE, H. E. 1992, Cognitive
of executive cognitive processes and multiple-task perfor- simulation as a tool for cognitive task analysis. Ergonomics,
mance, Part I: Basic mechanisms. Psychological Review, 35(10), 1163 – 1198.
104(1), 3 – 65. RYDER, J. M. and REDDING, R. 1993, Integrating cognitive task
MITCHELL, J. L. and MCCORMICK, E. J. 1990, Professional and analysis into instructional systems development. Educational
managerial position questionnaire (PMPQ). (Palo Alto, CA: Technology Research and Development, 41(2), 75 – 96.
Consulting Psychologists Press, Inc). RYDER, J. M. and ZACHARY, W. W. 1991, Experimental
MILLER, R. B. 1996, Development of a taxonomy of human validation of the attention switching component of the
performance: design of a system task vocabulary. American COGNET framework. In Proceedings of the human factors
Institutes for Research Tech. Rep. JSAS Catalog of Selected society 35th annual meeting (Santa Monica, CA), pp. 72 – 76.
Documents in Psychology, 3, 29 – 30. (Ms. No. 327)
Cognitive task analysis methods 299

SALVENDY, G. and SEYMOUR, W. D. 1973, Prediction and TYLER, S. W., NEUKOM, C., LOGAN, M. and SHIVELY, J. 1998,
development of industrial work performance. (New York: MIDAS human performance model. In Proceedings of the
John Wiley & Sons). human factors and ergonomics society 42nd annual meeting,
SALVENDY, G. and KNIGHT, J. J. 1988, Psychomotor perfor- pp. 320 – 324.
mance and information processing. In S. Gael (ed) The Job VISSER, W. and MORAIS, A. 1991, Concurrent use of different
Analysis Handbook for Business, Industry, and Government, expertise elicitation methods applied to the study of the
Vol I (New York: John Wiley & Sons), pp. 630 – 695. programming activity. In M. J. Tauber and D. Ackermann
SANDERSON, P. M., JAMES, J. M. and SEIDLER, K. 1989, SHAPA: (eds) Mental models of human-computer interaction 2 (B.V.
An interactive software environment for protocol analysis. North-Holland: Elsevier), pp. 97 – 113.
EPRL-89-08. University of Illinois at Urbana-Champaign, WEI, J. and SALVENDY, G. 2000, Development of the Purdue
Urbana, IL. Cognitive Job Analysis Methodology. International Journal
SCHLAGER, M. S., MEANS, B. and ROTH, C. 1990, Cognitive task of Cognitive Ergonomics. Vol. 4, No. 4 (London: Lawrence
analysis for the real-time world. In 34th annual proceedings Erlbaum Associates), p. 277 – 296.
of the human factor society (Santa Monica, CA: Human WEI, J. and SALVENDY, G. 2003, The Utilization of the Purdue
Factor Society), pp. 1309 – 1313. Cognitive Job Analysis Methodology, Human Factors and
SCHMIDT, F. L. and HUNTER, J. E. 1977, Development of a Ergonomics in Manufacturing. Vol. 13, No. 1. (New York:
general solution to the problem of validity generalization. John Wiley & Sons), pp. 59 – 84.
Journal of Applied Psychology, 62, 529 – 540. WICKENS, C. D. 1987, Information processing, decision-
SCHRAAGEN, J. M., CHIPMAN, S. F. and SHALIN, V. L. 2000, making, and cognition. In G. Salvendy (ed) Handbook of
Cognitive task analysis. (London: Lawrence Erlbaum human factors, Chapter 2.2 (New York: John Wiley & Sons),
Downloaded by [Stockholm University Library] at 17:07 10 August 2015

Associates). pp. 72 – 107.


SEAMSTER, T. L., REDDING, R.E. and KAEMPF, G. L. 1997, WICKENS, C. D. and CARSWELL, C. M. 1997, Information
Applied cognitive task analysis in aviation. (Bodmin: Hart- processing. In G. Salvendy (ed) Handbook of human factors
nolls Limited). 2nd edn. Chapter 4. (New York: John Wiley & Sons).
SHERIDAN, T. B. 1997, Task analysis, task allocation and WILSON B. and COLE, P. 1991, A review of cognitive teaching
supervisory control. In M. Helander, T. K. Landauer, and models. Educational Technology Research and Development,
P. Prabhu (eds), Handbook of human-computer interaction 39(4), 47 – 64.
2nd edn. (Elsevier), pp. 87 – 105.
THORDSEN, M. and HUTTON, R. 1998, The cognitive function
model. In Proceedings of the human factors and ergonomics
society 42nd annual meeting, pp. 385 – 389.

You might also like