The Cognitive Task Analysis Methods For Job and Task Design - Review and Reappraisal
The Cognitive Task Analysis Methods For Job and Task Design - Review and Reappraisal
4, 273–299
Abstract. This paper reviews and reappraises the current ods derived from cognitive science have begun to be used
research on the cognitive task analysis methodology for job or to conduct CTA for training research programmes,
task design and analysis. Specifically, it classifies the current curriculum redesign, and computer-based training devel-
cognitive task analysis methods for job or task design and
analysis, sorts out commonalities and differences among all opment. CTA has also been used in cognitive engineering
these cognitive task analysis methodology for job and task of human – machine systems, intelligent tutoring system
design and analysis by conducting pros and cons comparisons, development, decision support system design, and knowl-
and provides guidelines in selecting cognitive task analysis edge elicitation and acquisition for expert systems (Ryder
methods for job and task design and analysis. Moreover, based and Redding 1993). Lehto et al. (1992) used knowledge
on the current literature review, a validated human-centered
information-processing model for cognitive task performance acquisition methods on task analysis and performance
was developed based on human information processing theory. measurement in the cognitive domain for an expert system
This new model focuses on identifying all cognitive aspects of development. Some examples of jobs with strong
human performance in technical work, with the goal of cognitive components are situation assessment and
assisting job (re)design to increase human job performance. intelligence analysis, aviation and air traffic control,
process control, sensor data interpretation, and equip-
ment maintenance and troubleshooting (Ryder and
1. Introduction Redding 1993). Schraagen et al. (2000) identified two
areas of the CTA applications, one for the individual
The purpose of this section is to provide an overview of training, performance assessment, and selection, and the
cognitive task analysis (CTA) methods for job and task other for design of human – system interaction.
design by presenting the concepts of CTA, the history of
job design, and requirements of CTA for job design. 1.1.1. Concepts: A cognitive task is defined as a group
of related mental activities directed toward a goal
(which may not be clear) (Klein and Militello 1998).
1.1. CTA The cognitive task activities are unobservable. Many of
the unobservable activities are mental and often good
Although traditional task analysis techniques have candidates for a cognitive task analysis.
made significant contributions in improving productivity Klein and Militello (1998) defined cognitive task
when important task elements are visually observable, analysis (CTA) as the description of the cognitive skills
their focus on manual task procedures make them less needed to perform a task proficiently. ‘CTA is primarily
effective for cognitively oriented activities (Koubek et al. valuable for tasks that depend on cognitive aspects of
1994). Recent advances in cognitive science provide new expertise, such as decision making and problem solving
ways of characterizing learning and skill development (p. 6).’ They noted that there existed a need for an
that are appropriate for complex cognitive tasks. Meth- efficient and validated method for identifying cognitive
task requirements which could be (a) incorporated into ogy based on a single sequence of behaviours will
other methods to provide a complete picture of support only one way of performing the task. In
performance and (b) did not require extensive training contrast, CTA methods attempt to identify problem-
to use and significant resources to implement. solving strategies that may be manifest in variable
Schraagen et al. (2000) defined the cognitive task sequences of actions depending on the environmental
analysis (CTA) as ‘the extension of traditional task dynamics in each task and also to identify important
analysis techniques to yield information about the individual differences (Ryder and Redding 1993, Neer-
knowledge, thought processes, and goal structures that incx and Griffioen 1996, Seamster et al. 1997).
underlie observable task performance’.
CTA provides the tools for understanding the
cognitive elements of job performance. This under- 1.2. Job design history
standing is necessary for designing jobs that support and
maximize cognitive skill performance. The CTA differs Koubek et al. (1994) identified four historical phases
from traditional methods in a number of ways, as of job analysis based on three chronological periods
outlined in table 1. Table 1 is constructed based on taken from Primoff and Fine (1988):
Seamster et al. (1997) and Ryder and Redding (1993).
.
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
A traditional task analysis approach usually has an Phase I (1865 – 1918), which formed the founda-
observable process and emphasizes the behaviour, tions for job and task analysis when Taylor and
whereas CTA has an unobservable process and empha- Gilbreths began to produce techniques to identify,
sizes the cognition. A traditional task analysis approach measure, and organize the elements of manual
emphasizes the target performance desired; in contrast, work tasks.
CTA addresses expertise. Expertise refers to the knowl- . Phase II (1918 – 1945), which shifted from analys-
edge structure and information processing strategies ing specific elemental task components of manual
involved in task performance. A traditional task analysis jobs to identification of the skills and abilities
method focuses on identifying the knowledge required necessary for successful job performance. Ques-
for each individual task element, whereas CTA empha- tionnaires instead of visual observation of jobs
sizes the knowledge base for the whole job: its were developed and used to identify underlying
organization and the interrelations among concepts or abilities, thus providing opportunities to extend
knowledge elements. The emphasis on the knowledge the benefits of job analysis to jobs in which the
base for the whole job provides useful information for important tasks were more cognitive in nature,
structuring training to facilitate initial learning as well as such as jobs performed by a manager.
progression to the knowledge organization used by . Phase III (1945 – 1990) focused on consensus-
expert. Skills are identified for each separate task within based techniques to analyse underlying compo-
a traditional task analysis approach and those are for nents of work based on the activities of McCor-
the job as a whole within CTA. CTA includes mick et al. (1972). McCormick and Jeanneret
determination of mental models used in task perfor- developed the Position Analysis Questionnaire
mance, whereas traditional methods do not contain (PAQ) to identify the underlying work compo-
mental models. Traditional approaches cannot charac- nents, and the skills and abilities to perform these
terize variability in performance within and between work activities (McCormick 1977, 1979, McCor-
individuals. Because a single individual can perform a mick et al. 1969a,b,c, 1972, 1989, McCormick et
task in a variety of ways and use different sequences and al. 1977, 1989, Jeanneret and McCormick 1969,
methods, adopting a traditional task analysis methodol- Mecham and McCormick 1969a,b, Mecham et al.
Table 1. Traditional task analysis and cognitive task analysis comparisons (modified, based on Seamster, Redding and Kaempf
(1997), Ryder and Redding (1993)).
Traditional task analysis Cognitive task analysis (CTA)
Process observed Process not observed
Behaviour emphasized Cognition emphasized
Target performance analysed Expertise analysed
Knowledge for each task evaluated separately Interrelationship among knowledge elements for whole job evaluated
Segments tasks according to behaviours required Segments tasks according to cognitive skills required
Mental models not addressed Mental model addressed
Only one way to perform described Individual differences accounted for
Cognitive task analysis methods 275
1977). The PAQ is one of the most widely used detailed cognitive oriented task specific level (Koubek et
standard structured job analysis techniques com- al. 1994). Therefore, the current research focuses on the
mercially available in the United States (Jeanneret development of a cognitive oriented consensus-based
1991, 1992a,b, Jeanneret and McPhail 1991, questionnaire and psychological rating scales to sample
McPhail et al. 1992). The recently developed job at a complete and more detailed cognitive oriented task
analysis technique, Professional and Managerial specific level.
Position Questionnaire (PMPQ), focused on jobs Second, traditional methods for task analysis break
performed by managers (Mitchell and McCormick down jobs into discrete tasks composed of specific
1990). Rohmert and his colleague (Rohmert 1988 action sequences and identify prerequisite knowledge
and Romert and Landau 1979, Kulik and Oldham and skills for each task. Although these methods have
1988) also presented a technique for job analysis been effective for designing jobs for simple procedural
called Ergonomic Job Analysis Technique (AET). skills, they offer little insight for analysis of jobs
In addition to the consensus-based techniques, involving complex cognitive skills. Because of this,
Phase III also advocated a view of validity cognitive considerations need to be incorporated into
generalization. The validity generalization em- task analysis. Recently, cognitive methods have begun
phases the generalizability of ability requirement to be used to conduct task analysis for training program
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
across a number of jobs, therefore, abilities development (Love and O’Hara 1987, Polson and Lewis
necessary for successful performance remain fairly 1990, Wilson and Cole 1991) and human – computer
consistent across jobs and detailed analyses are system development (Eberts et al. 1990, Ryder and
less important (Schmidt and Hunter 1977). Redding 1993).
. Phase IV (1990 – present) focused on cognitive Third, technological development has significantly
task analysis (CTA) by combining techniques increased the complexity of job and task designs. The
from different disciplines into CTA methodology, role of cognitively oriented tasks in the workplace
such as combining knowledge elicitation for continues to increase as automation of physical task
expert systems (Visser and Morais 1991) and components advance. The operators’ tasks in highly
cognitive science (Anderson 1993) into CTA automated systems contain more and more planning
methodology. Koubek et al. (1994) integrated and decision components. The analysis and registration
protocol analysis with McCormick’s PAQ to of these components include mental activities in tradi-
develop a cognitive task analysis methodology, tional techniques; however, only task-oriented beha-
which identifies how operator abilities are used by viour is analysed at the level of skill-based and rule-
mapping the decision-making process. Results based action patterns (Luczak 1997). The design of a
from the developed cognitive task analysis show properly functioning job requires a different kind of
that a consensus based analysis technique can be knowledge to describe the cognitive or mental functions
significantly improved for identifying non-physi- at the knowledge-based information processing level.
cal task components. This focus complemented In computerized work, the growth of computer
the consensus-based approach that addressed applications has radically changed the nature of job
what abilities were required for successful task designs in two aspects. First, through increased auto-
performance. mation, the nature of the human’s task has shifted from
an emphasis on perceptual motor skills to an emphasis
on cognitive activities, such as problem solving and
1.3. Requirements of CTA in job design decision making. Second, through the increasing so-
phistication of computer applications, the job design in
There are three major requirements for the CTA. computerized work is gradually emphasizing the inter-
First, the history of job and task analysis showed a trend action between two cognitive systems (Hollnagel and
from analysing specific work elements to identify the Woods 1983, Christal and Weissmuller 1988).
underlying job factors with complex methodologies
(Davis and Wacker 1987, Drauden 1988). However,
the detailed levels available by measuring actual physical 2. Methodology
work activities in the traditional methods lack sensitivity
to cognitive task components, and therefore, produce This paper classifies the current cognitive task analysis
somewhat unsatisfactory results for cognitively oriented methods for job or task design and analysis, and sorts
jobs. The traditional consensus-based questionnaire and out commonalities and differences among all these
psychological rating scales can sample some general cognitive task analysis methodology. Guidelines and
cognitive activities, but not at a complete or a more helps are also provided in selecting cognitive task
276 J. Wei and G. Salvendy
analysis methods for a variety of tasks design and actual task scenario. Process tracing methods are more
analysis. formal than those methods in Family 1. In a sense, they
take analysis one step further, in that they explore the
cognitive structure and processes underlying task
2.1. CTA method classifications performance. These methods most commonly make
use of verbal data, and hence are fraught with the
CTA draws upon laboratory and field research to limitation of verbal reports. In addition, these methods
obtain methods and techniques for analysing the result in large and often unmanageable data sets that are
cognitive task that involve complex cognitive skills often difficult to interpret in a meaningful fashion. On
(Kirwan and Ainsworth 1992, Cooke 1994, Seamster et the other hand, the data collection techniques are
al. 1997, Schraagen et al. 2000). The classification of generally carried out easily, although some practice
CTA methods described in this research is based on the may be necessary on the part of the job analysers. The
mechanism of the methods themselves. Each CTA representative methods in this family are cognitive
method is classified into one of the four families based walkthrough (Clarke 1987), verbal reports (Ericsson
on its formality and analysis mechanism. Each method and Simon 1984), and protocol analysis (Sanderson et
is compared based on characteristics, inputs, outputs, al. 1989).
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
(continued overleaf )
277
278
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
e.g.3 of Hall et al. Structure interview of Pairs of experts (one Action, precursor Three stages Time required and cost are Very efficient
Structured (1994) problem solving poses, the other and interpretation Pairs of experts work on a moderate Recognize characteristics of procedural
interview – Skill analysis method to solves, a task data single problem skills in operational setting
simplified PARI elicit finer level of skill specific problem) Break down action, precursor,
details A single problem and interpretation data into
Study subtasks based on a cognitive procedures
single problem
e.g. 4 of Klein and Used when you do not Questionnaire Task diagram Frame the task as a process Not suitable for complex tasks Provides an overview of the task
Structured Militello (1998) already have a roadmap Be patient to repeat, and Identifies the cognitively complex
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
(continued overleaf )
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
Verbal reports Ericsson and Procedural skills and Experience Verbal data Subject reports a running High time consuming Easy to obtain data
Simon (1984) decision points Task commentary describing what Moderate cost Helpful in developing an initial
Concurrent, retrospective, is see, what is done, and why it Low accuracy understanding of a domain
or prospective is done, which reports are Incomplete
Self-report or shadow recorded on audio or video Concurrent is impossible if a
tapes task involves verbal
communication
Impossible for high cognitive
workload tasks
Subjects may perform a task
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
279
(continued overleaf )
280
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
Diagramming Lesgold et al. Focus on knowledge Task structured and Diagrams Elicit the critical piece of Complex for huge amount of Low time/cost
(1988) representation response controlled information that ties all the input information Simple solution for a complex problem
by a standard pieces of the information Intuitive method for eliciting critical
format together information
Lead to more thorough analysis
Error analysis Rasmussen et Focus on error types and Errors Mapped errors to Systematic analysis of Moderate time Low cost
al. (1987), source cognitive processing operational and performance Provide insights into decision making,
Norman (1984), failures errors to determine the particularly in critical situations, which
Reason (1987) relationship between error would not be gained through the
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
(continued overleaf )
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
Sorting Geiwitz et al. Focus on high level Concepts Sorted piles Sort concepts into piles based Data generated not very Low time/cost
(1990) conceptual structures on relatedness, one concept sensitive to variations Quick and easy to elicit conceptual
Derive models of expert’s can be put in different piles Decreased sensitivity to information
conceptual framework for a Revisit cards already in a pile proximity differences Effective to develop coding schemes that
task/job without time constraints can be used reliably to analyse
Label each pile unstructured data such as interview
protocols
Multi- Klein et al. Focus on knowledge A set of concepts A set of coordinates Provide pairwise proximity Complex with statistical Successful in a large variety of settings
dimensional (1989) representations corresponding to estimates for a set of concepts, methods and for a variety of purposes
scaling Structural modelling the location of each and generates Moderate time and cost Handle missing data given a sufficient
technique using descriptive item in multidimensional spatial Dimensions portrayed in the number of subjects and relatively low
multivariate statistical multidimensional layouts of those concepts physical layout of the space do error
techniques space Number of dimensions is not always correspond to the Provide a faithful representation of
decided using several best fitting dimensions, may original data that are unreliable at the
statistically based heuristics need rotate dimensions to unrelated end of the scale
interpret them Better represents perceptual information
than conceptual information
(continued overleaf )
281
282
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
GOMS (goals, Card et al. Assume that interacting Goal Behaviour Four basic components (goal, Inaccurate when the method Useful in cases where the sequence of
operators, (1983), Kieras with machines is similar to Task context Task time and operators, methods and of operation is not known or events is known
methods, and (1988), Carroll solving a problem Action learning time selection rules) when working with novices Useful for investigating the impact of
selection rules) and Olson Task modelling and Information Error Problems broken down into Difficult and extremely design alternatives and for
(1988), decomposition by Knowledge and subproblems, as well as goals complex understanding critical pathways for task
Elkerton and considering performance process ability Require significant amounts performance
Palmiter (1991), Procedural model based on limits of time to construct Very important in trying to understand
Gray et al. theory of human Limit to a specific set of tasks how a user interacts cognitively with
(1993), Eberts information processor and a specific functionality interactive software and in quantifying
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
(1997) (Human Processor Model) Moderate cost aspects of the interaction even before the
Vague in applications software is prototyped
Only applicable to error-free
performance
Only good for expert
GOMS Outline the cognitive Six operators Predicate times Predicate by quantifying Remove large chunk of time Best for making qualitative predications
variation 1 – behaviour of a user by values for observable by not analyse unobservable about differences between tasks
Basic GOMS breaking down the problem operations behaviour such as the time to Depend greatly on the skill of task
into subgoal and goal stacks acquire goals analysis. Different skills result in
Time consuming different variant of models
Assume error-free
performance
(continued overleaf )
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
e.g.1 of GOMS Kieras (1988) Basic structure is methods Methods Describe interaction of user More cycle times than goal Has a fairly simple concise structure
other variations Goes beyond GOMS by and computer in a computer stack model providing a good overall structure,
– NGOMSL combining several GOMS programming-like language easily understood statements are in a
(Natural models into one integrated Activities of the user described very concise form, useful in initial task
GOMS mode as subroutines with decision analysis stage
language) Incorporate the human statement (IF-THEN), flow of Enables task analysis using a GOMS-
model processor model through control statement (GOTO), like model to be more specific
the use of analyst-defined memory storage and retrieval Forces task analysis to be very precise
operators and specification Describe user computer interaction in a
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
(continued overleaf )
283
284
Table 2. (continued )
Method Reference Characteristic Input Output Process Weakness Advantage
e.g. 2 of Payne and Emphasize the family Identify ‘simple tasks’ that the Alternative representations Especially useful for evaluating
Grammar – Green (1986, resemblance among the user can perform without make slightly different consistency of the interface design and
TAG (task- 1989) language elements problem solving and that predictions that are also for offering design revisions based on
action Designed to make contain no control structure logical consistency
grammar) predications about the Describe simple tasks in a Determine well-defined categories of
relative complexity of dictionary by sets of semantic tasks with most structural consistency
designs rather than to components reflecting Investigate consistency in more detail
provide quantitative categorizations of the task Concentrate on the overall structure of
measures and predictions of world the language rather than individual rules
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
can be decided. They produce good quantitative (3) Interfering with task performance: Because the
prediction. The procedural models are located here. more formal methods tend to be conducted apart
People learn to use products by generating rules for use from actual task performance, there is less
and then ‘run’ their models, by interacting with the chance of interfering with performance, but they
product, and by sequencing through the set of rules tend to be artificial or to lack face validity, in
(Fischer 1991, Eberts 1997). There are several limita- that they are unlike any task that is actually
tions. Building models is expensive and time consuming. performed.
Most models are theoretical based on assumptions. If (4) Output data: The data obtained using the formal
the environment or scenario changes, the model needs to methods tend to be more quantitative, compared
be modified. Moreover, alternative models can produce to the more qualitative data obtained using the
different results that are also logical (Pashler 1998). The informal methods. Because the quantitative data
representative methods in this family are multi-dimen- is easier to interpret than qualitative data (Meyer
sional scaling, (Klein et al. 1989), ACT model (Ander- and Kieras 1997), informal method tends to be
son 1993), ARK model (Geiwitz et al. 1988), human more time consuming than formal methods.
processor model (Card et al. 1983, 1986), GOMS model Also, because it is easier to summarize quanti-
(Card et al. 1983, Kieras 1988, Carroll and Olson 1988, tative data, the analysis of group data is more
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
Elkerton and Palmiter 1991, Gray et al. 1993, Eberts straightforward for formal methods. However,
1997), production systems (Newell and Simon 1972, the informal methods often generate richer
Anderson 1976, Kieras and Polson 1985, Bovair et al. output than the formal methods.
1990), grammars (Eberts 1997), executive-process inter-
active control (Meyer and Kieras 1997), and object- In order to ensure that CTA is thorough and accurate,
based model (Beringer and Wandmacher 1991). two or more methods may be combined. Different
The four families of CTA are distinguished by the methods may result in very different models, all of which
degree to which the methods and analyses are specified. are good models with respect to very different aspects of
Observations and interviews are relatively informal, the domain knowledge. Therefore, the best way to
with much of the specification of methods and analyses minimize potential measurement errors and to maximize
left to intuition. Process tracing methods are somewhat the scope of domain coverage is to combine multiple
better specified, with some procedures, such as protocol methods (Klein and Militello 1998).
encoding. Conceptual techniques are fairly formal and There are two types of methods combination. The
well specified with fewer decisions. Formal models are first is to combine one or more traditional task analysis
very well defined. The tradeoffs between these four method with one or more cognitive task analysis
families are described next. method. The other is to integrate two or more cognitive
task analysis methods from the same or different CTA
(1) Training requirements: Because of the more families classified above. There are some examples in
active analysis role associated with informal which methods have been combined successfully to
CTA methods, analysers’ training requirements accomplish specific goals.
for these methods tend to be focused on aspects For the first type of methods combination, most
of interview skills and domain knowledge. On researchers considered that cognitive methods supple-
the other hand, the more formal methods tend to ment rather than replace traditional methods (Ryder
require training in the procedural and analytic and Redding 1993). The advantages of sensitivity and
details of the method. In general, these formal objectivity of the traditional approaches should not
methods are less flexible and require more necessarily be abandoned when faced with cognitively
methodological training, which may explain oriented tasks. However, their reliance on visually
why they are used less frequently than informal observable task components must be overcome. Koubek
methods. The more structured techniques typi- et al. (1994) made a first attempt at integrating a
cally require advanced preparation and signifi- traditional task analysis method with a CTA. In their
cant knowledge by the job analysers. research, protocol analysis was combined with a
(2) Introspection and verbalization: The less formal traditional task analysis method, McCormick’s PAQ,
direct methods require more introspection and to determine ability requirements for personnel selection
verbalization from the subject, compared to the on a computer-based task. The results supported the
more formal indirect methods. The indirect hypothesis that a consensus-based selection test can be
nature of the formal methods is typically improved beyond the results that were obtained from
associated with more time required to prepare the PAQ by including data derived from task analysis
the materials. techniques adapted for cognitively oriented tasks.
286 J. Wei and G. Salvendy
Protocol analysis refers to having persons think aloud CTA where we need to define and circumscribe the
while performing or describing a task, and then using domain of tasks or jobs, select CTA methods in
verbalization to infer subjects’ cognitive processing Family 1.
(Ryder and Redding 1993). Johnson and Johnson . Guideline 2. When specific procedures involved in
(1987) combined traditional task analysis methods with carrying out a task are not well defined, select
verbal protocol for troubleshooters. They analysed the CTA methods in Family 1.
performance of troubleshooters using traditional meth- . Guideline 3. When we can easily define a task that
ods to identify tasks done during troubleshooting, and is representative of the actual task scenario, and
then collected verbal protocols as technicians went this task has clear process, select CTA methods in
about the job and, therefore, improved the performance Family 2.
of troubleshooters. Ryder and Redding (1993) reviewed . Guideline 4. When a particular process of a task
the recent developments in CTA methods, and devel- and its concurrent task performance need to be
oped the integrated task analysis model (ITAM), a tracked, select CTA methods in Family 2.
framework for integrating cognitive and behavioural . Guideline 5. When data is easily captured by
task analysis methods within the Instructional Systems verbal means and data collection does not affect
Development (ISD) model. They presented ITAM’s the task/job incumbent’s performance such as
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
three analysis stages (progressive cycles of data collec- distraction, select CTA methods in Family 2.
tion, analysis, and decision making) in three compo- . Guideline 6. When domain knowledge, structures,
nents (skills, knowledge, and mental models). This interrelations of tasks need to be defined and
integrated approach could ‘support development of known, select CTA methods in Family 3.
training programs that build a flexible knowledge base, . Guideline 7. When multiple job or task analysers
automated skill components for high performance tasks, are analysing a task, and the task analysis requires
and efficient mental models for task understanding and less introspection and verbalization, select CTA
decision making. Trainees would be provided with methods in Family 3.
better tools for mastering the complex tasks that are . Guideline 8. When a task needs quantitative
increasingly required of workers today (Ryder and predication, and models of a task do not need to
Redding 1993: 93).’ Research on real-time, high- change (or only change a little) when environ-
performance jobs, such as supervisory control and air ments or scenario changes, select CTA methods in
traffic control, has shown that both traditional beha- Family 4.
vioural and cognitive analyses methods are required to . Guideline 9. When task performances are easily
understand performance (Schlager et al. 1990). affected or distracted by any interference, select
For the second type of methods combination, Potter CTA methods in Families 2, 3, and 4.
et al. (1998) described a framework for integrating . Guideline 10. When job or task analysers do not
different types of specific CTA techniques into software have significant knowledge on analytical techni-
system development. Thordsen and Hutton (1998) ques, select CTA methods in Families 1, 2, and 3.
presented a method of combining a cognitive function . Guideline 11. When a task is skill-based, select
model to identify the components of the new systems CTA methods in Families 1 and 2; when a task is
that have varying degrees of cognitive complexity. The rule-based, select in Families 2 and 3; and when a
combination better defined the role of the human in task is knowledge-based, select in Families 3
complex system design for the engineers and designers, and 4.
therefore, assisting the understanding of the roles of
human in complex system designs. Rasmussen (1983) classified tasks into skill, rule, and
knowledge-based categories on human performance.
The classification of human performance in skill, rule,
2.2. CTA method selections and knowledge-based tasks behaviour is the role of the
information observed from the environment. At the
Table 3 summarized the comparisons of these four skill-based level, the perceptual motor system acts as a
CTA methods families presented in the current research. multivariable continuous control system, which syn-
Based on Table 3, some guidelines have been chronies the physical activity. The sensed information
developed to help select the CTA methods in practical for this control is perceived as time-space signals,
applications. indicating time-space behaviour of the environment.
At the rule-based level, the information is perceived as
. Guideline 1. When tasks or jobs do not have a signs. Signs refer to precepts and rules for action. Signs
defined domain, especially in the initial phase of can only be used to select and modify rules controlling
Table 3. CTA methods comparisons and CTA method selections guidelines.
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
~~
Direct levels Direct Indirect
Formal levels Informal Formal
Defined levels Intuition Somewhat better specified Well specified with Very well defined
~
and domain knowledge training and preparations,
and significant knowledge
on analytical techniques
(modeling and simulation
techniques)
~ ~ ~
Introspection and Maximum Minimum
verbalization levels (subjects)
Interfering with task Maximum Minimum
performance levels
Output data levels Qualitative Quantitative
287
288 J. Wei and G. Salvendy
skilled routine sequences. At the knowledge-base level, However, the framework proposed by Meyer and
symbols are used for casual functional reasoning in Kieras (1997) does not cover all aspects of cognitive job
perceiving or explaining unfamiliar behaviour of the and task performance such as monitoring, motivation,
environment information. Symbols are the basis for and cognitive environments. The EPIC framework and
reasoning and planning, which refer to concepts related models (SRD) are theoretical and complex with lots of
to functional properties (Rasmussen 1983). parameters. Because there are no general agreements
consistent with various data, EPIC framework is not
directly attached to specific tasks. People need to
2.3. Human Centered Information Processing (HCIP) develop specific models for specific tasks. It is costly.
model Their detailed models are also complex and have some
limitations based on assumptions. Moreover, unlimited
One of the existing representative new frameworks of capacity assumption is not practical in job settings.
human information processing for cognitive task The ACT model (Anderson 1993) focused on
analysis is an executive process interactive control problem-solving skills acquisition. The central concept
(EPIC) proposed by Meyer and Kieras (1997). This is the production rule that plays a similar role to the
framework is for characterizing human performance of stimulus-response bond in learning theories. It distin-
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
concurrent perceptual-motor and cognitive tasks and is guishes between declarative and procedural knowledge
the basis of formulating computational models to and uses encoding strength to determine accessibility of
simulate multiple-task performance under a variety of declarative knowledge and performance of procedural
representative circumstances. These models can be knowledge. The advantages of this method are that it
applied to characterize skilled human information provided a basis for constructing intelligent computer-
processing and action performance in multiple tasks based tutoring systems; operators are encoded as rules
under a variety of conditions. As an instructive in an abstract form that can apply across a range of
illustration of how Meyer and Kieras’ theoretical situations; knowledge in production form applies much
framework may be used to model multiple task more rapidly and reliably; variability in problem-solving
performance, one particularly influential experimental behaviour is dealt with; ideas from problem-solving
paradigm, the psychological refractory period (PRP) theory and learning theory are brought together; and
procedure, is investigated. For performance under this rapid and important progress in understanding how
procedure, Meyer and Kieras proposed an explicit complex problem-solving skills are learned is made.
computational model, strategic response-deferment However, this method only applies to the problem-
(SRD) model, which is constructed and tested based solving domain and assumes means-ends problem-
on their production system formalism and EPIC solving structure. The ACT-based representation of
information processing architecture. The proposed knowledge (ARK) model presented by Geiwitz et al.
SRD model is applied in accounting for a variety of (1988) is inspired by the ACT model to elicit both a
quantitative results from the PRP procedure, and leads network of static knowledge about the domain and a set
to interesting new predictions. The good accounts of procedures preformed on that knowledge. This model
obtained suggest that EPIC framework and models is very well suited for job or task planning. However, it
(SRD) built upon this framework have merit. This is very time-consuming. The implementation of this
comprehensive theoretical framework concludes that at model requires a rich data set that requires extensive
a cognitive level, people can apply distinct sets of reduction and interpretation. Therefore, these two
production rules simultaneously for executing multiple models are not suitable for the complete job and task
tasks and individual tasks coordinated through various capability requirement analysis in the cognitive domain.
types of supervisory control and scheduling. A hu- The object-based model presented by Beringer and
man’s information-processing capacity is unlimited Wandmacher (1991) takes advantage of object-oriented
when using flexible scheduling strategies. EPIC includes technology and integrates procedural models such as
few physical constraints, that is, no limitation on GOMS-type model (Card et al. 1983) with the semantic
physical except the visual, such as omission of descriptive level models. This model is an integrating
processing capacity limitation assumptions. The EPIC formalism for representing both the declarative (model-
processing units are visual, auditory, and tactile ling concepts) and the procedural (modelling proce-
perceptual processors that receive inputs from simu- dures) aspects of human – computer interaction and the
lated physical sensors. EPIC has three functionally cognitive task analysis. The advantages are that data
distinct memories: declarative long-term memory, abstraction attributes from object-oriented modelling
procedural memory, and working memory. The work- provided explicit links between semantic and procedural
ing memory does not have capacity limitation. levels. This method bridges the gap between the high
Cognitive task analysis methods 289
level semantic description of a task space and the added to this model based on Salvendy and Knight’s
procedural description of possible activities. A semantic information processing model of the human operator
goal structure depends on the given task, and a GOMS (Salvendy and Knight 1988). The objective of this model
description is always limited to a specific set of tasks and is to capture a variety of aspects of task performance in
a specific functionality. In contrast, this model can the cognitive domain for job design. This model is
derive semantic goal structures for any potential mean- focused at the level of knowledge-based human perfor-
ingful task given a specific functionality in a very mance. Skill- and rule-based human performance is not
complex system. The functionality can be easily varied a focus, because they do not include many cognitive
by adding or removing classes or methods. However, elements such as decision making and problem solving.
this method has the disadvantage that time required and There are two types of job elements: job-oriented and
cost are high, hence, it is not suitable for a large variety worker-oriented. McCormick et al. (1972) distinguished
of jobs in industrial job design. between these two types of job elements. Job-oriented
Cognitive environment simulation (CES) presented by elements are job content descriptions that have a
Roth et al. (1992) is built on an artificial intelligence dominant association with the technological aspects of
problem-solving system representing models of human jobs and generally reflect what the worker achieved.
cognitive activities. This method performs diagnostic Worker-oriented elements are those that tend to
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
reasoning for fault management under dynamic condi- characterize the generalized human behaviours and
tions. The advantages of this method are to build a capabilities involved. The contexts of the present
runnable computer program that forces the modeller to discussion are those of a worker-oriented nature,
describe mechanisms in great detail; to uncover a variety because they offer some possibility of serving as bridges
of consequences of the basic information processing of various technologies. ‘A worker-oriented element
mechanisms that are instantiated in the program; to run can, in effect, be viewed collectively within the frame-
the simulation through a scenario to produce specific work of the stimulus – organism – response paradigm or,
behaviour that can be analysed and compared to other in more operational terms, information input, mediation
data; to produce simulation that can be run on a variety processes, and output’ (McCormick et al. 1972: 348).
of scenarios, including scenarios that were not part of The developed conceptual model somewhat fits into this
the original design set and, therefore, capture human paradigm.
cognitive activities in a wide range of domain-specific Based on the summarized literature reviews on what
circumstance; and to capture the cognitive demands and how traditional task analysis and cognitive task
imposed by dynamic fault-management situations. analysis that can analyse human job and task perfor-
However, it is complex, time consuming, and high cost. mances to assist job design, the current methods are
Therefore, this method is not suitable for a large variety only found to capture parts of the human performance
of jobs in industrial job design. aspects in the cognitive domain.
Wickens’ and Carswell’s resource allocation model Table 4 presents different aspects of task perfor-
(1987, 1997) is a general frame when analysing human mances on cognitive attributes and the affects on
information processing with a focus on attention and cognitive attributes. Cognitive attribute here refers to
memory. However, the cognitive stage is not detailed constructs of various types of human qualities most
enough when analysing human information processing closely related to the human traits in the cognitive
in the cognitive domain. Therefore, in this research, a domain. Some of these are of an ‘aptitude’ nature in the
new cognitive task performance analysis model, a cognitive domain; others are of a ‘situational’ nature in
human-centered information processing (HCIP) model the sense of imposing a requirement on the individual to
for the cognitive task performance, is presented. This adapt to the situation in question in the cognitive
model is developed based on human information domain. The detailed descriptions of these performance
processing theory, which modifies Wickens’ and Cars- aspects in 11 modules are presented as follows.
well’s resource allocation model (1987, 1997) in the
breadth and depth of perception and decision-making 2.3.1. Information interface module: This module
stages. One assumption is made that this model does not captures the input of information or data for cognitive
include the motor stage. The model proposed here also processing. The information input can be achieved
assumes that the human’s information-processing capa- through perceiving stimuli from physical channels such
city is limited, including attention capacity limitation as visual and auditory channels. There are two classes
and working memory capacity limitation. This assump- within this module: search and receive information class,
tion conforms to Pashler’s (1998) limited capacity in and identify objects, actions, and events class. The first
working memory, perceptual capacity, and central class specifies operations such as detect, inspect,
bottleneck limited capacity. The sensory memory is observe, read, receive, scan, survey, and listen to
290
Table 4. Breakdown of cognitive attributes and cognitive affecting attributes.
Cognitive attributes and cognitive affecting attributes
information or data. The second class recognizes advising, coordinating, negotiating, and supervising,
discrimination, identification, location, judgment, and and so on. Noninterpersonal communications includes
comparison of objects, actions, and events such as signal or code transmission and requesting (by choosing,
position, structures and patterns, colours, shapes, sizes, testing, and communicating commands).
and speeds.
2.3.7. Learning module: This module includes two
2.3.2. Information handling module: This module cap- classes of learning: the use of learned information and
tures human information processing performances human learning. The first class includes the using of job-
related to the handling of information. This information related experience and mathematics. The second class
handling refers to some aspects of the way in which the includes learning by being told or by instruction, by
worker processes job-related information to perform the reasoning (such as inductive and deductive reasoning),
job, including managing existing information, such as and also learning by programmes and other equipment
transcribe (copy/poster), code and decode, translate, (such as recording immediate events when a human
arrange, classify, order, itemize, tabulate, combine, intervenes with equipment, analysing cumulative experi-
analyse information or data, and predicting extra ence, then updating a model). It updates working
information. Although the work activities considered memory and long-term memory based on human
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
2.3.3. Mental plan and schedule module: This module 2.3.9. Memory module: This module captures human
captures the human cognitive performance related to the cognitive task performance related to memory resources
mental plan and schedule. It specifies goals and task needed for cognitive tasks. The resources limit retrievals
performance criteria such as meeting due dates, max- and stores for sensory, short-term, and long-term
imizing production rate, and maximizing accuracy. It memory demanded by tasks.
specifies constrains to human cognitive capabilities,
generating satisfied or optimized cognitive task execu- 2.3.10. Motivation module: This module refers to the
tion sequences, and so forth. job incumbent’s and task performer’s motivation, which
has a major affect on human cognitive attributes
2.3.4. Mental execution module: This module captures required for human information processing when
human cognitive task performance related to the major performing tasks. There are three classes that determine
execution of mental activities, including generating the motivation of job incumbents and task performers:
ideas, decision making, and problem solving. job/task attributes, benefits, and abilities/skills classes.
2.3.5. Mental monitor module: This module tracks or 2.3.11. Environment module: This module refers to
monitors the status of the mental activities within the environments that have a major affect on human
modules. After an abnormal status is detected, it sends cognitive attributes required for human information
feedback to the related modules for corresponding processing when performing tasks. Two classes are
corrections or intervenes in information processing included in this module. One is the physical,
based on abort, reset, or correction of failures or errors. physiological, and psychological environment. The
For example, after the execution module has a function physical environment refers to the information pre-
such as decision making, go wrong, this module send sentation mode; for example, the information pre-
this information back to the mental plan module for sentation mode is defined through letter image
replanning, rescheduling, or simply stopping. preview on the computer screen, or through the
specific keying method (Karwowskiet et al. 1994).
2.3.6. Communication module: This module refers to The physiological environment of an operator is
interpersonal communications and noninterpersonal defined in terms of stress and impairment. The
communications. Interpersonal communication includes psychological environment includes other goal-direc-
292 J. Wei and G. Salvendy
ted activities that are more or less congruent with the resource allocation. These capacities are of two generic
subject task. The effect of stress on operator forms: (a) each operation has limits in the speed of its
performance can be assumed to be similar to those functioning and in the amount of information that can
of physiological stresses (Fleishman and Quaintance be processed in a given unit of time; and (b) there are
1996: 280 – 281). The other class is the social limits on the total attention or memory, ‘mental
environment class, which refers to social interaction. energy,’ or resources available to the information
These two environment classes may have deleterious processing system. These limits are represented by the
effects on the operator, and potentially may threaten pool of attention resources and memory resources
the effectiveness of task performance. including memory retrieval and storage (Sheridan
Figure 1 presents the conceptual model that logically 1997); module 10 and 11 are modules that specify the
links all cognitive attributes in Table 4 together. attributes affecting cognitive attributes such as motiva-
In Figure 1, there are 11 boxes representing 11 tions and environments. The representation of affec-
categories of generic cognitive attributes and cognitive tions uses the double arrow lines. Each module is
affecting attributes in the perception and decision- further divided into classes, which are templates to
making stages of dynamic human information proces- hold similar cognitive attributes. Each class can be
sing. These categories are defined as modules. Modules further broken down into cognitive attributes based on
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
1 to 7 are functional modules, which represent the functional decomposition or resource allocation. These
major mental activities or functional categories. These cognitive attributes then can be mapped into job
cognitive functional relationships are represented using elements. These mapped job elements have been
solid arrow lines; modules 8 and 9 are resource presented in the structured form of questionnaires for
modules specifying the constraints that reflect resource quantitatively collecting data from the survey for
capacities such as attention and memory, of the various further statistical analysis (Wei and Salvendy 2000,
mental operations involved. The dotted lines represent 2003).
Figure 1. Human centered Information Processing (HCIP) model for cognitive task performance. Note: The solid arrow lines
represent the functional relations between functional modules 1 through 7, the dotted arrow lines the resource allocations for the
resource modules 8 and 9, and the double arrow lines represent the effects on cognitive attributes for affection modules 10 and 11.
Cognitive task analysis methods 293
2.4. HCIP model validation capturing the complete cognitive capability require-
ments for job design.
Table 5 summarized the capture of different aspects of Human cognitive-oriented work can be viewed from
human cognitive task performances in different job and various frames of reference, and thus can be character-
task analysis methods. A series of tasks construct a job. ized in terms of different classes of job-related variables.
In Table 5, G represents the job or task analysis method This research is based on a probing effort relating to
generally covered the cognitive attributes; N, somewhat cognitive capability requirements from workers, or what
covered; and S, extensively covered. The blank means are herein referred to as the worker-oriented aspects of
that the job or task analysis method does not cover the jobs. It would seem that such variables could serve as
cognitive attributes. The numbers 1 – 11 correspond to possible common denominators on which to compare or
the module numbering in Table 4, and the letters a and b contrast jobs of different technological areas. This
correspond to class numbering in Table 4. From Table research provides a system to analyse jobs in terms of
5, we see that no method captures all aspects of task reasonably discrete, separate job elements of a worker-
performance presented in Table 4 in the cognitive oriented nature and serves as the basis for building
domain. statistically related groupings of such elements or
Fleishman and Quaintance (1995, 1996) constructed cognitive job dimensions in order to describe the
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
1 2 3 4 5 6 7 8 9 10 11
Reference Methods a b a b a b a b c a b a b a b a b c a b c a b c a b
Rohmert and Landau (1979), Rohmert (1988) Ergonomic job analysis technique G G N G G G G N G
Hackman and Oldham (1980) Job diagnostic survey (JDS) G
Bank (1988) Job component inventory (JCI) N N G G N G
Primoff and Eyde (1988) Job element analysis G G G N G N N N N N G
Campion and Thayer (1983, 1985) Multimethod job design questionnaire (MJDQ) G G N N N G G
Cunningham (1988) Occupation analysis inventory (OAI) G G G G G
McCormick et al. (1972) Position analysis questionnaire (PAQ) G G G N N N G G N G
Koubek et al. (1994) PAQ with protocol analysis G G G N N N G G N G
G – The job or task analysis method generally covered the cognitive attributes; N – The job or task analysis method somewhat covered the cognitive attributes; S – The job
or task analysis method extensively covered the cognitive attributes; Blank – The task analysis method does not cover the cognitive attributes; The numbers 1 – 11
correspond to the module numbering in Table 4, and a and b correspond to class numbering in Table 4.
Table 6. Cognitive attributes and affecting attributes from Table 2 and Fleishman
Cognitive attributes and cognitive affecting attributes of HCCP model
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
1 2 3 4 5 6 7 8 9 10 11
295
296 J. Wei and G. Salvendy
(Card et al. 1983, 1986), the information processing CAMPION, M. A. and MEDSKER, G. J. 1992, Job design. In
model of the human operator (Salvendy and Knight Gavriel Salvendy (eds), Handbook of human factors, Chapter
32 (New York: John Wiley & Sons), pp. 645 – 881.
1988), and the TAG (Payne and Green 1986, 1989) CAMPION, M. A. and THAYER, P. W. 1983, Job design:
cover five modules; the computer decision aids and approaches, outcomes, and trade-offs. Organizational Dy-
mental models (Sheridan 1997) and the sensori-motor namics, 15(3), 66 – 79.
process chart (Crossman 1956) cover four modules; the CAMPION, M. A. and THAYER, P. W. 1985, Development and
ACT (Anderson 1993) and the ARK (Geiwitz et al. field evaluation of an interdisciplinary measure of job
design. Journal of Applied Psychology, 70(1), 29 – 43.
1988) models cover three modules; and the JDS (Hack- CARD, S. K., MORAN, T. P. and NEWELL, A. L. 1983, The
man and Oldham 1975, 1980) only cover one module. psychology of the human-computer interface. (Hillsdale, NJ:
Future research can be conducted by applying object- Lawrence Erlbaum Associates).
oriented technology to systematically analyse and design CARD, S. K., MORAN, T. P. and NEWELL, A. L. 1986, The model
these elements in the HCIP model in a more structured human processor: an engineering model of human perfor-
mance. In K. R. Boff, L. Kaufman and J. P. Thomas (eds),
class object format. The classes developed in the current Handbook of perception and human performance, Vol 2 (New
research can be further described using attributes and York: John Wiley & Sons).
member functions conforming to the object oriented CARROLL, J. M. and OLSON, J. R. 1988, Mental models in
concepts. Scenarios of the human information proces- human-computer interaction. In M. Helander (Eds), Hand-
book of human-computer interaction. (Amsterdam: Elsevier).
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
FISK, A. D. and EGGEMEIER, F. T. 1988, Application of JOHN, B. E. 1990, Extension of GOMS analyses to expert
automatic/controlled processing theory to training tactical performance requiring perception of dynamic auditory and
command and control skills: 1. Background and task visual information. In Proceedings of CHI ‘90: human
analytic methodology. In Proceedings of the human factors factors in computing systems (New York: ACM/SIGCHI),
society 32nd annual meeting (Santa Monica, CA: Human pp. 107 – 115.
Factors Society), pp. 1227 – 231. JOHN, B. E., VERA, A. H. and NEWELL, A. 1994, Towards real-
FLEISHMAN, E. A. 1995, Rating scale booklet: Fleishman Job time GOMS: a model of expert behavior in a highly
Analysis Survey. Management Research Institute, Inc. interactive task. Behavior & Information Technology, 13(4),
FLEISHMAN, E. A. and QUAINTANCE, M. K. 1996, Taxonomies of 255 – 267.
human performance: the description of human tasks. Manage- JOHNSON, L. and JOHNSON, N. E. 1987, Knowledge elicitation
ment Research Institute, Inc. involving teachback interviewing. In A.C. Kidd (ed),
FOGLI, L. 1988, Task attribute performance analysis. In S. Gael Knowledge acquisition for expert systems: a practical hand-
(ed), The job analysis handbook for business, industry, and book. (New York: Plenum Press).
government, Vol II (New York: John Wiley & Sons), KARWOWSKI, W., EBERTS, R., SALVENDY, G. and NOLAND, S.
pp. 1105 – 1119. 1994, The effects of computer interface design on human
GEIWITZ, J., KLATSKY, R. L. and MCCLOSKEY, B. P. 1988, postural dynamics. Ergonomics, 37(4), 703 – 724.
Knowledge acquisition for expert systems: conceptual and KIERAS, D. E. 1988, Towards a practical GOMS model
empirical comparisons. (Santa Barbara, CA: Anacapa methodology for user interface design. In M. Helander
Sciences). (ed), Handbook of human-computer Interaction. (Amster-
GEIWITZ, J., KORNELL, J. and MCCLOSKEY, B. 1990, An expert dam: Elsevier).
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
system for the selection of knowledge acquisition techniques. KIERAS, D. E. and POLSON, P. 1985, An approach to the formal
Technical Report 785(2). (Santa Barbara, CA: Anacpapa analysis of user complexity. International Journal of Man-
Sciences). Machine Studies, 22, 365 – 394.
GORDON, S. E., SCHMIERER, K. A. and GILL, R. T. 1993, KIRWAN, B. and AINSWORTH, L. K. (eds). 1992, A guide to task
Conceptual graph analysis: knowledge acquisition for analysis. (London, Washington, DC: Taylor & Francis Ltd).
instructional systems design. Human Factors, 35, 459 – 481. KLEIN, G. and MILITELLO, L. G. 1998, Cognitive task analysis.
GRAY, W. D., JOHN, B. E. and ATWOOD, M. E. 1993, Project In Workshop of human factors and ergonomics society 42nd
ernestine: validating a GOMS analysis for predicting and Annual Meeting, No.12. Chicago, Illinois.
explaining real-world task performance. Human-Computer KLEIN, G. A., CALDERWOOD, R. and MACGREGOR, D. 1989,
Interaction, 8, 237 – 309. Critical decision method for eliciting knowledge. IEEE
HACKMAN, R. J. and OLDHAM, G. R. 1975, Development of job Transactions on Systems, Man, & Cybernetics, 19, 462 – 472.
diagnostic survey. Journal of Applied Psychology, 60(2), KOUBEK, R. J., SALVENDY, G. and NOLAND, S. 1994, The use of
159 – 170. protocol analysis for determining ability requirements for
HACKMAN, R. J. and OLDHAM, G. R. 1980, Work redesign. personnel selection on a computer-based task. Ergonomics,
(Reading, MA: Addison-Wesley). 37(11), 1787 – 1800.
HALL, E. M., GOTT, S. P. and POKORNY, R. A. 1994, A KULIK, C. T. and OLDHAM, G. R. 1988, Job diagnosis survey. In
procedural guide to cognitive task analysis: the PARI S. Gael (ed), The job analysis handbook for business, industry,
methodology. Brooks AFB, TX. and government, Vol II (New York: John Wiley & Sons),
HOLLNAGEL, E. and WOODS, D. D. 1983, Cognitive systems pp. 936 – 959.
engineering: new wine in new bottle. International Journal of LAGHERY, K. R. and COOKER, K., 1992, Computer modeling
Man-Machine Studies, 18, 583 – 600. and simulation. In G. Salvendy (ed), Handbook of human
HOFFMAN, R. R. 1987, The problem of extracting the knowl- factors & ergonomics (2nd Ed.), Chapter 41 (New York:
edge of experts from the perspective of experimental John Wiley & Sons), pp. 1375 – 1408.
psychology. AI Magazine, 8, 53 – 67. LEHTO, M. R., BOOSE, J., SHARIT, J. and SALVENDY, G. 1992,
JEANNERET P. R. and MCCORMICK, E. J. 1969, The job Knowledge acquisition. In G. Salvendy (ed), Handbook of
dimensions of ‘work oriented’ job variables and of their industrial engineer (2nd ed), Chapter 58 (New York: John
attribute profiles as based on data from the position analysis Wiley & Sons), pp. 1495 – 1545.
questionnaire, Report 2. Occupational Research Center, LESGOLD, A., RUBINSON, H., FELTOVICH, P., GLASER, R.,
Purdue University, West Lafayette, IN 47907. KLOPFER, D. and WANG, Y. 1988, Expertise in a complex
JEANNERET, P. R. 1991, Introductory guide for use with the skill: diagnosing x-ray pictures. In M. T .H. Chi, R. Glaser
position analysis questionnaire. (Logan, Ulta: PAQ Services, and M. J. Farr (eds). The nature of expertise (Hillsdale, NJ:
Inc). Lawrence Erlbaum Associates, Publishers), pp. 311 – 342.
JEANNERET, P. R. and MCPHAIL, S. M. 1991, Position analysis LOVE, K. G. and O’HARA, K. 1987, Predicting job performance
questionnaire: the standard for job analysis. (Logan, Ulta: of youth trainees under a job training partnership act
PAQ Services, Inc). program (JTPA): criterion validation of a behavior-based
JEANNERET, P. R. 1992a, Job analysis guide: major duties, measure of work maturity. Personnel Psychology, 40, 323 –
essential functions, and job requirements. (Logan, Ulta: PAQ 340.
Services, Inc). LUCZAK, H. 1997, Task analysis. In G. Salvendy (ed), Hand-
JEANNERET, P. R. 1992b, User’s manual for PAQ ENTER-ACT: book of human factors and ergonomics (3rd ed), Chapter 12
a micro-computer software package for use with the position (New York: John Wiley & Sons), pp. 340 – 416.
analysis questionnaire, Version 3.1, (Logan, Ulta: PAQ
Services, Inc).
298 J. Wei and G. Salvendy
MANCUSO, J. C. and SHAW, M. L. G. 1988, Cognition and NEERINCX, M. A. and GRIFFIOEN, E. 1996, Cognitive task
personal structure: computer access and analysis. (New York: analysis: harmonizing tasks to human capacities. Ergo-
Praeger). nomics, 39(4), 543 – 561.
MCCORMICK, E. J. 1979, Job analysis: methods and applications. NEWELL, A. and SIMON, H. 1972, Human problem solving.
AMACOM. (Englewood Cliffs, NJ: Prentice Hall).
MCCORMICK, E. J. 1977, Job analysis manual for the position NORMAN, D. A. 1984, Stages and levels in human-machine
analysis questionnaire. (Logan, Ulta: PAQ Services, Inc). interaction. International Journal of Man-Machine Studies,
MCCORMICK, E. J., MECHAM, R. C. and JEANNERET, P. R. 1977, 21, 365 – 370.
Technical manual for the position analysis questionnaire PASHLER, H. E. 1998, The psychology of attention. (Cambridge,
(System II). (Logan, Ulta: PAQ Services, Inc). MA: The MIT Press).
MCCORMICK, E. J., MECHAM, R. C. and JEANNERET, P. R. 1989, PAYNE, S. J. and GREEN, T. R. G. 1986, Task-action grammars:
Technical manual for the position analysis questionnaire, 2nd a model of the mental representation of task languages.
edn. (Logan, Ulta: PAQ Services, Inc). Human-Computer Interactions, 2, 93 – 133.
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1969a, PAYNE, S. J. and GREEN, T. R. G 1989, The structure of
A study of job characteristics and job dimensions as based on command languages: an experiment on task-action gram-
the position analysis questionnaire, Report No. 6. Occupa- mar. International Journal of Man-Machine Studies, 30,
tional Research Center, Purdue University, West Lafayette, 213 – 234.
IN. POLSON, P. G. and LEWIS, C. H. 1990, Theory-based design for
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1969b, easily learned Interfaces. Human-Computer Interaction, 5,
Position analysis questionnaire. Occupational Research 191 – 220.
Downloaded by [Stockholm University Library] at 17:07 10 August 2015
Center, Purdue University, West Lafayette. POTTER, S. S., ROTH, E. M., WOODS, D. D. and ELM, W. C.
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1969c, 1998, A framework for integrating cognitive task analysis
The development and background of the position analysis into the system development process. In Proceedings of the
questionnaire, Report No.5. Occupational Research Center, human factors and ergonomics society 42nd Annual Meeting,
Purdue University, West Lafayette, IN. pp. 395 – 399.
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1972, PRIMOFF, E. S. and FINE, S. A. 1988, A history of job analysis.
A study of job characteristics and job dimensions as based In G. Salvendy (ed) Handbook of industrial engineering (New
on the position analysis questionnaire. Journal of Applied York: John Wiley & Sons), pp. 1415 – 445.
Psychology Monograph, 56(4), 347 – 368. PRIMOFF, E. S. and EYDE, L. D. 1988, Job element analysis. In
MCCORMICK, E. J., JEANNERET, P. R. and MECHAM, R. C. 1989, S. Gael (ed), The job analysis handbook for business, industry,
Position analysis questionnaire. (Palo Alto, CA: Consulting and government, Vol II (New York: John Wiley & Sons),
Psychologists Press, Inc). pp. 807 – 824.
MCPHAIL, S. M., JEANNERET, P. R., MCCORMICK, E. J. and RASMUSSEN, J. 1983, Skills, rules, and knowledge: signals, signs,
MECHAM, R. C. 1992, Job analysis manual for the position and symbols, and other distinctions in human performance
analysis questionnaire, Revised Edition. (Palo Alto, CA: models. IEEE, 257 – 266.
Consulting Psychologists Press, Inc). RASMUSSEN, J., DUNCAN, K. and LEPLAT, L. 1987, New
MECHAM, R. C., MCCORMICK, E. J. and JEANNERET, P. R. 1977, Technology and Human Error. (New York: John Wiley &
Users manual for the position analysis questionnaire (System Sons).
II). (Logan, Ulta: PAQ Services, Inc). REASON, J. 1987, Generic errors modeling systems (GEMS): a
MECHAM, R. C. and MCCORMICK, E. J. 1969a, The use in job cognitive framework for locating common error forms. In J.
evaluation of job elements and job dimensions based on the Rasmussen, K. Duncan, and L. Leplat (eds) New Technol-
position analysis questionnaire, Report No.3. Occupational ogy and Human Error (New York: John Wiley & Sons),
Research Center, Purdue University, West Lafayette, IN. pp. 63 – 86.
MECHAM, R. C. and MCCORMICK, E. J. 1969b, The use of data REISNER, P. 1981, Formal grammar and human factors design
based derived attribute requirements of jobs, Report No.4. of an interactive graphics system. IEEE Transactions on
Occupational Research Center, Purdue University, West Software Engineering, SE-7, 229 – 240.
Lafayette, IN, USA. ROHMERT, W. and LANDAU, K. 1979, A new techniques for job
MEDSKER, G. J. and CAMPION, M. A. 1997, Job and team analysis (AET). (New York: Taylor & Francis Inc).
design. In G. Salvendy (ed) Handbook of human factors and ROHMERT, W. 1988, AET. In S. Gael (eds), The job analysis
ergonomics, 3rd edn. (New York: John Wiley & Sons) handbook for business, industry, and government, Vol II (New
pp. 450 – 489. York: John Wiley & Sons), pp. 843 – 859.
MEYER, D. E. and KIERAS, D. E. 1997, A computational theory ROTH, E. M., WOODS, D. and POPLE, H. E. 1992, Cognitive
of executive cognitive processes and multiple-task perfor- simulation as a tool for cognitive task analysis. Ergonomics,
mance, Part I: Basic mechanisms. Psychological Review, 35(10), 1163 – 1198.
104(1), 3 – 65. RYDER, J. M. and REDDING, R. 1993, Integrating cognitive task
MITCHELL, J. L. and MCCORMICK, E. J. 1990, Professional and analysis into instructional systems development. Educational
managerial position questionnaire (PMPQ). (Palo Alto, CA: Technology Research and Development, 41(2), 75 – 96.
Consulting Psychologists Press, Inc). RYDER, J. M. and ZACHARY, W. W. 1991, Experimental
MILLER, R. B. 1996, Development of a taxonomy of human validation of the attention switching component of the
performance: design of a system task vocabulary. American COGNET framework. In Proceedings of the human factors
Institutes for Research Tech. Rep. JSAS Catalog of Selected society 35th annual meeting (Santa Monica, CA), pp. 72 – 76.
Documents in Psychology, 3, 29 – 30. (Ms. No. 327)
Cognitive task analysis methods 299
SALVENDY, G. and SEYMOUR, W. D. 1973, Prediction and TYLER, S. W., NEUKOM, C., LOGAN, M. and SHIVELY, J. 1998,
development of industrial work performance. (New York: MIDAS human performance model. In Proceedings of the
John Wiley & Sons). human factors and ergonomics society 42nd annual meeting,
SALVENDY, G. and KNIGHT, J. J. 1988, Psychomotor perfor- pp. 320 – 324.
mance and information processing. In S. Gael (ed) The Job VISSER, W. and MORAIS, A. 1991, Concurrent use of different
Analysis Handbook for Business, Industry, and Government, expertise elicitation methods applied to the study of the
Vol I (New York: John Wiley & Sons), pp. 630 – 695. programming activity. In M. J. Tauber and D. Ackermann
SANDERSON, P. M., JAMES, J. M. and SEIDLER, K. 1989, SHAPA: (eds) Mental models of human-computer interaction 2 (B.V.
An interactive software environment for protocol analysis. North-Holland: Elsevier), pp. 97 – 113.
EPRL-89-08. University of Illinois at Urbana-Champaign, WEI, J. and SALVENDY, G. 2000, Development of the Purdue
Urbana, IL. Cognitive Job Analysis Methodology. International Journal
SCHLAGER, M. S., MEANS, B. and ROTH, C. 1990, Cognitive task of Cognitive Ergonomics. Vol. 4, No. 4 (London: Lawrence
analysis for the real-time world. In 34th annual proceedings Erlbaum Associates), p. 277 – 296.
of the human factor society (Santa Monica, CA: Human WEI, J. and SALVENDY, G. 2003, The Utilization of the Purdue
Factor Society), pp. 1309 – 1313. Cognitive Job Analysis Methodology, Human Factors and
SCHMIDT, F. L. and HUNTER, J. E. 1977, Development of a Ergonomics in Manufacturing. Vol. 13, No. 1. (New York:
general solution to the problem of validity generalization. John Wiley & Sons), pp. 59 – 84.
Journal of Applied Psychology, 62, 529 – 540. WICKENS, C. D. 1987, Information processing, decision-
SCHRAAGEN, J. M., CHIPMAN, S. F. and SHALIN, V. L. 2000, making, and cognition. In G. Salvendy (ed) Handbook of
Cognitive task analysis. (London: Lawrence Erlbaum human factors, Chapter 2.2 (New York: John Wiley & Sons),
Downloaded by [Stockholm University Library] at 17:07 10 August 2015