Tips
for Creating
a Safety Culture
in Organizations
ESTABLISHING, SUPPORTING, COMMUNICATING, Or consider the “macho” aviator with thousands of flight
hours, attempting to land in dense fog. As he approaches the
AND IMPLEMENTING CLEAR SAFETY POLICIES minimum descent altitude for making a missed approach, he
WILL ALWAYS BENEFIT ANY ORGANIZATION. knows (well, he thinks he knows) that the aircraft will break
out of the fog layer any minute, the runway will be visible,
and he’ll make a perfect landing, upholding the company’s
motto to “get passengers to their destinations safely and on
time.” The copilot has seen him do this numerous times and
B Y K AT H E R I N E A . W I L S O N - D O N N E L LY, agrees that the runway will be in sight any minute. Within
the next minute the aircraft does break out of the fog layer,
H E AT H E R A . P R I E S T, C . S H A W N B U R K E ,
but the pilots have misjudged the approach and are slightly
& EDUARDO SALAS off course. The pilot spools up the engines, making a missed
approach and barely missing a building near the airport.
W
e live in a world filled with complexity These examples are hypothetical, but some argue that
and errors. Echoing the title of a book by the implementation of technology, such as robots in the
Gene Kranz (2000), and as we reflect on manufacturing industry, may be advancing too quickly for
the wake of recent tragedies,“Failure is not operators to keep up (e.g., Jiang & Gainer, 1987). The literature
an option.” The need to develop a safety is filled with evidence that human error in technology-
culture in technology-advanced environments is crucial to an driven environments such as manufacturing and aviation
organization’s success. In this article, we present tips to help contributes to accidents and incidents more than two-thirds
organizations take a macro-level approach to safety through of the time (e.g., Decker, 2001; Helmreich, in press). If the
establishing a safety culture surrounding the implementation manufacturing employee in the earlier example had turned
and use of new technology. the system off prior to entering the work area, the incident
would have been prevented. But he didn’t, as is the case when
Technology and Human Error employees are faced with similar situations (e.g., Järvinen &
Consider the following examples . . . Karwowski, 1995).
An employee at a manufacturing company is monitoring In an effort to protect humans and minimize the risk of
a new robotic system that carries sheet metal from one errors, laws, regulations, and governing agencies have been
workstation to the next, when the system jams and stops. developed (e.g., Federal Aviation Administration regulations,
The employee has the choice to either turn off the system, Occupational Safety and Health Administration guidelines).
which can take several minutes to power off and then back However, despite these safety laws and regulations, a significant
on, or enter the robot’s work area and free the clogged number of accidents and incidents continue to occur each year.
system while the machine is still operational. What does he For example, OSHA estimated that in 2002, approximately 4.7
do – risk losing at least 10 minutes waiting to power back up million worker-related injuries/illnesses (Bureau of Labor
or clear the jam in the system while it’s still on? After all, he’ll Statistics, 2003a) and 5500 deaths occurred in private-sector
be able to move out of the way before the robotic arm moves firms (Bureau of Labor Statistics, 2003b). Using the low-end
back in his direction. As he’s done at least a dozen times estimation of the percentage of those accidents that are
before without incident, the employee enters the work area attributable to human error (60%), it can be estimated that
without powering down the system. As he clears the jam, the 2.82 million injuries and 3300 deaths are the result of
robotic arm knocks him to the ground, fracturing his arm. human error. These figures are astonishing.
FALL 2004 • ERGONOMICS IN DESIGN 25
So how are organizations addressing this issue? Many microlevel factors that must be considered when imple-
have turned to training employees to manage errors before menting technology into an organization to ensure safety.
they become detrimental (e.g., resource management training; Therefore, we present the tips along with findings in the
see Salas, Bowers, & Edens, 2001). These organizations believe literature to support them in a summary table (next page).
that by addressing the micro levels of the organization (e.g.,
providing minimal training to workers), human error will Tip 1. Send appropriate signals that safety matters . . .
be reduced and safety improved. We argue that there is more clearly and precisely communicate them. It is not enough to
than meets the eye when it comes to safety and that training say that safety is important. Organizational leaders must
alone may not be enough. promote it through their actions and by putting it in writ-
ing. Underlying an effective safety culture are documented
The literature is filled with evidence safety policies and procedures. Safety policies set by man-
that human error in technology-driven agers provide employees with a broad description of what is
expected in terms of safe attitudes and behaviors, and proce-
environments contributes to accidents dures provide employees with guidance on how to meet
and incidents more than two-thirds of those expectations (Degani & Wiener, 1997).
the time. Research by Diaz and Cabrera (1997) suggested that
safety policies likely influence workers’ behaviors, their per-
As the complexity of new technologies increases, we ceptions, and the overall safety climate. We argue that in
believe that organizations must address safety at higher order to ensure that safety policies and procedures are
levels – that is, institute a “safety culture” – to ensure that adhered to, organizations must do two things. First, they
technology is implemented and used properly. This is not to must get employees involved (e.g., participatory ergonomics)
say that training for safety is not important; safety training is to ensure that workers will accept the implementation of
still necessary and should be incorporated into this safety new technologies. Research suggests that involving employ-
culture. ees throughout the development and implementation
process of the new technology will give them a feeling of
10 Tips for Developing a Safety knowledge and power, thereby increasing their acceptance of
Culture in Organizations the technology and motivation to perform the desired
In preparing this article, we drew on the available litera- behaviors (e.g., Wilson & Haines, 1997). For example, this
ture to determine which factors must be considered when has proven successful in one organization specializing in the
developing a positive safety culture. Admittedly, there will production of nuclear weapons (Caccamise, 1995).
always be the risk of error when human operators are Second, organizations must avoid normalization of
involved (“to err is human”), but organizations can take deviance (avoid letting unsafe practices become the norm)
steps to reduce dangerous, sometimes lethal incidents by discouraging employees from cutting corners that may
involving technology. From the literature we developed 10 jeopardize safety (Vaughan, 1996). Normalization of deviance
tips that organizations can use now to help minimize future has been cited as a contributing factor in the space shuttle
errors caused by technology overload, mistrust, or simple Challenger accident in 1986 (Vaughan, 1996).
human error when interacting with complex systems.
These suggestions are meant to encourage organizations Tip 2. Make people believe in and support safety . . .
to approach safety from a macrolevel perspective that starting at the top. It has been suggested that a significant
involves the entire organization and not just individual precursor to accidents may be employees’ safety attitudes
workers. We need to point out, however, that this is not an (including attitudes toward the use of technology) and that
attempt to oversimplify the complexities inherent in the some management practices may influence safety attitudes
development of a safety culture. It is likely not as easy as it in organizations, such as employment security or wages
looks. We hope that what we present here will provide based on occupational safety (e.g., increased pay for working
organizations with a starting point from which to build. We under hazardous conditions; Barling & Zacharatos, 1999).
recommend that organizations take great care when incor- Therefore, it is important to create positive safety attitudes
porating these tips into current practices and examine their that express care and concern for errors and hazards and
meaning at a deeper level. that show concern about the impact that errors and hazards
Additionally, although some of the information presented have on all people at all levels in the organization (Barling &
next may seem like old news, through the literature and our Zacharatos, 1999; Pidgeon, 1998).
experiences, we have found that research findings are not To accomplish this, organizations must get a commitment
being implemented in the real world. Initially, the cost and from upper-level managers that supports and encourages
time that an organization must spend to implement a safety safety policies and procedures. This commitment will help to
culture is great and may deter some, but the outcome – ensure that normalization of deviance does not occur (see
improved safety – will far outweigh the investment. Tip 1). In addition, managers must provide feedback to
We also noted the absence of a synthesis of the literature employees on their safety performance. Without support
that could offer a complete picture of the macro- and from those said to be enforcing safety, employees will have
26 ERGONOMICS IN DESIGN • FALL 2004
TIPS AND THEIR EXPLANATIONS FOR DEVELOPING A SAFETY CULTURE
Tips Considerations
1. Send appropriate signals that ● Create written policies and procedures for safety (Degani & Wiener, 1997).
safety matters . . . clearly and ● Get employees involved (e.g., participatory ergonomics;Wilson & Hanes, 1997).
precisely communicate them. ● Avoid normalization of deviance (Vaughan, 1996).
2. Make people believe in safety . . . ● Get a commitment to safety from upper-level managers (Pidgeon, 1998).
start at the top. ● Encourage management to openly demonstrate their commitment
(Barling & Zacharatos, 1999).
● Provide feedback to employees so they will know how they are performing
(Barling & Zacharatos, 1999).
3. Promote error checking . . . ● Develop a continuous learning climate (Hofmann & Stetzer, 1998).
encourage continuous learning. ● Encourage employees to routinely check for errors (Helmreich et al., 1999).
● Encourage employees to learn from their mistakes (Pidgeon & O'Leary, 1994).
4. Open communication is a must . . . ● Have good information flow throughout all levels of the organization
encourage it. (e.g., workers and managers; Barling & Zacharatos, 1999).
● Encourage employees to speak up (Jentsch & Smith-Jentsch, 2001).
● Encourage managers to share information with employees.
5. Search for solutions . . . examine ● Explore solutions to errors from many different angles (i.e., macro and micro).
all levels and promote different ● Use an existing accident investigation technique to explore errors
methods. (e.g., Haddon matrix, Haddon, 1980; HFACS,Wiegmann & Shappell, 2003).
6. Encourage documentation of ● Develop a voluntary, nonpunitive error-reporting system
errors . . . create an error-reporting (Pidgeon & O'Leary, 1994).
system. ● Encourage employees to report errors that went undetected by management
(Barling & Zacharatos, 1999).
7. Prepare people through training . . . ● Follow eight key steps to systematically design, implement, and evaluate a
provide the competencies training program (Salas & Cannon-Bowers, 2000, 2001):
needed. ■ Conduct a training needs analysis.
■ Consider external factors.
■ Establish measurable and task-relevant training objectives.
■ Determine what methods to use.
■ Determine what instructional strategies to use.
■ Develop realistic training scenarios.
■ Evaluate training.
■ Ensure transfer of training back to the job.
8. If you don't know it's broken, ● Continuously examine ongoing behaviors to determine if trained behaviors are
you can't fix it . . . measure/assess being applied on the job.
safe behaviors. ● Examine safety at multiple levels (Kirkpatrick, 1976).
9. You get what you ask for . . . ● Avoid encouraging and supporting behaviors that the safety culture is trying to
reward the right behaviors. discourage (Kerr, 1995).
10. Effective coordination and ● Promote interdependencies among team members.
communication is a must . . . ● Encourage members to coordinate and communicate at the team level
promote teamwork. (Salas & Cannon-Bowers, 2001).
FALL 2004 • ERGONOMICS IN DESIGN 27
little motivation to adhere to safety policies and procedures. and correcting the deviant behavior of the worker. Yet we
The more that workers see and believe management’s have found a number of studies suggesting that accidents
commitment to safety, the more likely they will be to develop may be the result of factors beyond the human operator’s
positive safety attitudes and improve performance. Research control that lie latent within the organization (e.g., safety
by Zohar (1980) showed that management support was culture) and that need to be addressed. Typically in complex
important in industrial settings. systems, there is a chain of events that leads to the user. Any
breakdown in the chain can lead to an error. Therefore,
Tip 3. Promote error checking . . . encourage continuous when investigating accidents, organizations must address
learning. The purpose of a continuous learning climate is to problems at all levels of the chain.
encourage employees to learn from their mistakes, not to hide
or cover them up. Therefore, employees must routinely As the complexity of new technologies
check for errors in order to avoid, trap, or mitigate the con-
sequences of those errors before a serious accident occurs
increases, we believe that organizations
(Helmreich, Merritt, & Wilhelm, 1999). must address safety at higher levels –
Additionally, the learning climate must be encouraged that is, institute a “safety culture.”
and supported by managers at all costs. A learning climate
will be hindered by a punitive climate in which fingers are For instance, managers’ negative attitudes toward tech-
pointed when an error occurs (Hofmann & Stetzer, 1998). nology can filter down to the worker, leading to errors. In
As such, blame should not be placed on those who err this case, blaming the worker will not correct the problem.
(Westrum, 1987, as cited in Pidgeon & O’Leary, 1994). The primary source of the errors needs to be investigated –
Rather, the cause of the error should be investigated (not the attitudes of management. To help examine errors from a
just the outcome of the incident), and when its cause is macrolevel perspective, we encourage organizations to use
determined, the entire organization should learn from it an existing approach if a system is not already in place. For
(Barling & Zacharatos, 1999). example, the Haddon matrix, first developed and explored
in the field of epidemiology (Haddon, 1980), today provides
Tip 4. Open communication is a must . . . encourage it. organizations with a three-dimensional approach to investi-
Organizations can promote a learning climate (see Tip 3) by gating accidents and incidents (see the figure below).
having good information flow between upper management
and workers (Barling & Zacharatos, 1999). It is important Tip 6. Encourage documentation of errors . . . create an
that employees feel comfortable communicating their ideas error-reporting system. Organizations should encourage
and opinions about new technologies and the policies and employees to document errors by developing an error-reporting
procedures developed to accommodate them. For example, system to track incidents that were known to the worker but
employees at all levels of the organization should feel com- went unnoticed by managers. A voluntary, nonpunitive sys-
fortable asserting themselves even when their concerns may tem, such as that implemented in aviation (Aviation Safety
be in conflict with the ideas of management (e.g., Jentsch & Reporting System; see Orlady & Orlady, 1999), would allow
Smith-Jentsch, 2001). Employees must believe that mistakes workers to report an error without the fear of blame and retri-
happen and that learning from them is encouraged and sup- bution (Westrum, 1987, as cited in Pidgeon & O’Leary, 1994).
ported regardless of cost. This system would help managers to determine what caused
the error or incident and would enable the entire organization
Tip 5. Search for solutions . . . examine all levels and to learn from it (Barling & Zacharatos, 1999). A punitive cli-
promote different methods. Many accident investigation mate, on the other hand, will encourage employees to cover
techniques in organizations focus on searching for answers up mistakes and not discuss them openly (Hofmann &
at the micro or individual level - that is, placing blame on Stetzer, 1996).
Decision Cost
Criteria Effectiveness
Feasability
Feasibility
Human Physical Social
System Environment Environment
Pre-event
Phases
Event
Post-event
The Haddon matrix for evaluating incidents (adapted from Runyan, 1998).
28 ERGONOMICS IN DESIGN • FALL 2004
Tip 7. Prepare people through training . . . provide the other levels to ensure that the trained safe behaviors are
competencies needed. For about two decades, researchers applied on the job and that they last over time. By using meth-
have argued that training is necessary to ensure the proper ods of behavioral measurement, organizations can pinpoint
use of technology, to encourage safe behaviors, and to reduce problems and incorporate them into future training programs.
human error (see Jiang & Gainer, 1987; Ziskovsky, 1984). Yet
many training programs fall short, for two reasons. First, many Tip 9. You get what you ask for . . . reward the right
programs only provide the basic training needed for workers behaviors. One of the biggest follies in organizations is that
to interact with technology, usually the minimum required by the wrong behaviors are often rewarded (Kerr, 1995). In an
law. Time constraints and costs are often cited as reasons why attempt to reward safe behaviors, organizations will actually
only minimum training is given. Second, there is a “science encourage and support those behaviors that the safety culture
of training” that organizations do not follow – that is, they is trying to discourage.
fail to design, develop, implement, and evaluate training For example, consider the development of a voluntary
systematically (see Salas & Cannon-Bowers, 2000, 2001). error-reporting system. The purpose of the system is to learn
In addition, there are many myths about training to which from errors and mistakes, but the requirement of some sys-
training developers can fall prey (see Salas, Cannon-Bowers, tems that employees submit identifying information with
Rhodenizer, & Bowers, 1999; Salas, Wilson, Burke, & Bowers, the report so that they may be contacted at a later date for
2002). Resource management training has been successfully more information will likely discourage them from reporting.
provided in many domains (e.g., aviation, nuclear) as a means Therefore, an unsuccessful safety culture may be attributable
to improving safety (Salas et al., 2001). Considering the extent to something that the organization least expects. They must
to which training significantly affects a worker’s performance be cognizant of the fact that corrective measures need to be
and efficiency, it should be based on what is learned from the taken to reward the right behaviors.
accident investigations, be developed to correct deviant
behaviors, and encourage the safety culture desired by the Tip 10. Effective coordination and communication is a
organization (Harvey, Bolam, Gregory, & Erdos, 2001). must . . . promote teamwork. Many organizations rely on
teams to accomplish their goals. For example, the medical
The purpose of a continuous learning community promotes teamwork as a means for improving
safety (e.g., Small, 1998). Teamwork is characterized by a set
climate is to encourage employees to of flexible and adaptive behaviors (what team members do),
learn from their mistakes, not to hide cognitions (what team members think), and attitudes (what
or cover them up. team members feel) – in other words, competencies (Salas &
Cannon-Bowers, 2001).
There are eight primary factors to be considered when Organizations must encourage members to work together
designing and developing a training program. Space limita- by coordinating and communicating at the team level. The
tions do not allow a thorough discussion of these factors here, synchronized collective action of team members requires a
but they are noted in the table on page 27, and we encourage collection of processes, strategies, and actions that allow team
readers to see Salas and Cannon-Bowers (2000) and Wilson, members to effectively and efficiently perform interdepend-
Priest, Salas, and Burke (in press). ently. Teamwork competencies naturally lend themselves to
safer work environments, so teamwork must be promoted
Tip 8. If you don’t know it’s broken, you can’t fix it . . . and supported by managers in order to be successful.
measure and assess safe behaviors. Following safety train-
ing, it is important that organizations assess whether the In Closing
appropriate safe behaviors were learned and transferred to With increasing technology comes a greater likelihood
the actual task environment. Furthermore, organizations of errors and a greater need for an integrated, macrolevel
must continuously examine ongoing behaviors to determine if approach to safety. We hope we have succeeded in promoting
trained behaviors are being applied on the job over time. If an understanding of issues that influence safe practices in
they are not, and unsafe behaviors are uncovered, corrective organizations through the foregoing tips that organizations
measures need to be implemented. can use to develop a positive safety culture that embraces
Kirkpatrick (1976) developed a four-level typology for technology.
evaluating training (reactions, learning, behaviors, and We and other researchers have been arguing for some
organizational impact). A recent paper by Salas et al. (2002) time that organizations should take a macrolevel approach
emphasized the need to evaluate training at multiple levels; to improving safety (e.g., Imada & Nagamachi, 1990). If they
these researchers suggested that positive reactions to training are, why has there been so little implementation of this
do not guarantee learning, and, additionally, learning does approach? Are organizations afraid of what they may find?
not guarantee that trained behaviors will be applied on the The threat of error will always be a possibility as humans
job. Although many organizations assess trainees’ reactions interact with technology, so why not approach error proac-
(because it is the easiest to measure!), they must also assess tively? Developing a positive safety culture is a welcome start
FALL 2004 • ERGONOMICS IN DESIGN 29
in the right direction. We put forth this challenge to organiza- Pidgeon, N. F., & O’Leary, M. (1994). Organizational safety culture: Implica-
tions – to critically evaluate themselves to ensure that their tions for aviation practice. In N. A. Johnston, N. McDonald, & R. Fuller
(Eds.), Aviation psychology in practice (pp. 21–43). Aldershot, England:
workforces remain safe as technology is integrated into their
Avebury.
work environments. Runyan, C. (1998). Using the Haddon matrix: Introducing the third dimen-
sion. Injury Prevention, 4, 302–307.
References Salas, E., & Cannon-Bowers, J. A. (2000). Designing training systems sys-
Barling, J., & Zacharatos, A. (1999). High performance safety systems: Ten tematically. In E. A. Locke (Ed.), The Blackwell handbook of principles of
management practices to create safe organizations. Paper presented at the organizational behavior (pp. 43–59). Malden, MA: Blackwell.
Academy of Management meeting, Chicago. Salas, E., & Cannon-Bowers, J. A. (2001). The science of training: A decade
Bureau of Labor Statistics. (BLS). (2003a). BLS Industry Injury and Illness of progress. Annual Review of Psychology, 52, 471-499.
Data. Retrieved October 21, 2004: http://www.bls.gov/iif/oshwc/osh/os/ Salas, E., Bowers, C. A., & Edens, E. (Eds.). (2001). Improving teamwork in
ostb1246.pdf. organizations: Applications of resource management training. Mahwah, NJ:
Bureau of Labor Statistics. (BLS). (2003b). Census of Fatal Occupational Erlbaum.
Injuries. Retrieved October 21, 2004: http://stats.bls.gov/iif/oshwc/cfoi/ Salas, E., Cannon-Bowers, J. A., Rhodenizer, L., & Bowers, C. A. (1999).
cfch0002.pdf. Training in organizations: Myths, misconceptions, and mistaken
Caccamise, D. J. (1995). Implementation of a team approach to nuclear crit- assumptions. Personnel and Human Resources Management, 17, 123–161.
icality safety: The use of participatory methods in macroergonomics. Salas, E., Wilson, K. A., Burke, C. S., & Bowers, C. A. (2002). Myths about
International Journal of Industrial Ergonomics, 15, 397–409. crew resource management training. Ergonomics in Design, 10(4), 20–24.
Decker, S. (2001). The field guide to human error investigations. Aldershot, Small, S. D. (1998). What participants learn from anesthesia crisis resource
England: Ashgate. management training. Anesthesiology, 89, U153.
Degani, A., & Wiener, E. L. (1997). Philosophy, policies, procedures and prac- Vaughan, D. (1996). The Challenger launch decision. Chicago: University of
tices: The four P’s of flight deck operations. In N. Johnston, N. McDonald, Chicago Press.
& R. Fuller (Eds.), Aviation psychology in practice (pp. 44–67). Aldershot, Wiegmann, D., & Shappell, S. (2003). A human error approach to aviation
England: Avebury. accident analysis: The human factors analysis and classification system.
Diaz, R. I., & Cabrera, D. D. (1997). Safety climate and attitudes as evaluation Aldershot, England: Ashgate.
measures of organizational safety. Accident Analysis & Prevention, 29(5), Wilson, J. R., & Haines, H. M. (1997). Participatory ergonomics. In G. Salvendy
643–650. (Ed.), Handbook of human factors and ergonomics (pp. 490–513). New
Haddon, W., Jr. (1980). Advances in the epidemiology of injuries as a basis York: Wiley.
for public policy. Public Health Reports, 95, 411–421. Wilson, K. A., Priest, H. A., Salas, E., & Burke, C. S. (in press). Can training
Harvey, J., Bolam, H., Gregory, D., & Erdos, G. (2001). The effectiveness of for safe practices reduce the risk of organizational liability? To appear in
training to change safety culture and attitudes within a highly regulated I. Noy & W. Karwowski (Eds.), Handbook of human factors in litigation.
environment. Personnel Review, 30, 615–636. London: Taylor & Francis.
Helmreich, R. L. (in press). Culture, threat, and error: Assessing system safety. Ziskovsky, J. P. (1984). Risk analysis and the R3 factor. Proceedings of the 8th
To appear in Safety in aviation: The management commitment: Proceedings Robotic Conference, Detroit, MI (pp. 15.9–15.21). Dearborn, MI: Robotics
of a conference. London: Royal Aeronautical Society. International of SME.
Helmreich, R. L., Merritt, A. C., & Wilhelm, J. A. (1999). The evolution of Zohar, D. (1980). Safety climate in industrial organizations: Theoretical and
crew resource management training in commercial aviation. Interna- applied implications. Journal of Applied Psychology, 65(1), 96–102.
tional Journal of Aviation Psychology, 9(1), 19–32.
Hofmann, D. A., & Stetzer, A. (1998). The role of safety climate and communi- Katherine A. Wilson-Donnelly is a doctoral candidate in the
cation in accident interpretation: Implications for learning from negative
events. Academy of Management Journal, 41(6), 644–657.
applied experimental and human factors psychology program
Hofmann, D. A., & Stetzer, A. (1996). A cross-level investigation of factors influ- at the University of Central Florida, Institute for Simulation &
encing unsafe behaviors and accidents. Personnel Psychology, 49, 307–339. Training, 3280 Progress Dr., Orlando, FL 32826, 407/882-1408,
Imada, A. S., & Nagamachi, M. (1990). Improving occupational safety and
[email protected]. Her research interests include aviation and
health: Non-traditional organizational design and management ap- medical safety, team training, and team performance. Heather A.
proaches. In K. Noro & O. Brown, Jr. (Eds.), Human factors in
organizational design and management – III (pp. 483–486). Amsterdam: Priest is a graduate student in the applied experimental and
North-Holland. human factors psychology program at the University of Central
Järvinen, J., & Karwowski, W. (1995). Analysis of self-reported accidents Florida. Her research interests include team training and team
attributed to advanced manufacturing systems. International Journal of performance measurement, patient safety and human factors
Human Factors in Manufacturing, 5, 251–266.
in medicine, and organizational safety. C. Shawn Burke is a
Jentsch, F., & Smith-Jentsch, K. A. (2001). Assertiveness and team perform-
ance: More than “just say no.” In E. Salas, C. A. Bowers, & E. Edens (Eds.), research associate at the Institute for Simulation & Training,
Improving teamwork in organizations: Applications of resource management University of Central Florida. His primary research interests
training (pp. 73–94). Mahwah, NJ: Erlbaum. include teams, patient safety, team leadership, team adaptabil-
Jiang, B. C., & Gainer, C. A., Jr. (1987). A cause-and-effect analysis of robot ity, team training and measurement, and team effectiveness.
accidents. Journal of Occupational Accidents, 9, 27–45.
Kerr, S. (1995). On the folly of rewarding A, while hoping for B. Academy of
Eduardo Salas is a professor in the Department of Psychology,
Management Executive, 9(1), 7–14. University of Central Florida, where he also holds an appoint-
Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training ment as program director for the Human Systems Integration
and development handbook: A guide to human resource development (2nd Research Department at the Institute for Simulation & Training.
ed., pp. 1–26). New York: McGraw-Hill. His research interests include team training, medical safety,
Kranz, G. (2000). Failure is not an option: Mission control from Mercury to
Apollo 13. New York: Simon & Schuster.
and team adaptability. Portions of this paper were presented at
Orlady, H. W., & Orlady, L. M. (1999). Human factors in multi-crew flight the Human Factors and Ergonomics Society 47th Annual
operations. Aldershot, England: Ashgate. Meeting in Denver, Colorado, October 13–17, 2003.
Pidgeon, N. F. (1998). Safety culture: Key theoretical issues. Work & Stress, 12,
202–216.
30 ERGONOMICS IN DESIGN • FALL 2004