0% found this document useful (0 votes)
105 views15 pages

The Structured Interview An Alternative To The Assessment Center?

This document discusses a framework for structured interviews that provides a valid and less costly alternative to assessment centers. It suggests basing interview questions on a job analysis and ranking tasks by importance. Interviewers should be trained and consistent standards established around scoring and documentation. Research is needed to refine the framework, but it could help address the lack of standards for interviews compared to assessment centers.

Uploaded by

Raymond Bailey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views15 pages

The Structured Interview An Alternative To The Assessment Center?

This document discusses a framework for structured interviews that provides a valid and less costly alternative to assessment centers. It suggests basing interview questions on a job analysis and ranking tasks by importance. Interviewers should be trained and consistent standards established around scoring and documentation. Research is needed to refine the framework, but it could help address the lack of standards for interviews compared to assessment centers.

Uploaded by

Raymond Bailey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

The Structured Interview

An Alternative To The
Assessment Center?

This article discusses how to improve the validity and reliability of structured interviews. A
framework for the structured interview is suggested. The framework is based on the founda­
tions laid by various researchers, as well as the guidelines for assessment centers. The
proposed framework was used to structure an interview used in a selection test. The results
suggest that this kind of structured interview may be a valid and less costly alternative to the
assessment center. Additional research to refine and build on the framework is suggested.

By Personnel selection for managerial and supervisory positions is im-


Phillip E. Lowry portant for efficient and effective conduct of business in both the private
and public sectors.

The most widely used personnel selection process today is the inter­
view. Dipboye reports that over 70% of organizations in the United States
use the unstructured interview in promotion decisions. In Europe the
percentages are even higher with over 90% of British and 9 4 % of French
1
employers reporting the use of interviews for managerial selection.

The validity and reliability of the unstructured interview has been


shown to be relatively low. Several procedures, such as adding structure to
the process and establishing standards have been suggested for improving
the interview process. These preliminary efforts have markedly improved
2
the reliability and of the interview process.

Purpose

The purpose of this article is to build a framework of suggested


procedures for the structured interview based on both the foundation laid
by previous researchers, and the guidelines in use for assessment centers.
This framework provides practitioners and researchers with a new starting
point for the design and conduct of structured interviews and also defines
Phillip E. Lowry is the Chair of the
Department of Public Adminis­
tration and is concurrently an as­ nascent standards for the structured interview.
sociate professor of management
and public administration at the
University of Nevada, Las Vegas.
Dr. Lowry teaches and conducts
research in the personnel assess­
The Assessment Center and the Structured Interview
ment and selection process. His
latest article, "The Assessment
Center: Effects of Varying Con­
sensus Procedures" appeared in
Public Personnel Management, Vol. The assessment center method, while not used as extensively as the
21, No. 2, Summer, 1992.
interview, has been receiving increasing attention. It is particularly impor-
Supportfor this research was provided in part by a grant from the First Interstate Bank.

Public Personnel Management Vol. 23 No.2 (Summer 1994) 201

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


tant in the managerial selection process for fire and police departments. In
1982, over 44 percent of 156 United States federal, state, and local govern­
3
ments used the assessment center. As of 1984,32 of 73 metropolitan United
States fire departments used the assessment center, especially for promo­
4
tion to supervisory and managerial positions.

Meta-analytic studies of both the assessment center and interview


methods reveal that structured interviews and assessment centers have
similar validities. Wiesner & Cronshaw reported that the 95% confidence
interval for the validity coefficient for structured employment interviews
was .34 - .86 using the criterion of potential job success. By comparison,
Gaugler, Rosenthal, Thornton ΙΠ, & Bentson reported the assessment center
95% confidence interval was .15 - .91 for the similar criterion of manage­
5
ment potential.

While the validities for assessment centers and structured interviews


are similar, the direct and indirect costs are not. Typically, assessment
centers use three or more situational simulations requiring direct observa­
tion. A structured interview, on the other hand, may be conducted with
only one situation requiring direct observation. Thus the time required to
test using a structured interview could be reduced by as much as 25% - 50%
of that required for an assessment center with a concomitant reduction in
cost.

One of the most important strengths of the assessment center process


is the defined set of standards for the design and conduct of an assessment
6
center. While these standards are not complete and are still evolving, they
do at least provide a fairly definitive set of suggested ways to design and
conduct assessment centers. Conversely, one of the most striking deficien­
cies in the interview method today is the lack of such procedures and
standards.

The need for definitive standards or guidelines for interviews has


long been recognized. As early as 1982 Arvey and Campion suggested that
such guidelines were needed, but they should be based on research results
7
rather than "...intuition, beliefs, and what seems more comfortable." Most
recently, researchers such as Daniel & Valencia; Dipboye; Eder; Wiesner &
Cronshaw; and especially Campion, et al. have provided important in­
8
sights that are leading toward these standards.

The Framework

For the purposes of this article, I define a structured interview to mean


"holding all interviewers to the same questions," "providing job informa­
tion and a clear set of specifications for requirements," and providing

202 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


behavior defined and anchored rating scales. This definition includes
'situational' interviews.

Primarily, structured interviews like all tests must be based generally


on the standards and guidelines contained in the Validation Principles and,
10
at least in the United States, on the Uniform Guidelines. Some of the critical
characteristics that should be included as parts of the framework for a valid
structured interview are: the interview questions; how to deal with the
candidate's prior training and experience, including performance; the con­
sistency of the process; the number of interviewers, their selection criteria,
and their training; how to determine scores; obligations to the candidates;
and documentation requirements. Each of these characteristics is addressed
below.

Interview Questions

Base interview questions on a job analysis. This is probably the single


most important characteristic of any valid selection test, including the
structured interview. Next, rank the output of the job analysis: the tasks or
lists of knowledge, skills, and abilities (KSA) based on some measure of
importance. Third, use a panel of subject matter experts (SME) to derive the
questions or situational simulations from the ranked tasks or KSAs. The
questions or simulations should measure only critical aspects of the job
11
required at entry.

Questions may include situational aspects such as requiring the can­


didate to perform a task. For example, assume the job analysis showed that
the job incumbent must be able, at the time of assignment to the job, to
properly complete a report. Relevant actions and questions could include
a requirement, before the interview, to complete a report based on a
relevant hypothetical situation. During the interview one or more questions
could be directed toward the action. For example, one such question could
be "What factors do you consider to be the most important in this situ­
ation?," followed by "Why did you select those factors?," etc. Questions
can also include references to experience. For example, "What do you
12
typically do when you are in this kind of situation?"

Try to use a different SME panel that includes members of protected


groups to develop the answers to the questions. In any event, check the
13
answers for accuracy as well as for potential bias or misinterpretation.

The Structured Interview 203

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


Training and Experience

Information can be gathered and evaluated concerning the candidate's training,


14
education, performance, etc.

Hinrichs suggested that the evaluation of personnel records if used


in conjunction with "an extensive personal interview" might be even more
effective than assessment centers.

I used a panel of judges, other than the assessors, to evaluate candi­


date records in seven assessment centers with 55 candidates. The judges
gave a score based on how well they believed the candidate's training,
experience, and performance prepared them for the management job under
consideration. These results were compared with the criterion (supervisor
estimates of management potential). The relationship was significant: the
records evaluation score provided additional significant information about
16
the management potential of the candidate.

If such evaluations are used, exercise care to ensure this kind of


17
information does not improperly influence the interviewers. Such evalu­
ations should probably be done outside the interview, preferably by other
judges.

Consistency of the Process

The entire process must be consistent from candidate to candidate. The


candidates must all be exposed to the same situations and questions. There
18
should be no ad-libbing, follow-up, or prompting. For example, the
questions can include logical follow-up questions given to all the candi­
dates, regardless of how they answered the first question. Such questions
as "Why did you select that choice?" are useful.

It has been suggested that, "The same panel member should conduct
19
all interviews and ask all questions." Such a procedure ensures consis­
tency. Consistency is also ensured when each panel member is assigned a
given set of questions. This reduces interviewer fatigue, provides some
variety in the process, and better distributes the work.

Another alternative that might be useful is to give the candidate a list


of the questions at the beginning of the interview with instructions to
20
answer each in order.

If a situational exercise is used, consistency and accuracy can be


enhanced by using a role player(s). Use of role players reduces the need for
interaction with the candidates. In assessment centers such interaction can

204 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


make it difficult for the interviewer to observe and record all the important
behaviors, especially in a fast moving or highly emotionally charged situ­
ation. It is also difficult for interviewers to remain emotionally detached
and objective when they are playing a role in the exercise such as "the
Chief," "the disgruntled employee," etc. Finally, it appears to be easier for
a role player to be consistent in dealing with the candidates as the situation
develops. The interviewers are able to objectively evaluate the role player's
performance and suggest changes to ensure consistency. This is something
they seem to have difficulty doing for themselves.

Interviewers

Use multiple interviewers. At least two and up to five interviewers


21
appear to be a reasonable number. Wiesner & Cronshaw found that the
mean reliability coefficient was significantly higher for multiple interview­
ers than for a single interviewer. Campion, et al. suggested that multiple
22
interviewers will reduce the impact of possible idiosyncratic biases.

Probably the most important criteria for interviewer selection are: (a)
they should have had extensive experience in observing and evaluating
behaviors on the job, and (b) they should not know the candidate.

The current assessment center guidelines contain information about


the training of assessors that is relevant for interviewers. I have recently
completed a study of the effect of assessor characteristics on scores. The
results of this study, and the assessment center guidelines, suggest that the
key to good interviewer training is to ensure they receive intensive training
in the simulations, interview procedures, and rating criteria that will be
23
used.

Scoring Procedures

Develop a carefully constructed behavior anchored observation and rating


2
scale with examples ofgood and bad answers. * The interviewers should receive
extensive instruction in the use of these checklists.

The actual procedure for determining the candidate's score in a


structured interview has not been widely reported. There is a large body of
literature on the scoring procedures used in assessment centers, but the
processes suggested may or may not be appropriate for structured inter­
views.

The assessment center guidelines allow the use of either an integrat­


ing discussion or a validated statistical process to determine the final score

The Structured Interview 205

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


for a candidate. Given that there is much reported research that supports
either process, there seems to be no strong evidence that reliability or
2
validity is significantly different if either technique is properly used.

I asked 54 assessors in nine assessment centers for their opinion about


the importance of an integrating discussion at the end of each exercise.
Three stated that they felt no need for such a discussion while 51 stated the
discussion was important to ensure they did not miss an important behav­
ior. Even though validities may not differ, it appears that the interviewers
would be more comfortable if there were an integrating discussion at the
end of each phase of the interview process.

If an integrating discussion is used, take care to reduce the possibility


of inter-interviewer influence. Lowry addressed this issue in the assess­
ment center process. He suggested that the content of the integrating
discussion be restricted to only observed behaviors with no evaluative
comments or divulging of scores allowed. He reported that there was no
evidence of interassessor influence when these procedures were fol­
27
lowed.

Obligations to the Candidates

Make the situation as non-stressful as possible. Inform the candidates of


the elements of the process, the restrictions on the organization about
divulging information from the process and other relevant information.
Give the candidates feedback about their performance, and let them know
28
when and in what form the feedback will be presented.

Documentation

Document the entire process. This is to ensure the process follows


29
appropriate testing guidelines, and so it can be defended if necessary.

Method

The model structured interview described above was used for select­
ing police officers for the position of police agent (detective) in a medium
size Western city. Police officer candidates who met certain prerequisites
for length of service, etc., took a written screening examination. The results
were used in the final selection decision. Those who passed the screening
examination were allowed to participate in the structured interview.

206 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


Twenty-one candidates successfully completed the screening exami­
nation and began the interview process. Two dropped out before complet­
ing the process.

The Interview

Job Analysis

I performed a job analysis using ten incumbent detectives and three


supervisors to analyze the tasks and the overall performance dimensions.
They concluded that the most important tasks that a newly selected detec­
tive must be able to perform on entry to the position were to be able to
conduct an interrogation and to prepare a report. In addition, they deter­
mined that the new agent must know the law and department policies
concerning arrest, the rights of the accused and related information.

The Scenario

Based on the job analysis, The SME panel developed a scenario for the
interview process. The scenario involved a male victim of an assault with
a deadly weapon. The chief of detectives provided a list of key issues to be
evaluated, together with a set of questions for the candidates. These were
integrated into the behavior anchored observation and rating forms.

The Interview Process

The interview process included three phases. In Phase 1, the candidate


interrogated an assault victim (who was a role player). A panel of inter­
viewers observed the interrogation, but did not interact with the candidate.
In Phase 2, the candidate prepared a written report of the interview. In
Phase 3, the interviewer panel questioned the candidate about the written
report prepared in Phase 2, and matters related to the situation that un­
folded in Phase 1.

Two interviewers observed Phase 1, and the remaining two con­


ducted the interviews in Phase 3.

The Phase 3 interviewer panel evaluated the written report, and


questioned the candidates on related issues. For example, the panel deter­
mined if the candidate wrote a legible, understandable, and complete
report; and whether all essential elements were listed. Each of the panel
members for Phase 3 was assigned nine questions that were either related
to the written report, or to the overall issues. Some examples were: "What
was the type of crime and why?," "What would you have done if the victim
had been a juvenile?," "How should a line-up be conducted?"

The Structured Interview 207

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


At the conclusion of Phases 1 and 3, the respective panels engaged in
an integrating discussion to determine the candidate's score and principal
strengths and weaknesses. The integrating discussion included only non-
evaluative comments about observed behaviors. No scores were discussed
at any time.

The interviewers were told to score the candidate based on the global
standard of the interviewer, and not to score based on a comparison of one
candidate with another. The scores assigned were based on a five point
scale that ranged from "very much below" to "very much above" the global
standard of the interviewer.

The Interviewers and the Role Player

The local government selected the interviewers and role player from
local police departments in the geographic area. The interviewers were
experienced police detectives, and the role player was a senior police
officer. None of the interviewers or the role player knew any of the
candidates. Each of the interviewers was well versed in the law and in the
general policies and procedures that were appropriate in this case.

Training

Approximately two weeks before the interview, I provided a compre­


hensive instruction manual for the interviewers and role player. This
manual covered the procedures that would be used in the process. On the
first day of the process, the interviewers and role player were given the
actual scenarios, the instructions to the candidates and interviewers, the
behavior anchored observation and rating forms, the questions that were
to be asked, and the issues to be considered. They had the opportunity to
discuss the entire process with the department's chief of detectives. As a
result of this exchange, several of the previously provided issues and
questions were revised.

The interviewers and the role player then went through a series of
trial runs using each other and observers as the candidate. At the end of the
training program, each interviewer and the role player knew precisely what
was to be done.

Evaluation of Training and Experience

The organization did not evaluate training and experience. It is mov­


ing toward that process, but at the time of this selection process, the
procedures for records verification and other details had not been com­
pleted.

208 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


Obligations to the Candidates

Approximately one month before the interview process, I provided


the candidates with a comprehensive manual that explained the process
they would go through. This manual was similar to the interviewers
manual and was designed to reduce anxiety.

On the first day of the process, the candidates met all the interviewers
and the role player in an attempt to allay any concerns. They were told
about the purpose of the process, and were informed that their performance
was treated as confidential information. Finally, they were told that they
would receive written feedback of the interviewer's comments about their
strengths and weaknesses.

Results

This was the first time the candidates had ever been exposed to a
structured interview. Many had taken "oral boards" in the past, but the
process used in this situation was new to them. Seventeen of the nineteen
candidates expressed a strong belief that the process was both fair and job
related.

The interviewers were questioned about their perceptions concerning


the relevancy and fairness of the process. They were unanimous in their
belief that the process was both fair and relevant. They suggested that the
process was at least equivalent to an assessment center.

There were two panels of interviewers. Because there was no interac­


tion between the panels during the process, the inter-rater reliability was
measured for each panel. The intra-class correlation for the both panels was
30
quite high: Panel 1, ICC(2,2) = .88; Panel 2, ICC(2,2) = .89.

Inter-rater agreement was also relatively high. Assuming that a one


rating scale difference signified agreement, the inter-rater agreement for
31
each panel was: Panel 1, Τ = 0.89; Panel 2, Τ - 1.00. The percentage of
agreement was 9 5 % and 100% for each panel respectively.

Discussion and Conclusions

This article proposes a framework for the structured interview that


includes the major components suggested by several researchers. The
primary differences between the procedures used in this structured inter­
view and those proposed by others are (1) the inclusion of a relatively

The Structured Interview 209

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


complex situational simulation, (2) the use of a role player, and (3) the use
of the integrating procedure at the end of each phase.

The results of this process are acceptable to not only the candidates,
but also the interviewers. There is a very high degree of both inter-rater
agreement and reliability. Finally, and perhaps most important, manage­
ment has accepted the process. The local government intends to use the
same process for their next selection test for detectives.

The results clearly support the total proposed framework, including


the three additional procedures. Although the sample size is relatively
small, the number of subjects is consistent with what is typically found in
local government selection processes.

Campion, et al., used a situational element in the structured interview


they reported on. This element was, however, quite simple. It required only
32
the reading aloud of a set of instructions. Thus far I have not seen a report
of anyone using a role player or an integrating discussion such as that used
here.

The time for developing this structured interview was about the same
as that required for an assessment center. However, the entire process was
completed in three days, one day less than would have been required for
an assessment center using the same number of interviewers. The cost was
less, making this an attractive alternative.

Today's assessment center method is based on a set of guidelines that


33
have been evolving. However, there is still no "typical" assessment center.
For example, one can measure performance dimensions, task competence,
or both in an assessment center. One can use a consensus process, or a
statistical integration technique to arrive at a final score. The final score
itself can be a pooled result from the exercises, or an overall assessment
rating (OAR) that represents the assessor's best judgment after all the
exercises are completed. Hence any discussion of the validity of an assess­
ment center must include a careful disclosure of precisely how the center
was conducted.

The problem of process variation in structured interviews is even


more difficult. There are an infinite number of ways the interview process
can be conducted. Therefore, it is important that standards be specified.

The framework for a model structured interview discussed in this


article is an extension of the processes suggested by the various researchers
cited herein. These processes are conditioned by the special circumstances
of the structured interview. For example, structured interviews usually
include only one appearance of the candidate before the interviewers. One
or more situations may be addressed and many questions posed. The

210 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


assessment center typically requires the candidate to make multiple ap­
pearances in multiple situations. Hence, it is especially important that the
situation(s) and the associated questions used in the structured interview
represent the most important task(s) or performance dimensions. While
this is important in an assessment center, it is critically important in a
structured interview.

The described model structured interview can include a procedure to


give candidates the opportunity to directly reveal information about past
accomplishments, assignments, education, etc. While this was not directly
tested in this field study, there is ample evidence that supports the use of
such information in the total selection process. The primary concern, how­
ever, is that revealing past accomplishments to the same interviewers or
assessors who will also evaluate their performance on some relevant task
may bias the evaluation. This can be dealt with rather easily by using a
different group of judges to evaluate such past accomplishments.

The structured interview is beginning to receive more support as a


valid selection tool. Its primary advantage is that it is derived directly from
the most widely used selection tool in use today, the unstructured inter­
view. It is rather simple, especially when compared with an assessment
center. It is less costly than an assessment center. It is generally as valid as
an assessment center.

On the other hand, the major disadvantages to the structured inter­


view appear to be the lack of defined standards, and the fact that a
candidate is usually seen only once. The decisions made during that one
appearance must be correct, there is no second chance. The assessment
center provides additional rating opportunities beyond one appearance
which could make a difference.

Structured interviews do offer some advantages, especially when


coupled with some form of records evaluation. The addition of a separate
records evaluation process could add a scored dimension that might over­
come, at least partially, the lack of observation opportunity in a structured
interview.

There is strong evidence that the model structured interview de­


scribed in this article offers a reasonable alternative to the assessment
center. More specifically, if organizations wish to either move away from
assessment centers or to continue to use interviews, they should consider
the model structured interview described here.

Empirical research in the form of experiments and field studies


should be conducted to assess the predictive validity, as well as to deter­
mine the most appropriate form for the proposed model structured inter-

The Structured Interview 211

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


view process. The process detailed here is a start toward a fully tested and
validated selection tool.

Notes

Robert L. Dipboye, Selection Interviews: Process Perspectives (Cincinnati: Southwestern Pub­


lishing Co, 1992), 5-6.

2
Michael A. Campion, Elliott D. Pursell, and Barbara K. Brown, "Structured Interviewing:
Raising the Psychometric Properties of the Employment Interview," Personnel Psychology 41
(1988): 25-42.

3
Louise F. Fitzgerald, and Marilyn K. Quaintance, "Survey of Assessment Center Use in State
and Local Government," Journal of Assessment Center Technology 5 (1982): 9-21.

4
Samuel J. Yeager, "Use of Assessment Centers by Metropolitan Fire Departments in North
America," Public Personnel Management 15 Spring (1986): 51-64.

5
Willi H. Wiesner and Steven F. Cronshaw, "A Meta-Analytic Investigation of Interview
Format and Degree of Structure on the Validity of the Employment Interview," Journal of
Occupational Psychology 61 (1988): 275-290; Barbara D. Gaugler, Douglas B. Rosenthal, George
C. Thornton ΠΙ, and Cynthia Bentson, "Meta-Analysis of Assessment Center Validity," Journal
of Applied Psychology 72 (1987): 493-511. A point estimate of assessment center validity of 0.53
for management potential was also reported by Gaugler, et al. (p. 503). A point estimate of
interview validities of 0.39 was reported by Patrick M. Wright, Philip A. Lichtenfells, and
Elliott D. Pursell, "The Structured Interview: Additional Studies and a Meta-Analysis," Journal
of Occupational Psychology 62 (1989): 191.

6
Task Force on Assessment Center Guidelines, "Guidelines and Ethical Considerations for
Assessment Center Operations," Public Personnel Management 18 Winter (1989): 457-470. Note
the use of the term "Guidelines" in the title. According to Dennis Joiner, the Task Force chose
to use that term so as to allow more flexibility in the application of the assessment center
process. [Dennis Joiner, "Assessment Centers and Job Simulation Exercises: Professional and
Ethical Issues," PTC/SC Newsletter 3 , 1 1 (November, 1991)].

7
Richard D. Arvey, and James E. Campion. The Employment Interview: A Summary and
Review of Recent Research. Personnel Psychology 35 (1982): 317.

8
Campion, et al., 25-42; Christopher Daniel, and Sergio Valencia, "Structured Interviewing
Simplified," Public Personnel Management 20 (1991): 127-134; Dipboye; Robert W. Eder, "Con­
textual Effects on Interview Decisions," in The Employment Interview: Theory, Research, and
Practice, ed. Robert W. Eder and Gerald R. Ferris (Newbury Park, CA: Sage Publications, Inc.,
1989), 113-126; Wiesner, et al., 275-290.

9
Dipboye, 158.

1 0
Equal Employment Opportunity Commission, Civil Service Commission, Department of
Labor, & Department of Justice, "Adoption by Four Agencies of Uniform Guidelines on
Employee Selection Procedures,"Federal Register 43 (1978): 38290-38315; Society of Industrial
and Organizational Psychology, Inc., Principles for the Validation and Use of Personnel Selection
Procedures 3d ed. (College Park, MD: Society of Industrial and Organizational Psychology, Inc.,
1987)

1 1
Campion, et al., 27-8; Task Force, 461; Hubert S. Feild, and Robert D. Gatewood, "Contextual
Effects on Interview Decisions," in The Employment Interview: Theory, Research, and Practice, ed.
Robert W. Eder and Gerald R. Ferris (Newbury Park, CA: Sage Publications, 1989), 145-57.

212 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


1 2
Campion, et al., 28; Gary P. Latham, Lise M. Saari, Elliott D. Pursell, and Michael A.
Campion, "The situational interview," Journal of Applied Psychology 65 no. 4:423-26.

1 3
Campion, et al., 28.

1 4
Dipboye, 56-7.

1 5
J(ohn) R. Hinrichs, "Comparison of 'Real Life' Assessment of Management Potential with
Situational Exercises, Paper-and-Pencil Ability Tests, and Personality Inventories," Journal of
Applied Psychology 53, no. 5:431.

1 6
The full results of these analyses are being prepared for publication.

1 7
Dipboye, 27; Campion, et al., 29.

1 8
Campion, et al., 28-9.

1 9
Campion, et al., 29.

2 0
D. Ε. Lovelace, personal communication, November 8,1991.

2 1
Campion, et al., 29; Dipboye; Daniel & Valencia, 129.

2 2
Wiesner & Cronshaw, 285; Campion, et al., 29.

2 3
Task Force, 465-470. A complete discussion of m y suggested model for selecting and
training assessors is contained in an article accepted for publication by Public Personnel
Management.

2 4
Campion, et al., 28; Daniel & Valencia, 130; Richard R. Reilly, Sarah Henry, and James W.
Smither, "An Examination of the Effects of Using Behavior Checklists on the Construct Validity
of Assessment Center Dimensions," Personnel Psychology 43 (1990): 71.

2 5
Task Force, 463.

2 6
George C. Thornton, Assessment Centers in Human Resource Management (Reading, Massa­
chusetts: Addison-Wesley Publishing Company, 1992).

2 7
Phillip E. Lowry, "The Assessment Center: Reducing Interassessor Influence," Public
Personnel Management 20 (Spring 1990): 19-26; idem, "The Assessment Center: Effects of
Varying Consensus Procedures," Public Personnel Management 21 (Summer 1992): 171-83.

2 8
Campion, et al. 29; Task Force, 468-470.

2 9
Campion, et. al., 29.

3 0
Patrick Ε. Shrout and Joseph L. Fleiss, "Intraclass Correlations: Uses in Assessing Rater
Reliability," Psychological Bulletin 86, no. 2, (1979):425-6.

3 1
Howard E. Tinsley and David J. Weiss, "Interrater Reliability and Agreement of Subjective
Judgments," Journal of Counseling Psychology 22, no. 4 (1975): 367.

32

Campion, et. al., 31.

3 3
Campion, et. al., 494.

The Structured Interview 213

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


References

Arvey, R. D., and J. E. Campion. 1982. The employment interview: A summary and review of
recent research. Personnel Psychology 35: 281-323

Campion, Μ. Α., Pursell, E. D., & Brown, Β. K. 1988. Structured interviewing: Raising the
psychometric properties of the employment interview. Personnel Psychology 41:25-42.

Daniel, C , & Valencia, S. 1991. Structured interviewing simplified. Public Personnel Manage­
ment 20:127-134.

tDipboye, R. L. 1992. Selection interviews: Process perspectives. Cincinnati: Southwestern Pub­


lishing Co.

Eder, R. W. 1989. Contextual effects oninterview decisions. In 77K employment interview: Theory,
research, and practice, ed. Robert W. Eder and Gerald R. Ferris, 113-126. Newbury Park, CA:
Sage Publications.

Equal Employment Opportunity Commission, Civil Service Commission, Department of


Labor, & Department of Justice. 1978. Adoption by four agencies of uniform guidelines on
employee selection procedures. Federal Register 43: 38290-38315.

Feild, H. S. and R. D. Gatewood. 1989. Contextual effects on interview decisions. In 77ie


employment interview: Theory, research, and practice, ed. Robert W. Eder and Gerald R. Ferris,
145-57. Newbury Park, CA: Sage Publications.

Fitzgerald, L.F., & Quaintance, M. K. 1982. Survey of assessment center use in state and local
government. Journal of Assessment Center Technology 5:9-21.

Gaugler, B. D., Rosenthal, D. B. Thornton ΠΙ, G. C , & Bentson, C. 1987. Meta-Analysis of


assessment center validity. Journal of Applied Psychology 72:493-511.

Hinrichs, J. R. 1969. Comparison of "real life" assessments of management potential with


situational exercises, paper-and- pencil ability tests, and personality inventories. Journal of
Applied Psychology 53:596-601.

Joiner, D. 1991. Assessment centers and job simulation exercises: Professional and ethical
issues. P(ersonnel) T(esting) Council)/S(outhern) C(alifornia), 3, no. 11, November, 1991.

Latham, G. P., Sari, L. M , Pursell, E.D., & Campion, M. A. 1980. The situational interview.
Journal of Applied Psychology 65:422-427.

Lowry, P. E. 1991. The assessment center: Reducing interassessor influence. Public Personnel
Management 20 (Spring): 19-26.

. 1992. The assessment center: Effects of varying consensus procedures. Public


Personnel Management 21 (Summer): 171-183.

Reilly, R.R., Henry, S., & Smither, J.W. 1990. An examination of the effects of using behavior
checklists on the construct validity of assessment center dimensions. Personnel Psychology 43:
71-84.

Shrout, P. E. and D. J. Weiss. 1979. Intraclass correlations: Uses in assessing rater reliability.
Psychological Bulletin 86:420-428.

Society of Industrial and Organizational Psychology, Inc. 1987. Principles for the validation and
use of personnel selection procedures 3d ed. College Park, MD: Author.

Task Force on Assessment Center Guidelines 1989. Guidelines and ethical considerations for
assessment center operations. Public Personnel Management 18: 457-470.

214 Public Personnel Management Vol. 23 No.2 (Summer 1994)

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015


Thornton, III, G. C. & Byham, W. C. 1982. Assessment centers and managerial performance. New
York: Academic Press.

Thorton, G.C. 1992. Assessment centers in human resource management. Reading, Massachusetts:
Addison-Wesley Publishing Company.

Tinsley, H. E. A. and D. J. Weiss. 1975. Inter-rater reliability and agreement of subjective


judgments. Journal of Counseling Psychology 22:358-376.

Wiesner, W. H., & Cronshaw, S. F. 1988. A meta-analytic investigation of interview format and
degree of structure on the validity of the employment interview. Journal of Occupational
Psychology 61:275-290.

Wright, P. M., P. A. Lichtenfels, and E. D. Pursell. 1989. The structured interview: Additional
studies and a meta-analysis. Journal of Occupational Psychology 62:191-99.

Yeager, Samuel J. 1986. Use of assessment centers by metropolitan fire departments in North
America. Public Personnel Management\5 (Spring): 51-64.

The Structured Interview 215

Downloaded from ppm.sagepub.com at RYERSON UNIV on September 13, 2015

You might also like