0% found this document useful (0 votes)
121 views19 pages

Mayer & DiGennaro Reed (2013)

The study investigates the effectiveness of a training package aimed at improving the accuracy of descriptive analysis data recording by direct service personnel involved in functional behavior assessments. Results indicate that the training was both effective and acceptable to participants, suggesting that it can enhance the reliability of data collection in applied settings. The findings highlight the importance of accurate data recording for developing effective interventions for problem behaviors in individuals with developmental disabilities.

Uploaded by

Yael Thibaudeau
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views19 pages

Mayer & DiGennaro Reed (2013)

The study investigates the effectiveness of a training package aimed at improving the accuracy of descriptive analysis data recording by direct service personnel involved in functional behavior assessments. Results indicate that the training was both effective and acceptable to participants, suggesting that it can enhance the reliability of data collection in applied settings. The findings highlight the importance of accurate data recording for developing effective interventions for problem behaviors in individuals with developmental disabilities.

Uploaded by

Yael Thibaudeau
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Journal of Organizational Behavior Management

ISSN: 0160-8061 (Print) 1540-8604 (Online) Journal homepage: https://www.tandfonline.com/loi/worg20

Effects of a Training Package to Improve the


Accuracy of Descriptive Analysis Data Recording

Kimberly L. Mayer & Florence D. DiGennaro Reed

To cite this article: Kimberly L. Mayer & Florence D. DiGennaro Reed (2013) Effects of a
Training Package to Improve the Accuracy of Descriptive Analysis Data Recording, Journal of
Organizational Behavior Management, 33:4, 226-243, DOI: 10.1080/01608061.2013.843431

To link to this article: https://doi.org/10.1080/01608061.2013.843431

Published online: 15 Nov 2013.

Submit your article to this journal

Article views: 847

View related articles

Citing articles: 7 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=worg20
Journal of Organizational Behavior Management, 33:226–243, 2013
Copyright © Taylor & Francis Group, LLC
ISSN: 0160-8061 print/1540-8604 online
DOI: 10.1080/01608061.2013.843431

RESEARCH ARTICLE

Effects of a Training Package to Improve the


Accuracy of Descriptive Analysis Data
Recording

KIMBERLY L. MAYER
Melmark New England, Andover, Massachusetts, USA

FLORENCE D. DIGENNARO REED


University of Kansas, Lawrence, Kansas, USA

Functional behavior assessment is an important precursor to devel-


oping interventions to address a problem behavior. Descriptive
analysis, a type of functional behavior assessment, is effective in
informing intervention design only if the gathered data accu-
rately capture relevant events and behaviors. We investigated a
training procedure to improve the accuracy of data recording by
direct service personnel. Findings suggest that the training package
was effective and acceptable to participants. Implications of these
findings are discussed.

KEYWORDS functional assessment, functional behavior assess-


ment, performance management, data collection

Functional behavior assessment (FBA) consists of a range of activities to


identify the environmental determinants of problem behavior in order to
develop a function-based intervention plan. A body of literature supports
the effectiveness of FBA to aid in the selection of intervention techniques for
children with developmental disabilities (e.g., Mace, 1994) and emotional or
behavioral disorders (e.g., Scott & Kamps, 2007). Descriptive analysis refers

The authors would like to thank Tiffaney Esposito, Dan Almeida, Brian Liu-Constant,
Patricia Finney, Joanne Coughlin, and Stefanie Doucette for their assistance with this research.
Address correspondence to Kimberly L. Mayer, Melmark New England, 461 River Road,
Andover, MA 01810, USA. E-mail: [email protected]

226
Descriptive Analysis 227

to a set of procedures within FBA that involve direct observation of the


environmental events (antecedents and consequences) surrounding a behav-
ior’s occurrence in the natural environment (Allday, Nelson, & Russel, 2011).
Although research suggests that descriptive analysis can effectively identify
the function of problem behavior (e.g., Kern, Gallagher, Starosta, Hickman, &
George, 2006; Scott & Kamps, 2007), erroneous conclusions about behavioral
function are sometimes made using descriptive analysis compared to func-
tional analysis, an experimentally rigorous approach (e.g., Lerman & Iwata,
1993; Pence, Roscoe, Bourret, & Ahearn, 2009; Thompson & Iwata, 2007).
Because of the limited availability of trained professionals, costs, and
convenience, direct methods of functional assessment, such as descriptive
analysis, are often preferred over functional analysis techniques in applied
settings (Alter, Conroy, Mancil, & Haydon, 2008; Ellingson, Miltenberger,
Stricker, Galensky, & Garlinghouse, 2000). However, others advocate that
both direct (e.g., descriptive analysis) and indirect (e.g., informant-based
checklists) methods should supplement experimental analysis to promote
the best outcomes (e.g., Doggett, Edwards, Moore, Tingstrom, & Wilczynski,
2001). Regardless of one’s particular stance on the appropriateness of
descriptive analysis alone or in combination with functional analysis, it is
a technique commonly used in applied settings as part of FBA. Moreover,
direct service personnel are often asked to be involved in descriptive analysis
data recording (Lerman, Hovanetz, Strobel, & Tetreault, 2009). Unfortunately,
direct service personnel may have less-than-ideal training and experience
in functional assessment and the development of behavioral interventions.
To improve the likelihood that the FBA process will lead to the identi-
fication of the appropriate reinforcer—and, presumably, a function-based
intervention—direct service personnel must reliably and accurately collect
these data.
Training participants to accurately perform skills has been a focus of
research for decades (e.g., Bass, 1987; Herbert & Baer, 1972). Antecedent
training tactics such as didactic instruction, the provision of information,
modeling, and rehearsal are commonly used to train newly hired direct
service personnel and educators (Jahr, 1998). Although effective, these
techniques are often insufficient to maintain performance over time, and
follow-up strategies, such as performance feedback, may be necessary
(DiGennaro Reed & Codding, 2011; Reid & Parsons, 2002). With respect
to FBA techniques, researchers have successfully trained teachers (e.g.,
Ellingson et al., 2000; Maag & Larson, 2004; Sasso et al., 1992) and par-
ents (e.g., Arndorfer, Miltenberger, Woster, Rortvedt, & Gaffaney, 1994; Frea
& Hepburn, 1999; McNeill, Watson, Henington, & Meeks, 2002) to imple-
ment FBA procedures using a variety of training techniques. For example,
Maag and Larson (2004) taught a general education teacher how to con-
duct a functional assessment and design an intervention after nearly 6 hr
of instruction and practice. Although the intervention reduced the problem
228 K. L. Mayer and F. D. DiGennaro Reed

behaviors of two students, the teacher did not record antecedents and conse-
quences associated with each behavioral occurrence in real time. Instead, this
information was generated by completing a questionnaire on the Functional
Assessment Hypothesis Formulation Protocol (Larson & Maag, 1998). These
findings are informative; however, they provide little guidance on ways to
improve real-time data recording.
In their attempt to identify an effective and efficient training proce-
dure, Ellingson and colleagues (2000) successfully documented agreement
between teachers (without specialized behavior analysis training) and grad-
uate students (with 2 years of training in behavior analysis) about the
hypothesized functions of problem behaviors of students with developmental
disabilities. After completing a questionnaire and participating in a behavioral
interview, teachers received a brief training in how to implement a struc-
tured checklist to record descriptive analysis data. The training consisted of
descriptions of operational definitions of all variables, an explanation about
how to use the data sheet, and feedback on their first 30-min observation
only. Their findings are promising and suggest that inexperienced teachers
can reliably collect functional assessment data with a limited amount of train-
ing. However, high agreement between teachers and graduate students may
have been a result of the functional assessment instruments adopted in the
study (Lerman et al., 2009). For example, the narrowly focused structured
checklist was developed by the researchers and relied on the informa-
tion generated by the questionnaire and interview. As a result, potential
antecedent and consequence events may not have been included on the
checklist. This important methodological limitation impacts the inferences
researchers can make about the effectiveness of the training procedure in
promoting reliable data recording.
Lerman and colleagues (2009) examined the effects of FBA data record-
ing format on the accuracy with which special education teachers and
paraprofessionals collected antecedent and consequence information. Their
participants received customary training often provided to educators in pub-
lic school settings (e.g., a 60-min in-service), viewed videos of interactions
between actors, and completed both narrative and structured A-B-C data
sheets. Narrative recording required participants to write a brief description
of events that occurred before and after behavior. Structured recording relied
on a form that listed potential antecedents and consequences and required
participants to mark a check next to those that were observed during the
video observation. Lerman et al. found slightly higher accuracy when par-
ticipants completed structured A-B-C data recording forms; however, neither
format was associated with acceptable levels of accuracy. The potential impli-
cations of these findings are alarming given common functional assessment
training practices in public school settings.
Taken together, these data suggest a need for additional research to
identify resource-efficient FBA training procedures that may be adopted in
Descriptive Analysis 229

educational settings with limited resources. In our view, resource-efficient


training procedures are those that rely on as few personnel resources as
possible within a limited amount of training time. In the present study we
adapted the approach used by Lerman and colleagues (2009) and relied on
videotape interactions; however, unlike in their study our videos consisted
of brief, real-life interactions between individuals with disabilities and their
educators. Given the lower accuracy associated with the narrative A-B-C
format in Lerman et al., our study focused on improving the accuracy of data
recording using narrative A-B-C recording, a descriptive analysis technique
recommended for use in applied settings (Smith, Vollmer, & St. Peter Pipkin,
2007).

METHOD
Participants and Setting
Five direct service personnel of a nonprofit agency that provides educational
and residential services to students with autism, brain injury, and other devel-
opmental disorders participated in the study. All participants worked in a
residential setting operated by the agency. Jamie was a 22-year-old woman
with 9 months of experience. Trish was a 23-year-old woman with 3 months
of experience and David was a 23-year-old man with 9 months of expe-
rience working with individuals with disabilities. Julie was a 36-year-old
woman with 10 months of experience working with individuals with dis-
abilities within a residential setting. Liz was a newly employed 21-year-old
woman with 2 months of experience.
All of the participants had bachelor’s degrees in psychology and limited
exposure to descriptive analysis prior to the start of the study. Their only
exposure consisted of one 60-min training on FBA during the agency ori-
entation upon initial hire. Although they viewed various data sheets during
this training, none of the participants had applied experience using these
tools. All training and procedures occurred during the evening hours at a
community-based group home operated by the agency.

Materials
Before the start of the study, we created 14 unique video clips containing
natural interactions of students in their home with staff and roommates. This
was accomplished by recording students participating in a variety of rou-
tines in their home over several observation periods. These recordings were
reduced in length to 5 min to capture relevant antecedent and consequence
events for appropriate and inappropriate behaviors. Both inappropriate
and appropriate behaviors were included because, in the present setting,
behavior analysts and staff sometimes assess the conditional probabilities
230 K. L. Mayer and F. D. DiGennaro Reed

of consequence events given both appropriate and inappropriate behavior


using a contingency space analysis (e.g., Eckert, Martens, & DiGennaro,
2005; Martens, DiGennaro, Reed, Szczech, & Rosenthal, 2008). The clips
were a mechanism for participants to record antecedents, behaviors, and
consequences displayed on screen as part of the experimental sessions
(i.e., the clips were not used as part of video modeling). The audible
sounds of the natural environment were not edited. Thus, viewers could
hear conversations, movement, and other sounds (e.g., dishes clanking,
footsteps). No additional sounds were added to the video; however, the
date and time stamp remained on the screen to facilitate the collection
of this information. Nine of the 14 video clips depicted situations during
which a student displayed problem behavior (e.g., noncompliance, climb-
ing on furniture, self-injury, shirt biting, skin picking, spitting). Five of the
videos depicted situations during which appropriate behavior occurred (e.g.,
functional communication, task initiation, eating independently, appropriate
use of behavioral relaxation). Note that some of the video clips contained
different topographies of similar behaviors across students (e.g., functional
communication, self-injury, and noncompliance). We did not include a mea-
sure of difficulty across video clips except to standardize the length of the
video and limit the number of behavior occurrences to one consistent with
operational definitions (note that some behavior classes, such as tantrum,
had more than one topography). The type and number of antecedents and
consequences varied across video clips to mimic events that take place in
the natural environment (see Table 1).
Participants used a narrative Antecedent–Behavior–Consequence (A-B-
C) Chart (Evans & Meyer, 1985) for data recording. The A-B-C Chart provided
space to indicate the name of the individual being observed, the resi-
dence in which he or she lived, and the target behavior and its operational
definition. The A-B-C Chart also consisted of sections to briefly note setting
information (e.g., date/time/location), antecedent and consequence events,

TABLE 1 Type and Number of Antecedents and Consequences Depicted Across All Video
Samples

Event True instances

Antecedents 20
Instruction delivered/prompt provided 6
Student on a break 4
Denied access to a desired item/activity 3
Extinction procedure implemented 3
Student on task 3
Loud environment 1
Consequences 19
Social praise 7
Planned ignoring 4
Maintained demand/prompting hierarchy implemented 5
Other verbal interaction 3
Descriptive Analysis 231

and the behavior of the individual being observed (see Appendix A). The
participants used a standard form developed by the agency that contained
two rows on each page and received a new form for each session. We did
not provide instruction about completing one or both rows of the form for
each session.
Participants received two training documents to assist them with data
recording: (a) a detailed outline of relevant information contained within sec-
tions of the A-B-C Chart (see Appendix B) and (b) an A-B-C Chart containing
22 bulleted points to consider when recording data. The training document
in Appendix B specifies the type of information required for each section of
the narrative recording form. Five pieces of information were expected for
setting information (date, time, location, setting where behavior occurred,
and staff initials). The antecedent section denotes events that precede the
occurrence of the target behavior. In the present study, participants were
expected to record data for nine unique events. The content provided in
Appendix B specifies generally what information could be recorded but
does not include an exhaustive list of all possible examples. For exam-
ple, the training document specifies that participants should “indicate any
demands placed within 5 min of behavior occurrence” but does not provide
all possible demands that could be presented. In other places, examples of
antecedents are provided (e.g., possible sources of noise). In the behavior
section, participants were expected to specify the class of behavior and spe-
cific topographies because the video depicted a single occurrence of each
target behavior. However, if participants provided an incorrect count of the
behavior, this was marked as an error. The consequence section denotes
events that follow the occurrence of the target behavior. Six consequence
events are detailed in Appendix B, with possible staff member responses
delineated further. The other training document that participants received
looked similar to Appendix A (i.e., the A-B-C Chart) with abbreviated por-
tions of the outline in Appendix B bulleted within each column. For example,
under the date/time/location section five bullets indicating each of the fol-
lowing were provided: (a) date, (b) time, (c) location, (d) specific setting,
and (e) staff initials. This training document was meant to serve as a quick
reference tool.

Response Measurement and Interobserver Agreement


Before the start of the study five Board Certified Behavior Analysts®
completed A-B-C Charts (see Appendix A) while viewing 100% of the videos.
One of the behavior analysts had a doctoral degree in school psychology,
two had master’s degrees in applied behavior analysis, and two had obtained
master’s degrees in special education. The behavior analysts had an average
of 11 years of experience (range = 6–15 years) across multiple employment
settings and supervisory roles. A scoring template was generated using the
events (i.e., setting information, antecedents, behavior, and consequences)
232 K. L. Mayer and F. D. DiGennaro Reed

identified most often by the Board Certified Behavior Analysts® . The first
author, also a Board Certified Behavior Analyst® , conducted a final review of
the scoring template while watching the videos to verify that the scoring tem-
plate reflected actual events contained within the video clips. The dependent
variable was the accuracy with which participants completed the A-B-C Chart
upon viewing a 5-min video clip of a student engaged in various routines
within his or her home with the support of staff. Multiple pieces of informa-
tion were required for each section of the A-B-C Chart (setting information
= 5, antecedents = 9, behavior = 2, consequences = 6) consistent with
details provided in Appendix B. If the video clip did not depict one of the
many details indicated in Appendix B, an accurate response would indicate
“not applicable” or “unknown” (e.g., peer reaction unknown). The A-B-C
Charts completed by participants were compared against the scoring tem-
plate informed by the responses of the Board Certified Behavior Analysts® .
Accuracy was calculated by dividing the number of events within each sec-
tion of the A-B-C Chart (setting information, antecedents, behaviors, and
consequences) marked similarly to the scoring template by the total number
of events in each section, multiplied by 100. Note that exact phrasing was
not required, just that the written response captured similar events.
An independent second scorer collected data on participant scoring of
the A-B-C Charts during 57%, 20%, 50%, 100%, and 43% of sessions for
Jamie, Trish, David, Julie, and Liz, respectively. An agreement was scored
when both individuals recorded each unit of information similarly (i.e., set-
ting information, stimulus events [i.e., antecedents and consequences], and
behavior occurrence). Identical phrasing was not required; however, the
recorded response had to capture similar events. Agreement was calculated
as the number of agreements divided by agreements plus disagreements
and converted to a percentage. Mean percent agreement for Jamie was 93%
(range = 86%–100%), for Trish was 93% (range = 86%–100%), for David was
95% (range = 86%–100%), for Julie was 93% (range = 78%–100%), and for
Liz was 97% (range = 80%–100%).
In addition, Julie and Liz completed an 18-item questionnaire assessing
the overall acceptability of the interventions used with them to improve
descriptive analysis data recording. This questionnaire was adapted from the
Intervention Rating Profile–15 (IRP; Martens, Witt, Elliot, & Darveaux, 1985)
and is available from the first author. The modified IRP items were rated on
a scale ranging from 1 (strongly disagree) to 6 (strongly agree), with higher
scores representing treatment acceptability. Total scores could range from
18 to 108.

Experimental Design and Procedure


A concurrent multiple baseline design across participants was used to
evaluate the effects of training (i.e., written task clarification) with and
Descriptive Analysis 233

without feedback on the accuracy with which participants completed the


A-B-C Chart.

BASELINE
All participants had received a brief in-service training on the topic of FBA
upon initial hire but otherwise were not trained to implement the techniques
addressed in this study. Before each session, participants were informed
of the behavior depicted in the video as well as its operational definition.
We provided this information before video viewing because direct service
personnel typically have this information prior to data collection in actual
service delivery and practice. Video clips included scenes depicting both
appropriate and inappropriate behavior during various routines in the home.
Videos were viewed one time only and were not repeated across experimen-
tal sessions. Upon viewing the clip, participants recorded data on the A-B-C
Chart. We did not specify whether data recording should take place during
or after video viewing, and participants were free to record data at either
time. Questions were not answered and feedback was not provided during
baseline sessions.

TASK CLARIFICATION + FEEDBACK

The purpose of this phase was to examine the accuracy of participants’ (i.e.,
Jamie, Trish, and David) data recording following the provision of written
task clarification documents and verbal performance feedback. Participants
were provided with the two training documents to assist them with data
recording (i.e., an A-B-C Chart with bullet points and an outline). Before par-
ticipants viewed the video, a 10-min training was provided during which the
training documents were reviewed and participant questions were answered.
However, practice scoring did not take place. The training documents were
present while the participants watched the video and recorded data. Within
3 hr of viewing one video clip, participants received verbal feedback on
the accuracy of their data recording. We corrected both commission and
omission errors and provided a rationale for the correct response. We also
delivered vocal praise for correctly identifying relevant events. Feedback
review generally lasted less than 10 min, on average. A performance cri-
terion of 90% accuracy across three consecutive sessions was required for
mastery.

TASK CLARIFICATION

The purpose of this phase was to evaluate the accuracy of participants’ (i.e.,
Julie and Liz) data recording following task clarification without feedback.
234 K. L. Mayer and F. D. DiGennaro Reed

Procedures were identical to those described in “Task Clarification +


Feedback” with one exception: Feedback was not provided to participants
in this phase.

RESULTS

The accuracy with which participants completed data recording using the
A-B-C Chart is presented in Figures 1 (with feedback) and 2 (without
feedback). As shown in Figure 1, during baseline Jamie, Trish, and David
displayed low accuracy for antecedents, consequences, and setting informa-
tion. Relative to their performance on other sections of the A-B-C Chart,
they demonstrated higher data recording accuracy for behavior, although
some variability was evident for Trish and David. For all three participants,
an immediate level change was observed in their accuracy following task
clarification plus performance feedback for all sections of the A-B-C Chart.
Variability was observed for antecedent and consequence recording, though
performance was above 80% for all participants. As shown in Figure 2, Julie
and Liz demonstrated low performance for recording antecedents and con-
sequences during baseline. Variable performance was noted for Julie for
both setting information and behavior. Liz displayed perfect data recording
accuracy for behavior across six sessions. An immediate level change was
observed in performance following training for both participants. Although
some variability was observed for Julie, by the end of this condition her per-
formance was high and stable. Liz’s performance was high and stable across
all sections of the A-B-C Chart.
Ratings from the modified IRP ranged from 98 to 102 (M = 100), which
indicates general intervention acceptability for the procedures used with Julie
and Liz. The mean item rating was 5.56 out of 6, with both participants
slightly to strongly agreeing with each item (ratings of 4, 5, or 6). Julie and
Liz also slightly to strongly agreed that the interventions implemented were
effective, would not result in negative side effects, and would be beneficial
overall for staff.

DISCUSSION

The purpose of this study was to investigate the impact of a resource-


efficient intervention on the accuracy of descriptive analysis data recording
by direct service personnel. Our findings showed that generic informa-
tion about FBA provided in a didactic in-service does not translate into
accurate narrative A-B-C data recording. Although the column labels of the
A-B-C Chart specify in simple terms the information/data requested, par-
ticipants performed poorly during baseline and required more specialized
training. An intervention package (i.e., brief training, task clarification, and
Descriptive Analysis 235

Baseline Task Clarification+Feedback


100

Date/Time/Location
80

60

40

20

0 Jamie

100

80 Behavior
% Accuracy

60
Antecedents

40

20

0 Trish

100

80

Consequences
60

40

20

0 David

1 2 3 4 5 6 7 8 9 10
Consecutive Video Samples

FIGURE 1 Percent accuracy during data recording across baseline and treatment (task
clarification + feedback) conditions for each participant.
236 K. L. Mayer and F. D. DiGennaro Reed

Baseline Task Clarification

100

80

60

40 Date/Time/Location

20 Consequences

0
% Accuracy

Julie

100

80 Behavior

60 Antecedents

40

20

0 Liz

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Consecutive Video Samples

FIGURE 2 Percent accuracy during data recording across baseline and treatment (task
clarification) conditions for both participants.

performance feedback) produced robust changes in data recording accuracy,


providing support for its effectiveness. Our findings further documented
systematic improvements in performance without the use of performance
feedback for two participants. That is, a brief training combined with task
clarification was sufficient to increase the accuracy with which two direct
service personnel recorded descriptive analysis data during video viewing
of natural interactions. Moreover, participants rated these procedures as
acceptable and beneficial.
Descriptive Analysis 237

It is interesting that participants showed generally high accuracy in


recording the occurrence of behavior, even during baseline. This finding
may be due to practice they had engaging in this behavior as part of their
regular job duties. In this setting, direct service personnel regularly record
data on skill acquisition procedures (e.g., functional communication training)
and problem behavior so that the effects of teaching and behavior support
plans can be assessed. Their relatively high performance on the behavior sec-
tion of the A-B-C Chart may be a function of their prior experience with this
skill or the fact that the videos predictably included one behavior occurrence
per session. We expected and observed low performance on the antecedent
and consequence sections of the A-B-C Chart given how challenging this
task may be using narrative forms (vs. descriptive forms on which users
may select from a sample of available options). Task clarification via writ-
ten training tools was effective in producing changes in performance on
these variables, though a delay to acquisition was observed for one par-
ticipant (Julie) when feedback was not provided. It may be the case that
feedback, though requiring resources, may produce more rapid changes in
performance than antecedent-only training techniques.
These findings are especially important in light of the amount of time
required to produce performance improvement. For Julie and Liz, the inter-
vention package was effective without the continued presence of a live
trainer or frequent performance feedback, providing evidence of its effi-
ciency and potential transportability. The latter advantage may be especially
important in settings where educators or direct service personnel are not
housed in a single building (e.g., multiple schools within a district, nonprofit
agencies with varied service delivery settings). These effects are also advan-
tageous in situations in which the number of new staff requiring training
exceeds the number of direct training opportunities; antecedent-only training
techniques, such as task clarification, might be appropriate for low-risk pro-
cedures including descriptive analysis data recording using the A-B-C Chart.
We did not assess the effects of these training techniques with moderate- or
high-risk procedures and therefore caution against using them in situations
other than those evaluated in the current study. Although feedback was not
needed to improve performance for Julie and Liz, task clarification alone
may not be effective with other participants.
We also did not evaluate long-term performance, and it may be the
case that a high level of performance is unsustainable with the present inter-
vention package. We suspect that baseline performance was due to a skill
deficit, which is supported by the beneficial effects of task clarification both
with and without feedback. If this hypothesis is correct, we caution readers
against eliminating performance feedback altogether given the robust liter-
ature documenting its effectiveness (Alvero, Bucklin, & Austin, 2010) and
the fact that managerial feedback after training can help to resolve per-
formance deficits (i.e., motivation problems). Moreover, staff may require
238 K. L. Mayer and F. D. DiGennaro Reed

ongoing direct observation and feedback to capture and prevent observer


drift. Although descriptive analysis data recording improved during the task
clarification–only condition, we have not documented maintenance over
time or generalization across job responsibilities and employment contexts or
to real-time in vivo data recording. Thus, we cannot speak to the long-term
effectiveness or external validity of our procedures and discourage readers
from withholding performance feedback until future research has evaluated
these variables.
This study did not assess data recording accuracy in the absence of a
traditional FBA in-service. As a result, the current study procedures should
be adopted as a supplement to, not in lieu of, customary training (i.e., 60-
min didactic in-service). We envision educators receiving sample data sheets
and a DVD at the conclusion of an in-service so that they may practice data
recording as a follow-up assignment to didactic instruction and submit these
materials to the trainers. Once reliability is scored, follow-up strategies could
be provided. Praise, acknowledgment, or some other reward could be pro-
vided to educators who meet criterion and could be delivered via technology
such as e-mail to limit resource expenditures. Although we did not evaluate
this explicit strategy in the present study, some mechanism for maintain-
ing high levels of performance would be preferred. In addition, face-to-face
meetings could be arranged for those educators who require corrective feed-
back such as that provided in this study. This training sequence allocates
resources to those educators or direct service personnel most in need of
support and is consistent with our goal of maximizing outcomes with limited
resource expenditures. Future research could evaluate the effectiveness of
this particular training sequence.
The mechanism by which task clarification produced changes in perfor-
mance is unknown. The printed training materials could have functioned as
prompts to evoke accurate data recording during video viewing. We did
not evaluate the extent to which participants used these materials over
time, and it may be the case that they relied on them less throughout
the intervention (i.e., prompt fading). Alternatively, the printed materials
might have facilitated self-managed feedback, allowing participants to mon-
itor their own behavior without the immediate oversight of a supervisor.
Either explanation is plausible and suggests that future research is neces-
sary to better understand the mechanism by which changes in performance
took place. This information could inform ways to enhance training materi-
als and evaluate their efficacy with other types of data collection activities
(e.g., close-ended descriptive data recording, recording frequency or dura-
tion of target behavior), employment settings (e.g., single building with
on-site supervisor vs. multiple buildings), employment conditions (e.g., high
vs. low supervisor-to-staff ratios), and other variables.
Despite these findings, a number of limitations exist and should be
addressed in future research. Although all of the participants had similar
Descriptive Analysis 239

background experiences with functional assessment, the length of time


between their participation in didactic instruction and the start of the study
varied. Didactic training took place during the participants’ new-hire orien-
tation. Two of the participants had received their initial training within 2 to
3 months of the study, and the remaining participants had been employed for
9 to 10 months. Although baseline performance was similar across all of the
participants, these time differences may have influenced outcomes in ways
we did not directly measure. In addition, our study focused exclusively on
narrative data recording and did not assess the accuracy of structured data
recording formats. The latter are commonly used in educational and clini-
cal settings, and, unfortunately, the generalizability of the present findings
to structured forms is limited. An important area of study is extending these
findings to other types of data collection techniques and FBA procedures that
direct service personnel complete. Finally, this study did not assess the per-
formance of direct service personnel during live observation while managing
other responsibilities. We attempted to replicate and extend a portion of the
study procedures of Lerman and colleagues (2009), who maximized control
by collecting data under ideal conditions. Similar to their study, our findings
represent optimal accuracy and might overestimate the findings in real-world
settings. Thus, future research might wish to examine the generalization of
gains from ideal research conditions to applied educational settings.
Notwithstanding these limitations, the present findings offer a viable,
resource-efficient procedure to train educators and direct service personnel
to complete descriptive analysis data collection using narrative recording
formats. These results are promising in light of the role these individuals
frequently have in the functional assessment process. Descriptive analysis,
a type of FBA, is effective in informing intervention design only if the
gathered data accurately capture relevant events and behaviors. Efficient yet
effective techniques to improve the performance of direct service personnel
are valuable for settings with limited resources and represent a worthwhile
area of study.

REFERENCES

Allday, R. A., Nelson, J. R., & Russel, C. S. (2011). Classroom-based functional


behavioral assessment: Does the literature support high fidelity implementation?
Journal of Disability Policy Studies, 22, 140–149. doi:10.1177/1044207311399380
Alter, P. J., Conroy, M. A., Mancil, G. R., & Haydon, T. (2008). A comparison of
functional behavior assessment methodologies with young children: Descriptive
methods and functional analysis. Journal of Behavioral Education, 17, 200–219.
doi:10.1007/s10864-008-9064-3
Alvero, A. M., Bucklin, B. R., & Austin, J. (2010). An objective review of the effec-
tiveness and essential characteristics of performance feedback in organizational
240 K. L. Mayer and F. D. DiGennaro Reed

settings (1985-1998). Journal of Organizational Behavior Management, 21,


3–29. doi:10.1300/J075v21n01_02
Arndorfer, R. E., Miltenberger, R. G., Woster, S. H., Rortvedt, A. K., & Gaffaney,
T. (1994). Home-based descriptive and experimental analysis of problem
behaviors in children. Topics in Early Childhood Special Education, 14, 64–87.
Bass, R. F. (1987). Computer-assisted observer training. Journal of Applied Behavior
Analysis, 20, 83–88. doi:10.1901/jaba.1987.20-83
DiGennaro Reed, F. D., & Codding, R. S. (2011). Intervention integrity assessment.
In J. Luiselli (Ed.), Teaching and behavior support for children and adults with
autism spectrum disorder: A “how to” practitioner’s guide (pp. 38–47). New
York, NY: Oxford University Press.
Doggett, A. R., Edwards, R. P., Moore, J. W., Tingstrom, D. H., & Wilczynski, S. M.
(2001). An approach to functional assessment in general education classroom
settings. School Psychology Review, 30, 313–328.
Eckert, T. L., Martens, B. K., & DiGennaro, F. D. (2005). Describing antecedent-
behavior-consequence relations using conditional probabilities and the general
operant contingency space: A preliminary investigation. School Psychology
Review, 34, 520–528.
Ellingson, S. A., Miltenberger, R. G., Stricker, J., Galensky, T. I., & Garlinghouse,
M. (2000). Functional assessment and intervention for challenging behaviors
in the classroom by general classroom teachers. Journal of Positive Behavior
Interventions, 2, 85–97. doi:10.1177/109830070000200202
Evans, I. M., & Meyer, L. H. (1985). An educative approach to behavior problems: A
practical decision model for interventions with severely handicapped learners.
Baltimore, MD: Brookes.
Frea, W. D., & Hepburn, S. L. (1999). Teaching parents of children with autism
to perform functional assessments to plan interventions for extremely dis-
ruptive behaviors. Journal of Positive Behavior Interventions, 1, 112–122.
doi:10.1177/109830079900100205
Herbert, E. W., & Baer, D. M. (1972). Training parents as behavior modifiers: Self-
recording of contingent attention. Journal of Applied Behavior Analysis, 5,
139–149.
Jahr, E. (1998). Current issues in staff training. Research in Developmental
Disabilities, 19, 73–87. doi:10.1016/S0891-4222(97)00030-9
Kern, L., Gallagher, P., Starosta, K., Hickman, W., & George, M. (2006). Longitudinal
outcomes of functional behavioral assessment-based intervention. Journal of
Positive Behavior Interventions, 8, 67–78. doi:10.1177/10983007060080020501
Larson, P. J., & Maag, J. W. (1998). Applying functional assessment in general educa-
tion classrooms issues and recommendations. Remedial and Special Education,
19, 338–349.
Lerman, D. C., Hovanetz, A., Strobel, M., & Tetreault, A. (2009). Accuracy of
teacher-collected descriptive analysis data: A comparison of narrative and
structured recording formats. Journal of Behavioral Education, 18, 157–172.
doi:10.1007/s10864-009-9084-7
Lerman, D. C., & Iwata, B. A. (1993). Descriptive and experimental analysis of vari-
ables maintaining self-injurious behavior. Journal of Applied Behavior Analysis,
26, 293–319.
Descriptive Analysis 241

Maag, J. W., & Larson, P. J. (2004). Training a general education teacher to apply
functional assessment. Education and Treatment of Children, 27, 26–36.
Mace, F. C. (1994). The significance and future of functional analysis methodologies.
Journal of Applied Behavior Analysis, 27, 385–392. doi:10.1901/jaba.1994.27-
385
Martens, B. K., DiGennaro, F. D., Reed, D. D., Szczech, F. M., & Rosenthal, B. D.
(2008). Contingency space analysis: An alternative method for identifying con-
tingent relations from observational data. Journal of Applied Behavior Analysis,
41, 69–81. doi:10.1901/jaba.2008.41-69
Martens, B. K., Witt, J. C., Elliot, S. N., & Darveaux, D. X. (1985). Teacher judg-
ments concerning the acceptability of school-based interventions. Professional
Psychology: Research & Practice, 16, 191–198.
McNeill, S. L., Watson, T. S., Henington, C., & Meeks, C. (2002). The effects of
training parents in functional behavior assessment on problem identification,
problem analysis, and intervention design. Behavior Modification, 26, 499–515.
doi:10.1177/0145445502026004004
Pence, S. T., Roscoe, E. M., Bourret, J. C., & Ahearn, W. H. (2009). Relative con-
tributions of three descriptive methods: Implications for behavioral assessment.
Journal of Applied Behavior Analysis, 42, 425–446. doi:10.1901/jaba.2009.42-425
Reid, D. H., & Parsons, M. B. (2002). Working with staff to overcome challenging
behavior among people who have severe disabilities: A guide for getting support
plans carried out. Morganton, NC: Habilitative Management Consultants.
Sasso, G. M., Reimers, T. M., Cooper, L. J., Wacker, D. P., Berg, W., Steege, M., . . .
Allaire, A. (1992). Use of descriptive and experimental analyses to identify the
functional properties of aberrant behavior in school settings. Journal of Applied
Behavior Analysis, 25, 809–821. doi:10.1901/jaba.1992.25-809
Scott, T. M., & Kamps, D. M. (2007). The future of functional behavioral assessment
in school settings. Behavioral Disorders, 32, 146–157.
Smith, R. G., Vollmer, T. R., & St. Peter Pipkin, C. (2007). Functional approaches
to assessment and treatment of problem behavior in persons with autism and
related disabilities. In P. Sturmey & A. Fitzer (Eds.), Autism spectrum disorders:
Applied behavior analysis, evidence, and practice (pp. 187–234). Austin, TX:
PRO-ED.
Thompson, R. H., & Iwata, B. A. (2007). A comparison of outcomes from descrip-
tive and functional analyses of problem behavior. Journal of Applied Behavior
Analysis, 40, 333–338. doi:10.1901/jaba.2007.56-06
242 K. L. Mayer and F. D. DiGennaro Reed

APPENDIX A

Narrative Recording Form (A-B-C Chart)


Student: __________________ Residence: __________________
Target behavior and operational definition:

Date/time/location Antecedent Behavior Consequences

APPENDIX B
Supplemental Training Materials for Narrative Recording Form (A-B-C
Chart)
Date, Time, Location
● Write the date
● Indicate the time: (+/−) 10 minutes of the behavior occurring
● Write the location
● Specify where the behavior occurred (setting)
◦ Examples: captain’s seat of Smith Road van; upstairs living room at Brick
Road Residence; Shaws in Saugus
◦ Nonexamples: Smith Road van; Brick Road residence
● Write staff initials

Antecedent/Stimulus Events
● Include the staff-to-student ratio in place prior to the student emitting the
behavior
● Include what activity the student was working on prior to emitting the
behavior. This must include whether:
◦ the student was on a break or
◦ the student was working on his or her activity schedule
Descriptive Analysis 243

● Specify the particular task, break activity, or IEP objective that was being
completed
● Indicate any demands placed within 5 min of behavior occurrence
● Indicate if the student was denied access to tangible items of interest within
5 min of behavior occurrence
● Specify environmental variables regarding noise occurring prior to the
student emitting the behavior (include noises being made by the TV, radio,
people talking)
● Specify environmental variables regarding other students’ behavior occur-
ring prior to the student emitting behavior
● If peers were exhibiting problem behaviors, write their initials
● Write the initials of any peers present

Behavior

● Write the general class of behavior (aggression, communication)


● Write the topography of behavior based on the operational definition
(scratch, vocal statement “Break please”)

Consequence

● Write staff member responses to behavior occurrence. Be sure to indicate:


◦ Any verbal interaction that staff had with the student following the
behavior including any programmatic SD s delivered
◦ Any physical guidance or other interaction that staff had with the student
◦ Whether planned ignoring was instituted. Be sure to note whether or not
any staff looked at the student.
◦ Other responses or interventions delivered by other staff. If this took
place, write their initials and the specific response.
◦ Be sure to indicate any consequence provided even if it is not in the
behavior support plan
● Write whether other staff intervened, describe their response, and indicate
their initials
● Write any verbal comments made by peers and specify the peer’s initials
● Specify if peers looked at the student and indicate the peer’s initials
● Specify if peers emitted problem behaviors in response to student’s
behavior and indicate the peer’s initials
● Write the particular activity student was redirected to complete following
behavior

Note. IEP = individualized education plan. SD s = discriminative stimulus.

You might also like