0% found this document useful (0 votes)
100 views35 pages

Coburn & Turner (2011)

The article presents a framework for understanding data use in educational settings, emphasizing that the effectiveness of assessments and data is contingent upon how they are interpreted and utilized. It highlights the complex interplay of individual beliefs, organizational contexts, and power dynamics that influence data use processes and outcomes. The authors aim to organize existing research on data use and provide guidance for future studies on the relationship between data use interventions and educational outcomes.

Uploaded by

cherryzinoo88
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
100 views35 pages

Coburn & Turner (2011)

The article presents a framework for understanding data use in educational settings, emphasizing that the effectiveness of assessments and data is contingent upon how they are interpreted and utilized. It highlights the complex interplay of individual beliefs, organizational contexts, and power dynamics that influence data use processes and outcomes. The authors aim to organize existing research on data use and provide guidance for future studies on the relationship between data use interventions and educational outcomes.

Uploaded by

cherryzinoo88
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Measurement: Interdisciplinary Research & Perspective

ISSN: 1536-6367 (Print) 1536-6359 (Online) Journal homepage: https://www.tandfonline.com/loi/hmes20

Research on Data Use: A Framework and Analysis

Cynthia E. Coburn & Erica O. Turner

To cite this article: Cynthia E. Coburn & Erica O. Turner (2011) Research on Data Use: A
Framework and Analysis, Measurement: Interdisciplinary Research & Perspective, 9:4, 173-206,
DOI: 10.1080/15366367.2011.626729

To link to this article: https://doi.org/10.1080/15366367.2011.626729

Published online: 22 Nov 2011.

Submit your article to this journal

Article views: 3226

View related articles

Citing articles: 43 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=hmes20
Measurement, 9: 173–206, 2011
Copyright © Taylor & Francis Group, LLC
ISSN: 1536-6367 print / 1536-6359 online
DOI: 10.1080/15366367.2011.626729

FOCUS ARTICLE

Research on Data Use: A Framework and Analysis


Cynthia E. Coburn
Policy, Organizations, Measurement, and Evaluation, University of California, Berkeley

Erica O. Turner
Educational Policy Studies, University of Wisconsin, Madison

One of the central lessons from research on data use in schools and school districts is that assess-
ments, student tests, and other forms of data are only as good as how they are used. But what
influences how they are used? This relatively straightforward question turns out to be fairly com-
plex to answer. Data use implicates a number of processes, conditions, and contexts. It involves
interpretive processes, as using data requires that the user interpret the data and construct implica-
tions for next steps. It implicates social and organizational conditions, since the data use unfolds in
the context of a multileveled organizational system. And, because data can be a source of power,
particularly in the current accountability environment, data use also involves power relations. In this
article, we put forward a framework for understanding the phenomenon of data use in the context of
data use interventions. We draw on existing research and theory to identify key dimensions of data
use that we should attend to and offer a way to understand how these dimensions might interact. In so
doing, we provide guidance for studying the pathways between data use interventions and various
outcomes of value.
Keywords: conceptual framework, data use, interpretation, power, social context, social interaction

One of the central lessons from research on data use in schools and school districts is that assess-
ments, student tests, and other forms of data are only as good as how they are used. But what
influences how they are used? Existing research on data use points to a broad array of dimen-
sions that matter for how data use unfolds, ranging from individual factors such as beliefs and

Correspondence should be addressed to Cynthia E. Coburn, 3643 Tolman Hall, Berkeley, CA 94703. E-mail:
[email protected]
174 COBURN AND TURNER

knowledge to organizational and even political ones. To complicate matters further, assessments
and test scores are rarely used in isolation. Increasingly, they arrive in schools as part of mul-
tipronged interventions promoting data use and data driven decision making (Young & Kim,
2010). These interventions vary in scope and complexity, from the provision of professional
development to accompany a new assessment, on the one hand, to comprehensive initiatives, on
the other, that include new assessments, new technology for organizing and displaying results,
the institution of regular data conversations structured by protocols and guided by facilitation,
and rewards and sanctions for performance.
In this article, we put forward a framework for understanding the phenomenon of data use
in the context of data use interventions. Our goal is to provide a way to understand how pieces
of the data use puzzle fit together in order to better illuminate what is involved in data use and
provide conceptual guidance for how we might study it. To that end, we identify key dimensions
of data use that should be attended to and offer a way to understand how these dimensions
might interact. Our framework acknowledges that data use implicates a number of processes,
conditions, and contexts. It involves interpretive processes, as using data requires that the user
interpret the data and construct implications for next steps (Coburn, Toure, & Yamashita, 2009;
Moss, 2007; Spillane & Miele, 2007). It implicates social and organizational conditions, since
the data use unfolds in the context of a multileveled organizational system called public schools,
which enable and constrain the dynamics of interpretation and action (Honig & Venkateswaran,
in press; Little, in press; Spillane, in press). And, because data is so often tightly intertwined
with power, particularly in the current accountability environment, data use involves power and
politics as well (Coburn, Honig, & Stein, 2009; Henig, in press; Stone, 2002).
To build the framework, we draw on existing research and theory to describe the underly-
ing interpretive processes that link data to action and discuss the interrelated organizational and
political contexts within which these processes unfold.1 We then show how data use interventions
interact with these contexts to shape the underlying data use processes in ways that have conse-
quences for student learning and other outcomes. In so doing, we provide guidance for studying
the potential pathways between data use interventions and various outcomes of value. And, we

1 This article is meant to be a conceptual essay, not a comprehensive literature review. However, we did review

a great deal of research on data use. We were aided in our efforts to identify relevant literature by the fact that the
production of this essay happened in the context of a series of meetings sponsored by the Spencer Foundation that sought
to characterize the existing research base on data use. As part of this initiative, the Spencer Foundation commissioned
comprehensive literature reviews by 15 scholars on different aspects of the data use puzzle. We drew on these literature
reviews (especially, Henig, in press; Honig & Venkateswaran, in press; Jennings, in press; Koretz & Jennings, 2010;
Little, in press; Marsh, in press; Spillane, in press; and Supovitz, 2011), as well as a comprehensive literature review
we had recently completed ourselves on data use at the district level (Coburn, Honig, & Stein, 2009) as a starting point.
We read and reviewed articles identified by these manuscripts, as well as additional articles found in the reference lists
of the articles we read. We also read and reviewed several recent edited volumes on data use (Bransford, Stipek, Vye,
Gomez, & Lam, 2009; Earl & Timperley, 2009; Kowalki & Lasley, 2009; Mandinach & Honey, 2008; Moss, 2007) as
well as a recent comprehensive review commissioned by the Institute for Education Sciences (Hamilton et al., 2009).
These volumes were sources of scholarship on data use to review, and their reference lists provided insight into further
resources to pursue. All told, we read and reviewed 161 articles, books, book chapters, and reports. However, while our
review of the research was extensive, we do not claim that it was comprehensive. Indeed, it is possible that we missed
some articles, especially given the burgeoning interest in this topic. Therefore, claims that we make about the state of the
research literature on data use should be viewed with caution.
RESEARCH ON DATA USE 175

seek to stimulate discussion between the different research communities that must work together
if we are to develop new knowledge in this area.
We begin by providing an overview of our framework. In subsequent sections, we discuss each
component of the framework in greater depth, describing the research and theory upon which it
is based. We close by discussing the implications of this framework for future research.

ORGANIZING THE RESEARCH ON DATA USE: A FRAMEWORK

As data use interventions have proliferated across the country, so too has research on data use.
Yet, the research base is somewhat disorganized. Researchers from disparate traditions focus on
different aspects of the data use phenomenon. They draw on different concepts and language,
sometimes to discuss the same thing. And, they do not always reach out across disciplinary
boundaries and research communities to attend to findings from those in other traditions. As a
result, the growing body of research has identified a large number of factors that influence
data use (see, for example, recent reviews by Jennings, in press; Marsh, in press; Henig, in
press; Honig & Venkateswaran, in press), but provides limited guidance about how these fac-
tors interact. Thus, we know relatively little about the pathways between data use interventions
and outcomes.
Here, we take a first step in addressing this state of affairs. We put forward a framework for
organizing research on data use. The framework, depicted in Figure 1, is intended to identify
key dimensions that we should attend to if we want to understand the process and outcomes of
data use in the context of data use interventions, and provide a way to understand how these
dimensions might interact. We discuss the framework in 2 parts. In this section, we provide a
broad overview of the major components of the framework. We then provide details on each
component in subsequent sections of the article.
At the center of our framework is the process of data use. We define the process of data
use as what actually happens when individuals interact with assessments, test scores, and other
forms of data in the course of their ongoing work.2 Existing research in cognitive and social
psychology suggests that data use is, at the root, an interpretive process that involves noticing
data in the first place, making meaning of it, and constructing implications for action. The process
is fundamentally interactive, influenced by characteristics of the individuals involved and the
dynamics of the social interaction. Throughout our discussion, we are interested in the full range
of data users—not only teachers but also school leaders and district administrators. Data use is a
multilevel phenomenon in education. In order to understand how data use unfolds, it is important
to investigate the process as experienced individuals at multiple levels of the system.
The process of data use is shaped in significant ways by the organizational and political
context of schools and districts, represented by the outer circle in Figure 1. We identify key
dimensions of context that span from proximal to distal. At the most proximal level, data use
routines structure who teachers and others interact with, around what data, in what ways. These
routines are influenced by the configuration of time, access to data, and organizational and

2 In other work, we have referred to this construct as “the practice of data use” (Coburn & Turner, in press). Here, we

refer to it as “the process of data use” to help distinguish individuals’ engagement with data from the change in practice
that may result.
176 COBURN AND TURNER

FIGURE 1 Framework for data use.

occupational norms that guide interaction. Leadership plays a role in all these organizational
dimensions. Finally, these dimensions of context are intertwined and influenced by relations of
power and authority.
Interventions to promote data use interact with dimensions of the organizational and political
context as they attempt to alter teachers’, school leaders’, or district leaders’ use of data in their
on-going work.3 As shown at the top left of Figure 1, we identify 3 categories of interventions:
(a) tools to promote data use, such as protocols for data analysis, processes for collecting obser-
vational data, and formative assessment systems; (b) comprehensive data initiatives that bring
together multiple tools, processes, and technology and strive for systemic improvement; and (c)

3 In focusing on data use interventions, this article focuses on planned or intentional change efforts. We acknowledge,

however, that some data use may be unplanned or emergent.


RESEARCH ON DATA USE 177

high-profile policy initiatives that promote data use, most notably state and federal accountability
policies. Features of these interventions interact with contexts and shape the process of data use.
The final component of the framework is potential outcomes, represented at the bottom right
of Figure 1. Advocates promoting data use tout a number of benefits that schools and school dis-
tricts will realize if they engage in data use activities. Researchers, in turn, investigate the impact
of data processes and data interventions on a similarly wide range of outcomes. Here, we identify
3 potentially interrelated outcomes of data use: (a) outcomes related to student learning; (b) those
related to changes in teacher and administrative practice; and, (c) those related to organizational
or systemic change.
In the text below, we discuss each component of the framework in more detail. We draw on
existing research and theory to provide definitions of the key dimensions we discuss and suggest
ways to understand the connections between them. In so doing, we begin to outline potential
pathways between intervention and outcomes. We close with a discussion of the implications of
our analysis for future research on data use.

THE PROCESS OF DATA USE

One of the main lessons from research on the process of data use is the central role of interpreta-
tion. Data does not speak for itself. Rather, people must actively make meaning of the data and
construct implications for action. In order to understand how test scores, assessments, and other
forms of data are used, it is necessary to understand these interpretative processes. In this section,
we turn to the center of Figure 1 to investigate what we know about data use processes. We ask,
What is the nature of data use processes?
We argue that interpretation is a central part of the data use process, playing a role in
how individuals notice data in the first place, how they make meaning of it, and how they
come to understandings about the implications of the data for action. It is these understand-
ings about implications for action that, once deliberated and debated, guide decision making and
action. Interpretive processes—noticing, interpreting, and constructing implications for action—
are shaped by individual beliefs, knowledge, and motivation and are influenced by the nature and
patterns of social interaction.

Noticing, Interpreting, and Constructing Implications for Action

The process of data use, and the role of interpretation, begins as individuals or groups notice the
data or patterns in the data in the first place. Existing research suggests that individuals routinely
fail to attend to key pieces of information or major patterns in data. Attention is partial and filtered
in significant ways (Spillane & Miele, 2007). People tend to search for and see aspects of the data
that support their beliefs, assumptions, and experiences and do not even notice data that might
contradict or challenge these beliefs (Bickel & Cooley, 1985; David, 1981; Hannaway, 1989;
Ingram, Louis, & Schroeder, 2004; Kennedy, 1982; Young & Kim, 2010). This phenomenon may
be exacerbated during conditions of data overload that many schools and districts are currently
experiencing. Under these conditions, individuals often narrow the range of information they
search for and pay attention to because they simply cannot attend to it all given real limits of
their time and attention (Honig, 2003).
178 COBURN AND TURNER

But noticing data is only a first step. Individuals must also interpret the test scores or assess-
ment results; that is, they must construct an understanding of what the data mean. Is the test
score high or low? What is the evidence of student learning represented in the work? To what
do we attribute this performance? Is the data, assessment, or test score valid? Individuals make
these interpretations by fitting new information into preexisting beliefs or cognitive frameworks
(Spillane & Miele, 2007; Weick, 1995). New information is always understood through the lens
of what we already know and believe (Greeno, Collins, & Resnick, 1996), influencing how data
is encoded, organized, and interpreted (Spillane & Miele, 2007). Cognitive psychologists remind
us that it is far more likely that we will assimilate information into pre-existing ways of seeing
the world than engage with data in ways that cause us to reconfigure or “accommodate” existing
cognitive frameworks in light of new information (see Spillane & Miele, 2007, for a review of
this literature). Indeed, there are ample instances in the research on data use where individuals
interpret test scores or assessments as confirming pre-existing beliefs and discounting the data
when they challenge these beliefs (Coburn, 2001; Coburn et al., 2009; David, 1981; Young &
Kim, 2010). For example, in her study of the use of Title I evaluations in 15 districts, David
(1981) found that district administrators consistently discounted evaluations that challenged their
perceptions of the programs, questioning their validity, the appropriateness of the methodology
and measures, and the degree to which the evaluations measured valued outcomes.
Interpretive processes are also front and center as individuals construct implications for
action. When teachers decide the implication of the data is that they should modify their instruc-
tion, regroup students based on assessments, or not change anything because the problem has
to do with the construction of the test or children’s home life, interpretation plays a central role.
When school or district administrators decide the data imply that they should allocate resources in
a new way, develop new policies or programs, or maintain current course of action, interpretation
also plays a central role. In connecting the data with a response, individuals link together a series
of premises into an argument for a particular direction to pursue (Phillips, 2007). Bridging the
space between data and a response involves a series of assumptions, conjectures, and judgments
(Coburn et al., 2009; Kennedy, 1982; Phillips, 2007). These assumptions and judgments tend to
be rooted in one’s prior beliefs and experiences. For example, in a longitudinal study of data use
at the district central office, Coburn and colleagues (2009) found that those in the district who
had beliefs about mathematics consistent with the standards-based curriculum attributed low test
scores to lack of professional development for teachers and argued for increased resources for
teacher professional development. Those in the district who favored more traditional approaches
to mathematics instruction attributed the low test scores to a lack of attention to basic skills in the
curriculum and argued that the district should adopt a supplementary curriculum to provide stu-
dents practice with math facts. It is individuals’ construction of implications for action that then
informs what they do in response to data. These implications also become fodder for deliberation,
negotiation, and debate at the heart of decision making.

Beliefs, Knowledge, and Motivation

As should be clear from the foregoing discussion, individuals’ beliefs play an important role in
data use processes. There is quite a bit of evidence that beliefs play a central role in noticing data
(Bickel & Cooley, 1985; David, 1981; Hannaway, 1989; Kennedy, 1982, 1984; West & Rhoton,
1994), interpreting data (Bickel & Cooley, 1985; Coburn et al., 2009; Cromey, 2000; David,
RESEARCH ON DATA USE 179

1981; Hallett, 2010; Herman & Gribbons, 2001; Ikemoto & Marsh, 2007; Ingram et al., 2004;
Kerr Marsh, Ikemoto, & Barney, 2006; Mintrop, 2004; Young & Kim, 2010), and in constructing
implications for action (Coburn et al., 2009; Kennedy, 1982; Spillane, in press). For example,
teachers often perceive standardized test data or interim assessments as lacking either validity
or usefulness for making decisions about student learning or teacher effectiveness. These beliefs
shape what data teachers seek out and notice when they make instructional decisions, for exam-
ple, attending more closely to student work or student behavior as indicators of student learning
(Cromey, 2000; Ingram et al., 2004; Kerr et al., 2006).
Knowledge may also play a role. Teachers’ and district leaders’ interpretation of data can be
problematic when they lack substantive knowledge of the subject matter relevant to the decision
(Coburn et al., 2009; Hubbard, 2010; Little, in press; Little & Curry, 2009; Timperley, 2009).
For example, in a study of 7 schools involved in a major reading initiative, Timperley (2009)
analyzed transcripts of teachers’ data meetings and found that teachers lacked sufficient peda-
gogical content knowledge to draw strong inferences from test score data and identify potential
instructional choices in a rigorous way. Knowledge of data analysis is also important, as it has
the potential to help data users identify needed data and draw inferences with data in appro-
priate ways. Yet many researchers report that teachers, school administrators, and others have
limited knowledge of the mechanics of data analysis, including how to ask questions, select data
to answer these questions, use technology to manipulate data, and draw valid interpretations of
the data (Cromey, 2000; Feldman & Tung, 2001; Kerr et al., 2006; Mason, 2002; Marsh, Pane, &
Hamilton, 2006; Means, Padilla, DeBarger, & Bakia, 2009; Supovitz & Klein, 2003).
Finally, theoretical work suggests that motivation influences how individuals engage with and
interpret data use. Research in psychology suggests that individuals have strong motivation to
maintain a positive self-image. This self-affirmation bias may lead to a tendency to discount
evidence that raises questions about the efficacy of past practices or performance. At the same
time, Edwards and Smith (1975) argue that motivation to reach certain goals may lead to greater
efforts to puzzle through undesirable evidence, rather than the more typical response of dis-
counting it (as cited in Spillane, Reiser, & Reimer, 2002). In his now-classic study, Lortie (1975)
argues that teachers’ inclination to attend selectively to evidence of effectiveness (for exam-
ple, the turn-around student) rather than whole-class patterns enables teachers to preserve the
“psychic rewards” of teaching.

Social Interaction

Finally, data use in schools and school systems is rarely an individual endeavor. Rather, it tends
to happen in social interaction and negotiation with colleagues (Halverson, Grigg, Prichett, &
Thomas, 2007; Means, Gallagher, & Padilla, 2007; Means et al., 2009; Spillane, in press). Even
teachers, who typically work alone in their classrooms, interact with children, their colleagues,
and coaches and school leaders around data in ongoing ways. This means that interpretive
processes—noticing data, interpreting it, and constructing implications for action—are typically
influenced by interaction and negotiation with others.
Who one interacts with matters. Individuals come to the table with a variety of beliefs, knowl-
edge, or motivations (Coburn & Talbert, 2006; Coburn et al., 2009; Spillane, in press). They
also bring different ideas and information to inform deliberation and debate (Honig, Copland,
Rainey, Lorton, & Newton, 2010). While interactions may lead to the development of shared
understandings (Kennedy, 1982), groups made up of individuals with contrasting beliefs and
180 COBURN AND TURNER

knowledge can also notice different data (Spillane, in press), come to different interpretations of
the same data (Coburn, 2001; Coburn et al., 2009; Hallett, 2010), or construct different implica-
tions for action (Coburn et al., 2009; Spillane, in press). For example, Spillane (in press) draws
on data on principal-teacher interaction in elementary schools in Chicago to demonstrate how
the principal had a substantively different interpretation of the meaning and implications of test
score data than teachers, leading to conflict and debate. In fact, settings where individuals interact
across department or roles are more likely to involve conflicting ideas about appropriate inter-
pretations of data and about appropriate responses than settings where individuals interact within
departments or roles (Coburn et al., 2009).
Taken together, this research suggests that while assessments, tests, observations, evalua-
tions, and numerous other sorts of data provide information to people at various levels of
the system, how these individuals use this information depends centrally on how they notice,
interpret, and construct an understanding of the implications of data for action. Interpretive
processes—noticing, interpreting, and constructing implications—are influenced by individual
beliefs, knowledge, and (at least theoretically) motivation. But, they are also influenced by
patterns of social interaction as well.

ORGANIZATIONAL AND POLITICAL CONTEXT OF DATA USE

A second key lesson from the research on data use is that the process of data use is shaped
in significant ways by the organizational and political contexts in which it takes place. The
organizational and political contexts for public schools are quite complex. Public schools are
a multilevel system, with the expectation that data processes unfold at multiple levels simultane-
ously (Honig & Venkateswaran, in press). Data processes also exist in, and are intertwined with,
a highly politicized environment, with multiple constituencies to serve and multiple goals. Here,
we return to Figure 1, focusing attention on the outer circle. We ask, What are the organizational
and political contexts that matter for data use? How do these contexts influence how the process
of data use unfolds?
Existing research identifies numerous dimensions of the organizational and political context
that matter for data use. Here, our goal is not to be comprehensive but rather to illuminate how
some of these different dimensions interact to shape the process of data use. We also seek to
move beyond a list of contextual conditions, to begin to specify the relationship between these
contextual conditions on the one hand and the process of data use on the other.
We begin at the most proximal level, discussing how organizational routines guide who inter-
acts with who around what data in what ways. We then show how the configuration of time,
access to data, and organizational and occupational norms influence what data people even notice
and the dynamics by which they interact in data use routines. We argue that school and district
leadership plays a role by influencing each of these other dimensions of context. Finally, we
illustrate how these organizational dynamics are intertwined with and influenced by relations of
power and authority.4

4 Although we do not discuss these here, other aspects of the organizational and political context that may influence

data use include the following: formal positions and roles, such as data coaches or reading coaches (Lachat & Smith,
2005; Marsh et al., 2009); the hierarchical and differentiated organization of the school district central office and overall
RESEARCH ON DATA USE 181

Data Use Routines

Though often taken for granted, data use routines can play a significant but subtle role in how the
process of data use unfolds. An organizational routine is a “repetitive, recognizable pattern of
interdependent actions, involving multiple actors” (Feldman & Pentland, 2003, p. 95). We define
“routines for data use” as the modal ways that people interact with data and each other in the
course of their ongoing work. Data use routines may be informal such as when a superintendent
regularly asks for reports from the director of assessment that she then peruses with members of
her cabinet or when principals draw on spending data in their quarterly meetings with the school
site council. Or they can be highly designed and structured, as is sometimes the case with grade-
level meetings for teachers that are guided by protocols and facilitated by a school coach or the
principal. Data use routines can be designed or naturally occurring and evolving. The defining
criteria for a data use routine is that it is recurrent and patterned interaction that guides how
people engage with each other and data in the course of their work.
Existing research suggests that data use routines are a key context for data use because they
“fram[e] and focus interactions among school staff” (Spillane, in press, p. 4). They do so by
bringing a particular configuration of people together around a particular set of data and structure
their interactions in specific ways. First, how a routine is configured, whether by design or in its
naturally occurring form, organizes who is in the room for data conversations. In some schools,
teachers primarily look at data in their grade-level groups or departments. In others, data use
routines bring teachers together as a whole school. At the central office, data use routines may
involve the superintendent and her cabinet or may mainly happen within units in the district
office. Some data use routines, such as those documented by Honig and her colleagues (2010) and
by Supovitz (2006), bring together individuals from the central office with people from schools.
The configuration of people matters because, as we have discussed, different people come
to the table with different beliefs and knowledge, which shapes how they interpret data and the
level and kind of negotiations they have over the implications of the data for action (Coburn
et al., 2009; Spillane, Parise, & Sherer, 2011). Thus, to the degree that routines influence patterns
of interaction, they are likely to influence the interpretive process—noticing data, interpreting it,
and constructing implications for action—as well.
Second, routines focus attention and thus what people notice and discuss. Routines are often
centered on a specific kind of data. For example, many data use routines bring teachers or others
together to examine standardized test scores (Marsh, in press). However, it is possible that data
use routines can focus on other forms of data, such as student work (Gearhart & Osmundson,
2009; Gearhart et al., 2006; Little, Gearhart, Curry, & Kafka, 2003), records of practice (Horn &
Little, 2010), or evidence from observations or experience (Honig et al., 2010; Ikemoto & Honig,
2010). Depending upon how a routine is configured, participants spend their time looking at
some data and not others, about some subjects and not others, related to some aspects of student
learning and not others (Spillane et al., 2011).

schooling system (Coburn et al., 2009; Thorn, Meyer, & Gamoran, 2007); rules and policies, such as specifications of
how teachers use their non-instructional time in union contracts (Marsh et al., 2006); external organizations and actors
(Burch & Hayes, 2009; Datnow & Honig, 2008); and trust (Herman & Gribbons, 2001; Ikemoto & Marsh, 2007; Ingram
et al., 2004).
182 COBURN AND TURNER

Routines also influence how teachers and others talk with one another in social interaction.
They can alternatively open up or close down opportunities for learning, shaping opportunities
to notice data and the nature of joint interpretation (Horn & Little, 2010). For example, Horn
and Little (2010) document how a routine of “normalizing,” or defining a classroom problem as
normal, closed off conversation, preventing teachers from delving deeper into the causes of the
issue raised by evidence from the classroom. As this example suggests, while conversations in
data use routines can spur action or change, groups may also come to interpret data and implica-
tions for action in ways that maintain the status quo. Ultimately, in bringing together people and
focusing and framing their attention, routines for data use are a consequential context for how
the process of data use unfolds.

Time, Access to Data, and Norms Influence Routines

Other dimensions of the organizational context, in turn, influence how data use routines unfold.
Here, we argue that the configuration of time, access to data, and norms of interaction influence
what data people even notice and the dynamics by which they interact in data use routines.

Time

Time is a central element in how interaction around data is organized. It takes time to collect
and analyze data and collectively debate implications for decision making. For teachers, prin-
cipals, and district leaders, time for data use is in short supply (Honig et al., 2010; Ikemoto &
Marsh, 2007; Ingram et al., 2004; Little et al., 2003; Marsh et al., 2006; Means et al., 2007,
2009; Weinbaum, 2009). Theoretical work and some preliminary research at the district level
suggest that the quality of decisions degrade as resources to support decision making decline.
In the absence of time to debate conflicting interpretations of data and search for and evaluate
different solutions, decision making gets increasingly drawn out, unresolved, and conservative
(Coburn et al., 2009; Cohen, March, & Olsen, 1988).

Availability of Data

As discussed earlier, the configuration of data routines draws teachers’ attention to some
data and not other data. But, this is predicated on systems and structures that bring data into
conversations in the first place. Thus, the availability of data matters for how routines unfold.
Organizations collect certain kinds of data and not others. This data is available to some peo-
ple and not to others. Data is available on a range of different time scales—some immediately,
some not until months later. As scholars point out, what data is available to whom and when is
partially a function of the technological infrastructure for data collection, storage, and retrieval
(Lachat & Smith, 2005; Marsh et al., 2006; Means et al., 2009; Means, Padilla, & Gallagher,
2010; Thorn, 2001; Wayman, 2007; Wayman, Conoly, Gasko, & Stringfield, 2008; Wayman,
Stringfield & Yakimowski, 2004). But it is also a function of the human infrastructure: How
individuals in different parts of the organization are connected to each other shapes the flow
of information (Coburn, 2010; Daly & Finnigan, in press; Honig, 2006). For example, Honig
(2006) shows that one of the main roles of district administrators who were working directly
RESEARCH ON DATA USE 183

with schools-community partnership sites was to bring information about these sites’ needs and
implementation efforts to the attention of district leaders in decision making roles.

Norms of Interaction

Occupational and organizational-specific norms further guide interaction within data use rou-
tines. At the most macro level, occupational norms of privacy in teaching work against teachers
sharing their practice with their colleagues (Little, 1990; Lortie, 1975; Marsh et al., 2006). Even
as more and more intentional routines are designed to bring teachers together to share their prac-
tice in discussions of data, norms of privacy leave the conversation at the level of the superficial,
such that it is unusual for teachers to talk in depth about their practice and share evidence of
student learning with their colleagues (Little, 2007; Little et al., 2003). Some schools do develop
local norms of inquiry or collaboration. In these schools, teachers are more likely to use data to
support joint problem solving (Ikemoto & Marsh, 2007; Little, 2007; Symonds, 2004; Young,
2008). Schools with norms that enable teachers to share data about their classroom practice
openly, critique one another, or ask challenging questions are more likely to have conversa-
tions that delve more deeply into issues of instruction and student learning (Little et al. 2003;
McDougall, Saunders, & Goldenberg, 2007; Timperley, 2009).

Leadership

School and district leaders play a role by influencing each of these aspects of context: designing
routines in the first place, allocating time, creating access to data, fostering norms of interaction,
and participating themselves in data use routines. School or district leaders may select or design
data use routines (Honig et al., 2010; Spillane et al., 2011; Supovitz & Klein, 2003). The choices
they make about how the routine is designed have consequences for who is involved, how and
how often they interact, and around what data (Sherer & Spillane, 2011; Spillane et al., 2011).
For example, in their report on data use in 5 schools engaged in comprehensive school reform,
Supovitz and Klein (2003) found that school principals and other leaders created a number of
“innovative activities” that guided teachers’ engagement with data and each other. One princi-
pal developed a routine of meeting individually with 4th grade teachers to plan how to move
students to the next level in the high-stakes tests. In preparation, the principal assembled the
previous year’s test results and used this data to determine the instructional efforts she believed
each teacher should take. The principal and each teacher then used this data and the principal’s
analyses to plan lesson sequences for the year. In developing this routine, this principal made
important choices about what data to use, how some of the data would be analyzed, who would
participate in the discussions, and what role the teachers would play in this data use routine.
School and district leaders also configure time, enabling or constraining teachers’ and others’
ability to engage in data use routines regularly or for extended periods of time (Coburn & Russell,
2008; Halverson et al., 2007; Young, 2008). Leaders, especially at the district level, make deci-
sions about who gets access to what data. They filter large masses of data, selecting what data
gets sent to schools and sometimes presenting the data in particular formats. This serves to focus
attention, guiding conversation and debate (Halverson et al., 2007; Marsh et al., 2006; Thorn,
Meyer, & Gamoran, 2007). School and district leaders also play a key role in establishing norms
184 COBURN AND TURNER

of interaction. They can create a climate of trust and risk taking in schools, which enables teach-
ers and others to share more freely and take the risks necessary to change their practice (Bryk
& Schneider, 2002; Copland, 2003; Ikemoto & Marsh, 2007; Wayman & Stringfield, 2006).
However, they can also use data to create a climate of fear and turmoil (Hallett, 2010). Leaders
also may foster norms that establish data use as part of “the way we do things” at a school or dis-
trict, leading to the development of more frequent or widespread data routines (Lachat & Smith,
2005).
Finally, school and district leaders play a particularly important role when they participate
in data use routines themselves. School leaders’ questions, guidance, and statements can focus
discussions about data in important ways, shaping how others notice and interpret, as well as
the substance of the debate (Copland, Knapp, & Swinnerton, 2009; Earl, 2009; Halverson et al.,
2007; Lasky, Schaffer, & Hopkins, 2009; Spillane, in press; Symonds, 2004; Young, 2008). For
example, Lasky and her colleagues (2009) found that in the context of data routines, school lead-
ers’ prompts pointed teachers to procedural rather than substantive issues and, at times, diverted
teachers’ attention from the data altogether. But, other studies document school leaders’ efforts
to keep teachers focused on student learning through repeated questioning and facilitation in the
context of data use routines (Earl, 2009; Symonds, 2004). For example, Symonds (2004) found
that school leaders in schools that were successful at closing the achievement gap between White
and Asian students on the one hand and African American and Latino on the other were more
likely to focus teachers’ attention on the achievement gap than schools where the achievement
gap remained stable or increased.

Relations of Power and Authority

Finally, relations of power and authority—between schools and communities, schools and dis-
tricts, and teachers and administrators—play a role in how data use processes unfold. Power and
the political pressure it feeds is a near omnipresent characteristic of the context of public schools.
Public schools are in the public domain. Multiple interest groups inside and outside the district
with different stakes and, at times, different values pressure district and school administrators to
pay attention to certain data and to make particular decisions. In spite of the fact that data use is
often positioned as the antidote to overly politicized decision making at the school and district
level, there is evidence that it is deeply intertwined with data use processes.
As Henig (in press) reminds us, information is power. The public release of data is intended to
and often does reshape power relations between schools and their communities (Henig, in press;
McDonnell, 2004). Community actors can and often do use performance data to push for changes
at the school and district level, creating greater power for their positions and for themselves.
Data use can also reshape power relations within schools and districts. Indeed, one purpose of
data use, especially as part of accountability policy, is to create better monitoring over classroom
instruction. Thus, data use routines—between districts and schools and within schools—may be,
in part, mechanisms of managerial control (Hallett, 2010; Henig, in press; Spillane, in press).
While data can influence power relations, power relations can also influence data use (Henig,
in press). More specifically, power relations can influence what data one notices as the very
decision to seek further data can emerge in the midst of controversial issues or from political
motivations (Englert, Kean, & Scribner, 1977; Kennedy, 1982, 1984). For example, Kennedy
RESEARCH ON DATA USE 185

(1982) recounts how political controversy relating to personnel matters brought a long-standing
program to the attention of district staff in 1 of the 16 districts in her study. In the course of
addressing the personnel issue, staff noticed and attended to previously “dormant” data. Thus,
political processes shifted notions of what was important to pay attention to, which in turn raised
the profile of certain kinds of data and not others.
Finally, relations of authority matter as well. Authority is power that comes with a particular
role or position in an organization and can be exercised by any person holding that position (Scott
& Davis, 2007). Research on data use suggests that people with different levels of authority
have differential influence in the negotiation about the meaning and implications of data. For
example, in a study of instructional decision making among district administrators, Coburn and
her colleagues found that when there were differences in how individuals interpreted the data
and its implications for action, those with authority nearly always prevailed. This suggests that
authority plays an important role in the interpretive process and, thus, in how data use unfolds in
social interaction (Coburn, 2005; Coburn, Bae, & Turner, 2008; Spillane, in press).
Taken together, the existing research provides evidence that interpretive processes unfold in
and are influenced by a multilevel organizational and political context. At the most proximal
level are data use routines that guide who interacts with who around what data in what ways.
Data use routines, in turn, are influenced by the configuration of time, access to data, norms
of interaction, and school and district leadership. Finally, relations of power and authority are
important as well, shaping the dynamics of interaction within which interpretation, deliberation,
and debate unfolds.

INTERVENTIONS TO PROMOTE DATA USE

A third key lesson from research on data use is that the nature of the intervention matters for
how it interacts with contexts and shapes interpretive processes. There are currently a significant
number of interventions to promote data use in schools and districts across the country. These
interventions vary substantially. They can be as modest as a single protocol to guide conversa-
tion or as elaborate as a system of regular, interim assessments, supported by new technology to
promote access to data, professional development to support interpretation, and requirements for
weekly or biweekly data conversations among teachers and other staff. Further, accountability
policy plays an increasingly important role, adding rewards, sanctions, and a lot of public atten-
tion to data into the mix. What these diverse interventions share is the intent to alter teachers’,
school leaders’, and/or district leaders’ use of data in their ongoing work.
Here, we turn to the upper left corner of Figure 1, asking, How do interventions promoting
data use interact with the organizational and political contexts? How do they shape the process of
data use? We draw on existing research to illustrate how features of data use interventions shape
political and organizational contexts and the process of data use in intentional and unintentional
ways. We argue that understanding these linkages is critical as it can provide a foundation for
understanding the mechanisms by which interventions produce outcomes that matter, a topic to
which we return in the final section.
We begin by outlining 3 different types of data use interventions that we attend to in this
review. We then review how different features of interventions influence the context and process
of data use.
186 COBURN AND TURNER

Types of Interventions

We identify 3 categories of data use interventions that move from targeted to multifaceted.
First, there exists a raft of tools intended to foster data use. Tools are “externalized represen-
tations of ideas used by practitioners in their practice (Norman, 1988), which serve as mediating
devices that are used to shape action in certain ways” (Sherer & Spillane, 2011, p. 616). Rather
than dictating what people should do, Smagorinsky and colleagues argue that tools create the
“potential for different kinds of action that may be realized in different ways by different partic-
ipants” (as cited in Honig, 2008, p. 638). Tools intended to foster data use include protocols
for examining data, software systems that organize and create reports of data (e.g., Quality
School Portfolio [Chen, Heritage, & Lee, 2005] or Grow Reports [Light et al., 2005]), new
formative assessments, processes for collecting and analyzing observational data, (e.g., the
LearningWalk from the Institute for Learning [Ikemoto & Honig, 2010] or the Snapshot in Duval
County [Supovitz & Weathers, 2004]), among others. There are also targeted supports for data
use, including the development of data coaches or facilitators (Marsh, McCombs, & Martorell,
2009).
Second, in recent years, school districts and external organizations have developed a range
of comprehensive initiatives to foster data use in schools. These initiatives often incorporate
multiple tools along with professional development and new technology. They include such
diverse strategies as district initiatives that couple interim assessments linked to pacing guides
and curriculum standards that teachers are required to administer and discuss in teacher teams
(Christman et al., 2009; Clune & White, 2008; Goertz, Nabors Oláh, & Riggan, 2010); school-
level inquiry projects that tend to focus on a broad range of data, use protocols to guide
data discussions, and frequently involve trained facilitators and/or professional development
(Copland, 2003; Gallimore, Ermeling, Saunders, & Goldenberg, 2009; McDougall et al., 2007;
Porter and Snipes, 2006; Saunders, Goldenberg, & Gallimore, 2009); and district data use initia-
tives that engage individuals at multiple levels of the system in data routines, use technological
tools and protocols, and involve professional development (Ikemoto & Honig, 2010; Kerr et al.,
2006; Marsh et al., 2006; Supovitz, 2006) among others.
Third, data use has been heavily promoted by district, state, and federal accountability policy.
In accountability policies, data is the main way to evaluate progress and is linked to incentives
for teachers and others to change their practice (Stecher, Hamilton, & Gonzalez, 2003). This
approach is based on the assumption that the sanctions and rewards linked to data will (a) focus
greater attention on student performance, thus increasing data use and (b) leverage the findings
from the data to motivate educators to make instructional change to improve that performance.
Of course, these 3 categories of interventions are not mutually exclusive. Comprehensive data
initiatives are composed of combinations of tools. Accountability policy sometimes triggers the
development or adoption of individual tools or comprehensive initiatives. In addition, interven-
tions can either be locally developed, and thus emerge from inside a school or organization, or
adopted or imposed from the outside.
However, whether an individual tool, a comprehensive initiative, or accountability policy,
whether coming from outside or emerging from within, we must understand how the intervention
interacts with the existing organizational and political contexts of a setting and how it influences
underlying data use processes if we are to understand the consequences of data use interventions.
RESEARCH ON DATA USE 187

Empirical and theoretical work suggests that how interventions shape these processes depends
upon the features of the intervention itself.

Features of Interventions

Most studies of data use interventions tend to be descriptive (Knapp, Swinnerton, Copland, &
Monpas-Huber, 2006). That is, they focus on describing the nature of the activities or strate-
gies involved, without attention to either outcomes or the process by which these outcomes are
achieved (Coburn & Turner, in press; Jennings, in press; Marsh, in press). Here, we draw on
research that investigates the relationship between data use interventions and the context and
process of data use. We identify 6 features of data use interventions that interact with the political
and organizational contexts and influence the process of data use: designed routines, technolog-
ical tools, protocols to guide interaction, professional development, sanctions and rewards, and
systems of meaning. This list is not meant to be comprehensive. Rather, we emphasize features
for which there exists empirical literature that helps explain how the feature interacts with either
the context or process of data use.

Designed Routines

One of the central ways that data use interventions attempt to shift data use processes and out-
comes is by introducing designed data use routines into schools and districts (Sherer & Spillane,
2011). Not all data use interventions employ this feature, but the creation of professional learn-
ing communities or inquiry teams that encourage teachers and others to work together to discuss
data in structured and patterned ways appears to be increasingly common (Little, in press). These
designed routines shape existing contexts by interacting with and potentially altering preexist-
ing or naturally occurring data use routines in schools and district. In so doing, they have the
potential to (a) shape what teachers or others notice; (b) alter patterns of interaction in ways that
influence how people interpret and construct implications for action; and (c) influence individual
and shared beliefs.
First, designed routines can influence data use processes by focusing attention on some data
and not other data, thus shaping what participants notice and attend to (Ikemoto & Honig, 2010;
Sherer & Spillane, 2011; Spillane in press; Spillane et al., 2011). For example, Spillane and his
colleagues (2011) compared routines for data use designed and instituted by school adminis-
trators in 3 different Chicago elementary schools. They show that the design of these routines
led teachers to focus on different kinds of information in different schools. While the designed
routine in 1 school focused on benchmark assessments linked to the state standardized tests,
data routines in other schools involved a broader range of data, including classroom assessments,
information from teacher surveys, and classroom observations. Thus, when teachers interacted in
data use routines, their conversations were centered on different aspects of and different evidence
of student learning.
Second, designed routines often bring people together in new and different combinations
(Coburn & Russell, 2008; Honig et al., 2010; Supovitz, 2006). By altering patterns of interaction,
these designed routines may also influence the dynamics by which teachers and others interpret
188 COBURN AND TURNER

data and construct implications for action. For example, Supovitz (2006) documents how
designed data use routines in Duval County, Florida brought together individuals from the central
office with school principals in regular, patterned ways to discuss data on the implementation of
districtwide reforms. This routine served to bring school leaders into district-level conversations
about the strategic direction of the district.
We know that who is at the table for data use conversations is critical, because the range of
beliefs and knowledge present as well as the configuration of authority relations shapes what
interpretations are brought to the table, as well as the negotiation over implications for action
(Coburn et al., 2008, 2009). Indeed, several studies find that when designed routines brought
school administrators together with teachers in new ways, administrators played an increased
role in what teachers noticed about the data, how they interpreted it, and how they constructed
implications for action (McDougall et al., 2007; Spillane et al., 2011). Thus, to the degree that
interventions shape patterns of interaction through designed routines, they are likely to influence
the dynamics of interpretation as well.
Third, when accompanied by the provision of adequate time, designed routines may also
shape individual and collective beliefs. McDougall and colleagues (2007) provide evidence that
teachers who participated in inquiry teams that included release time during the day to discuss
student work changed their expectations for students compared with teachers in the same inter-
vention without adequate time. The researchers argue that in-depth, open discussion of student
achievement in the designed routines, enabled by sufficient time, brought teacher expectations to
the surface and prompted individual and collective reexamination.

Technological Tools

Many interventions involve new technological tools. Means and her colleagues (2009) report
that almost all school districts surveyed in a nationally representative sample had student data
information systems in place and more than 75 percent had systems for analysis and organization
of benchmark assessments and data warehouses with current and historical data on students.
Technological tools can shape access to data (Kerr et al., 2006; Means et al., 2009; Wayman &
Stringfield, 2006), which, in turn, has the potential to influence what teachers and others notice
and talk about in data use routines. What data and in what form depends, in part, on the design
of the system’s technological infrastructure (Supovitz, 2006; Thorn, 2001; Wayman et al., 2008)
and the configuration of data reports (Brunner et al., 2005; Thorn et al., 2007). For example, in
their study of the use of Grow Reports in New York City, Brunner and colleagues (2005) found
that the data reports that sorted students into levels of proficiency (far below, far above, etc.) for
each state standard facilitated teachers’ and others’ attention to those students who were on the
cusp of proficiency, or “bubble kids” (see also Supovitz, 2006).
But existing research suggests that some technological tools fall short when they create
access to data that teachers do not find useful or relevant to their instructional decisions (Goertz
et al., 2010; Means et al., 2009; Wayman et al., 2008; Young & Kim, 2009), limiting the
degree to which they influence teachers’ data use processes (Goertz et al., 2010). In addi-
tion, technological tools may be more effective in shaping interpretive processes when they
are accompanied by training on how to use the system, something that is often lacking (Means
et al., 2009).
RESEARCH ON DATA USE 189

Protocols and Skilled Facilitation

Some interventions provide explicit protocols, at times accompanied by skilled facilitation,


to structure interaction in data use routines. Protocols are “procedural steps and guidelines . . .
to organize discussion and structure participation” (Little et al., 2003), and existing research
suggests that they have the potential to (a) shape data use routines in ways that influence
interpretive processes and (b) alter norms of interaction.
Even though individuals rarely follow protocols completely, protocols nevertheless focus con-
versation in important ways (Earl, 2009; Horn & Little, 2010; Ikemoto & Honig, 2010; Lasky
et al., 2009; Little & Curry, 2009; Timperley, 2009). While some protocols prompt teachers and
others to talk about specific evidence of student learning or specific instructional strategies, oth-
ers prompt more general conversation or, even, a tendency to “turn away” from data (Earl, 2009;
Lasky et al., 2009; Little et al., 2003; Timperley, 2009). This, in turn, can shape what teachers
notice because the protocol and conversation that ensues can focus teachers’ and others’ atten-
tion on some data and not other data, and can sometimes divert attention from data completely
(Ikemoto & Honig, 2010; Sherer & Spillane, 2011; Spillane in press; Spillane et al., 2011).
Protocols in combination with skilled facilitation—especially by school administrators—may be
more likely to focus conversation on the data itself and implications for practice (McDougall
et al., 2007; Supovitz, 2006) than protocols alone.
Protocols may also influence norms of interaction. Well-structured protocols can create a safe
space for conversation by guiding who talks, how much, and in what ways (Marsh, in press;
Murnane, Sharkey, & Boudett, 2005; Nelson & Slavit, 2007), preventing any one person from
dominating conversations and allowing differences of opinion to come to light. Yet, protocols
alone may not be sufficient for changing long-standing occupational norms of privacy in teach-
ing. Because norms of non-intervention and maintaining harmony often prevent teachers from
engaging in challenging investigation of teaching and their own classroom practice, skilled facil-
itation may be important for productive social interactions around data use (Little et al., 2003;
Nelson & Slavit, 2007).

Professional Development

Professional development on either the mechanics of data use or on subject matter content has
the potential to shape interpretive processes indirectly by influencing the knowledge and beliefs
teachers and others draw upon as they notice data, interpret it, and construct implications for
action. When interventions provide professional development or coaching on the mechanics of
data use, teachers’ knowledge about asking questions, selecting appropriate data, and drawing
appropriate inferences increases (Armstrong & Anthes, 2001; Chen et al., 2005; Datnow, Park,
& Wohlstetter, 2007; Fuchs, Fuchs, Karns, Hamlett, & Katz, 1999; Ikemoto & Marsh, 2007;
Supovitz, 2006), although knowledge may increase in some respects but not in others (Gearhart
& Osmundson, 2009) and not all professional development results in such growth (Weinbaum,
2009). Similarly, although some studies suggest that professional development can influence
subject matter knowledge (Ikemoto & Honig, 2010), others report that it does not always offer
sufficient support or help teachers to develop the knowledge necessary to connect interpretations
190 COBURN AND TURNER

of data into implications for their own specific teaching practice (Gearhart & Osmundson, 2009;
Massell & Goertz, 2002; Means et al., 2010).

Sanctions and Rewards

In this era of accountability, it is increasingly common for data use interventions to involve
sanctions and rewards for performance, with data as the arbiter of performance (Jennings, in
press). Linking data use with sanctions and rewards, especially the degree to which attribu-
tions of success and failure are publicized widely, may alter relations of power in ways that
shape interpretation and action (Henig, in press; McDonnell, 2004). Indeed, there is evidence
that accountability policy reshapes power relations between schools and communities, causing
school leaders to feel greater pressure from the district and from local communities to improve
test scores in order to maintain public support and funding (McDonnell, 2004). This, in turn,
shapes their construction of implications for action: their sense of what they must do to promote
data use and instructional change in their school (Fairman & Firestone, 2001).
Administrators at the school and district levels have responded to increased accountability
pressure with increased monitoring. In so doing, they have designed data use routines that serve
as a form of surveillance (Goertz et al., 2010; Hallett, 2010; Honig et al., 2010; McDougall et al.,
2007; Spillane et al., 2011). For example, Hallett (2010) shows how one principal in Chicago
developed a routine by which she inspected teachers’ grades and evidence from student work as
a mechanism to see what was going on in teachers’ classrooms and ensure that teachers were
teaching to the standards. At the district level, Goertz and her colleagues (2010) document a
similar phenomenon. District leaders held meetings with principals to publicly share and discuss
benchmark assessment data reports for each school in what was intended to be a supportive dis-
cussion but that principals came to experience as a form of evaluation. Thus, pressures associated
with sanctions, rewards, and the public nature of data altered the function of data use routines in
ways that shifted power relations between participants (Hallett, 2010).

Systems of Meaning

Finally, a handful of studies highlights a much more subtle way that interventions influence
interpretive processes at the center of data use: By providing systems of meaning—including cat-
egories, classification systems, and logics of action—that become embodied in data use routines
(Little, in press; Sauder & Espeland, 2009; Spillane, in press) and shape interpretive processes
in important ways (Little, in press). We know the most about the role of systems of meaning in
accountability policy, but it is conceivable that other kinds of interventions have a similar effect.
For example, No Child Left Behind (NCLB) put forth a bevy of categories for understanding
school and district performance, including the now-iconic categories of below basic, basic, profi-
cient, and advanced as well as the categories for the key subgroups. These classification systems
have become embedded in the way that data are collected and presented to teachers and others
(e.g., in Grow Reports, discussed above). They also influence how protocols are structured to
guide data use routines. Close-in studies of conversations in data use routines provide evidence
that teachers and others invoke these categories as they interpret data and discuss implications
for action (Little, in press; Spillane, in press). In reviewing the research on teachers’ talk in data
RESEARCH ON DATA USE 191

use routines, Little (in press) argues that “classificatory talk” pervades teachers’ discussion of
data. Teachers draw on key categories to “assign various meanings to data, make inferences from
data, create explanations for observed patterns, or imagine appropriate responses to the patterns
they detect” (p. 28; see also Blanc et al., 2010). Thus, categorization systems that are promoted
by policies such as NCLB can influence, not only how teachers, school leaders, and district per-
sonnel look at, analyze, and make meaning of data, but also how they organize instructional
responses (Coburn & Turner, in press).
Accountability policy also influences data use routines by providing “logics” of action, or
organizing principles that specify both goals and the appropriate means for achieving the goals
(Friedland & Alford, 1991). These logics can become bound up in the very design of data use
interventions (Hallett, 2010; Spillane, in press). For example, Spillane (in press) argues that the
data use routines he documented in his study of elementary schools in Chicago embodied logics
promoted by the accountability movement. Spillane shows that in spite of the different ways that
principals designed data use routines across the 3 schools in his study, all the routines were guided
by an accountability logic involving curricular standardization, a primacy on student achievement
tests as measures of progress, and a focus on making classroom practice more transparent. Thus,
Spillane argues, the data use routines served to bring these new ideas about the social organiza-
tion of schooling firmly into the school, reshaping teacher and administrator roles and the power
relations between them.
Taken together, this analysis highlights a number of features of data use interventions that alter
political and organizational context and the process of data use in schools and districts. Several
features of data use interventions—designed routines, technological tools, protocols and skilled
facilitation, sanctions and rewards, systems of meaning—can play an important role in how
data use routines unfold in schools and districts, shaping administrator roles, patterns of interac-
tion, and underlying interpretive processes in consequential ways. Other features can influence
norms of interaction, including protocols and skilled facilitators. Still others—like professional
development—influence the beliefs and knowledge that individuals and groups draw upon as
they notice data, interpret it, and construct implications for action.
By highlighting the research that draws links between data use interventions and the contexts
and process of data use, we provide a way of understanding how research on interventions can
intersect with research on organizational and political contexts on the one hand and research on
the underlying interpretive processes on the other. In so doing, we begin to lay the foundation
for understanding the mechanism by which interventions produce data use outcomes of value.
However, at the same time, it is clear that not all interventions interact with contexts and shape
data use processes all the time or in the same way. Understanding when and under what condi-
tions a given feature of data use interventions interacts with context and shapes the processes of
data in what ways is an area that is ripe for future investigation.

OUTCOMES OF DATA USE

The degree to which interventions interact with political and organizational contexts to shape
interpretive processes is important because it has consequences for diverse outcomes. Those pro-
moting interventions for data use make various claims about the outcomes: increased student
learning, improved test scores, educators’ changed attitudes about student success, improved
192 COBURN AND TURNER

practice, greater efficiency, school improvement, organizational learning, and organizational


transformation. Part of the challenge of sorting out these outcomes is that different scholars focus
on outcomes at different levels of the system (classroom, school, and school district) and units
of analysis (individuals, groups, and organizations as a whole). Furthermore, different scholars
conceptualize outcomes at a given level in different ways, creating more complexity.
In this section, we return to Figure 1, focusing on the dimensions in the bottom right corner.
We ask, What are the outcomes of data use? What are the pathways between intervention and
outcomes? We put forth one way to think about and organize the potential outcomes of data use.
We start by discussing organizational change, arguably the least familiar and certainly least stud-
ied outcome. We then move on to changes in practice and, finally, student learning. Throughout,
we draw on existing literature to provide insight into possible pathways by which interventions
interact with political and organizational context and shape data use processes to influence these
various outcomes. And, we highlight the ways in which these outcomes, in turn, may be linked
to one another.

Organizational Change

Scholars of organizations insist that organizations are more than the sum of the individuals that
populate them (Scott & Davis, 2007). Thus, it is also possible to conceive of change in orga-
nizations that is more than the sum of change in individuals’ practices. Although less common
in studies of data use in public schools, some scholars have focused attention on these sorts of
organizational outcomes, including changes in policy or strategic direction (Coburn et al., 2009b;
Supovitz, 2006), changes in organizational structure (Thorn et al., 2007), and changes in the way
work and work roles are organized (Honig, 2008; Honig et al., 2010; Sherer & Spillane, 2011;
Spillane et al., 2011; Supovitz, 2006). It is possible to study organizational change at the school
level (e.g. Sherer & Spillane, 2011) or the school district level (e.g. Honig, 2008; Honig et al.,
2010; Supovitz, 2006) or to study the system of public schooling as a whole (e.g. Henig, in press).
What is important here is that these are changes that go beyond individual practice and persist in
the face of the turnover of individual personnel (Sherer & Spillane, 2011).
Organizational learning theorists remind us that change does not always equal improvement
(Levitt & March, 1988). Organizations can learn in a way that reinforces existing practice, lead-
ing to stability rather than change (Argote, 1999; Glynn, Lant, & Milliken, 1994; Levitt & March,
1988; Feldman & Pentland, 2003). Changes in policy, structure, or the organization of work may
also produce what some perceive to be negative outcomes.
To date, few studies have addressed the impact of data use interventions on organiza-
tional change. Yet those that do, provide a remarkably consistent, if general, portrait of the
pathways between interventions and organizational change. These studies suggest that organi-
zational change can result when groups or individuals engage in an iterative process of noticing,
interpreting, and constructing implications for action in the context of data routines. When orga-
nized strategically, data use conversations, and the incremental decisions that result, can add up
to substantial changes in policy, the organization of work, and work practices themselves (Honig,
2008; Honig et al., 2010; Sherer & Spillane, 2011; Supovitz, 2006; Supovitz & Weathers, 2004).
For example, Honig and colleagues’ study of 3 reforming school districts (2010) shows that
central office leaders’ ongoing data routines were central to their ability to transform the way
RESEARCH ON DATA USE 193

central office work was organized. These routines, and the degree to which they focused on
data that showed what was actually happening rather than impressions, fostered norms of self-
reflection and openness to data. Iterative data use routines also provided information that enabled
continued adaptation of their efforts, leading to substantial organizational change over the long
term.
Organizational change may be more likely when data use routines are designed to be interlock-
ing and stretch across multiple aspects of the district, as was the case in Duval County, Florida
(Supovitz, 2006). In this case, the overlapping, interlocking nature of the data use routines led
to the development of shared understandings that subsequently guided interpretation, leading to
more coordinated action systemwide. Furthermore, changes in the organization of work and work
practices achieved through iterative conversation in data use routines can be sustained in the face
of changes of individual personnel, even as the routines themselves evolve over time (Sherer &
Spillane, 2011).

Change in Practice

The theory of action underlying many data use interventions is that teachers, school leaders, and
district administrators will examine data and adjust their practices to support student learning.
As such, teacher and administrator practice is a key interim outcome for data use interventions.
“Practice” can be understood as “the coordinated activities of individuals and groups in doing
their ‘real work’ as it is informed by particular organizational or group context” (Cook & Brown,
1999, pp. 386–387). For teachers, changing practice in response to data may mean altering
instructional strategies, grouping, instructional materials, or other dimensions of the classroom.
It may also mean changes in the ways that they interact with one another or shifts in the roles that
they take on in schools and districts. For school leaders, changing practice may mean new roles
and responsibilities. It may also mean shifting the way they interact with teachers, parents, and
students. Finally, for district administrators, changing practice may mean altering the ways they
go about the task of making decisions or setting new policies, but it may also mean changing the
way that they work in relation to each other and those in schools.5
Like organizational change, it is important to remember that change in practice may not
always be positive, as when teachers and administrators game the system (Booher-Jennings,
2005; Heilig & Darling-Hammond, 2008), take measures to narrow the curriculum (Diamond
& Cooper, 2007; Hoffman, Assaf, & Paris, 2001; Marsh, Hamilton, & Gill, 2008; Ogawa,
Sandholtz, & Scribner, 2004; Sloan, 2006; Wright & Choi, 2006), or make short-term, superficial
changes in practice (Diamond & Cooper, 2007).
Existing research has begun to provide insight into the ways in which interventions shape
the context and process of data use to influence teacher and administrator practice. In terms
of administrator practice, we know that some interventions institute new data use routines that
bring administrators together with others—teachers, coaches, and school and district leaders—in
new ways (Honig et al., 2010; McDougall et al., 2007; Spillane et al., 2011; Supovitz, 2006).

5 In this section, we are focused on change in practice that results from engagement with data. However, it is important

to note that data use is itself a form of practice in the Cook and Brown (1999) sense. Please see Coburn and Turner (in
press) for a fuller treatment of the practice of data use, both conceptually and methodologically.
194 COBURN AND TURNER

Participation in these new routines can influence how and how often administrators give feed-
back to teachers and others (McDougall et al., 2007; Spillane et al., 2011), the content of that
feedback (McDougall et al., 2007), and administrators’ broader strategies for providing support
and supervision (Honig et al., 2010; Ikemoto & Honig, 2010). For example, in their study of
15 Title I schools—9 involved in an initiative focused on developing inquiry teams and 6 control
schools—McDougall and his colleagues (2007) found that principals in the treatment schools
were more likely to participate in teacher inquiry meetings and professional development. When
they participated, they were more likely to provide feedback that focused teachers’ attention on
data on student learning during data deliberations.
Changes in administrator practice can have important consequences for teacher practice.
Recall that school and district administrators can play an important role in the interpretive pro-
cesses at the center of data use when they participate with others in data use conversations.
Because of the authority relations involved, principals’ interpretation of data and construction
of implications for action may be quite influential. Indeed, when administrators respond to test
score data in the context of accountability policy with increased calls for test preparation, narrow-
ing curriculum, or focusing on children at the margins of proficiency (bubble kids), we are more
likely to see these practices on the part of teachers (Booher-Jennings, 2005; Bulkley, Fairman,
Martinez, & Hicks, 2004; Christman et al., 2009). Furthermore, studies of comprehensive initia-
tives suggest that when administrators participate with teachers in data use routines, it can create
stronger linkages between administrators’ actions and teacher practice (McDougall et al., 2007;
Spillane et al., 2011), although the impact on teachers’ practice that results depends upon what
school leaders emphasize and, perhaps, the nature of the data use routine itself. For example,
Spillane and his colleagues report that data use routines that drew on an accountability logic
emphasizing monitoring and surveillance led to increased standardization across classrooms and
greater coupling between teachers’ classroom practice and the environment (Spillane, in press;
Spillane et al., 2011).

Student Learning

Student learning is perhaps the most oft-cited of outcomes in the data use literature. For educators
and researchers alike, student learning is the bottom line. However, there is considerable debate
among those in the measurement and assessment community about what constitutes valid mea-
sures of student learning and the relationship between achievement on tests and student learning
(Baron & Wolf, 1996; Herman & Haertel, 2005; Ryan & Shepard, 2008). And, indeed, perhaps
reflecting the multiple viewpoints in this debate, studies of data use rely upon a wide variety
of measures, including classroom assessments, performance assessments, and, most frequently,
large-scale standardized tests to measure student learning (Black & Wiliam, 1998; Young & Kim,
2010). With the advent of No Child Left Behind, scholars (and those in schools and districts)
are beginning to be attend to relative achievement between students of different racial and ethnic
groups as well as special education and language status. Thus, studies increasingly investigate the
degree to which data use interventions influence the long-standing “achievement gap” between
White and Asian students on the one hand and African American, Latino, and Pacific Islander
students on the other (e.g., Snipes, Doolittle, & Herlihy, 2002; Symonds, 2004).
The pathways between interventions to promote data use and these student outcomes are less
clear. While there is an accumulating body of research on the impact of data use interventions
RESEARCH ON DATA USE 195

on student outcomes, especially related to the introduction of new assessments (for reviews,
see Black & Wiliam, 1998; Fuchs & Fuchs, 1986; Jennings, in press; Marsh, in press; National
Research Council, 2011; Young & Kim, 2009), this research rarely includes an investigation of
the process by which these outcomes are achieved (Coburn & Turner, in press; Marsh, in press;
Jennings, in press). However, a handful of studies do link interventions to context, data use
process, and outcomes, providing insight into at least a few possible pathways from intervention
to student learning. These studies tend to emphasize either the ways that intervention shape
deliberation and discussion through protocols, facilitation, and administrative involvement in
ways that influence student learning or the degree to which they influence teachers’ individual
knowledge through professional development.
One set of studies focuses on the degree to which interventions influence student learning by
shaping the nature of teachers’ interaction in data use routines. In their study of a comprehensive
initiative focused on the implementation of grade level inquiry teams, Saunders and his col-
leagues (Gallimore et al., 2009; McDougall et al., 2007; Saunders et al., 2009) provide evidence
that schools with strong implementation of the intervention significantly outperformed control
schools on the SAT-9 achievement test (conservative effect size of 0.8). Drawing on qualitative
work, they argue that this result is related to differences in the ways that teachers interacted in
data use routines. Because their study was longitudinal, they are able to show that the nature
of this conversation—and student learning outcomes—changed when the model added profes-
sional development, skilled facilitation, and more active involvement of school leaders in data
conversation. These features, along with release time for teachers, shifted the nature of conver-
sation in data use routines in ways that shaped what teachers noticed and how they constructed
implications for action. For example, a comparison of teacher talk in data use routines in exper-
imental versus control schools shows that teachers in the treatment schools were more likely to
attribute student achievement to specific instructional actions while teachers in control schools
were more likely to attribute it to student traits or non-instructional explanations. These differ-
ences, in turn, were associated with increases in student learning as measured by standardized
achievement tests.
A set of studies of the introduction and use of benchmark assessments in Philadelphia pro-
vides additional evidence that school leaders can be influential in data use routines. Christman
and her colleagues (Blanc et al., 2010; Christman et al., 2009) found that using benchmark assess-
ments and other features of managed instruction did not produce student learning gains unless
it was accompanied by strong instructional leadership in the school, a statistically significant
predictor of learning growth (effect sizes ranged from 0.11 to 0.17) (Christman et al., 2009).
Drawing on their qualitative data, the researchers show that teachers and school leaders in most
schools focused on short-term solutions in data use routines like test-taking strategies (Blanc
et al., 2010; Christman et al., 2009). That is, teachers rarely addressed issues in their own instruc-
tion. However, school leaders with strong instructional leadership were able to focus teachers’
attention on the implications of data for their classroom instruction. In one case-study school,
the school principal structured grade-level discussions around analysis of data and instructional
issues, encouraged teachers to connect benchmark data with instructional tools like the curricu-
lum standards, and hired teacher leaders who worked with teachers to interpret and connect
benchmark assessments with instructional strategies (Blanc et al., 2010). Administrators with
strong instructional leadership also used data to inform their own thinking on priorities for
teacher professional development, shaping what resources they provided teachers to support data
use (Christman et al., 2009).
196 COBURN AND TURNER

In contrast, rather than focusing on teacher interaction in data use routines, Fuchs and her
colleagues (1999) focus on how interventions shape teacher knowledge through the provision
of professional development. They draw on data from an experimental study of an intervention
that involved the introduction of new performance assessments in mathematics and professional
development to support their use. They found that teachers in the treatment group had increased
knowledge of the merits of performance assessment and how to use them in their classroom
(effect size=1.70) and change in self-reported mathematics practice (effect sizes ranging from
.62 to 1.51, depending upon the dimension of practice measured). They also report that students
in treatment classes showed growth in various aspects of mathematical problem solving, although
there were stronger effects for students who teachers identified as above grade level (effect size
ranging from .93 to 1.47, depending upon the dimension of problem solving measured) than for
those at grade level (effect size ranging from .30 to 1.15) or those below grade level (effect size
ranging from −.28 to .60).
Taken together, studies that attend to pathways from intervention to student learning, change
in practice, or organizational change begin to provide guidance on possible mechanisms by which
data use interventions produce the outcomes we care about. Studies attending to student learning
identify 2 key pathways: (a) influencing the nature of teachers’ conversations about data via new
data routines, protocols, and active participation of school administrators and (b) influencing
teachers’ knowledge via the provision of professional development. Both of these approaches
highlight the importance of teachers’ underlying interpretive processes, showing how changing
the dynamics of social interaction, on the one hand, or the knowledge teachers draw upon when
they make attributions about assessments and draw implications for action, on the other, matter
for the level of student learning in their classrooms.
However, the handful of studies that attend to the pathway between outcomes and student
learning pay only limited attention to the political and organizational contexts of data use.
Studies that attend to change in administrator and teacher practice broaden our understanding
here, suggesting that interventions can shape access to data and norms of interaction in ways
that influence deliberation in data use routines, with consequences for administrator and teacher
practice. Further, interventions provide systems of meaning that foster new modes of interaction
between administrators and teachers, shifting the relations of power between the two in ways that
impact classroom practice.
Finally, research on organizational change reminds us of the importance of attending to data
use processes over time. Iterative engagement with data in data use routines—especially to the
degree that interpretations and implications for action are revisited, adjusted, adapted over time
and across venues—may add up to changes in policy, the social organization of work roles,
and levels of coordination across different parts of the system. These organizational changes
may, in turn, influence individual teacher and administrator practice with implications for student
learning, although this conjecture awaits empirical investigation.

DISCUSSION

Data use interventions are everywhere in public schools and districts. Tools to promote data use,
comprehensive data use initiatives, and accountability policies with data use at their center seek
to improve public school performance. In this article, we argue that in order to understand how
RESEARCH ON DATA USE 197

these data use interventions might influence teacher and administrator practice, affect student
learning, and lead to organizational change we must understand how they interact with existing
organizational and political contexts of schools and shape the underlying process of data use.
To that end, we have put forth a framework for understanding data use in the context of
data use interventions. We draw on research and theory to identify key dimensions of the data
use phenomenon and provide a way to understand how these dimensions might interact. More
specifically, we argue that how individuals use assessments, test scores, or other data at the core
of data use initiatives depends on how they notice, interpret, and construct an understanding of
the implications of data for action. These interpretive processes are influenced by the complex
and multilayered contexts of schools and districts, including data use routines, norms of interac-
tion, relations of power and authority, among others. We then show how data use interventions
interact with these contexts to shape the underlying interpretive processes in ways that have
consequences—both good and bad—for student learning and other outcomes.
This framework contributes to research on data use by highlighting key pathways by which
interventions shape, or fail to shape, data use outcomes. To date, few studies of data use that
attend to outcomes also attend to the process by which these outcomes are produced (Coburn &
Turner, in press; Jennings, in press; Marsh, in press). Similarly, few studies that attend to the
underlying interpretive processes of data use or role of context attend to student learning (Henig,
in press; Little, in press; Spillane, in press). The lack of connection between intervention, context,
process, and outcomes is unfortunate. Absent information about the process of data use, we can
know something leads to a given outcome but not know how or why. Absent information about
context, we cannot explain why the same tools or initiatives foster positive outcomes in some
settings and not others. At the same time, information about context and process in the absence of
information about outcomes does little to help policy makers and school and district leaders make
informed decisions about whether particular data use interventions are worth the investment of
their efforts and scarce resources.
In this article, we review research that investigates the connections between context, interpre-
tive processes, and outcomes to illustrate the ways that these pieces of the data use puzzle may
implicate one another. Studies that explicitly investigate the relationship between interventions
and student outcomes converge on 2 key levers by which data use influences student learning:
(a) teachers’ social interaction in data use routines and (b) individual teacher knowledge. Both
of these dimensions are important because they influence teachers’ interpretive processes: what
they notice when they engage with data, how they interpret it, and how they construct implica-
tions for action. These interpretive processes, in turn, may influence how teachers respond to data
in their classroom, which has consequences for student learning. Other studies help broaden our
understanding of these pathways to student learning by identifying features of interventions and
dimensions of context that shape teacher knowledge and the dynamics of social interaction.
To date, many of the potential pathways identified by the framework are suggested by existing
research rather than investigated directly. For example, we know that teachers’ conversations in
data use routines shape their interpretive processes in ways that can influence their instructional
practice with consequences for student learning. At the same time, we also know that relations
of power and authority can influence teachers’ conversations in data use routines and interpretive
processes. But, there are no studies (that we know of) that investigate the link between relations
of power and authority and student learning (see Henig, in press, for a review of the intersection
of power and data use). There are similar issues with other dimensions of context (like norms of
198 COBURN AND TURNER

interaction) as well as with key features of interventions (like sanctions and rewards). In laying
out multiple dimensions that may matter in the pathway between intervention and outcomes and
suggesting relationships between them, then, this framework provides quite a bit of fodder for
future research. Here, we outline a few places to begin.
First, given the central role of interpretive processes in this account, it may be important
to investigate the links between these processes and other aspects of the framework in greater
depth. Research that develops a better understanding of the connections between processes and
outcomes is an obvious place to start given that data use is generally advocated and undertaken
with a desired outcome in mind. For example, existing studies that link intervention to student
learning have investigated how interventions influence either teachers’ individual characteristics
(especially the role of professional development in building teacher knowledge) or social inter-
action (especially how designed routines and protocols and, to a lesser extent, school leaders,
influence teacher talk in data routines). But, how does teacher knowledge influence the dynamics
of interpretation and debate in social interaction? Similarly, we know that administrators can play
an important role in shaping what teachers notice and in developing implications for action. And,
a handful of studies suggest that when administrators are involved in data use routines, teachers
are more likely to make changes in practice that increase student learning. But, how do these
processes and outcomes vary depending upon the knowledge of administrators, instructional
coaches, or facilitators with whom teachers interact in data use interventions?
More targeted investigations of the links between interventions and the process of data use
may also be useful. Relevant questions might include these: When do interventions contribute to
the well-established tendency for teachers and others to assimilate information into preexisting
frameworks leading to stability rather than change and when do they prompt teachers to recon-
struct their understanding of students, the subject matter, and instructional strategies in more
fundamental ways? How do interventions shape data use routines in ways that harness the power
of divergent points of view and differential expertise to spur learning rather than lead to the con-
flict and disagreement often documented in the literature (e.g. Coburn et al., 2008; Coburn et al.,
2009; Hallett, 2003; Hallett, 2010; Spillane, in press; Spillane & Miele, 2007)? Similar questions
can be asked to guide investigations at the intersection of context and processes of data use.
Second, this framework generates new directions for comparative research on data use inter-
ventions. We have identified 6 key features of interventions that seem to matter for the ways that
data use interventions interact with contexts and shape data use processes. Yet the research on
most of these features is mixed. Thus, we know that protocols can influence what teachers and
others notice in data use conversations and how they draw implications for action. But, what are
the qualities of protocols that focus teachers’ attention on issues of student learning, and what
are the qualities that focus it away from student learning toward general instructional strategies?
Does it differ according to the organizational and political contexts that are present in a school?
How do protocols interact with other features of interventions, such as professional develop-
ment, systems of meaning, or the presence of sanctions and rewards? One can imagine similar
research questions for each feature of data use interventions we identified. Knowledge generated
from studies of this sort, not only have the potential to contribute to the field’s understanding
of data use, they have the potential to provide insight for redesigning existing interventions and
improving data use efforts overall.
Finally, the framework broadens our understanding of the range of outcomes that may be
important to attend to when studying the phenomenon of data use. For many scholars and
RESEARCH ON DATA USE 199

practitioners alike, student learning is the bottom line. And, indeed, there has been a great deal
of attention to the relationship between data use interventions—especially those centered around
formative assessment—and student learning outcomes variously defined (see Black & Wiliam,
1998; Fuchs & Fuchs, 1986; and Young & Kim, 2009, for reviews of this literature).6 Those
studies that move beyond the relationship between data use and student learning tend to focus on
teacher practice as a key interim outcome (see, for example, Christman et al., 2009; Gallimore
et al., 2009; Fuchs et al., 1999; McDougall et al., 2007).
Yet, our analysis suggests that administrator practice may also be an important interim out-
come to attend to. We show that administrators play a key role in shaping teacher practice
by participating with teachers in data use routines, focusing teachers’ attention on some data
and not other data within those routines, and playing an active and influential role in con-
structing implications for action. Several studies show that these actions have implications for
student outcomes. At the same time, there is evidence that administrators also shape contex-
tual conditions—including time allocation, access to data, and norms of interaction—in ways
that may influence teacher practice in potentially positive and negative ways. This suggests that
there may be analytic benefits of attending to administrator practice as a key interim outcome for
understanding teacher practice and student learning.
In addition, studies of data use only rarely attend to organizational change. Yet, while lit-
tle studied, organizational changes may be quite consequential. Change in policy and strategic
direction or the ways that work and work roles are organized have the potential to alter individ-
ual administrator and teacher practice on a widespread basis, as illustrated by Supovitz’s study
of Duval County, Florida, and Honig and colleagues’ (2010) study of 3 major urban districts.
Furthermore, precisely because these organizational changes are not located in specific individ-
uals, but rather stretched across people, processes, and structures, they have a greater potential
to be sustained than changes that focus on individual practice, given the high levels of turnover
endemic in U.S. public schools (see Sherer & Spillane, 2011, on this point). Thus, attention
to outcomes related to organizational change is a potentially fruitful approach for understand-
ing the consequences of data use on a larger scale and extending beyond the reach of a single
intervention.
It is worth noting that studying linkages that span across areas of the framework may require
researchers or research teams to stretch across traditional disciplinary boundaries or research
communities. This framework suggests that understanding the data use phenomenon requires
bringing insights from the assessment and measurement community together with insights from
social psychologists who study the dynamics of deliberation and debate in groups. It requires
bringing insights from cognitive psychologists who study the microprocesses of noticing and
attribution together with insights from organizational theorists who study how these processes
unfold in the context of complex organizations and environments. And it requires bringing the
lessons from political science into the fold, so that we can better understand how relations
of power both inside and outside organizations play a role in people’s experience of data use
interventions, the dynamics of interpretation, and the nature of instructional change.

6 Although, certainly, there could be more attention to the connection between comprehensive initiatives and student

outcomes. A recent review of research on comprehensive data initiatives suggests that few studies attend to student
outcomes and many that do have methodological problems that raise questions about their ability to draw conclusions
about outcomes (Marsh, in press).
200 COBURN AND TURNER

In many ways, the practice of data use is out ahead of research. Policy and interventions to
promote data use far outstrip research studying the process, context, and consequences of these
efforts. But the fact that there is so much energy promoting data use and so many districts and
schools that are embarking on data use initiatives means that conditions are ripe for systematic,
empirical study. These settings provide opportunities for research to learn from practice about
the conditions that promote data use in schools. The framework we present here is intended to
help provide guidance for this research. By emphasizing the nature of linkages between different
facets of the data use phenomenon, this framework can inform the design of studies that are better
able to connect different aspects of what has heretofore been a disconnected field. In so doing,
we hope to spur a new generation of research on data use that draws on insights from different
research communities to develop new knowledge that has the potential to inform the practice of
data use and improve efforts to intervene.

ACKNOWLEDGEMENTS

We are grateful to the Spencer Foundation for support for writing this article. We thank Andrea
Conklin Bueschel, Paul Goren, Judith Warren Little, Pamela Moss, and 2 anonymous reviewers
for helpful comments. An earlier draft of this article was developed to guide a series of conven-
ings of scholars of data use sponsored by the Spencer Foundation under the auspices of their
Data Use and Educational Improvement Initiative. We are grateful to the participants in the con-
venings for rich and challenging conversations that enriched our understanding of the facets of
data use and greatly improved the article.

REFERENCES

Argote, L. (1999). Organizational learning: Creating, retaining and transferring knowledge. Boston: Kluwer.
Armstrong, J., & Anthes, K. (2001). How data can help: Putting information to work to raise student achievement.
American School Board Journal, 188(11), 38–41.
Baron, J. B., & Wolf, D. P. (Eds.) (1996). Performance-based student assessment: Challenges and possibilities. Ninety-
fifth Yearbook of the National Society for the Study of Education. Chicago, IL: University of Chicago Press.
Bickel, W. E., & Cooley, W. W. (1985). Decision-oriented educational research in school districts: The role of
dissemination processes. Studies in Educational Evaluation, 11(2), 183–203.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5(1), 7–74.
Blanc, S., Christman, J. B., Liu, R., Mitchell, C., Travers, E., & Bulkley, K. E. (2010). Learning to learn from data:
Benchmarks and instructional communities. Peabody Journal of Education, 85(2), 205–225.
Booher-Jennings, J. (2005). Below the bubble: “Educational triage” and the Texas accountability system. American
Educational Research Journal, 42, 231–268.
Bransford, J. D., Stipek, D. J., Vye, N. J., Gomez, J. M., & Lam, D. (Eds.) (2009). The role of research in educational
improvement. Cambridge, MA: Harvard Educational Press.
Brunner, C., Fasca, C., Heinze, J., Honey, M., Light, D., Mandinach, E., Wexler, D. (2005). Linking data and learning:
The Grow Network study. Journal of Education for Students Placed at Risk, 10(3), 241–267.
Bryk, A. S., & Schneider, B. L. (2002). Trust in schools: A core resource for improvement. New York, NY: Russell Sage
Foundation.
Bulkley, K., Fairman, J., Martinez, C., & Hicks, J. (2004). The district and test preparation. In W. A. Firestone & R. Y.
Schorr (Eds.), The ambiguity of test preparation. Mahwah, NJ: Erlbaum.
Burch, P., & Hayes, T. (2009). The role of private firms in data-based decision making. In T. J. Kowalski & T. J. Lasley,
II (Eds.), Handbook of data-based decision making in education. New York, NY: Routledge.
RESEARCH ON DATA USE 201

Chen, E., Heritage, M., & Lee, J. (2005). Identifying and monitoring students’ learning needs with technology. Journal
of Education for Students Placed at Risk, 10(3), 309–332.
Christman, J., Neild, R., Bulkley, K., Blanc, S., Liu, R., Mitchell, C., & Travers, E. (2009). Making the most of interim
assessment data: Lessons from Philadelphia. Philadelphia, PA: Research for Action.
Clune, W. H., & White, P. A. (2008). Policy effectiveness of interim assessments in Providence public schools (WCER
Working Paper No. 2008-10). Madison: Wisconsin Center for Education Research.
Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading policy in their professional
communities. Educational Evaluation and Policy Analysis, 23(2), 145–170.
Coburn, C. E. (2005). Shaping teacher sensemaking: School leaders and the enactment of reading policy. Educational
Policy, 19(3), 476–509.
Coburn, C. E. (2010). Partnership for District Reform: The challenges of evidence use in a major urban district. In C.
E. Coburn & M. K. Stein (Eds.), Research and practice in education: Building alliances, bridging the divide. New
York, NY: Rowman & Littlefield.
Coburn, C. E., Bae, S., & Turner, E. O. (2008). Authority, status, and the dynamics of insider-outsider partnerships at the
district level. Peabody Journal of Education, 83(3), 364–399.
Coburn, C. E., Honig, M. I., & Stein, M. K. (2009a). What’s the evidence on district’s use of evidence? In J. Bransford,
D. J. Stipek, N. J. Vye, L. Gomez, & D. Lam (Eds.) Educational improvement: What makes it happen and why? (pp.
67–86). Cambridge, MA: Harvard Educational Press.
Coburn, C. E., & Russell, J. L. (2008). District policy and teachers’ social networks. Educational Evaluation and Policy
Analysis, 30(3), 203–235.
Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping the terrain. American
Journal of Education, 112(4), 469–495.
Coburn, C. E., Toure, J., & Yamashita, M. (2009b). Evidence, interpretation, and persuasion: Instructional decision
making in the district central office. Teachers College Record, 111(4), 1115–1161.
Coburn, C. E., & Turner, E. O. (in press). The practice of data use: An introduction. American Journal of Education.
Cohen, M. D., March, J. G., & Olsen, J. P. (1988). The garbage can model of organizational choice. In J. G. March (Ed.),
Decisions and organizations (pp. 294–334). Oxford, UK: Basil Blackwell.
Cook, S. D. N., & Brown, J. S. (1999). Bridging epistemologies: The generative dance between organizational knowledge
and organizational knowing. Organization Science, 10(4), 381–400.
Copland, M. (2003). Leadership of inquiry: Building and sustaining capacity for school improvement. Educational
Evaluation and Policy Analysis, 25(4), 375–395.
Copland, M. A., Knapp, M. S., & Swinnerton, J. A. (2009). Principal leadership, data, and school improvement. In T. J.
Kowalski & T. J. Lasley II (Eds.), Handbook of data-based decision making in education. New York, NY: Routledge.
Cromey, A., (2000, November). Using student assessment data: What can we learn from schools? Policy Issues (Brief
No. 6). Oak Brook, IL: North Central Regional Educational Laboratory.
Daly, A. J., & Finnigan, K. S., (in press). The ebb and flow of social network ties between district leaders under high
stakes accountability. American Educational Research Journal.
Datnow, A. & Honig, M. I. (2008). Introduction to the special issue on scaling up teaching and learning improvement
in urban districts: The promises and pitfalls of external assistance providers. Peabody Journal of Education, 83,
328–327.
Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to
improve instruction for elementary students. Los Angeles: University of Southern California, Center on Educational
Governance.
David, J. L. (1981). Local uses of Title I evaluations. Educational Evaluation and Policy Analysis, 3(1), 27–39.
Diamond, J. B., & Cooper, K. (2007). The uses of testing data in urban elementary schools: Some lessons from Chicago.
In P. A. Moss (Ed.), Evidence and decision making (pp. 241–263). Malden, MA: National Society for the Study of
Education.
Earl, L. M. (2009). Leadership for evidence-informed conversations. In L. M. Earl & H. Timperley (Eds.), Professional
learning conversations: Challenges in using evidence for improvement (pp. 43–52). Dordrecht: Springer.
Earl, L. M., & Timperley, H. (Eds.) (2009). Professional learning conversations: Challenges in using evidence for
improvement. Dordrecht: Springer.
Englert, R. M., Kean, M. H., & Scribner, J. D. (1977). Politics of program evaluation in large city school districts.
Education and Urban Society, 9(4), 429–450.
202 COBURN AND TURNER

Fairman, J. C., & Firestone, W. A. (2001). The district role in state assessment policy: An exploratory study. In S. H.
Fuhrman (Ed.), From the capitol to the classroom: Standards-based reform in the states. Chicago, IL: University of
Chicago Press, 124–147.
Feldman, J., & Tung, R. (2001, April). Whole school reform: How schools use the data-based inquiry and deci-
sion making process. Paper presented at the annual meeting of the American Educational Research Association,
Seattle, WA.
Feldman, M. S., & Pentland, G. T. (2003). Reconceptualizing organizational routines as a source of flexibility and change.
Administrative Science Quarterly, 48, 94–118.
Friedland, R., & Alford, R. R. (1991). Bringing society back in: Symbols, practices, and institutional contradictions. In
W. W. Powell & P. J. DiMaggio (Eds.), The new institutionalism in organizational analysis (pp. 232–263). Chicago,
IL: University of Chicago Press.
Fuchs, L.S., & Fuchs, D. (1986). Effects of systematic formative evaluation: A meta-analysis, Exceptional Children, 53,
199–208.
Fuchs, L. S., Fuchs, D., Karns, K., Hamlett, C. L., & Katz, M. (1999). Mathematics performance assessment in the
classroom: Effects of teacher planning and student problem solving. American Educational Research Journal, 36,
609–646.
Gallimore, R., Ermeling, B. A., Saunders, B., & Goldenberg, C. (2009). Moving the learning of teaching closer
to practice: Teacher education implications of school-based inquiry teams. Elementary School Journal, 109(5),
537–553.
Gearhart, M., Nagashima, S., Clark, S., Schwab, C., Vendlinski, T., Osmundson, E., & Herman, J. (2006). Developing
expertise with classroom assessment in K–12 science: Learning to interpret student work. Interim findings from a
2-year study. Educational Assessment, 11(3), 237–263.
Gearhart, M. & Osmundson, E. (2009). Assessment portfolios as opportunities for teacher learning. Educational
Assessment, 14(1), 1–24.
Glynn, M. A., Lant, T. K., & Milliken, F. J. (1994). Mapping learning processes in organizations: A multi-level framework
linking learning and organizing. In J. Meindl, J. Porac, & C. Stubbart (Eds.), Advances in Managerial Cognition and
Organizational Information Processing, 5, 43–83.
Goertz, M. E., Nabors Oláh, L., & Riggan, M. (2010). From testing to teaching: The use of interim assessments in
classroom instruction. (CPRE Research Report No. RR-65). Philadelphia, PA: Consortium for Policy Research in
Education.
Greeno, J. G., Collins, A. M., & Resnick, L. B. (1996). Cognition and learning. In D. C. Berliner & R. C. Calfee (Eds.),
Handbook of Educational Psychology (pp. 15–46). New York, NY: Macmillan.
Hallett, T. (2003). Symbolic power and organizational culture. Sociological Theory, 21(2), 128–149.
Hallett, T. (2010). The myth incarnate: Recoupling processes, turmoil, and inhabited institutions in an urban elementary
school. American Sociological Review, 75(1), 52–74.
Halverson, R., Grigg, J., Prichett, R., & Thomas, C. (2007). The new instructional leadership: Creating data-driven
instructional systems in schools. Journal of School Leadership, 17(2), 158–193.
Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J. A., & Wayman, J. C. (2009). Using student achieve-
ment data to support instructional decision making (No. NCEE 2009-4067). Washington, DC: National Center for
Educational Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Hannaway, J. (1989). Managers managing: The workings of an administrative system. New York, NY: Oxford University
Press.
Heilig, J. V., & Darling-Hammond, L. (2008). Accountability Texas-style: The progress and learning of urban minority
students in a high-stakes testing context. Educational Evaluation and Policy Analysis, 30(2), 75–110.
Henig, J. (in press). The politics of data use. Teachers College Record.
Herman, J., & Gribbons, B. (2001). Lessons learned in using data to support school inquiry and continuous improvement:
Final report to the Stuart Foundation. Los Angeles: University of California, Center for the Study of Evaluation.
Herman, J. L., & Haertel, E. H. (2005). The uses and misuses of data for educational accountability and improvement.
Yearbook of the National Society for Studies in Education. Malden, MA: Wiley-Blackwell.
Hoffman, J. V., Assaf, L. C., & Paris, S. G. (2001). High-stakes testing in reading: Today in Texas, tomorrow? Reading
Teacher, 54(5), 482–492.
Honig, M. I. (2003). Building policy from practice: District central office administrators’ roles and capacity for
implementing collaborative education policy. Educational Administration Quarterly, 39(3), 292–338.
RESEARCH ON DATA USE 203

Honig, M. I. (2006). Street-level bureaucracy revisited: Frontline district central-office administrators as boundary
spanners in education policy implementation. Educational Evaluation and Policy Analysis, 28(4), 357–383.
Honig, M. I. (2008). District central offices as learning organizations: How sociocultural and organizational learning
theories elaborate district central office administrators’ participation in teaching and learning improvement efforts.
American Journal of Education, 114, 627–664.
Honig, M. I., Copland, M. A., Rainey, L., Lorton, J. A., & Newton, M. (2010). Central office transformation for district-
wide teaching and learning improvement. Seattle, WA: Center for the Study of Teaching and Policy.
Honig, M. I. & Venkateswaran, N. (in press). School-central office relationships in evidence use: Understanding evidence
use as a systems problem. American Journal of Education.
Horn, I. S. & Little, J. W. (2010). Attending to problems of practice: Routines and resources for professional learning in
teachers’ workplace interactions. American Educational Research Journal, 47(1), 181–217.
Hubbard, L. (2010). Research-to-practice: A case study of Boston Public Schools, Boston Plan for Excellence and
Education Matters. In C. E. Coburn & M. K. Stein (Eds.), Research and practice in education: Building alliances,
bridging the divide. New York, NY: Rowman & Littlefield.
Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the “data driven” mantra: Different conceptions of data-driven
decision making. In P. A. Moss (Ed.), Evidence and decision making (pp. 105–131). Malden, MA: National Society
for the Study of Education.
Ikemoto, G. S., & Honig, M. I. (2010). Tools to deepen practitioners’ engagement with research: The case of the Institute
for Learning. In C. E. Coburn & M. K. Stein (Eds.), Research and practice in education: Building alliances, bridging
the divide. New York, NY: Rowman & Littlefield.
Ingram, D., Louis, K. S., & Schroeder, R. G. (2004). Accountability policies and teacher decision making: Barriers to
the use of data to improve practice. Teachers College Record, 106, 1258–1287.
Jennings, J. L. (in press). The effects of accountability system features on data use. Teachers College Record.
Kennedy, M. M. (1982). Working knowledge and other essays. Cambridge, MA: Huron Institute.
Kennedy, M. M. (1984). How evidence alters understanding and decisions, Educational Evaluation and Policy Analysis,
6(3), 207–226.
Kerr, K. A., Marsh, J. A., Ikemoto, G. S., & Barney, H. (2006). Strategies to promote data use for instructional
improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education, 112(4),
496–520.
Knapp, M. S., Swinnerton, J. A., Copland, M. A., & Monpas-Huber, J. (2006). Data informed leadership in education.
Seattle, WA: Center for the Study of Teaching and Learning.
Koretz, D., & Jennings, J. (2010). The misunderstanding and use of data from educational tests. Unpublished manuscript.
Kowalski, T. J., & Lasley, T. J. (Eds.) (2009). Handbook of data-based decision making in education. New York, NY:
Routledge.
Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for
Students Placed at Risk, 10(3), 333–349.
Lasky, S., Schaffer, G., & Hopkins, T. (2009). Learning to think and talk from evidence: Developing system-wide capac-
ity for learning conversations. In L. M. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges
in using evidence for improvement (pp. 95–108). Dordrecht: Springer.
Levitt, B., & March, J. G. (1988). Organizational learning. Annual Review of Sociology, 14, 319–340.
Light, D., Honey, M., Heinze, J., Brunner, C., Wexler, D., Mandinach, E., & Fasca, C. (2005). Linking data and learning:
The Grow Network study. Summary Report. New York, NY: Center for Children and Technology.
Little, J. W. (1990). The persistence of privacy: Autonomy and initiative in teachers’ professional relations. Teachers
College Record, 91(4), 509–536.
Little, J. W. (2007). Teachers’ accounts of classroom experience as a resource for professional learning and instructional
decision making. In P. A. Moss (Ed.), Evidence and decision making (pp. 217–240). Malden, MA: National Society
for the Study of Education.
Little, J. W. (in press). Understanding data use practices among teachers: The contribution of micro-process studies.
American Journal of Education.
Little, J. W., & Curry, M. W. (2009). Structuring talk about teaching and learning: The use of evidence in protocol-based
conversation. In L. M. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges in using evidence
for improvement (pp. 29–42). Dordrecht: Springer.
204 COBURN AND TURNER

Little, J. W., Gearhart, M., Curry, M., & Kafka, J. (2003). “Looking at student work” for teacher learning, teacher
community, and school reform. Phi Delta Kappan, 83(5), 184–192.
Lortie, D. C. (1975). Schoolteacher: a sociological study. Chicago, IL: University of Chicago Press.
Mandinach, E. B., & Honey, M. (Eds.). (2008). Data-driven school improvement: Linking data and learning. New York,
NY: Teachers College Press.
Marsh, J., Hamilton, L., & Gill, B. (2008). Assistance and accountability in externally managed schools: The case of
Edison Schools, Inc. Peabody Journal of Education, 83(3), 423–458.
Marsh, J. A., (in press). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College
Record.
Marsh, J. A., McCombs, J. S., & Martorell, F. (2009). How instructional coaches support data-driven decision making.
Educational Policy, 24(6), 872–907.
Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence
from recent RAND research (OP-170). Santa Monica, CA: RAND Corporation.
Mason, S. (2002). Turning data into knowledge: Lessons from six Milwaukee Public Schools. Madison, WI: Wisconsin
Center for Education Research.
Massell, D., & Goertz, M. E. (2002). District strategies for building instructional capacity. In A. M. Hightower, M. S.
Knapp, J. A. Marsh, & M. W. McLaughlin (Eds.), School districts and instructional renewal (pp. 43–60). New York,
NY: Teachers College Press.
McDonnell, L. M. (2004). Politics, persuasion, and educational testing. Cambridge, MA: Harvard University Press.
McDougall, D., Saunders, W., & Goldenberg, C. (2007). Inside the black box of school reform: Explaining the how and
why of change at Getting Results schools. International Journal of Disability, Development and Education, 54(1),
51–89.
Means, B., Gallagher, L., & Padilla, C. (2007). Teachers’ use of student data management systems to improve instruction.
Report prepared for U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Prepared
by SRI International, Menlo Park, CA.
Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009). Implementing data-informed decision making in schools:
Teacher access, supports and use. Report prepared for U.S. Department of Education, Office of Planning, Evaluation
and Policy Development. Prepared by SRI International, Menlo Park, CA.
Means, B., Padilla, C., Gallagher, L. (2010). Use of education data at the local level from accountability to instruc-
tional improvement. Report prepared for U.S. Department of Education, Office of Planning, Evaluation and Policy
Development. Prepared by SRI International, Menlo Park, CA.
Mintrop, H. (2004). Schools on probation: How accountability works (and doesn’t work). New York, NY: Teachers
College Press.
Moss, P. A. (Ed.). (2007). Evidence and decision making. 106th Yearbook of the National Society for the Study of
Education, Part I. Malden, MA: Blackwell.
Murnane, R., Sharkey, N., & Boudett, K. (2005). Using student-assessment results to improve instruction: Lessons from
a workshop. Journal of Education for Students Placed at Risk, 10(3), 269–280.
National Research Council. (2011). Incentives and test-based accountability in public education. Committee on
Incentives and Test-Based Accountability in Public Education, M. Hout & S. W. Elliott (Eds.), Board on Testing
and Assessment, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academies
Press.
Nelson, T. H., & Slavit, D. (2007). Collaborative inquiry among science and mathematics teachers in the USA:
Professional learning experiences through cross-grade, cross-discipline dialogue. Professional Development in
Education, 33(1), 23–39.
Ogawa, R. T., Sandholtz, J. H., & Scribner, S. P. (2004). Standards gap: Unintended consequences of local standards-
based reform. Teachers College Record, 106(6), 1177–1202.
Phillips, D. C. (2007). Adding complexity: Philosophical perspectives on the relationship between evidence and policy.
In P. A. Moss (Ed.), Evidence and decision making. 106th Yearbook of the National Society for the Study of Education
(pp. 376–402). Malden, MA: Blackwell.
Porter, K. E., & Snipes, J. C. (2006). The challenge of supporting change: Elementary student achievement and the Bay
Area School Reform Collaborative’s focal strategy. New York, NY: MDRC.
Ryan, K. E., & Shepard, L. A. (Eds.). (2008). The future of test-based educational accountability. New York, NY:
Routledge, Taylor & Francis.
RESEARCH ON DATA USE 205

Sauder, M., & Espeland, W. N. (2009). The discipline of rankings: Tight coupling and organizational change. American
Sociological Review, 74, 63–82.
Saunders, W. M., Goldenberg, C. N., & Gallimore, R. (2009). Increasing achievement by focusing grade-level teams
on improving classroom learning: A prospective quasi-experimental study of Title I schools. American Educational
Research Journal, 46(4), 1006–1033.
Scott, W. R., & Davis, G. F. (2007). Organizations and organizing: Rational, natural, and open systems perspectives.
Englewood Cliffs, NJ: Prentice Hall.
Sherer, J. Z., & Spillane, J. P. (2011). Constancy and change in work practice in schools: The role of organizational
routines. Teachers College Record, 113(3), 661–657.
Sloan, K. (2006). Teacher identity and agency in school worlds: Beyond and all-good/all-bad discourse on
accountability-explicit curriculum policies. Curriculum Inquiry, 36(2).
Snipes, J., Doolittle, F., & Herlihy, C. (2002). Foundations for success: Case studies of how urban school systems improve
student achievement. Washington, DC: Council of Great City Schools.
Spillane, J. P. (in press). Data in practice: Conceptualizing the data based decision-making phenomena. American Journal
of Education.
Spillane, J. P., & Miele, D. B. (2007). Evidence in practice: A framing of the terrain. In P. A. Moss (Ed.), Evidence and
decision making (pp. 46–73). Malden, MA: National Society for the Study of Education.
Spillane, J. P., Parise, L. M., & Sherer, J. Z. (2011). Organizational routines as coupling mechanisms: Policy, school
administration, and the technical core. American Educational Research Journal, 48, 586–619.
Spillane, J. P., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing
implementation research. Review of Educational Research, 72(3), 387–431.
Stecher, B., Hamilton, L., & Gonzalez, G. (2003). Working smarter to leave no child behind: Practical insights for school
leaders. Santa Monica, CA: RAND.
Stone, D. (2002). Policy paradox: The art of political decision making (Revised ed.) New York, NY: Norton.
Supovitz, J. A. (2006). The case for district-based reform: Leading, building, and sustaining school improvement.
Cambridge, MA: Harvard Education Press.
Supovitz, J. A. (2011). The assessment marinera. Unpublished manuscript.
Supovitz, J. A., & Klein, V. (2003). Mapping a course for improved student learning: How innovative schools use student
performance data to guide improvement. Philadelphia, PA: Consortium for Policy Research in Education.
Supovitz, J. A., & Weathers, J. (2004). Dashboard lights: Monitoring implementation of district instructional reform
strategies. Philadelphia, PA: Consortium for Policy Research in Education.
Symonds, K. W. (2004). After the test: Closing the achievement gaps with data. Learning Points Associates and Bay
Area School Reform Collaborative.
Thorn, C., Meyer, R. H., & Gamoran, A. (2007). Evidence and decision making. In P. A. Moss (Ed.), Evidence and
decision making (pp. 340–361). Malden, MA: National Society for the Study of Education.
Thorn, C. A. (2001, November 19). Knowledge management for educational information systems: What is the state of
the field? Education Policy Analysis Archives, 9(47). Retrieved from http://epaa.asu.edu/epaa/v9n47/
Timperley, H. (2009). Evidence-informed conversations making a difference to student achievement. In L. M. Earl & H.
Timperley (Eds.), Professional learning conversations: Challenges in using evidence for improvement. (pp. 69–80).
Dordrecht: Springer.
Wayman, J. C. (2007). Student data systems for school improvement: The state of the field. In Texas Computer Education
Association’s, TCEA Educational Technology Research Symposium: Vol. 1 (pp. 156–162). Lancaster, PA: ProActive.
Wayman, J. C., Conoly, K., Gasko, J., Stringfield, S. (2008). Supporting equity inquiry with student data computer
systems. In E. B. Mandinach & M. Honey (Eds.), Linking data and learning (pp. 171–190). New York, NY: Teachers
College Press.
Wayman, J. C., & Stringfield, S. (2006). Technology-supported involvement of entire faculties in examination of student
stat for instructional improvement. American Journal of Education, 112(4), 549–571
Wayman, J. C., Stringfield, S., & Yakimowski, M. (2004). Software enabling school improvement through analysis of
student data. Baltimore, MD: Center for Research on the Education of Students Placed at Risk, Johns Hopkins
University. Retrieved from http://www.csos.jhu.edu/crespar/techReports/Report67.pdf
Weick, K. E. (1995). Sensemaking in organizations. Thousand Oaks, CA: Sage.
Weinbaum, E. (2009). Learning about assessment. An evaluation of a ten-state effort to build assessment capacity in
high schools. Philadelphia, PA: Consortium for Policy Research in Education.
206 COBURN AND TURNER

West, R. F., & Rhoton, C. (1994). School district administrators’ perceptions of educational research and barriers to
research utilization. ERS Spectrum, 12(1), 23–30.
Wright, W. E., & Choi, D. (2006). The impact of language and high-stakes testing policies on elementary school English
language learners in Arizona. Education Policy Analysis Archives, 14(13).
Young, V. (2008). Supporting teachers’ use of data: The role of organization and policy. In E. B. Mandinach & M. Honey
(Eds.), Data driven school improvement: Linking data and learning (pp. 87–106). New York, NY: Teachers College
Press.
Young, V. M., & Kim, D. H. (2010). Using assessments for instructional improvement: A literature review, Education
Policy Analysis Archives, 18(19), 1–36.

You might also like