0% found this document useful (0 votes)
386 views29 pages

Posing Significant Research Questions: Editorial

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
386 views29 pages

Posing Significant Research Questions: Editorial

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Journal for Research in Mathematics Education

2019, Vol. 50, No. 2, 114–120

Editorial

Posing Significant Research Questions


Jinfa Cai, Anne Morris, Charles Hohensee, Stephen Hwang, Victoria Robison,
Michelle Cirillo, Steven L. Kramer, and James Hiebert
University of Delaware

In 2002, the National Research Council (NRC) released Scientific Research in


This material may not be copied or distributed electronically or in other formats without written permission from NCTM.

Education, a report that proposed six principles to serve as guidelines for all
scientific inquiry in education. The first of these principles was to “pose signifi-
Copyright © 2019 by the National Council of Teachers of Mathematics, Inc. www.nctm.org. All rights reserved.

cant questions that can be investigated empirically” (p. 3). The report argued that
the significance of a question could be established on a foundation of existing
theoretical, methodological, and empirical work. However, it is not always clear
what counts as a significant question in educational research or where such ques-
tions come from. Moreover, our analysis of the reviews for manuscripts submitted
to JRME1 suggests that some practical, specific guidance could help researchers
develop a significant question or make the case for the significance of a research
question when preparing reports of research for publication.
Building on the JRME archive of nearly 50 years of research articles, this issue
marks the beginning of a series of editorials aimed at discussing how to conduct
and report high-quality research in mathematics education. In this first editorial
in the series, we discuss what counts as a significant research question in math-
ematics education research, where significant research questions come from, and
how researchers can develop their manuscripts to make the case for the signifi-
cance of their research questions. Although we are beginning a new series of
editorials, we will continue to draw on the ideas from our editorials over the past
2 years (e.g., Cai et al., 2018; Cai et al., 2017). In particular, we consider what
significant research questions might look like in the aspirational future world of
research that we have described in those editorials—a world in which mathematics
education research is carried out by widespread, stable partnerships of teachers
and researchers and in which research both takes root in and shapes the everyday
practices of mathematics teaching and learning.

Significant Research Questions


It is difficult, if not impossible, to judge the significance of a research question
just by reading the question. Certainly, significant research in mathematics educa-
tion should advance the field’s knowledge and understanding of the teaching and
learning of mathematics (Heid, 2010; Simon, 2004). We believe this implies that
the characteristics that make a research question significant are dependent on

1 We analyzed the reviews for every manuscript that underwent full review and received a decision
in 2017. For those manuscripts that were ultimately rejected, not a single reviewer stated that the
research questions were particularly relevant or insightful. In contrast, for those manuscripts that
ultimately received a revise and resubmit decision or were accepted (pending revisions), only one
reviewer raised the concern that the research questions would not make a contribution to the field.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 115

context and specifically on assumptions about what kind of knowledge is useful.


Research can advance our understanding of teaching and learning mathematics
in ways that are more distant from the classroom or more connected to practice.
Although we acknowledge the value of research that is more distant and might
eventually have an effect on classroom teaching or learning, we have developed
the argument in our previous editorials that significant research in mathematics
education can, and perhaps should, be much closer to the classroom and aim to
directly impact practice. From this perspective, we begin by asserting that a
research question that addresses teachers’ shared instructional problems and one
whose answer helps the field (students, teachers, policy makers, researchers)
understand why and how the answer is a solution to the problem is likely to be a
significant research question.

Addressing Instructional Problems


We focus on teachers’ instructional problems because they provide a strong
basis for connecting the work of research to the challenges of teaching and learning
mathematics. Confrey (2017) argued that mathematics education research is
grounded in a “practical wisdom” that reflects the challenge of operating in
complex decision-making environments like classrooms and schools. Because of
this, significant research questions can and do arise directly or indirectly from
teachers’ problems of practice.2 Within the idealized portrait of a future world of
mathematics education research described in our previous editorials (Cai et al.,
2017a, 2019), significant research questions arise from interactions between
researchers and teachers about challenges that teachers face in establishing and
helping students achieve well-defined learning goals. Grounding a research ques-
tion in instructional problems that are experienced across multiple teachers’
classrooms helps to ensure that the answer to the question will be of sufficient
scope to be relevant and significant beyond the local context.
Significance is also drawn from the importance of the mathematics that is
investigated. Instructional problems that lead to significant research questions are
problems related to teaching and learning powerful mathematics—mathematics
that is valued by the mathematics education community (broadly conceived). Our
vision of important mathematics is inclusive. It includes mathematics content, the
nature and practices of mathematics as a discipline, beliefs about mathematics and
affective perceptions of mathematics as a powerful and useful tool, and the role
and use of mathematics in addressing inequities (Cai et al., 2017b).3
For an example, we revisit the instructional problem faced by the fourth-grade
teacher, Mr. Lovemath, described in our earlier editorials (Cai et al., 2017a, 2017b).
Mr. Lovemath intended for his students to explore multiple strategies for
completing a fraction comparison task, but the students were unable to make

2 This point also finds support in Scientific Research in Education (NRC, 2002) in its discussion
of Pasteur’s quadrant—the intersection of the quest for fundamental understanding and consid-
erations of use (Stokes, 1997).
3 Indeed, Confrey (2017) points out that the fourth of Flyvbjerg’s (2001) questions that
characterize research in social science—Who gains and loses from the intervention?—puts
questions of equity squarely in the sights of mathematics education researchers.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
116 Posing Significant Research Questions

progress on the task and ended up employing a single procedure (using common
denominators) to perform all the comparisons. The students’ difficulties with the
task led Mr. Lovemath and Ms. Research, a mathematics education researcher, to
identify several relevant questions: Why did the students encounter difficulties?
Why did the intended opportunity to learn mathematics not materialize? What
prior knowledge do students need to take advantage of this learning opportunity?
These questions are grounded in an instructional problem that is likely shared by
many teachers who are trying to help their students achieve this learning goal.
Answering these questions would generate insight into students’ learning of
important mathematics and would also shed light on ways to make the learning
opportunities in the task available to all students.

Understanding How and Why


Research questions that focus on teachers’ instructional problems gain addi-
tional significance when they move from only finding answers to the problem to
also understanding how and why the answer is a solution to the instructional
problem. This distinguishes significant research from many other educational
activities (e.g., conducting a successful professional development). Understanding
how and why builds knowledge of a type that enables the solution to the initial
instructional problem to be adapted for a related problem or a different context.
Research questions that aim to understand often ask about the conditions under
which the solution to a problem will work rather than simply asking about the
nature of the solution. Years ago, Cronbach (1986) noted the value of these kinds
of questions, using the classic Brownell and Moser (1949) study4 as an example
of research that went beyond a simple comparison of treatments (a what-works-best
horse race) to examine how different treatments operated under different condi-
tions. More recently, Maxwell (2004) argued for the importance of causal explana-
tion in educational research, specifically highlighting the explanatory significance
of underlying causal processes and the importance of the context in shaping those
processes in particular situations. Studying the conditions under which a solution
works allows teachers and researchers to generate further hypotheses about the
changes in students’ learning that other instructional choices might produce.
For example, in the case of Mr. Lovemath’s instructional problem, Ms. Research
might ask what prior knowledge the students need to take advantage of the learning
opportunity in the fraction comparison task. An answer that reflects the potential
significance of this question would be more than just a list of prerequisite concepts.
Ms. Research’s answer should address the field’s theoretical understanding of the
role that earlier concepts play in new learning. Furthermore, the answer’s descrip-
tion of prior knowledge should, itself, attend to important mathematics—the
conceptual structures that underlie fraction comparison and how particular
concepts are needed at different points to engage productively with the task.

4 Brownell and Moser (1949) studied two approaches for teaching subtraction (meaningful and
mechanical) under two different conditions (using the regrouping algorithm for subtraction and
using the equal additions algorithm). By crossing the two instructional approaches with the two
different algorithms (regrouping and equal additions), Brownell and Moser found, among other
results, that the meaningful approach produced better outcomes than the mechanical approach for
the regrouping algorithm but not for the equal additions algorithm.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 117

Making these conditions explicit would allow Mr. Lovemath and Ms. Research to
make and test new predictions about how students would engage with the task
after making specific changes to the instruction leading up to the task. The
thinking behind these predictions could also then inform research and practice in
other classrooms (perhaps using different curricula) in which teachers encounter
a similar instructional problem.
Looking across the problems of practice discussed in our previous editorials,
we can identify additional types of significant questions that might arise from
problems of practice. These include questions about the resources (in addition to
the prior knowledge discussed above) that students bring with them that would
help or hinder them in taking advantage of a learning opportunity, questions about
the arrangements of learning goals and subgoals into learning trajectories, ques-
tions about the kinds of data that would usefully inform teaching, and questions
that focus on teacher–researcher partnerships and their work. Fundamentally, our
message is that significant research questions can be generated by addressing
problems of practice while striving to understand underlying mechanisms and
their interactions with the context. Although we have not described every signif-
icant research question that can be posed in mathematics education, we believe
that the kind of knowledge produced by answering research questions like these
is useful and likely to have an impact on practice.

Communicating the Significance of Research Questions


Perceiving and formulating a significant research question is both a science and
an art. The mathematician Jacques Hadamard (1945) wrote that “this delicate
choice is one of the most important things in research” (p. 126). Einstein and Infeld
(1938) claimed that “to raise new questions, new possibilities, to regard old prob-
lems from a new angle, requires creative imagination and marks real advance in
science” (p. 95). Indeed, Klamkin (1968) claimed that “among professional math-
ematicians, asking questions rates almost as high as answering them” (p. 132).
However, formulating a significant research question does not ensure that audi-
ences will perceive its significance. It is still necessary to communicate that ques-
tion in such a way that the field appreciates its significance. This communication,
often embedded in research papers, depends on clearly formulating the research
question for readers and making convincing arguments for its significance.
It can be a challenge to formulate a research question clearly in a research report.
As Heid and Blume (2011) observed regarding manuscripts submitted to JRME,
the statement of the research question is often an issue in submissions. Authors some-
times fail to specify their research question(s), and even when they do, they sometimes
report only a general research problem or area of interest rather than a specific research
question. (p. 106)

Our analysis of the reviews for the manuscripts submitted to JRME that received
a full-review decision in 2017 provides empirical data supporting this observation
by Heid and Blume. Fully 55% of the reviews for those manuscripts that were
rejected in 2017 included concerns about the research questions, including the lack
of a clear motivation for the research questions and a failure to appropriately
connect the research questions to other parts of the manuscript (e.g., situating the

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
118 Posing Significant Research Questions

questions with respect to the theoretical framework or describing methods appro-


priate for investigating the questions). Even for those manuscripts that were ulti-
mately accepted pending revisions or that received a decision of revise and
resubmit, 17% and 23% of the reviews, respectively, included concerns about the
research questions such as the need to make the statement of the research questions
somewhat clearer.
Thus, communicating the significance of a research question involves several
considerations. First, the question must be explicitly stated with specificity and
precision. It is neither sufficient nor fair to the reader to merely imply the question,
to phrase it only as a goal of the study, or to merely describe a general problem,
instructional or otherwise. A precisely stated research question should make clear
what kinds of data are needed to answer the question and what an answer would
look like. Precision in the statement of a research question can pay dividends in
terms of how well the data will help the researcher and, ultimately, the readers to
understand the phenomenon being studied.
A second consideration is that the research question must be clearly connected
to prior research to situate it in the larger field of mathematics education research.
The significance of a research question cannot be determined just by reading it.
Rather, its significance stands in relation to the knowledge of the field. The justi-
fication for the research question itself—why it is a significant question to inves-
tigate—must therefore be made clear through an explicit argument that ties the
research question to what is and is not already known. Indeed, nearly one quarter
of the JRME reviews that highlighted issues with the research questions in manu-
scripts rejected in 2017 specifically called for authors to make this kind of argu-
ment to motivate the research questions, whereas none of the manuscripts that
were ultimately accepted (pending revisions) received this kind of comment.
Thus, through a research question’s connections to prior research, it should be
clear how answering the question extends the field’s knowledge because it is based
on hypotheses suggested by previous research. The argument that there is a lack
of research in a particular area is not, on its own, a strong justification. To success-
fully make the case that a research question extends the field’s knowledge, the
question must be situated within a theoretical framework that helps readers under-
stand how answering the question informs the field and, consequently, practice or
policy. Although an appeal to an external source can be helpful to establish that
the field has an interest in the question, it is not a shortcut to making the case for
the significance of the question. That case relies on a chain of justification forged
from a theoretical framework that draws on the knowledge of the field. From the
perspective of the future world of mathematics education research that we
described in our previous editorials, that case can rest on whether the question
addresses instructional problems shared by teachers and how the question will
aim the investigation toward the conditions under which a solution to the instruc-
tional problem works (i.e., how answering the question will help the field under-
stand why and how the answer is a solution to the problem).
Finally, a clear and warranted question must be presented in such a way that it
can be empirically investigated and such that the methods for investigation make
sense and follow logically from the question. Indeed, the research question should
be coherent with the methods and data analysis so that, together, they make a

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 119

strong argument. That argument should be tight but should also flow smoothly
like a convincing story or a winning argument in a debate. It should make it easy
for readers to be convinced, and readers should not need to fill in part of the argu-
ment. On the one hand, the argument for the significance of the research question
depends on a theoretical framework. The theoretical framework shapes the
researcher’s conception of the phenomenon of interest, provides insight into it, and
defines the kinds of questions that can be asked about it. On the other hand, there
are many possible theoretical frameworks. Choosing among them depends on how
productively they allow the researcher to engage with the research problem and
to formulate good questions. This mutual dependence means that formulating a
significant research question is an iterative process, one that successively moves
from a broad, general sense of an idea which is potentially fruitful to a well-
specified theoretical framework and a clearly stated research question.
Like formulating a significant research question, the choice or construction of
a theoretical framework is also something of an art.5 In the next editorial, we will
discuss in detail how a theoretical framework can be chosen or constructed to
justify and communicate the significance of research questions. In addition, we
will address how the coherence of the research question, design, data coding and
analyses, and presentation and discussion of the findings as a chain of arguments
depends on presenting a relevant theoretical framework.

References
Brownell, W. A., & Moser, H. E. (1949). Meaningful vs. mechanical learning: A study in Grade III
subtraction. Durham, NC: Duke University Press.
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2017a). A future vision
of mathematics education research: Blurring the boundaries of research and practice to
address teachers’ problems. Journal for Research in Mathematics Education, 48(5), 466–473.
doi:10.5951/jresematheduc.48.5.0466
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2017b). Clarifying the
impact of educational research on students’ learning. Journal for Research in Mathematics
Education, 48(2), 118–123. doi:10.5951/jresematheduc.48.2.0118
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2018). Reconceptualizing
the roles of researchers and teachers to bring research closer to teaching. Journal for Research in
Mathematics Education, 49(5), 514–520. doi:10.5951/jresematheduc.49.5.0514
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2019). Research pathways
that connect research and practice. Journal for Research in Mathematics Education, 50(1), 2–10.
doi:10.5951/jresematheduc.50.1.0002
Cai, J., Morris, A., Hwang, S., Hohensee, C., Robison, V., & Hiebert, J. (2017). Improving the impact
of educational research. Journal for Research in Mathematics Education, 48(1), 2–6. doi:10.5951/
jresematheduc.48.1.0002
Confrey, J. (2017). Research: To inform, deform, or reform? In J. Cai (Ed.), Compendium for research
in mathematics education (pp. 3–27). Reston, VA: National Council of Teachers of Mathematics.
Cronbach, L. J. (1986). Social inquiry by and for earthlings. In D. W. Fiske & R. A. Shweder (Eds.),
Metatheory in social science: Pluralisms and subjectivities (pp. 83–107). Chicago, IL: University
of Chicago Press.
Einstein, A., & Infeld, L. (1938). The evolution of physics: The growth of ideas from early concepts
to relativity and quanta. Cambridge, United Kingdom: Cambridge University Press.

5 We see a parallel to formulating good research questions in mathematics, as Hadamard (1945)


describes it: “The guide we must confide in is that sense of scientific beauty, that special esthetic
sensibility” (p. 127).

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
120 Posing Significant Research Questions

Flyvbjerg, B. (2001). Making social science matter: Why social enquiry fails and how it can succeed
again. Cambridge, United Kingdom: Cambridge University Press.
Hadamard, J. (1945). An essay on the psychology of invention in the mathematical field. Princeton,
NJ: Princeton University Press.
Heid, M. K. (2010). The task of research manuscripts—Advancing the field of mathematics
education. Journal for Research in Mathematics Education, 41(5), 434–437.
Heid, M. K., & Blume, G. W. (2011). Strengthening manuscript submissions. Journal for Research
in Mathematics Education, 42(2), 106–108. doi:10.5951/jresematheduc.42.2.0106
Klamkin, M. S. (1968). On the teaching of mathematics so as to be useful. Educational Studies in
Mathematics, 1(1–2), 126–160. doi:10.1007/BF00426240
Maxwell, J. A. (2004). Causal explanation, qualitative research, and scientific inquiry in education.
Educational Researcher, 33(2), 3–11. doi:10.3102/0013189X033002003
National Research Council. (2002). Scientific research in education. Washington, DC: National
Academies Press. doi:10.17226/10236
Simon, M. A. (2004). Raising issues of quality in mathematics education research. Journal for
Research in Mathematics Education, 35(3), 157–163. doi:10.2307/30034910
Stokes, D. E. (1997). Pasteur’s quadrant: Basic science and technological innovation. Washington,
DC: Brookings Institution Press.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:08:42 UTC
All use subject to https://about.jstor.org/terms
Journal for Research in Mathematics Education
2019, Vol. 50, No. 3, 218–224

Editorial

Theoretical Framing as Justifying


Jinfa Cai, Anne Morris, Charles Hohensee, Stephen Hwang, Victoria Robison,
Michelle Cirillo, Steven L. Kramer, and James Hiebert
University of Delaware
This material may not be copied or distributed electronically or in other formats without written permission from NCTM.
Copyright © 2019 by the National Council of Teachers of Mathematics, Inc. www.nctm.org. All rights reserved.

In our March editorial (Cai et al., 2019), we discussed the nature of significant
research questions in mathematics education. We asserted that the choice of a suit-
able theoretical framework is critical to establishing the significance of a research
question. In this editorial, we continue our series on high-quality research in
mathematics education by elaborating on how a well-constructed theoretical
framework strengthens a research study and the reporting of research for publica-
tion. In particular, we describe how the theoretical framework provides a
connecting thread that ties together all of the parts of a research report into a
coherent whole. Specifically, the theoretical framework should help (a) make the
case for the purpose of a study and shape the literature review; (b) justify the study
design and methods; and (c) focus and guide the reporting, interpretation, and
discussion of results and their implications.
JRME reviewers frequently comment on theoretical frameworks in their evalu-
ations of manuscripts. Our analysis of the reviews for every manuscript that
underwent full review and received a decision in 2017 revealed that reviewers
raised concerns related to the theoretical framework in nearly 90% of manuscripts
that were ultimately rejected. Indeed, approximately 70% of the individual reviews
for these manuscripts included concerns related to the theoretical framework. Even
for those manuscripts that were ultimately accepted, nearly 30% of the individual
reviews still raised such concerns. Common concerns expressed by reviewers
included the following: that the manuscript lacks a sufficiently developed frame-
work, that the framework is not appropriate, that the framework is overly broad or
generic, that the framework is overly narrow or myopic, and that the framework is
disconnected from the other parts of the study. Concerns like these often reflect
serious issues with a manuscript that generally require significant revisions if these
concerns are to be effectively addressed.

What Is a Theoretical Framework?


Much has been written about theoretical frameworks, and some researchers have
explicitly called for increased attention to theoretical frameworks in mathematics
education research (e.g., Leatham, in press; Lester, 2005; Silver & Herbst, 2004;
Skott, Van Zoest, & Gellert, 2013; Spangler & Williams, in press). Despite these
calls, the notion of a theoretical framework can remain somewhat mysterious and
confusing for novice and experienced researchers alike. Moreover, novice
researchers may mistakenly believe that a theoretical framework is merely a
straightforward summary of related studies. We recognize that some researchers
make an explicit distinction between theoretical frameworks and conceptual
frameworks (e.g., Eisenhart, 1991; Imenda, 2014; Lester, 2005). However, these

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:09:59 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 219

terms have often been used interchangeably in the literature. In this editorial, we
use the term theoretical framework broadly (similar to the treatment of conceptual
frameworks by Eisenhart, 1991, and Lester, 2005) to encompass the set of assump-
tions, theories, hypotheses, and claims (as well as the relationships between them)
that guide a researcher’s thinking about the phenomenon being studied.
Researchers have used a number of different metaphors to describe theoretical
frameworks. Maxwell (2005) referred to a theoretical framework as a “coat closet”
that provides “places to ‘hang’ data, showing their relationship to other data,”
although he cautioned that “a theory that neatly organizes some data will leave
other data disheveled and lying on the floor, with no place to put them” (p. 49).
Lester (2005) referred to a framework as a “scaffold” (p. 458), and others have
called it a “blueprint” (Grant & Osanloo, 2014). Eisenhart (1991) described the
framework as a “skeletal structure of justification” (p. 209). Spangler and Williams
(in press) highlighted this structural role of theoretical frameworks by drawing an
analogy to the role that a house frame provides in preventing the house from
collapsing in on itself. Each of these metaphors draws on notions of connection
and structure for the purpose of organizing and supporting work. They portray the
theoretical framework as something purposefully constructed from multiple
components. It is not simply found or chosen—ready-made, say, by searching the
literature—nor can it be so generic that it provides little guidance for conducting
the study or writing a report.
We take a strong position that, to be useful, the theoretical framework should be
constructed by the researcher as a critical part of conceptualizing and carrying out
the research. To this point, as one JRME reviewer explained, “It is not enough to
use definitions that appear in the literature to provide a theoretical grounding.”1
One must do more than simply present an assemblage of existing parts from the
literature. Even when using existing theories and frameworks, researchers must
explain how they draw upon and combine them to build a framework that is suited
to the present study.
In particular, we believe that a theoretical framework for a study is constructed
through and for justification. It is constructed through justification when
researchers ask themselves a series of questions as they conceptualize and conduct
their studies: Why is this topic an important thing to study? What do I expect to
find? What do I think the answers to my research questions will be? Why do I
expect those findings? This last question often leads to a first-level set of general
reasons like “because students won’t understand the tasks well enough to score
well” or “because instruction will not be sustained long enough” and so on. Then,
by justifying their answers to this question—asking themselves why these are good
reasons—researchers can develop a second-level set of reasons (like “if the task is
not in students’ zone of proximal development, they are unlikely to understand it”)
that begins producing hypotheses that are connected with previous research. These
connections between what is new and what is known form the basis of a theoretical
framework that guides the selection of research questions, research methods, and
data collections and that supports compelling explanations of the findings that can
move the field forward. In this way, the theoretical framework can, for example,

1 All reviewer comments in this editorial have been paraphrased to respect the confidentiality of
the review process.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:09:59 UTC
All use subject to https://about.jstor.org/terms
220 Theoretical Framing as Justifying

ensure that a study provides new information addressing teachers’ shared instruc-
tional problems and helps the field (students, teachers, policy makers, researchers)
understand why and how the results will help solve those problems.
The theoretical framework is also constructed for justification and, in particular,
for explaining to others the reasoning that underlies the decisions made in a
research study. Although we recognize that the theoretical framework guides the
conceptualization and conduct of a research study, below we primarily focus on
the role of the theoretical framework in communicating research to the wider
mathematics education research community. At minimum, the theoretical frame-
work must support three kinds of justifications in the report: the why (the purpose
of the study), the how (the methodology of the study), and the what (the discussion
of the study’s findings and their implications).
These components of justification are interconnected, link by link, into a larger,
coherent chain of reasoning that permeates the report and holds it together. A
missing or broken link obscures the logic of the study, making it seem incoherent.
As one JRME reviewer put it, “The research design lacks coherence because of the
lack of coordination among the frameworks used; this makes the methods seem
disconnected from both the question and the findings in the discussion.”

The Why: Justifying the Purpose of the Study and the Scope
of the Literature Review
“The authors introduce many frameworks and constructs in the theoretical frame-
work and the literature review. However, it is not clear which one will be the
focus.”—A JRME reviewer

As we discussed in our March editorial (Cai et al., 2019), significant research


questions—ones that extend the field’s knowledge—rely “on a chain of justifica-
tion forged from a theoretical framework that draws on the knowledge of the field”
(p. 118). Research studies are built on a foundation of knowledge developed through
earlier work, both theoretical and empirical. “Through a research question’s
connections to prior research, it should be clear how answering the question
extends the field’s knowledge because it is based on hypotheses suggested by
previous research” (Cai et al., 2019, p. 118), hypotheses that we refer to as educated
hypotheses. These educated hypotheses, stemming from the review of the litera-
ture, allow readers to anticipate the possible findings and potential contribution of
the work. When authors do not clearly present the theoretical framework that
connects the study to earlier work, they give readers (and reviewers) the impression
that the study exists in a vacuum. More precisely, readers are left guessing how it
advances the field’s understanding of mathematics teaching and learning.
By laying out the theoretical framework, researchers situate their perspectives
and their research questions in the broader field.2 During the conceptualization

2 A key consideration for authors preparing a manuscript for JRME is that the journal is focused
on mathematics education research. Thus, although the journal does not prescribe a set of theoreti-
cal frameworks specific to the domain of mathematics education, it remains extremely important to
draw connections between the theoretical framework that has been employed and relevant theories
about the teaching and learning of mathematics. This is particularly important in cases where re-
search has been conducted in a different, but related, domain (e.g., cognitive science, educational
psychology, and so forth).

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:09:59 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 221

and conduct of a research study, this means that researchers make explicit for
themselves how their research questions are similar to and different from related
questions already studied by other researchers. They construct and refine the
theoretical framework to better understand and analyze the phenomenon being
studied and to decide what to read and look for in prior research (e.g., peripheral
areas versus areas where the researcher needs to be an expert). When preparing a
research report, this means that the scope of the literature review—what counts as
relevant to this particular study—is justified by the theoretical framework that has
been constructed over the course of the study. Because the theoretical framework
provides a connected set of reasons for the decisions made in conducting the study,
only the previous research that made a difference in those decisions is essential to
include in the literature review.
Reviewers will often suggest additional literature to review. However, the
researcher must still carefully consider what prior research is truly relevant. The
literature review should not become a laundry list of relevant research (although it
may seem tempting to take this approach in response to reviewers’ calls to include
additional literature). Rather, it should draw on the theoretical framework to orga-
nize the literature in a useful, and perhaps novel, way that justifies why the contri-
bution of the particular study is significant. When a reviewer raises the concern
that the researcher failed to review a relevant study or line of research, this may
mean that the study is not properly positioned with respect to what is already
known. In other words, if a reviewer chooses to raise this kind of concern, it should
be because the reviewer believes that if the omitted literature had been taken seri-
ously, the researcher would have made different decisions and would have
conducted the study or interpreted the findings differently.

The How: Justifying the Design of the Study and the


Research Methodology
“A number of critical methodological choices were not well justified, and I wanted
to know more about the theoretical support for those choices.”—A JRME reviewer

In any research study, a variety of methods and approaches can be used to answer
the research questions. A theoretical framework, even one that is still being devel-
oped over the course of a study, helps provide the researcher with reasons for
making particular methodological choices. As Mason (2005) pointed out, frame-
works
inform the researcher in the design of their study, such as when seeking tasks to reveal
dimensions of variation of which subjects are aware or can access, to get them doing
and talking as well as making records, to provoke them into displaying mathematical
thinking and to stimulate them to expose the subtle shifts in the structure of their
attention. (p. 18)

In the reporting of a research study, the theoretical framework therefore helps


justify for readers why the chosen design for the study makes sense to answer the
research questions. It should be clear from the theoretical framework how the
methods chosen for the study will lead to collecting data that will address the
research questions. Simply stating the choice of a particular methodology is not

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:09:59 UTC
All use subject to https://about.jstor.org/terms
222 Theoretical Framing as Justifying

justification enough. The researcher must make an argument, based on the


theoretical framework, to motivate the choices made for the design of the study,
the methods of analysis, and so on.
The theoretical framework helps researchers make decisions about the choice of
methods in multiple ways. For example, a researcher who is studying questions
about teaching mathematical proof recognizes that sociomathematical norms have
been used to explain how what counts as a valid proof or mode of proof is socially
negotiated in classrooms. So, the researcher includes sociomathematical norms as
part of the theoretical framework of the study. Because these norms are negotiated
through discourse and argumentation in the classroom, the researcher also includes
the theoretical machinery of discourse analysis, including Toulmin’s model of
argumentation. These decisions prompt the researcher to include observations of
socially negotiated products during class discussion to capture activity that might
contribute to the classroom development of proof. In turn, these choices in
constructing the theoretical framework motivate and justify the researcher’s choice
of techniques for data collection (e.g., video-recording discourse in the classroom)
and data analysis.
In contrast, if there is a mismatch or a lack of connection between the theoretical
framework and the methodological choices, readers and reviewers may rightfully
question the validity of the instruments and the analysis. For example, the constant
comparison method and building grounded theory (Corbin & Strauss, 1990) are
frequently cited somewhat loosely as methodological (and theoretical) choices
(Mewborn, 2005). Reviewers become concerned when a research report refers to
grounded theory without any indication of how this influenced the decisions that
were made to conduct or report the study. Reviewers also raise concerns when a
research report invokes this methodology but also describes a detailed and highly
specified theoretical framework that would preempt the development of a grounded
theory. Lack of connections or contradictions like these between the theoretical
framework and methodological choices ultimately weaken the contribution of a
report of research.

The What and the So What: Justifying the Presentation of the Findings
and the Interpretation of the Findings
“Because this manuscript is missing a theoretical framework, the discussion lacks
support, and it is impossible to judge the merit of the findings.”—A JRME reviewer

Thus far, we have made the case that a study should be guided by educated
hypotheses and a justified methodological design. With these two components in
place, the findings of a study will emerge from data that address the research ques-
tions and confirm or disconfirm the hypotheses. Interpreting the findings can then
take the form of comparing theoretically grounded predictions to actual results and
then refining or extending the theoretical framework to support revised hypotheses
that align with what was actually observed. The revised framework can be
presented as the study’s contribution to the field, and the new, more educated
hypotheses can be tested in future studies. In contrast, if the study is not situated
within clearly justified hypotheses, the findings are not anchored to their intended
purpose, and researchers can be tempted to make overreaching claims.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:09:59 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 223

The theoretical framework also provides context for the discussion of the find-
ings. As a vital connection between the findings that have been presented and the
larger argument that is made, the theoretical framework gives the researcher a
mechanism to explain how the findings address and answer (or fail to answer) the
research questions. For example, in a quantitative study there may be many results
that are statistically significant. It is incumbent on the researcher to use those
results to justify which of the educated hypotheses have or have not been
confirmed. More broadly, the theoretical framework, having already been used to
establish the relevance of the study to the field, is key to explaining to readers the
new contribution of the findings. In short, the discussion of the findings should
revisit the educated hypotheses that emerged from the review of the literature,
demonstrating the significance of the findings that result from the present study
in light of that other work and informing refinements to the theoretical framework.

Conclusion
Too frequently, we find JRME reviewers lamenting that the theoretical frame-
work is insufficiently developed and disconnected from the rest of the manuscript
(e.g., “the theoretical framework and methodology are not congruent” and “the
theoretical framework is only arbitrarily connected to the data”). Indeed, more than
one quarter of the reviews for rejected manuscripts in 2017 included such
comments. We believe that a well-constructed theoretical framework comes from
researchers’ careful thinking about the reasons—the justification—for the hypoth-
eses they formulate about the likely outcomes of the study. The framework is then
used to guide the choice of literature reviewed, the research methods applied, and
the claims of significance and contribution to the field. The theoretical framework
thus ties together the background, methodology, and findings of a study into a
single cohesive narrative.
In our next editorial (July 2019), we will focus on choosing methods for
conducting a study and describing these methods in a report of the study. We will
argue that research questions dictate the choice of research methods; the theoretical
framework helps researchers choose the methods that will generate the kind of data
needed to address the research questions. But researchers still must make decisions
among a variety of methods that could be used. How do researchers make these
decisions? To develop our argument, we will again point out common errors in
choosing methods and describing them.

References
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., Cirillo, M., . . . Hiebert, J. (2019). Posing
significant research questions. Journal for Research in Mathematics Education, 50(2), 114–120.
doi:10.5951/jresematheduc.50.2.0114
Corbin, J., & Strauss, A. (1990). Grounded theory research: Procedures, canons, and evaluative
criteria. Qualitative Sociology, 13(1), 3–21. doi:10.1007/BF00988593
Eisenhart, M. A. (1991). Conceptual frameworks for research circa 1991: Ideas from a cultural
anthropologist; implications for mathematics education researchers. In R. G. Underhill (Ed.),
Proceedings of the thirteenth annual meeting of the North American Chapter of the International
Group for the Psychology of Mathematics Education (Vol. I, pp. 202–219). Blacksburg, VA:
Division of Curriculum & Instruction.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:09:59 UTC
All use subject to https://about.jstor.org/terms
224 Theoretical Framing as Justifying

Grant, C., & Osanloo, A. (2014). Understanding, selecting, and integrating a theoretical framework
in dissertation research: Creating the blueprint for your “house.” Administrative Issues Journal:
Connecting Education, Practice, and Research, 4(2), 12–26.
Imenda, S. (2014). Is there a conceptual difference between theoretical and conceptual frameworks?
Journal of Social Sciences, 38(2), 185–195.
Leatham, K. R. (in press). Principles for effectively communicating the theoretical framing of
our work. In K. R. Leatham (Ed.), Designing, conducting, and publishing quality research in
mathematics education. Cham, Switzerland: Springer.
Lester, F. K., Jr. (2005). On the theoretical, conceptual, and philosophical foundations for research in
mathematics education. ZDM Mathematics Education, 37(6), 457–467. doi:10.1007/BF02655854
Mason, J. (2005). Frameworks for learning, teaching and research: Theory and practice. In G. M.
Lloyd, M. Wilson, J. L. M. Wilkins, & S. L. Behm (Eds.), Proceedings of the 27th annual meeting
of the North American Chapter of the International Group for the Psychology of Mathematics
Education (pp. 9–30).
Maxwell, J. A. (2005). Qualitative research design: An interactive approach (2nd ed.). Thousand
Oaks, CA: Sage.
Mewborn, D. S. (2005). Framing our work. In G. M. Lloyd, M. Wilson, J. L M. Wilkins, & S. L.
Behm (Eds.), Proceedings of the 27th annual meeting of the North American Chapter of the
International Group for the Psychology of Mathematics Education (pp. 31–39).
Silver, E. A., & Herbst, P. (2004, April). “Theory” in mathematics education scholarship. Paper
presented at the research precession of the annual meeting of the National Council of Teachers of
Mathematics, Philadelphia, PA.
Skott, J., Van Zoest, L., & Gellert, U. (2013). Theoretical frameworks in research on and with
mathematics teachers. ZDM Mathematics Education, 45(4), 501–505. doi:10.1007/s11858-013-
0509-3
Spangler, D. A., & Williams, S. R. (in press). The role of theoretical frameworks in mathematics
education research. In K. R. Leatham (Ed.), Designing, conducting, and publishing quality
research in mathematics education. Cham, Switzerland: Springer.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 01 Jan 1976 12:34:56 UTC
All use subject to https://about.jstor.org/terms
Journal for Research in Mathematics Education
2019, Vol. 50, No. 4, 342–348

Editorial

Choosing and Justifying Robust Methods


for Educational Research
Jinfa Cai, Anne Morris, Charles Hohensee, Stephen Hwang, Victoria Robison,
This material may not be copied or distributed electronically or in other formats without written permission from NCTM.

Michelle Cirillo, Steven L. Kramer, and James Hiebert


Copyright © 2019 by the National Council of Teachers of Mathematics, Inc. www.nctm.org. All rights reserved.

University of Delaware

In our recent editorials (Cai et al., 2019a, 2019b), we discussed the important
roles that research questions and theoretical frameworks play in conceptualizing,
carrying out, and reporting mathematics education research. In this editorial, we
discuss the methodological choices that arise when one has articulated research
questions and constructed at least a rudimentary theoretical framework. Just as the
researcher must justify the significance of research questions and the appropriate-
ness of the theoretical framework, we argue that the researcher must thoroughly
describe and justify the selection of methods. Indeed, the research questions and
the theoretical framework should drive the choice of methods (and not the reverse).
In other words, a sufficiently well-specified set of research questions and theo-
retical framework establish the parameters within which the most productive
methods will be selected and developed.
We have argued previously that research should be guided by educated hypoth-
eses—hypotheses about what one expects to find as possible answers to the
research questions based on a foundation of earlier empirical and theoretical work.
These educated hypotheses shape the choice of methods. One useful heuristic for
choosing methods begins with the researchers formulating, as precisely as possible,
their set of educated hypotheses about what they will find and the claims that they
hope to make. Then, the researchers can work backward to determine what kinds
of data would be needed to address these hypotheses and, in turn, what methods
would yield these kinds of data. Although this heuristic provides a general blueprint
for selecting and refining methods, its benefits can best be understood by exam-
ining how it can be applied to avoid many common methodological flaws that arise
in manuscripts submitted to JRME.

Methodology and Traditional Pathways for Educational Research


In our January 2019 editorial (Cai et al. 2019), we defined research pathways as
“the collection of assumptions that define the purposes of educational research,
the principles that differentiate research from other educational activities, and the
guidelines for how research should be conducted” (p. 2). Manuscripts submitted
to JRME typically report the findings of mathematics education research studies
that follow a traditional pathway for research. That is, the report describes a single
study in which the researcher posed one set of research questions and gathered one
set of data to answer these questions.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 343

Common Methodological Problems


Based on our analysis of reviewers’ comments,1 one of the most common meth-
odological flaws in 2017 JRME submissions was a lack of justification for the
chosen methods. For example, one JRME reviewer wrote, “The researchers made
a number of seemingly random decisions with no explanation.” Nearly one third
of the reviews for manuscripts that were ultimately rejected included this concern,
as did one fourth of the reviews for manuscripts that received a decision of revise
and resubmit. In contrast, this concern was raised in only about 15% of reviews for
manuscripts that were ultimately accepted. Especially problematic are cases in
which authors simply cite other researchers to justify their methods without
unpacking the specifics of the methods and presenting an explicit argument for
why the methods used are the most appropriate and productive.
Concerns about justifying methodological choices can be attributed, in part, to
a lack of alignment among the research questions, the theory, and the methods. As
another JRME reviewer noted, “The methods need to have more than a cursory
connection to the theoretical construct that is at the heart of the study.” Indeed,
reviewers explicitly highlighted misalignments in one fifth of the reviews of ulti-
mately rejected manuscripts, twice as often as they did for manuscripts that were
ultimately accepted or given a decision of revise and resubmit. These misalign-
ments can take many forms, including the scale of the methods not fitting the scale
of the research question (e.g., when analyzing too much data interferes with
addressing the research question at the appropriate level of depth) or the tasks given
to participants not generating responses that directly address the hypotheses (and
the associated research questions). Although it may not be obvious when
conducting a study, such misalignments become particularly evident in the report.
The coherence of the report suffers when authors do not or cannot explain their
choice of procedures by connecting them to the hypotheses that constitute the
theoretical framework or by showing how the procedures would generate the kind
of data needed to address the research questions.
The problem of justification is exacerbated when the methods are not described
in enough detail for readers to understand exactly how the data were gathered or
analyzed. In roughly 40% of the reviews for manuscripts that were ultimately
rejected or received a decision of revise and resubmit, reviewers called for more
detailed descriptions of the methods to better understand the choices the researchers
made. Even for manuscripts that were ultimately accepted, over one fourth of the
reviews called for describing the methods in greater detail. Data coding, in partic-
ular, should be described in enough detail to help the reader understand how the
researcher interpreted the data with respect to the research questions and the
hypotheses (e.g., clearly stating what evidence was needed in order to say a
particular code had been satisfied). Coding was a concern in nearly one fourth of
the reviews for ultimately rejected manuscripts—double the rate for manuscripts
that were ultimately accepted.

1 We analyzed the reviews for every manuscript that underwent a full review and received a deci-
sion in 2017. Reviewer comments in this editorial have been paraphrased to respect the confidential-
ity of the review process.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
344 Choosing and Justifying Robust Methods

Several other methodological concerns, including issues of validity, reliability,


and considering alternative explanations for findings, were regularly raised in the
reviews we analyzed. These concerns are also related to justifying the choice of
methods. Although authors sometimes fail to include enough information about
how they addressed validity and reliability, that information is essential to making
the argument that the chosen methods should produce trustworthy data that address
the research questions and that the methods are being executed properly. It is also
common for researchers to collect data that do not suffice to rule out alternative
explanations for the findings. For example, a qualitative analysis may not make
sufficient use of methods like triangulation to challenge and test conclusions, or a
quantitative analysis might fail to employ a multilevel model when one is needed,
thus leaving open the possibility that the observed results are spurious.

Suggestions for Addressing the Common Problems


The heuristic we proposed earlier—formulate hypotheses about likely answers
to the research questions, identify the nature of the data that would address these
hypotheses, and develop the data collection procedures that would yield these
data—could solve many of these common methodological problems. To demon-
strate its usefulness, we elaborate how it might be applied.
We imagine two overlapping but distinct phases in the process of selecting,
developing, and refining the methods for a study. The first phase consists of the
initial selection and description of research methods through (a) ensuring that the
research questions are specified as precisely as possible; (b) formulating predic-
tions or hypotheses (and making explicit one’s own implicit hypotheses) about
expected answers to the questions based on previous theoretical development,
empirical research, and one’s own prior experiences; (c) imagining the kinds of
data that will be needed to test these hypotheses and answer the research questions;
and (d) determining the best ways to gather these kinds of data and analyze them
so that the hypotheses can be directly addressed. This phase can also include
imagining alternative explanations for the data—alternatives to the initial hypoth-
eses. Doing this helps to ensure that the methods chosen will provide data that can
address the competing hypotheses and ultimately support plausible and evidence-
based arguments that one hypothesis is more likely to explain the data than another.
Table 1 shows an example of how research questions, hypotheses, data, and
analyses should be tightly and explicitly connected both in designing and
conducting a study and in reporting the study. In this example, two hypotheses can
be formulated for how Research Question 1 will be answered and one hypothesis

Table 1
Coherence Among Research Questions, Hypotheses, Data, and Analysis Procedures
Question Hypothesis Data Analysis
Hypothesis 1 Data 1, Data 2 Analysis 1, Analysis 2
Research Question 1
Hypothesis 2 Data 2, Data 3 Analysis 3

Analysis 2, Analysis 3,
Research Question 2 Hypothesis 3 Data 1, Data 4
Analysis 4

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 345

can be formulated for Research Question 2. Several kinds of data (Data 1, 2, and
3) are required to address the hypotheses for Research Question 1, and those data
need to be analyzed using three procedures (Analyses 1, 2, and 3). Two forms of
data (Data 1 and 4) are required to address the hypothesis for Research Question
2, and they overlap with the data relevant to Research Question 1 as do their appro-
priate analytic procedures (Analyses 2, 3, 4).
Enacting this systematic process for developing methods will solve a number of
the most common methodological problems. The alignment among research
questions, theoretical framework (hypotheses), and methods is prioritized as the
methods are selected and developed. Documenting this process of methods devel-
opment can yield a coherent description showing how these key aspects of a
research study support each other. In most cases, sufficient detail about the
methods can be provided by describing all of the ways the methods are designed
to address the hypotheses and, in turn, the research questions.
In our view, the development of research methods can be improved further
through a second phase of methods development. The second phase involves
iterative, brief cycles of testing the choice of methods and refining them to ensure
the most productive methods are used for addressing the research questions. Trying
out the methods by gathering and analyzing a small set of data can help researchers
quickly determine whether the methods need to be refined. For example, the tasks
given might need to be adjusted to generate informative responses or the interview
questions might need to be reworded. Perhaps the coding scheme might need to be
changed, which, in turn, could suggest changes to the kinds of data that are needed.
Even the sample might need to be adjusted or different analytic procedures might
need to be selected. Researchers might find it helpful to run through this cycle
several times, each time gathering just enough data to identify small tweaks that
could improve the chances of addressing the hypotheses and the research questions.
Some specific methodological choices cannot be made or justified properly
during the first phase. Data must be collected and some initial analyses must be
conducted before final choices can be made. For example, in a study with data that
have a nested structure (e.g., students nested in classrooms that are also nested in
schools), researchers might decide in the first phase that hierarchical linear
modeling (HLM) could be an appropriate quantitative methodological choice to
analyze the data. However, it would still be necessary to justify the use of HLM
and the chosen model (e.g., the choice of predictor variables or whether the model
has two or three levels) through a systematic process of model building in which
the model fit is evaluated at each step both quantitatively and with respect to the
theoretical framework. This process cannot happen a priori—it depends on having
data to analyze in potential models, and would therefore have to occur in the second
phase of methods development.
This second phase of empirically improving the methods yields additional
benefits. More of the common methodological problems can be resolved. For
example, methods can be adjusted to generate the optimal amount of data to
address the research questions. Tasks can be sharpened to generate the data most
relevant for answering the questions. Further details of method development can
be provided and justified based on these pilot tests, model-building processes, or
cycles of empirical refinement. In summary, we believe the two phases we have

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
346 Choosing and Justifying Robust Methods

described can solve many common methodological problems by tightly aligning


the research methods with other key aspects of a research study.
We conclude this section by noting that some readers might be concerned that
our emphasis on formulating predictions or hypotheses about possible answers to
the research questions before conducting a study could preclude researchers from
being open to, or aware of, contradictory or unexpected findings. We do not think
this is the case. Indeed, we believe that carefully thinking about likely findings
creates increased sensitivity to unexpected findings. By making explicit for oneself
what one is expecting to find, the likelihood increases that one will notice surprises
when they occur.

Methodology and Alternative Pathways for Educational Research


Our first series of editorials spanning 2017–2018 gradually developed a future
world of research that aspired to have greater impact on practice. In this vision of
the future, we proposed an alternative pathway for mathematics education research
that involves teacher–researcher partnerships working on solving problems of
practice through iterative cycles of innovation in tasks and lessons. We described
in some detail how these teacher–researcher partnerships could operate through
multiple phases of work, gradually improving teaching and student learning by
iteratively making small adjustments to instruction, gathering just enough data,
analyzing the data, using the findings to make further adjustments, and repeating
the cycle. The essential elements of this process are similar to the process we
recommended above for improving the methods to maximize the benefit of a
particular study. We believe this is because empirically based improvement,
regardless of the goal, requires similar iterative cycles. In the argument we offered
above, the goal is the improvement of methods for maximizing the benefit of a
particular study. In the future world of this alternative research pathway, the goal
is the direct improvement of classroom teaching and learning. In both cases,
researchers form hypotheses, develop methods to test hypotheses, implement the
methods to test the hypotheses, use the findings to revise the hypotheses, and repeat
the cycle.
A fundamental difference between the traditional and alternative pathways is
that the alternative pathway assumes that maximizing the benefits of research (for
practice) comes not from a single study, or even from a small number of discrete
studies, but rather from a continuing series of connected studies. In fact, the find-
ings from these connected studies constitute the actual findings of interest; they
are not just recalibrations of methods developed for a larger study. Rather than
being methodological steps on the way to conducting an individual study, the
iterative cycles are the studies. Task design, implementation and refinement of an
intervention, assessment calibration, and revision of hypotheses are the research.
A methodological advantage of the alternative pathway is that, to proceed down
this path, many of the common problems cited above must be solved as an
embedded aspect of the iterative cycles of work. Engaging in these cycles means
that teacher–researcher partnerships are constantly aligning research questions,
hypotheses, and methods. They are testing and refining practical hypotheses about
how to improve student learning, drawing from both the wisdom of practice and
insights from research. The methods used by a partnership in one cycle must be

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 347

fully described so the next cycle can build on what the partnership learned and on
how they learned it. As we described in an earlier editorial (Cai et al., 2018), the
data and artifacts generated by a teacher–researcher partnership should be stored
as knowledge packages in a professional knowledge base that is continually
updated as the partnership engages in iterative cycles of work. These knowledge
packages would hold all the information for other researchers to interpret the partial
solutions to the instructional problems that are continuously improved, including
the justification and reasoning for the methodological choices made in each cycle.
Although we contend that research that follows the alternative pathway described
in our previous editorials avoids many typical methodological pitfalls, we also
recognize that this alternative pathway has not yet gained much traction in our
field. We expect that as alternative pathways are developed further and imple-
mented more frequently, a new series of methodological problems could appear.
We cannot yet address these but we can be quite sure they will be different than
those common today. For now, we note one set of methodological questions that
may arise: What methods would be appropriate to support generalizations gener-
ated by this type of work? For example, if a teacher–researcher partnership engages
in numerous cycles of work and produces a set of knowledge packages based
around an instructional unit of lessons, how can this knowledge inform another
teacher–researcher partnership engaged in work in another content area? Is this
work always entirely tied to contexts, or are there generalizations that can inform
this work across content areas and lessons? What methods would help to identify
such generalizations?

Conclusion
Choosing appropriate and effective methods and justifying that choice is a
critical part of conducting and communicating high-quality research in education.
By carefully and explicitly connecting the research questions and the hypotheses
that form the theoretical framework to the selection of methods, it is possible to
avoid many common methodological problems. Indeed, methods that are well
justified and closely connected to the other components of the study form the basis
for generating trustworthy and insightful findings and for producing a coherent
report of the study. If the field moves to other research pathways, many current
methodological problems might be solved. But, new problems are likely to arise
that require a similar degree of attention.
In our November editorial, we will turn our attention to issues of interpreting
findings in educational research. For example, we will consider how to avoid the
common pitfall of making claims that are insufficiently supported by data, both in
research that follows the traditional pathway and in the cycles of iterative work that
make up the alternative pathway. Indeed, this is frequently an issue with manu-
scripts submitted to JRME that are ultimately rejected. We will argue that the
heuristics for choosing and justifying methodology that we have described in this
editorial can also help researchers ensure that their claims are well supported by
their data.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
348 Choosing and Justifying Robust Methods

References
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., Cirillo, M., . . . Hiebert, J. (2019a). Posing
significant research questions. Journal for Research in Mathematics Education, 50(2), 114–120.
doi:10.5951/jresematheduc.50.2.0114
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., Cirillo, M., . . . Hiebert, J. (2019b).
Theoretical framing as justifying. Journal for Research in Mathematics Education, 50(3), 218–
224. doi:10.5951/jresematheduc.50.3.0218
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2018). Using data to
understand and improve students’ learning: Empowering teachers and researchers through
building and using a knowledge base. Journal for Research in Mathematics Education, 49(4),
362–372. doi:10.5951/jresematheduc.49.4.0362
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2019). Research pathways
that connect research and practice. Journal for Research in Mathematics Education, 50(1), 2–10.
doi:10.5951/jresematheduc.50.1.0002

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:11:14 UTC
All use subject to https://about.jstor.org/terms
Journal for Research in Mathematics Education
2019, Vol. 50, No. 5, 470–477

Editorial
So What? Justifying Conclusions and
Interpretations of Data
Jinfa Cai, Anne Morris, Charles Hohensee, Stephen Hwang, Victoria Robison,
Michelle Cirillo, Steven L. Kramer, and James Hiebert
University of Delaware
This material may not be copied or distributed electronically or in other formats without written permission from NCTM.
Copyright © 2019 by the National Council of Teachers of Mathematics, Inc. www.nctm.org. All rights reserved.

Although often asked tactfully, a frequent question posed to authors by JRME


reviewers is “So what?” Through this simple and well-known question, reviewers
are asking: What difference do your findings make? How do your results advance
the field? “So what?” is the most basic of questions, often perceived by novice
researchers as the most difficult question to answer. Indeed, addressing the “so
what” question continues to challenge even experienced researchers. All
researchers wrestle with articulating a convincing argument about the importance
of their own work. When we try to shape this argument, it can be easy to fall into
the trap of making claims about the implications of our findings that reach beyond
the data.
We use this editorial to propose some ideas for presenting and interpreting
results with an eye toward addressing the “so what” question. We do so by lever-
aging the alignment among research questions, theoretical framework, and
methods in a well-designed research study. Our aim is to present some practical
ideas that could help researchers evaluate their findings with this question in mind.

Aligning Interpretations With Earlier Parts of the Report


In previous editorials, we argued that justifying the significance of a study
requires developing a coherent chain of reasoning connecting the theoretical
framework (Cai et al., 2019c), the research questions (Cai et al., 2019b), and the
research methods chosen to address the research questions (Cai et al., 2019a). In
this editorial, we argue that the chain of reasoning is not complete until the results
are interpreted and discussed. The results do not stand alone; they fit within the
story developed up to that point in the report. Therefore, the importance of the
findings—the answer to the “so what” question—depends on the story developed
before the results are presented. The importance of the findings, and of the study
itself, emerges from interpreting the findings in a way that explicitly connects the
data to earlier links in the chain.

Connecting Interpretations With the Research Questions


A first suggestion for connecting interpretations with research questions is that
authors carefully consider how their findings address the research questions.
Although this might seem like an obvious step in interpreting the data, authors
often do not give it sufficient attention, perhaps because answering research ques-
tions is deceptively complex. The answers to research questions in mathematics
education are (almost) never “yes” or “no.” Because educational settings are filled
with interactions among multiple, and often confounding, factors, research ques-
tions that anticipate a yes or no answer hide important complexities. Appropriate

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 471

answers often include tentative observations about why a particular phenomenon


occurred, the conditions under which outcomes were found, nuances that require
a more complicated answer than expected, or subtle but important differences
between the results obtained and the results predicted.
In our March 2019 editorial, we argued that research questions “gain additional
significance when they move from only finding answers to the problem to also
understanding how and why the answer is a solution” (Cai et al., 2019b, p. 116).
Interpreting the findings in ways that help readers understand why the results
turned out the way they did is, in our view, a hallmark of interpretations that
persuasively answer the “so what” question. How can authors achieve this? One
way is to describe the conditions under which the results occurred and offer
hypotheses about how the results would be the same or different under
different conditions.
The educational significance of describing the contextual conditions under
which the results occurred is perhaps most salient for studies that ask questions
about teachers’ instructional problems. In the future world of research that we
previously envisioned, teacher–researcher partnerships pose research questions
that directly address pressing instructional problems (Cai et al., 2019b).
Understanding the conditions under which the results answer the questions and
solve the problems allows partnerships to predict how the outcomes might be
similar and different in different classrooms, with different students, for different
topics, and so on. In obvious ways, these interpretations lead to further targeted
studies. The “so what” question is answered with little additional effort.
How do researchers make sure that the data they collect and the analyses they
conduct generate results that can be interpreted in ways that further the field’s
understanding of the phenomena? That is, how do researchers make sure that their
research identifies and describes the conditions under which phenomena occur?
Most simply, they phrase their research questions in ways that ask about these
conditions (Cai et al., 2019b). But how do researchers know what conditions to
explore in their research? Answering these questions takes us back to the theo-
retical framework that motivates, shapes, and justifies the research questions (Cai
et al., 2019c).

Connecting Interpretations With the Theoretical Framework


The ultimate answer to the “so what” question is found both in the theoretical
framework developed for the study and in the way the results inform the further
development of the framework. We previously described several critical functions
of a well-constructed theoretical framework (Cai et al., 2019c). One is to answer
the question above—under what conditions are particular outcomes expected?
Because the theoretical framework is tailored to a particular study, it uses past
research to identify the relevant factors that could influence the outcomes and
explains why these factors are important. A finely tuned theoretical framework is
what allows researchers to pose research questions that ask about the effects of
particular factors, and that provides the bases for them to make informed hypoth-
eses about outcomes.
Armed with hypotheses that predict the outcomes and that explain why they are
the most likely given the conditions of the study, researchers can interpret the

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
472 Justifying Conclusions and Interpretations

results in terms of these hypotheses. We view interpreting results in terms of


hypotheses to mean examining the way in which hypotheses should be revised to
more fully account for the results. For example, if researchers expect Outcome A
but instead find Outcome B, they must ask what changes to the hypotheses could
have resulted in predicting Outcome B rather than Outcome A. Are there condi-
tions that were not accounted for that should be included in the revised hypotheses?
We believe that revising hypotheses is an optimal response to the “so what” ques-
tion because a researcher’s initial hypotheses plus the revisions suggested by the
data are the most productive way to tie a study into the larger chain of research of
which it is a part.
We view presenting revised hypotheses as a central part of interpreting data and
drawing conclusions because revised hypotheses are the touchstones that demon-
strate growth in knowledge. Building on other researchers’ revised hypotheses
and revising them further by more explicitly and precisely describing the condi-
tions that are expected to influence the outcomes in the next study accumulates
knowledge in a form that can be built upon and improved by future researchers.
Comparing the revised hypotheses with those proposed by previous researchers
is a compelling way to answer the “so what” question. These comparisons show
how the study advances the field.
Interpreting findings in order to revise hypotheses is not a straightforward task.
Usually the hypotheses in any particular study arise from a theoretical framework
that blends multiple constructs or variables and predicts multiple outcomes, with
different outcomes connected to different research questions and addressed by
different sets of data. We previously illustrated some of this complexity in a table
(Cai et al., 2019a). In Table 1, we add two additional columns to that table to incor-
porate findings and justifiable revisions to the initial hypotheses.

Table 1
Coherence Among All Parts of a Research Report
Question Hypothesis Data Analysis Results So What?
Revision of
Hypothesis 1
Research Hypothesis Data 1, Analysis 1,
Finding 1 Future
Question 1 1 Data 2 Analysis 2
Directions for
Study
Revision of
Hypothesis 2
Hypothesis Data 2,
Analysis 3 Finding 2 Future
2 Data 3
Directions for
Study
Revision of
Analysis 2, Hypothesis 3
Research Hypothesis Data 1,
Analysis 3, Finding 3 Future
Question 2 3 Data 4
Analysis 4 Directions for
Study

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 473

Not shown in Table 1 is the theoretical framework that underlies the operational
parts of a study. The theoretical framework infuses the research questions with
meaning and significance; generates specific hypotheses; and suggests methods,
data, and analyses that will most directly address the questions and hypotheses.
The theoretical framework again comes into play at the interpretation phase as the
hypotheses are revised to yield a revised theoretical framework. At the most
general level, the answer to the “so what” question is contained in the revisions to
the theoretical framework. As we said in the May 2019 editorial:

Interpreting the findings can then take the form of comparing theoretically
grounded predictions to actual results and then refining or extending the
theoretical framework to support revised hypotheses that align with what
was actually observed. The revised framework can be presented as the
study’s contribution to the field, and the new, more educated hypotheses
can be tested in future studies. (Cai et al., 2019c, p. 222) Comparing the
initial and the revised framework allows readers to see clearly the contri-
butions of this study.

Connecting the Interpretations With the Methods


When choosing the best methods to collect and analyze data in a research study,
researchers can rely on the educated hypotheses of the theoretical framework to
help define what kinds of data will be needed to address the hypotheses, how best
to gather these kinds of data, and what analyses should be performed (Cai et al.,
2019a). As with the research questions and the theoretical framework, the methods
of analysis need to be revisited when one is interpreting the findings. For example,
it is important to consider the kinds of claims that one’s methods are capable of
supporting. This is as true for quantitative analytic methods as for
qualitative methods.

Writing the Interpretation and Discussion


Interpreting the findings in ways that move the field forward by addressing the
“so what” question is an ambitious undertaking. There also remains the challenge
of structuring and writing the discussion to present these observations in a
convincing but not overreaching way. In this section, we discuss some of the
common concerns that reviewers raise about the discussion. We then conclude this
editorial by addressing a challenge that many researchers have faced related to
unexpected findings.

Structuring the Discussion Section and Avoiding Common Errors


Although there is no rigid formula for structuring the discussion section of a
report, we do see structures that seem to work better than others. We recommend
that the discussion begin with a brief summary of the main results, especially those
the authors will interpret in the discussion. This summary should not contain data
or results not previously presented. The discussion could then move to interpreting
the results and addressing the “so what” question in the ways we have described.
This makes up the bulk of the discussion. If authors choose to describe limitations
in the discussion, they could do so by showing how their interpretations are

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
474 Justifying Conclusions and Interpretations

explicitly constrained by limitations of the study or they might point to claims they
are unable to make. If the authors have chosen to embed limitations in earlier
sections of the paper, they will have presented their findings in ways that have
already constrained the interpretations of the findings. Finally, the discussion
should conclude with the implications of the findings. These implications might
suggest directions for future research or applications to educational practice. There
could also be methodological implications that inform and enrich the field’s
toolbox for conducting research.
In our analysis of JRME reviewer comments,1 we found several common
concerns that correspond to errors or omissions in the discussion structure
outlined above. One common error about which reviewers raised concerns was
claiming more than the data showed or could support. Fully 30% of the reviews
we analyzed included such concerns. Generally speaking, concerns about the
support for claims fell into two categories. On the one hand, reviewers raised
concerns about claims for which the authors provided insufficient or unclear
support and for which reviewers felt authors could have provided more support by
a more extensive or careful analysis of the data collected (e.g., “The authors have
collected excellent data, but it must be analyzed and interpreted to provide more
meaningful support for the results.”). For manuscripts that ultimately received a
decision of Accept with Revisions, the majority of reviewers’ concerns about
support for claims fell into this category. On the other hand, some reviewers raised
critical concerns about claims that the data and analysis or the overall design of
the study could not support. For manuscripts that ultimately received a decision
of Reject, the majority of reviewers’ concerns about support for claims fell into
this category. Concerns of this type challenge the viability of a manuscript because
they involve fundamental breaks in the chain of reasoning that aligns the research
questions, the theoretical framework, the methods, and the findings. For example,
one JRME reviewer stated, “The task given to the participants does not provide
the evidence that would be necessary to support these claims.” That is, the task
was not aligned with the research questions, and the evidence it could provide
would not address the claims the researcher wanted to make. In order to address
concerns like this, it is typically not enough to simply narrow the claims because
the nature of the data and data collection is at odds with the questions and with
the theoretical framework the author has constructed. As another JRME reviewer
commented, “I cannot see how these data would allow a robust analysis within
the authors’ framework.”
Another fundamental but common issue highlighted by reviewers was that the
“so what” question was not being addressed satisfactorily. In other words, it was
unclear why the contribution of the work being reported was significant or worth-
while, either theoretically or practically. About one third of the reviews for manu-
scripts that were ultimately rejected included such concerns. As one JRME
reviewer put it, “The manuscript left me unsure of what the contribution of this
work to the field’s knowledge is, and therefore I doubt its significance.” Even for

1 We analyzed the reviews for every manuscript that underwent full review and received a deci-
sion in 2017. Reviewer comments in this editorial have been paraphrased to respect the confidential-
ity of the review process.

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 475

manuscripts that were ultimately accepted, 14% of reviews included some


concerns about the contribution and significance of the work. However, in many
of these cases, rather than posing their concerns as a reason against publication,
reviewers offered suggestions to the authors about how to strengthen their argu-
ment (e.g., “These suggestions are intended to help the authors make a stronger
argument for the contribution of their work” and “I urge you to be more explicit
about how your findings are important. What might researchers or teachers learn
from your work?”).

Dealing With Unexpected Findings


Formulating hypotheses encourages researchers to be explicit and precise about
how much is known in the field; it does not preclude researchers from keeping an
open mind to observe the full range of outcomes. On the contrary, hypotheses that
generate predictions allow researchers to distinguish between those findings they
expected to see and those they did not. Researchers are often faced with unex-
pected and perhaps surprising results, even when they have developed a carefully
crafted theoretical framework, posed research questions tightly connected to this
framework, presented hypotheses about expected outcomes, and selected methods
that should help answer the research questions. Indeed, the unexpected findings
can be the most interesting and valuable products of the study. How, then, should
researchers treat unexpected findings? Our answer is to treat them in a way that
is most educative for the reader.
When researchers are confronted with unexpected findings, we see at least three
possible paths that would help the reader understand more fully the phenomenon
under investigation. The choice of which path to take depends on the researchers’
reevaluation of their own work, a reevaluation guided by their unexpected findings.
The first path we describe is appropriate when researchers reexamine their
theoretical framework and decide that it is still a compelling framework based on
previous work. They reason that readers are likely to have been convinced by this
framework and would likely have made similar predictions. In this case, we believe
that it is educative for researchers to (a) summarize their initial framework, (b)
present the findings and distinguish those that confirmed the hypotheses from
those that did not, and (c) conjecture why the framework was inadequate and
propose changes to the framework that would have created more alignment with
the unexpected findings. Revisions to initial hypotheses are especially useful if
they include explanations about why a researcher might have been wrong (and
researchers who ask significant questions in domains as complex as mathematics
education are almost always wrong in some way). Depending on the ways in which
the revised framework differs from the original, the authors have two options. If
the revised framework is an expansion of the original, it would be appropriate for
the authors to propose directions for future research that would extend this study
beyond its intended scope. Alternatively, if the revised framework is still largely
within the scope of the original study and consists of revisions to the original
hypotheses, the revisions could guide a second study to check the adequacy of the
revisions. This second study could be conducted by the same researchers (perhaps
before the final manuscript is written and presented as two parts of the same

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
476 Justifying Conclusions and Interpretations

report), or it could be proposed in the discussion as a specific study that could be


conducted by other researchers.
The second path is appropriate when researchers reexamine their theoretical
framework in light of their unexpected findings and recognize serious flaws. The
flaws could result from a number of factors, including casting the elements of the
framework in too general a way to formulate well-grounded hypotheses or not
accounting carefully enough for the previous work in this domain, both theoretical
and empirical. In many of these cases, readers would not be well served by reading
the chronology of the researchers’ flawed or loose reasoning. We believe the reader
would learn more if the researchers reconstructed their framework, more carefully
built from prior work and in a form closer to that in the first path. If the findings
remain unexpected based on the hypotheses generated by this revised, more
compelling framework, then the first path applies. But it is likely that the new
framework presented in the report will better predict the findings. After all, the
researchers now know the findings they will report. The key is for the researchers
to show how the new theoretical framework necessarily generates the hypotheses
and predictions they present in the report. The researchers should then explain
why they believe particular hypotheses were confirmed and why others should be
revised, even in small ways. The point we are making is that we believe it is accept-
able to reconstruct frameworks before writing research reports if doing so would
be more educative for the reader.
The third path becomes appropriate when researchers, in reexamining their
theoretical framework, trace the problem to a misalignment between the methods
they used and the theoretical framework or the research questions. Perhaps the
researchers recognize that the tasks they used did not yield data that could have
addressed the research questions directly. Or perhaps the researchers realize that
the sample they selected would likely have been heavily influenced by a factor
they failed to take into account. In other words, the researchers decide that the
unexpected findings were due to a problem with the methods they used, not with
the framework or the accompanying hypotheses. In this case, we recommend that
the researchers correct the methodological problems and reconduct the study.

Summary
This editorial concludes a series of four editorials about conducting and commu-
nicating research in (mathematics) education. We have discussed the formulation
of research questions (Cai et al., 2019b), the construction of theoretical frameworks
(Cai et al., 2019c), the choice of methods (Cai et al., 2019a), and, now, the interpre-
tation of findings. We organized these editorials around three main ideas that
permeate the research process: justification, coherence, and significance.
Justification is necessary at every step of research, whether in arguing for the
significance of research questions, making clear why a methodological choice is
appropriate, or convincing readers that one’s interpretations of the data are well
supported. Coherence requires that all of the components of research fit together
into a consistent narrative—a chain of reasoning that connects the theoretical
framework, the research questions, the methods, and the ways in which the results
are presented and interpreted. An effective choice of theoretical framework can
help researchers achieve coherence by providing a structure in which all the parts

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms
Cai, Morris, Hohensee, Hwang, Robison, Cirillo, Kramer, and Hiebert 477

of the research can connect. Finally, especially from the perspective of publishing
work in a research journal, research must be significant. It must advance our
knowledge and understanding of the teaching and learning of mathematics in a
substantive and powerful way. With this in mind, we look ahead to our next series
of editorials.
In January 2020, to mark the auspicious occasion of the 50th anniversary of
JRME and the 100th anniversary of the National Council of Teachers of
Mathematics, we will begin a new set of five editorials focused on identifying
future directions for promising research in the field of mathematics education. In
these editorials, our goal will be to identify those research questions that will shape
our field’s work for decades to come. What do we need to understand better in
mathematics education in the next 50 years to improve learning opportunities for
all students? As with our previous editorials, we approach this task with a mindset
of driving the field forward to conduct research that has the greatest positive
impact on the teaching and learning of mathematics in classrooms. In that regard,
we have come full circle to the driving theme of our first series of editorials:
improving the impact of education research by carefully rethinking the pathways
through which education research is conceived, conducted, and communicated
(Cai et al., 2017; Cai et al., 2019). We look forward to engaging the field by delib-
erately considering what we could collectively accomplish in the next 50 years.

References
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., Cirillo, M., . . . Hiebert, J. (2019a).
Choosing and justifying robust methods for educational research. Journal for Research in
Mathematics Education, 50(4), 342–348. doi:10.5951/jresematheduc.50.4.0342
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., Cirillo, M., . . . Hiebert, J. (2019b). Posing
significant research questions. Journal for Research in Mathematics Education, 50(2), 114–120.
doi:10.5951/jresematheduc.50.2.0114
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., Cirillo, M., . . . Hiebert, J. (2019c).
Theoretical framing as justifying. Journal for Research in Mathematics Education, 50(3), 218–
224. doi:10.5951/jresematheduc.50.3.0218
Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2019). Research pathways
that connect research and practice. Journal for Research in Mathematics Education, 50(1), 2–10.
doi:10.5951/jresematheduc.50.1.0002
Cai, J., Morris, A., Hwang, S., Hohensee, C., Robison, V., & Hiebert, J. (2017). Improving the
impact of educational research. Journal for Research in Mathematics Education, 48(1), 2–6.
doi:10.5951/jresematheduc.48.1.0002

This content downloaded from


104.238.164.107 on Mon, 03 Aug 2020 18:12:12 UTC
All use subject to https://about.jstor.org/terms

You might also like