Analyzing Collegiate Critical Thinking Course Effectiveness - Evidence From A Quasi-Experimental Study in China
Analyzing Collegiate Critical Thinking Course Effectiveness - Evidence From A Quasi-Experimental Study in China
A R T I C L E I N F O A B S T R A C T
Keywords: Collegiate critical thinking skills are taught via independent courses or embedded modules, but
Critical thinking effectiveness of instruction methods needs further study. Our study adopted a pre- and post-test
Independent quasi-experimental design with intervention and control groups to explore effectiveness of in
Embedded
dependent and embedded critical thinking modules at two Chinese universities. Embedded crit
Effectiveness
Quasi-experimental study
ical thinking modules had a significant, positive effect on development of collegiate critical
thinking skills, but the independent critical thinking course did not have a statistically significant
effect. This study incorporated a qualitative component to provide critical reflection on why
independent critical thinking courses may be less effective, including aspects of course logistics,
characteristics, and teaching methods, to provide guidance for future research and practical
considerations for teaching critical thinking courses.
1. Introduction
In the current era of rapidly changing economies and information infrastructures, innovative talents are the main driving force of
economic and social development. Cultivating college students’ critical thinking ability, among other core skills, not only promotes
innovative talents but also could be considered a responsibility of higher education institutions (Janssen et al., 2019; Peruza, Assi
lbayeva, Saidakhmetova & Arenova, 2021). College students are expected to gain lifelong learning and innovation skills, which require
them to possess solid critical thinking skills (Partnership for 21st Century Skills, 2019; Zhang, 2016). Though researchers from various
disciplines, methodological approaches, and theoretical orientations define critical thinking differently (Dewey, 1910; Fisher &
Scriven, 1997; Norris & Ennis, 1989; Zhang & Shen, 2018), one widely accepted definition of critical thinking is a “purposeful,
self-regulatory judgment which results in interpretation analysis, evaluation, and inference, as well as explanation of the evidential,
conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based” (Facione, 1990, p. 2).
This is the definition that we adopt to guide our study.
Many researchers believe that students can develop critical thinking skills through educational interventions, that students can
learn and be taught critical thinking skills, and that critical thinking skills can be measured objectively (Bernard et al., 2008; Facione,
2000; Loyalka et al., 2021; Smith, 1992; Sternberg, 1990). Universities worldwide have set up critical thinking courses and view them
as an effective way to develop students’ critical thinking skills. Previous literature has demonstrated that over 40% of universities in
the United States, Canada, Australia, and New Zealand offer courses on critical thinking in various formats, including independent
courses and embedded modules (Huang, 2010). Critical thinking coursework was first introduced for humanities and social sciences
* Corresponding author.
E-mail address: [email protected] (X. Xu).
https://doi.org/10.1016/j.tsc.2022.101105
Received 14 February 2022; Received in revised form 8 July 2022; Accepted 24 July 2022
Available online 26 July 2022
1871-1871/© 2022 Elsevier Ltd. All rights reserved.
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
majors in fields such as philosophy and linguistics (Coles & Robinson, 1989). Then, some universities in the United States, Canada, and
Australia began offering critical thinking coursework for all social science students (Olga, 2016; Wu, 2003).
Although there has been discussion on the importance of promoting students’ thinking skills development in China, the concept of
critical thinking was not introduced into the college curriculum until the past decade. Introducing this concept in classroom teaching in
China has not been a prevalent successful effort (Huang, 2019) that only 2% of 2100 higher education institutions provide courses with
“critical thinking” in their titles (Dong, 2014). Some typical offers of critical thinking courses include independent course design, such
as Huazhong University of Science and Technology Seed Class’s effort to teach critical thinking through a specialized course, and
Tsinghua University’s course offerings of “critical thinking and ethical reasoning course” to its students in the school of economics and
management; and embedded course design, such as Shantou University’s course offering of “Collegiate medical English integrated
program,” and Tsinghua University’s writing class for first-year students adopting critical thinking component (Su, 2021; Yang &
Yang, 2015). Existing publications have provided an insider view of the course history and pedagogy design, but little is known about
the effectiveness of those courses. Though critical thinking courses are offered on a broader scale across Chinese universities, we want
to further investigate the effectiveness of various designs of critical thinking courses. This study provides empirical data on the
effectiveness of two efforts to develop college students’ critical thinking skills using a quasi-experimental and mixed-method design at
two universities in China.
2. Literature review
Though there is broad agreement that college students should develop critical thinking skills, there is no consensus on the most
effective ways to teach critical thinking skills through the existing college curriculum—namely, through a standalone course or in
tegrated into existing courses (Behar-Horenstein & Lian, 2011; Chen, 2014; Hitchcock, 2012; Swartz & Perkins, 1990; Yu & Gao,
2017). Some theorists believe that critical thinking is a general concept; that its core tenets, techniques, attitudes, and methods can be
applied to multiple disciplines and contexts; and that critical thinking should be taught as a standalone course to provide systematic
training with the necessary foundation (Costa, 1985; Lipman, 1988; Sternberg, 1990). Other scholars have criticized the general-skill
view of critical thinking and claimed that critical thinking skills are highly contextualized; therefore, one cannot develop critical
insights without the proper topical knowledge (Coles & Robinson, 1989; Howie, 2011; McPeck, 1981; Smith, 1992). Particularly, as
Smith (1992) stated, “Only with rich knowledge of a field can we think critically about it, and it is impossible to think critically without
understanding what we want to think” (p. 92). These scholars have advocated for critical thinking skills development to be embedded
in disciplinary courses. With the embedded design, critical thinking training is an integral part of instructional design, and relevant
skills are trained in disciplinary-based course tasks, so students’ critical thinking skills develop as they further their knowledge in their
field of study.
Accordingly, there are two common critical thinking course design types: independent courses and embedded modules. Inde
pendent critical thinking courses provide general education for undergraduate students to enhance their awareness of critical thinking
skills and typically cover theories, applications, and techniques related to critical thinking. Embedded critical thinking modules build
upon existing courses of various disciplines and incorporate critical thinking elements in the curriculum (Hitchcock, 2012).
Literature evaluating the effectiveness of courses designed to develop students’ critical thinking skills has demonstrated incon
sistent findings, partly because studies were conducted in different countries using various learning assessment tools. Although some
study results have indicated the positive impact of independent critical thinking courses (e.g., Liu, Yu, Liu, Wang & Yang, 2021;
Penningroth, Despain & Gray, 2007; Rowe et al., 2015), other research has suggested that such courses do not make much of a dif
ference in promoting students’ critical thinking skills (Smith et al., 2019). Similarly, whereas Djamàa (2018) found that incorporating
critical thinking skills teaching in a literature class yielded positive outcomes for students’ critical thinking skills development,
Eunyoung, Ruth and Yeoungsuk (2014) found that a similar course for nursing students did not significantly enhance students’ critical
thinking skills.
Despite differing conclusions about the effectiveness of critical thinking courses in these studies, previous research identified
potential factors that could enhance the impact of critical thinking training for undergraduate students, such as dialog opportunities,
exposure to authentic problems, synthesis of information, and more (Abrami et al., 2015; Bezanilla, Fernández-Nogueira, Poblete &
Galindo-Domínguez, 2019). For example, in-class interaction between students and teachers seemed generally positive, as it allowed
students to gain constructive and immediate feedback from their teachers about how they answered questions, providing an effective
learning scaffold (Y. Chen, 2016; Li & Shi, 2020; Penningroth et al., 2007; Zarei, Mojtahedzadeh, Mohammadi, Sandars & Emami,
2021). Peer interaction and collaborative learning environments were observed in effective critical thinking courses (Yao, 2001). Also,
studies of effective learning environments included exercises in which students could apply their critical thinking techniques to solve
problems, such as software simulation for a professional work environment, case study activities, and in-class debate exercises (Gu, Liu
& Gu, 2021; Rowe et al., 2015; Seybert & Kane-Gill, 2011; Smith, Rama, & Helms, 2018). In embedded critical thinking courses,
deliberate incorporation of critical thinking skill training into teaching disciplinary knowledge has contributed to students’ critical
thinking skill development (Djamàa, 2018; Li & Shi, 2020).
In addition to the module differences, studies comparing students’ performance on critical thinking assessments indicate that
cultural factors may impact students’ critical thinking skills across learning contexts (Manalo, Kusumi, Koyasu, Michita & Tanaka,
2013; Tian & Low, 2011). Particularly for Chinese students, Chen (2017) conducted a case study and suggested unique qualities of
Chinese students’ understanding of critical thinking. Existing literature published in the Chinese context mostly focuses on introducing
the concept and pedagogy of critical thinking. Within the limited empirical studies conducted in the Chinese context, most are con
ducted in the field of medical education and English education. Further, few studies adopted a rigorous quasi-experimental design.
2
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
Table 1
Quality analysis of critical thinking skill assessment tool.
Analysis based on classical test theory ( Difficulty Pass rate (0.11, 0.3) (0.31, 0.7) (0.71, 0.9)
Dai, 2010; Murphy & Davidshofer, Number of items 2 6 10
2004) Judgment Relatively Moderate Relatively
difficult simple
Degree of Discrimination (0.1, 0.19) (0.2, 0.29) (0.3, 0.39) (0.4, 0.49)
Discrimination index
Number of items 3 5 6 4
Judgment Relatively Well enough Relatively good Very good
poor
Reliability The Cronbach reliability coefficient of the test tool is 0.723 (well enough), which meets the
internal consistency requirements.
Analysis based on item response theory ( Difficulty Difficulty index (-∞, -3) (-3, 0) (0, 3) (3, ∞)
Luo, 2012; Rupp, 2006) Number of items 2 13 2 1
Judgment Relatively Moderate Moderate partial Relatively
simple partial simple difficult difficult
Degree of Discrimination (0, 0.4) (0.4, 0.6) (0.6, 3)
Discrimination index
Number of items 4 4 10
Judgment Relatively Not bad Relatively good
poor
Reliability The information content curves of the test tool show left-skewed distribution. When the ability
value of the samples is distributed in [-3, 1.5], the content values of information curves are
relatively high, indicating this tool is suitable to measure the samples with the ability value
distributed in [-3, 1.5]. However, for the samples with the ability value distributed in “-3
below” or “1.5 above,” the content values of the information curves are relatively low, and the
measurement error is relatively large. This tool is suitable for evaluating samples with
moderate or relatively lower ability values.
Validity analysis a. Calibration correlation validity is high, and the test scores are significantly correlated with
academic performance, college entrance examination score, critical thinking skill self-
assessment score.
b. The test scores are significant heterogeneity among different subjects.
The present study advances the understanding of critical thinking course effectiveness in the Chinese context in two ways. First, our
study employed a learning assessment tool developed in the Chinese context, which provided an advantage in terms of validity for
cultural sensitivity (Zhang & Shen, 2018). To date, tools used to assess students’ learning have been developed in contexts outside
China (e.g., Critical Thinking Disposition Inventory, California Critical Thinking Skills Test). Though these tools were translated into
Chinese and validated with sufficient evidence, linguistic and cultural issues may affect students’ interaction with these assessment
tools (O. Liu et al., 2018). Second, though some studies conducted in China used pre- and post-test design to assess course effectiveness,
none used a quasi-experimental approach. Only a single sample is used in pre-post test design to evaluate the course effectiveness by
measuring the students’ change, such as Chen (2016). This design could not eliminate other influencing factors to the best degree,
while a quasi-experimental design can estimate the causal impact of the critical thinking courses while eliminating the influences of
other factors. For this study, we used a quasi-experimental design and a critical thinking skill assessment tool developed in China to
explore the effectiveness of critical thinking courses, including one independent course and one embedded course, using two samples
in the 2020–2021 academic year.
3. Methods
We used a sequential explanatory mixed-method design to evaluate and analyze the effectiveness of various types of critical
thinking courses. An umbrella research question guides the study: Can collegiate critical thinking courses increase students’ critical
thinking skills? More specifically, we want to understand how the various modules of critical thinking courses impact students’ critical
thinking skills in Chinese universities. To answer this question, we chose an explanatory sequential mixed-method design (Creswell &
Plano Clark, 2011) to first evaluate two critical thinking courses for their effectiveness as conducted, followed by a qualitative
sub-study to understand why the course was effective or ineffective (Century & Cassata, 2016). We also want to acknowledge that our
study focuses on understanding the skill acquisition of critical thinking but not the critical thinking disposition. An overview of our
study design is described below.
First, through the quasi-experimental design, we used a culture-sensitive critical thinking assessment tool to measure students’
critical thinking skills before and after the courses. Second, after analyzing the pre- and post-test scores of the critical thinking
assessment across the groups, we interviewed students to acquire qualitative data on their experience with the courses. Results from
both strands were synthesized to understand the effectiveness of critical thinking courses in this study.
The research question guiding the quantitative analysis was: How did students’ critical thinking skills change before and after
3
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
Table 2
Characteristics and distribution of sample 1.
Characteristics Intervention group (n = 86) Control group (n = 81) Statistics(p-value) Cramer’s V/Cohen’s d
n Percentage (%) n Percentage (%)
Note. We used a t-test and calculated Cohen’s d for continuous variables, the effect size cut-off for small, medium, and large are at 0.20, 0.50, and 0.80.
We used the chi-square test and calculated the Cramer’s V for categorical variables, the effect size cut-off for small, medium, and large are at 0.10,
0.30, and 0.50.
taking the course? This section outlines the measurement used in assessing students’ critical thinking skills and two quasi-experimental
studies focusing on an independent and an embedded critical thinking course, respectively.
3.1.1. Measures
The critical thinking skill assessment tool in this research was developed and validated in the Chinese context by our international
and interdisciplinary team, covering six subdimensions, including analytical argumentation structure, meaning clarification, analysis
evaluation argumentation and reasoning, evaluation of the implicit meaning of narrative information, assessment of the credibility of
information, and the identification of implicit assumptions (Shen, Wang & Zhang, 2019). We conducted pre-tests around different
levels of universities and colleges to improve the effectiveness of this assessment tool. Finally, the assessment tool contains 18 objective
multiple-choice items, which participants needed to complete independently within 25 min. The maximum score on this assessment
was 100 points. No reference books, computers, cell phones, or teamwork were allowed during the assessment process.
Based on classical test theory and item response theory, we tested the reliability, validity, difficulty, and degree of discrimination of
the assessment (see results in Table 1). Results indicated sufficient reliability, validity, appropriate difficulty, and a good degree of
discrimination for the assessment.
In this study, we sampled one independent course (i.e., Study 1) and one embedded course (i.e., Study 2) to understand the
effectiveness of the critical thinking courses implemented.
3.1.2. Study 1
We collected data from a university in Central China. Since 2020, this university has provided a general elective course called
Critical Thinking for College Students, accessible to all undergraduate students. The course was designed as a typical independent
critical thinking course and consisted of eight elements: (a) history and foundation of critical thinking; (b) critical thinking in reading
and reasoning; (c) concept clarification and assessment; (d) deduction: relevance, thoroughness, and prudency; (e) deduction in
science and practice: optimal choice; (f) hypothesis testing; (g) dialectics, innovation, and synthesis; and (h) critical thinking in
writing. The course was spread across eight weeks, and students met once a week. Each session consisted of 3.5 contact hours to cover
each topic listed. The instruction team included eight instructors to cover each of the eight topics.
We used a nonequivalent group design to identify participants for our intervention and control groups. The university used an
online registration system all undergraduate students could access equally. All undergraduate students were given a timeframe to use
the system and register for courses. When the number of registered students reached course capacity, registration closed. Two sections
of Critical Thinking for College Students were chosen as intervention groups. Based on student demographics, we chose students from
three classes as control groups: Mental Health for College Students, Psychology of Love, and Physical Well-Being for College Students.
We considered three criteria while choosing the control group to ensure the intervention group and control group were comparable.
First, the students’ demographics regarding grade level, discipline, gender, and age are comparable. Second, we collected data from the
same cohort, ensuring that the intervention group and the control group took classes in the same semester and the classes had the same
contact hours. Third, we confirmed that the control group’s course did not cover any critical thinking content by checking the syllabus
and confirming with the instructors. Since the study happened in a natural university teaching environment, we acknowledge that we
could not completely eliminate the influence of instructors.
All students in the control and intervention groups took the critical thinking assessment before and after the 8-week courses. After
data cleaning, we had a valid sample of 103 for pre-assessment in the intervention group and 111 for the control group. For post-
assessment, we had 103 valid samples for the intervention group and 105 for the control group. Due to attrition from missing
4
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
Table 3
Characteristics and distribution of sample 2.
Characteristics Intervention group (n = 49) Control group (n = 27) Statistics Cramer’s V/Cohen’s d
n Percentage (%) n Percentage (%) (p-value)
Table 4
Background information of interviewees.
Code of interviewees Class Gender Discipline
Note. All interviewees were recruited from intervention groups. A1-A3 recruited from Section 1, B1-B2 from Section 2 of Study 1, C1 recruited from
Study 2.
attendance and opting out of the assessment, we matched the pre- and post-assessments for both groups. Sample 1 consisted of 86
students in the intervention group and 81 in the control group. Detailed demographics and academic information are shown in Table 2.
Students in the intervention and control groups were comparable in grade level and academic discipline. Also, there was no significant
statistical difference between the two groups in their critical thinking skills in the pre-test at the α = 0.05 level, but there was a
significant statistical difference at the α = 0.1 level (0.05 < p = 0.063 < 0.1).
3.1.3. Study 2
We collected data from another university in Central China for Sample 2. The College of Education provided junior students with a
required Writing for Education Students course. The course was led by one instructor covering writing topics such as nonfiction
writing, narrative writing, dialog, character, scene, and interview. In addition to the traditional writing topics, the course covered
topics on typical critical thinking skills, such as inference, deduction, and logic fallacies. The course was spread across 16 weeks, and
each session lasted 2 h. In this embedded critical thinking course, the instructor used an active learning approach and incorporated
critical thinking elements in writing exercises, aiming to concurrently improve writing and critical thinking skills. In addition to in-
class exercises and discussions, the course adopted an online platform for student interaction, discussion, and writing exercises.
We used a nonequivalent group design to identify our intervention and control groups. We chose students from Writing for Ed
ucation Students as the intervention group and students from a section of Foreign Education History as the control group. Similarly, we
use the same three criteria articulated in Study 1 when choosing the control group. The latter course was required for junior students in
the College of Education, and the course content did not involve critical thinking elements.
We conducted the pre- and post-tests using the aforementioned assessment tool to evaluate students’ critical thinking skills. After
data cleaning, there were 49 valid samples in the intervention group and 34 valid samples in the control group for the pre-test; there
were 55 valid samples in the intervention group and 37 in the control group for the post-test. Due to students’ lack of attendance or
engagement in the assessment process, we identified more attrition from the sample. After comparing the pre- and post-test results for
both groups, our final sample included 49 valid participant results for the intervention group and 27 for the control group. Both groups
were junior students in education majors. We used a t-test and chi-square test to evaluate the homogeneity of the intervention and
control groups. As shown in Table 3, the two groups are in the same grade level and discipline, and there is no significant difference in
age and gender. In addition, there was no statistical difference in the pre-test between the two groups at the α = 0.01 level, but there
was a statistical difference between the two groups at the α = 0.05 level. Therefore, the developmental gain of critical skills needed to
be considered in the analysis of our quasi-experimental results.
Here we want to discuss the limitation of the study design and our effort to address the issue. We used the same assessment tool for
pre- and post-test. To avoid the memory effect and other potential factors that would compromise the research design, we made the
following endeavors. Firstly, we did not tell the students that there would be a post-test assessment when they were given the pre-test
assessment to reduce the motivation for participants to memorize the questions in the assessment. Secondly, the number of questions
used in this assessment is 18, and each question has an extensive description. The participants were given 25 min to complete the test
assessment, so it was very challenging for them to memorize the questions and spend time working on them afterward. Thirdly, the
assessment tool was confidential, and only the research team had access to the assessment outside the test event. Lastly, there was over
5
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
Fig. 1. The pre- and post-test scores of critical thinking skill in intervention group and control group of Sample 1.
two months’ time lapse between the pre- and post-test, and participants would not retain all their memories regarding the questions in
the assessment as time went by.
In the qualitative strand, we want to understand what influencing factors impacted the students’ experience and the courses’
effectiveness. We collected qualitative data through semi-structured interviews with six students from the intervention groups at the
end of the courses to understand further the effectiveness and influences of the critical thinking courses. We considered participants’
attendance and engagement with the course and their assessment scores to recruit a representative sample of interview participants.
The students were informed of the purpose of the interview and voluntarily participated. Demographic information is shown in
Table 4.
In the interview, we asked participants questions about their participation in the critical thinking class, information retention and
skills development after completing the class, and their general reflection on the course offering. The interview questions are listed in
Table 7 in the appendix. Using a thematic analysis approach, we explored potential influencing factors on the effectiveness of the
critical thinking courses from the quasi-experiment (Braun & Clarke, 2006). To ensure the trustworthiness of the qualitative analysis,
we attended to the quality criteria in our analytical process described by Seale (1999) and Tracy (2010). The authors regularly per
formed group discussions to ensure a consensus on the codes and themes and used appropriate quotes to support the claims. Results
from the thematic analysis provided an explanation for our quasi-experimental outcomes.
4. Findings
Fig. 1 presents the pre- and post-test results of the intervention and control group. Descriptive statistics indicate that most scores
were between 60 and 90 for both pre- and post-testing. We categorized pre-test scores into three groups: (a) high (greater than 80), (b)
medium (between 60 and 80), and (c) low (below 60). In the intervention group, the percentage of students in each group was 36.05%
(high), 53.48% (medium), and 10.47% (low); in the control group, the percentage of students in each group was 27.16% (high),
56.79% (medium), and 16.05% (low). Though there was a difference in the distribution of scores, there was no significant difference in
the means testing for the pre-assessment score between the two groups. The means were 74.354 for the intervention group and 70.576
for the control group (t = 1.871, p = 0.063). We considered the two groups homogenous for comparison analysis.
Because the intervention and control groups were irrelevant, we tested the normality and homogeneity of variance. Test results are
shown in Table 5. From the normality test, the following scores followed the assumption of normality: pre- and post-test scores of
intervention groups, pre-test scores, and the difference between pre- and post-test scores of the control groups, whereas the difference
between the pre- and post-test scores of the intervention group and the post-test score did not follow a normal distribution. Post-test
6
Q. Zhang et al.
Table 5
Comparative analysis of the test scores of collegiate critical thinking skills in intervention group and control group of sample 1.
Test Intervention group (n = 86) Control group (n = 81) Homogeneity of variance t-test Effect size
Analysis of the M Normality test Analysis of the M Normality test
Pretest 74.354 11.533 0.120 0.211 0.128 70.576 14.472 0.071 0.851 0.182 0.040 1.871 0.063 0.2897
Posttest 74.225 12.288 0.149 0.963 0.342 71.948 15.161 0.000 0.061 0.001 0.057 1.069 0.287 0.1655
Score gap − 0.129 10.296 0.016 0.157 0.028 1.372 12.157 0.284 0.117 0.155 0.132 − 0.863 0.390 0.1336
Note. a. In the normality test, when p > 0.05, we accepted null hypothesis; b. In the homogeneity of variance test, when p > 0.05, we accepted the null hypothesis.
Fig. 2. The pre- and post-test scores of critical thinking skill in the intervention group and control group of Sample 2.
scores and the difference between pre- and post-test scores for the control and intervention groups met the criteria of homogeneity,
whereas the pre-test scores did not. Therefore, we used a t-test to evaluate the effectiveness of the independent critical thinking course.
We used the classical statistic test method to assess the effectiveness of the independent critical thinking course. First, we performed
a t-test for the control and intervention groups’ post-test scores and found no significant difference in students’ critical thinking skills
between the intervention group and the control group after the semester-long course (p = 0.287). Then, we performed t-test for the
difference between pre-test and post-test scores of both groups and found no difference between intervention and control groups on
students’ score change between pre-test and post-test (p = 0.390). Lastly, we performed a paired t-test for pre- and post-test scores for
the intervention group and found no significant change between pre- and post-test scores for the intervention group (p = 0.908) with a
very small effect size of 0.0125. These three test results indicate that the independent critical thinking course did not have a significant
impact on college students’ critical thinking skills development.
Fig. 2 shows the pre- and post-assessment scores of the intervention and control groups in Sample 2. In the pre-assessment, 40.81%
of students in the intervention group and 59.26% in the control group scored below 60. In the post-assessment, only 14.29% of students
in the intervention group scored below 60, whereas 44.44% of the control group scored below 60. The mean difference in pre-
assessment scores between the two groups was significant (p = 0.024), suggesting the two groups differed at the 0.05 significance level.
Similar to the statistical treatment of Sample 1, we tested the normality and homogeneity of the intervention and control groups’
pre- and post-test scores (see Table 6). Then, we performed a t-test for post-test scores between the intervention and control groups.
Results showed there was a significant difference observed between the two groups at the end of the semester at the 0.01 significance
level (p = 0.000), with a large effect size of 1.1326. Then, we performed a t-test for the pre- and post-test score differences for both the
intervention and control groups. Results showed that the pre- and post-test score difference between the two groups was almost
significant at the 1% significance level (p = 0.109). Lastly, we performed a paired t-test for the pre- and post-test scores for the
intervention group and found a statistically significant difference, which suggested the critical thinking course tested had a positive
influence on students’ critical thinking skill development. To summarize the results of the tests in Study 2, the embedded critical
thinking course in the experiment increased students’ critical thinking skill development.
While exploring the mechanism of critical thinking course effectiveness, we identify one possible underestimated and long-lasting
gain of taking a critical thinking course that students appreciate and advocate for learning and practicing critical thinking skills. We
also found the potential factors impeding the effectiveness of critical thinking courses, including a lack of opportunities to use critical
thinking skills to solve problems, and ineffective course logistics, elaborated in the sections below.
8
Q. Zhang et al.
Table 6
Comparative analysis of the test scores of collegiate critical thinking skill in the intervention group and control group of sample 2.
Intervention group (n = 49) Control Group (n = 27) Homogeneity of variance t-test Effect size
Analysis of the M Normality test Analysis of the M Normality test
9
4.3.1. Theme 1: increased awareness of the importance of critical thinking skills and deliberate practice
Although students in the intervention group had varying levels of exposure to critical thinking concepts and skills prior to the
classes, all participants agreed that college students, regardless of academic background, should take courses to gain critical thinking
skills. Ailin, Anyang, and Bailin all commented on the transferability of critical thinking skills across academic backgrounds. After
taking the independent critical thinking course, Ailin viewed critical thinking as a tool to use in her field of study to integrate
interdisciplinary knowledge, stating, “Critical thinking is a tool, and it could be helpful for integrating knowledge within my field of
study. I’m also assuming it will be pretty useful to integrate knowledge across various disciplines.” Though students viewed critical
thinking skills as transferrable across disciplines, they emphasized the learning process of mastering the application of critical thinking
skills. Anyang shared, “Critical thinking is broadly applicable, but learning how to apply it to different fields needs further knowledge
about the topic. It’s important to know how to use critical thinking skills than just knowing the concepts.” Impressed by the usefulness
of critical thinking skills, Bailin became an advocate for teaching critical thinking even before college. Bailin stated, “I would like to see
everyone developing critical thinking skills. It is such a fundamental skill, just like learning how to write…It will be great if we can start
learning this in elementary school.”
Anming and Anyang commented on the importance of development of critical thinking skills and application for various pop
ulations. Anming thought that taking the critical thinking course had a long-term impact on how he approached problem-solving in
various scenarios. He said, “The course taught me how to develop my own view towards different topics, or solutions towards different
problems. I believe this will leave a further impact on how I approach things in the future.” Going beyond his personal experience,
Anyang provided his view on how critical thinking is crucial for people who need to solve complex problems, such as researchers. He
said, “For decisions in research settings or complicated social issues, people need to develop solutions in a very complicated situation.
Critical thinking skills are definitely needed in solving those complex problems.”
All interview participants in this study agreed that critical thinking development is a long-term effort and requires deliberate
practice. Ci from Study 2, with the embedded course design, said she learned writing and critical thinking skills but commented,
“Learning in class is not enough for skill development, and it might take a while for me to master this through constant practice.”
Similarly, Ailin from Study 1, the independent course, shared that learning critical thinking skills is not a linear process and involves
internalization and application of the skills, saying:
I think the growth of critical thinking skills does not happen right after taking a class. It seems to me like a long process and
requires intentional practice in daily life. For example, there might be a moment long after class where you realize, “Oh, I can
use this technique learned from my critical thinking course.” Then you just apply it to solve problems in real life.
Learning to think critically is a long process, and it seemed the 8-week class did not provide enough time for students to internalize
all the skills learned, but the 16-week class showed better results. On the other hand, regardless of the course length, taking a critical
thinking class can increase students’ awareness of the importance of critical thinking skills and plant a seed for them to use the ac
quired critical thinking skills after the class.
4.3.2. Theme 2 – lacking opportunities to apply critical thinking skills to problem solving and getting feedback impedes effective learning
Students appreciate opportunities to apply critical thinking skills to scenarios in real–life and academic settings. However, in the
interview, students provided feedback that there were not enough opportunities to apply critical thinking skills in and outside class,
and they reflected on how the missing elements influenced their learning experience negatively. First, students appreciated in-class
interaction and practice when the interaction was active and meaningful, such as students answering questions, instructors explain
ing misunderstood concepts, and group discussion. Anming observed student-teacher interaction, saying:
The teacher will propose a question and some students will volunteer to answer, or the teachers would call on us . . . not only get
feedback on our answers, but we also learned the process to apply critical thinking skills to solve those problems and observed
the differences in the approaches.
However, most participants from Study 1 shared that there were not many opportunities to interact in class. As Anyang shared,
“The teacher would ask questions in class, but students were not so active to participate since it was a large class and no small groups.”
Similarly, Bailin shared, “Most of the time it was the teacher lecturing, of course, the lecture was really good because there were many
great examples, but mostly, it was just lecture.”
Second, students recognized the importance of practicing applying critical thinking through homework assignments but considered
this format of application not effective due to inadequate understanding of the course content and the missing of in-depth feedback on
homework. Though a few participants in Study 1 mentioned the homework workload was not light, considering it was an elective
course, most participants agreed that opportunities to practice strengthened their knowledge gained in class. Bailin commented on
how homework and in-class exercises gave them opportunities to grasp the course content better through reflection:
We were given homework to write a short essay after every class. For example, the teacher of the second class gave us a case
study for homework . . . it would be best if we could practice using critical thinking in real life when this [using in real life] is not
feasible, I think a case study is the best alternative . . . it took me a while to finish the homework for each class.
Third, though the debate activity in Study 1 provided a positive learning experience for students who were ready for high-level
engagement, students who did not develop enough critical thinking skills found the activity not helpful. For example, Anming
actively participated in the debate exercise and considered this learning activity helpful. He shared:
10
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
For some important topics . . . when I’m thinking on my own, I might have some limitations. But when there’s a debate, I get to
hear contrasting ideas, which helps me to generate new ideas or complete my premature thoughts.
Bing reflected on his lack of commitment and preparation for the debate, thus not gaining much. He shared, “Yes, I opted in for the
debate, but few people really prepared. So, the debate was not very fruitful. Maybe it’s we are not good enough or something.”
Students savored the interactive activities in class and regretted not having enough of these opportunities through the class, which
indicated the major missing elements of the independent critical thinking course in Study 1: in-class interaction and practice for
problem-solving. Although homework and the debate activity provided an opportunity to learn to apply critical thinking skills, these
opportunities were ineffective when there was no opportunity to get constructive feedback on homework or low-quality interaction in
the debate exercise.
4.3.3. Theme 3 – better pedagogy and logistics needed: critiques on the course design regarding class size, schedule, and course content
The last theme addressed critiques of the course design. In Study 1, each course session lasted for 3.5 h, which led to students feeling
tired, and they found it hard to concentrate in the last hour of the class. This sentiment was shared by Bing, who said, “I would [actively
participate in the class], but when it comes to the later part of each class, I feel tired. The 3.5-hour class is too long.” The class size also
played an important role in students’ perceived learning outcomes. When the class size was too large to provide opportunities for
effective instructor-student interaction, students felt they did not learn as much as they wished. Anyang enrolled in a section with over
100 students and constantly felt a lack of motivation to engage with the course. He said:
There was no group discussion due to the class size . . . I think 20 to 30 would be a good number for the class size so that the
teachers can use more interactive activities to engage everyone and practice using critical thinking skills.
The instructional team also asked for students’ reflections on the pros and cons of having an instructional team of teachers from
various academic backgrounds. Although students appreciated teachers’ diverse backgrounds, there was a concern shared about the
lack of consistency of the materials taught, and they missed opportunities for the in-depth discussion with teachers in a single
–instructor course. Bailin shared how this arrangement of multiple instructors impacted their learning in positive and negative ways:
It’s basically all good to me. Critical thinking requires us to view things from different perspectives. When there was more than
one instructor, we got to experience examples from various disciplines and different perspectives… But some of the course
content requires consistency, and each topic is built upon the previous one; multiple teachers made it feel inconsistent when
learning those fundamentals. I personally don’t think it’s a big problem to get used to different teachers’ teaching styles.
In Study 1, as multiple instructors were teaching the course, students expressed it was difficult to engage with instructors for further
discussion. For example, Anming shared:
Sometimes, I realize that I might want to have a further discussion on certain topics with the professor from the last session, but
since a new professor was covering a different topic every week, it was not very convenient to trying reach out to the last
professor.
On the contrary, Ci from Study 2 complimented the process of rapport building between students and the instructor: “We got to
know the teacher pretty well, and the teacher was familiar with our learning progress. Our communication and interaction were great.”
The missing instructor-student interaction seemed detrimental to students’ engagement, but the positive instructor-student interaction
led to strong student engagement.
Also, when interview participants shared their opinions on whether the critical thinking course should be made a required course,
they used critical thinking skills and reached this consensus: All college students can benefit from a critical thinking course. As stu
dents’ motivation to take a critical thinking course varies, there should be critical thinking courses with various instructional designs to
meet students’ diverse needs. For example, Anming believed the university should not push all students to take a critical thinking
course: “I think the course could remain an elective course and only those who are ready to learn can benefit the most. When students
have similar motivation levels, they are more like to learn together.” In addition to the suggestion of creating courses for learners with
various previous exposure and motivational levels, participants also reflected on the course content and concluded there should be an
upper spiral structure of courses where they can take different levels of course when they are ready.
Through a quasi-experimental design and an objective critical thinking assessment tool created and validated in the study context,
we found that the independent critical thinking course in the sample did not significantly impact students’ critical thinking skill
development, but the embedded course did. The qualitative strand in this sequential explanatory mixed-method design provided
insights on the potential factors associated with the lack of effectiveness of the independent critical thinking course (e.g., lack of in-
class interaction and lack of feedback for homework), which was found related to large class size, long lecture hours, and the lecture-
focused pedagogy.
We identified a core element of effective course design for critical thinking courses: dissecting the thinking process in an application
scenario within students’ knowledge domain. More specifically, students need to observe how experienced teachers and peers apply
critical thinking skills, and they also need constructive feedback on their thought processes when applying critical thinking skills for
problem-solving. This is consistent with the learning theory proposed by Facione (1990) that students need to learn critical thinking
11
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
through application, as well as empirical findings that advocate using performance tasks and authentic problems for student partic
ipation (Abrami et al., 2015; Cargas, Williams & Rosenberg, 2017; Dekker, 2020). Although applying critical thinking skills in
problem-solving is necessary, our findings also confirmed that students need a lecturing component to understand the concepts and
observe the instructor modeling critical thinking skills (Xia & Zhong, 2017). The effectiveness of the embedded course in Study 2
supported that the disciplinary knowledge and critical thinking skills could be taught as separate course content but needs some
integration (e.g., designing homework that required students to use critical thinking skills). An effective critical thinking course is
defined by its effective teaching elements.
When teachers design interactive course elements for students to use critical thinking skills for problem-solving, they should
consider the target students’ knowledge domain when designing activities and homework. Students have an easier time performing
critical thinking activities in a subject area with prior training and knowledge. This allows them to focus on practicing the mechanism
of critical thinking instead of using their attention and energy to understand the topic and context. This finding was consistent with
Wu, Li and Hong (2014) proposal that students learn critical thinking skills (e.g., writing and reading) that need practice in everyday
life. In the interviews, although students enjoyed learning from multidisciplinary teachers to deepen their understanding of critical
thinking concepts, doing homework on topics outside their knowledge domain adds to the difficulty of homework completion.
Though our study design focuses on skills but not the disposition of critical thinking development, Theme 1 from the qualitative
analysis suggested a stronger awareness of its importance, indicating a stronger disposition towards critical thinking. Answering Tian
and Low’s (2011) call for more empirical studies on Chinese students’ critical thinking development, our findings suggest that Chinese
students can develop critical thinking when appropriate training opportunities are provided.
We propose some considerations for course designers with both teaching modalities based on the findings. A core principle for
course design of either modality is to create opportunities for effective exercises where all students can engage and get feedback on
their application of critical thinking skills. Here are some specific suggestions for designing independent and embedded critical
thinking courses.
For independent critical thinking courses, understanding students’ academic backgrounds and motivation levels can help in
structors understand how to pace the course and decide the amount of content covered to facilitate the best learning outcome.
Although exposing students to case studies in various disciplines can broaden and deepen their understanding of the transferability of
critical thinking skills, it might be challenging for them to practice critical thinking skills deliberately without a familiar context to
solve problems (i.e., they already have some topical knowledge or expertise). Also, an independent critical thinking course could be
offered as a series that covers different aspects and techniques in applying critical thinking skills or it could be designed at different
difficulty levels so students can choose the courses according to their existing knowledge.
For embedded critical thinking courses, instructors need to bring up critical thinking concepts and techniques explicitly. An
advantage of the embedded critical thinking course is the in-depth course teaching on a disciplinary topic, which provides students
with a ready-to-play environment to apply critical thinking skills and monitor their learning progress. Instructors can use the course
space to model how critical thinking is used in the disciplinary context. Also, incorporating critical thinking elements into the exercises
can allow students to strengthen their learning of the course content.
Regardless of the course module, a small class size that promotes interaction and relationship building between instructor and
students is preferred. This environment allows students to practice using critical thinking skills and getting constructive feedback,
which may contribute to effective critical thinking learning. This suggestion is consistent with Dekker’s (2020) finding that a small and
close-knit community enables students to express their thoughts openly. Although Study 1 and 2 did not show statistical significance in
the increase of critical thinking skills from the objective assessment, interviewees from both Study 1 and 2 expressed increased
awareness of critical thinking courses and an intention to practice problem-solving outside of class. We consider this an unattended and
undervalued learning outcome when assessing the effectiveness of critical thinking courses, considering the importance of critical
thinking skills in college students’ development. As critical thinking skill development is a lifelong process, it is meaningful for ed
ucators to propose course designs to strengthen this learning outcome.
6. Conclusion
Though there have not been consistent findings on the effectiveness of critical thinking courses, studies from various countries
using different course modalities have shown that critical thinking skills can be taught and learned in college. Opportunities to apply
critical thinking skills to problem-solving, in-class interaction among students and instructors for constructive feedback, and teaching
critical thinking skills in disciplinary-based courses can all promote students’ critical thinking skills development. We also propose that
future educators and researchers explore the potential underestimated long-term gain of taking a critical thinking course through
longitudinal observation and study.
Disclosure statement
Qinggen Zhang: Funding acquisition, Conceptualization, Methodology, Formal analysis, Writing – original draft, Writing – review
& editing. Huanli Tang: Investigation, Formal analysis. Xinrui Xu: Methodology, Formal analysis, Writing – original draft, Writing –
12
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to
influence the work reported in this paper.
Funding
This research was supported by funds from the Youth Project of the National Natural Science Foundation of China under Grant
71904054 (The effectiveness and implementation mechanism of undergraduates’ critical thinking courses: A longitudinal research
based on quasi-natural experiments); Key Project of Educational Science Planning in Hubei Province under Grant 2020GA004
(Research on the relationship between undergraduate academic performance and value-added of critical thinking skills); Double First
Class Funds for Humanities and Social Sciences from Huazhong University of Science and Technology (Think Tank and Social Services
Project).
Appendix
See Table 7
Table 7
Interview protocol.
Sections Interview questions
Individual participation in the class 1 Why did you decide to enroll in the course? (If applicable)
2 What was the size of your class?
3 How was the attendance rate of the course? What about your attendance?
4 How were the learning outcomes assessed in this course? (e.g., homework/exam/
project)
Information retention and skills development after 1 Can you recall what topics were covered in the class༟
completion of the course 2 What was your general impression of the teaching quality of the course?
3 What’s your understanding of critical thinking? How has your idea of critical thinking
changed after taking the course?
4 What are some learning outcomes you perceived in terms of knowledge? What about
skills and competency?
Reflection on the course offering 1 Would you agree that the instructors guide the students to use critical thinking to solve
real life problems?
2 Would it be better if the course is offered as a required or elective?
3 Who should take the course? Should it be offered to a specific major or grade level?
4 What would be an ideal class size for this course?
5 What would be the best composition of instructor background? Multidisciplinary?
6 How would you assess the intensity of the course? What’s your overall impression on the
course?
References
Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., & Persson, T. (2015). Strategies for teaching students to think critically: A meta-
analysis. Review of Educational Research, 85(2), 275–314.
Behar-Horenstein, L. S., & Lian, N. (2011). Teaching critical thinking skills in higher education: A review of the literature. Journal of College Teaching and Learning, 8
(2), 25–41.
Bernard, R. M., Dai, Z., Abrami, P. C., Sicoly, F., Borokhovski, E., & Surkes, M. A. (2008). Exploring the structure of the Watson–Glaser critical thinking appraisal: One
scale or many subscales? Thinking Skills and Creativity, 3(1), 15–22.
Bezanilla, M. J., Fernández-Nogueira, D., Poblete, M., & Galindo-Domínguez, H. (2019). Methodologies for teaching-learning critical thinking in higher education:
The teacher’s view. Thinking Skills and Creativity, 33, Article 100584.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3(2), 77–101.
Cargas, S., Williams, S., & Rosenberg, M. (2017). An approach to teaching critical thinking across disciplines using performance tasks with a common rubric. Thinking
Skills and Creativity, 26, 24–37.
Century, J., & Cassata, A. (2016). Implementation research: Finding common ground on what, how, why, where, and who. Review of Research in Education, 40(1),
169–215. https://doi.org/10.3102/0091732X16665332
Chen, L. (2017). Understanding critical thinking in Chinese sociocultural contexts: A case study in a Chinese college. Thinking Skills and Creativity, 24, 140–151.
Chen, Y. (2016). Teachers’ questioning and the cultivation of learners’ critical thinking ability. Foreign Languages and Their Teaching, 2, 87–96.
Chen, Z. (2014). The teaching modes of critical thinking and its practical significance. Journal of Higher Education, 35(9), 56–63.
Coles, M. J., & Robinson, W. D. (1989). Teaching thinking: A survey of programmes in education. Bristol, England: The Bristol Press.
Costa, A. L. (1985). Developing minds: A resource book for teaching thinking. Alexandria, VA: ASCD.
13
Q. Zhang et al. Thinking Skills and Creativity 45 (2022) 101105
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage publications.
Dai, H. (2010). Psychometrics (pp.69,73,286). Beijing: Higher Education Press.
Dekker, T. J. (2020). Teaching critical thinking through engagement with multiplicity. Thinking Skills and Creativity, 37, Article 100701.
Dewey, J. (1910). How we think (p. 6). Boston, MA: D.C Heath.
Djamàa, S. (2018). From book to screen: Adopting cinematic adaptations of literature in the EFL classroom to hone students’ critical thinking skills. Computers in the
Schools, 35(2), 88–110.
Dong, Y. (2014). What kind of critical thinking courses should we teach? Industry and Information Technology Education, 2(3), 36–42.
Eunyoung, C., Ruth, L., & Yeoungsuk, S. (2014). Effects of problem-based learning vs. traditional lecture on Korean nursing students’ critical thinking, problem-
solving, and self-directed learning. Nurse Education Today, 34(1), 52–56.
Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations
prepared for the Committee on Pre-College Philosophy of the American Philosophical Association. ERIC Doc.#, 315423.
Facione, P. A. (2000). CCTDI test manual. Millbrae, CA: The California Academic Press.
Fisher, A., & Scriven, M. (1997). Critical thinking: Its definition and assessment. CA: Point Reyes.
Gu, Y., Liu, Z., & Gu, M. (2021). Teaching critical thinking through debate games: A case study of journalism and communication course. Higher Education Development
and Evaluation, 37(2), 105–114.
Hitchcock, D. (2012). Critical thinking as an educational idea. Journal of Higher Education, 33(11), 54–63.
Howie, D. (2011). Teaching students thinking skills and strategies. London: Jessica Kingsley Publishers.
Huang, C. (2010). Strengthening critical thinking, cultivating creative talents. Educational Research, 31(5), 69–74.
Huang, C. (2019). A study on curriculum design of university critical thinking from the perspective of general education curriculum. Shanghai: Shanghai Normal University.
Janssen, E. M., Mainhard, T., Buisman, R. S. M., Verkoeijen, P. P. J. L., Heijltjes, A. E. G., van Peppen, L. M., et al. (2019). Training higher education teachers’ critical
thinking and attitudes towards teaching it. Contemporary Educational Psychology, 58, 310–322.
Li, Y., & Shi, Y. (2020). An experimental study on the effectiveness of college English PBL model teaching in cultivating undergraduates’ critical thinking ability.
Higher Education Exploration, 36(7), 73–79.
Lipman, M. (1988). Critical thinking: What can it be? Educational Leadership, 46(1), 38–43.
Liu, O. L., Shaw, A., Gu, L., Li, G., Hu, S., Yu, N., et al. (2018). Assessing college critical thinking: Preliminary results from the Chinese HEIghten Critical Thinking
assessment. Higher Education Research & Development, 37(5), 999–1014.
Liu, T., Yu, X., Liu, M., Wang, M., & Yang, X. (2021). A mixed method evaluation of an integrated course in improving critical thinking and creative self-efficacy
among nursing students. Nurse Education Today, 106(4), Article 105067.
Loyalka, P., Liu, O. L., Li, G., Kardanova, E., Chirikov, I., Hu, S., et al. (2021). Skill levels and gains in university STEM education in China, India, Russia and the United
States. Nature Human Behaviour, 5(7), 892–904.
Luo, Z. (2012). Theoretical basis of item response. Beijing: Beijing Normal University Press.
Manalo, E., Kusumi, T., Koyasu, M., Michita, Y., & Tanaka, Y. (2013). To what extent do culture-related factors influence university students’ critical thinking use?
Thinking Skills and Creativity, 10, 121–132.
McPeck, J. (1981). Critical thinking and education. Oxford: Robertson.
Murphy, K., R., & Davidshofer, C., O. (2004). Psychological testing: Principles and applications. Upper Saddle River, NJ: Pearson, 6thed.
Norris, S., & Ennis, R. (1989). Evaluating critical thinking. Pacific Grove, CA: Midwest Publications.
Olga, K. (2016). Thinking different as a method to educate students in humanities. Educational Alternatives, 14(1), 429–436.
Partnership for 21st Century Skills. (2019). Framework for 21st century learning definitions.http://static.battelleforkids.org/documents/p21/p21_framework_
definitionsbfk.pdf.
Penningroth, S. L., Despain, L. H., & Gray, M. J. (2007). A course designed to improve psychological critical thinking. Teaching of Psychology, 34(3), 153–157.
Peruza, Z., Assilbayeva, F., Saidakhmetova, L., & Arenova, A. (2021). Psychological and pedagogical foundations of practice-oriented learning of future STEAM
teachers. Thinking Skills and Creativity, 41, Article 100886.
Rowe, M. P., Gillespie, B. M., Harris, K. R., Koether, S. D., Shannon, L. Y., & Rose, L. A. (2015). Redesigning a general education science course to promote critical
thinking. CBE–Life Sciences Education, 14(3), 1–12.
Rupp, A. A., & Zumbo, B. D. (2006). Understanding parameter invariance in unidimensional IRT models. Educational and Psychological Measurement, 66(1), 63–84.
Seale, C. (1999). Quality in qualitative research. Qualitative Inquiry, 5(4), 465–478.
Seybert, A. L., & Kane-Gill, S. L. (2011). Elective course in acute care using online learning and patient simulation. American Journal of Pharmaceutical Education, 75
(3), 1–5.
Shen, H., Wang, Y., & Zhang, Q. (2019). Development and test of national assessment of collegiate capacity of critical thinking. Journal of Higher Education, 40(10),
65–74.
Smith, F. (1992). To think in language, learning and education. London: Routledge.
Smith, L., Gillette, C., Taylor, S. R., Manolakis, M., Dinkins, M., & Ramey, C. (2019). A semester-long critical thinking course in the first semester of pharmacy school:
Impact on critical thinking skills. Currents in Pharmacy Teaching and Learning, 11(5), 499–504.
Smith, T. E., Rama, P. S., & Helms, J. R. (2018). Teaching critical thinking in a GE class: A flipped model. Thinking Skills and Creativity, 28, 73–83.
Sternberg, R. J. (1990). Thinking styles: Keys to understanding student performance. Phi Delta Kappan, 71(5), 366–371.
Su, J. (2021). Thinking: The core competence of academic writing. News and Writing, 38(8), 104–107.
Swartz, R., & Perkins, D. (1990). Teaching thinking: Issues and approaches. Pacific Grove, CA: Midwest Publications.
Tian, J., & Low, G. D. (2011). Critical thinking and Chinese university students: A review of the evidence. Language, Culture and Curriculum, 24(1), 61–76.
Tracy, S. J. (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry, 16(10), 837–851.
Wu, H. (2003). Critical thinking and logic education. Journal of Yanan University (Social Sciences Edition), 25(1), 19–23.
Wu, Y., Li, C., & Hong, Z. (2014). On the teaching mode of cultivating critical thinking. E-education Research, 35(11), 71–77.
Xia, H., & Zhong, B. (2017). On influencing factors and cultivation strategies of university students’ critical thinking. Educational Research, 38(5), 67–76.
Yang, M., & Yang, J. (2015). Medical humanity content-based English curriculum for the cultivation of medical academic competencies: The case of SUMC. Medical
Education Management, 1(4), 243–249.
Yao, L. (2001). Foreign research and enlightenment on teaching promoting the development of college students’ critical thinking. Higher Education of Science, 8(5),
18–21.
Yu, Y., & Gao, S. (2017). Cultivation mode of critical thinking of American college students and its enlightenment. Modern University Education, 33(4), 61–68.
Zarei, A., Mojtahedzadeh, R., Mohammadi, A., Sandars, J., & Emami, S. A. H. (2021). Applying digital storytelling in the medical oncology curriculum: Effects on
students’ achievement and critical thinking. Annals of Medicine and Surgery, 70, Article 102528.
Zhang, H. (2016). On the connotations of a key competence. Global Education, 45(4), 10–24.
Zhang, Q., & Shen, H. (2018). Undergraduates’ ability of critical thinking and its added value in first-class universities: An empirical study based on the assessment of
undergraduates in China’s 83 universities. Educational Research, 39(12), 109–117.
14