FIE2013 Expectations CameraReady-KarlEdits
FIE2013 Expectations CameraReady-KarlEdits
net/publication/261080795
CITATIONS READS
4 1,603
12 authors, including:
All content following this page was uploaded by Karl R. B. Schmitt on 10 June 2016.
xiii xiii
Elise A. Larsen¶ , Andrea A. Andrewk , Mara R. Doughertyk∗∗ , Matthew W. Miller†† ,
xiii x xi xiii xii xiii
Artesha C. Taylor‡‡ , Breanne Roberston , Alexis Y. Williams , and Spencer A. Benson‡
∗ ECE Dept. , † Mathematics Dept. , ‡ Center for Teaching Excellence, § Dept. of Family Science, ¶ Dept. of Biology,
k Dept. of Chemistry & Biochemistry, xi Dept. of Human Development, xii Dept. of Cell Biology & Molecular Genetics,
†† Dept. of Kinesiology, ‡‡ Dept. of Communication, x Dept. of Art & Art History
∗ [email protected], † [email protected], ‡ [email protected], xii [email protected]
∗∗ American University, †† Auburn University, x Wesleyan University, xiii University of Maryland, College Park, MD 20742
Abstract—Students begin each new course with a set of ex- ducing computing to curricula at different levels and stages.
pectations. These expectations are formed from their experiences These initiatives originate from the reality that computational
in their major, class level, culture, skills, etc. However, faculty technologies have become an integral part of our daily lives all
and the students are not on the same page with respect to over the world in every aspect of our lives. Other efforts that
expectations even though students get course syllabi. It is crucial aim at improving science education such as Anderson et al.
for faculty to understand students’ expectations for maximiz-
ing students learning, satisfaction, and success. Furthermore, it
[2] have made recommendations for improving general science
would ensure classroom transparency. There would be no hidden education including adopting techniques already supported by
unstated expectations; disappointments during the course can evidence of improved learning.
potentially be minimized. We present the results of a survey The two thrusts above, namely, the need and importance
focused on understanding student expectations. Specifically, we of integrating computing into current and future curricula in
will focus on examining the differences in expectations of the the K − 12 realm and at colleges and universities and the
students of Computer Science and Engineering (CSE) courses
goal of producing more STEM graduates, mean that there is
and non-computing STEM courses. We present our analysis and
observations of the results using aggregate data for all students a larger number of students who are expected to be enrolled
at all class levels. We observe various interesting differences and in colleges in STEM fields and they will have more exposure
similarities among the STEM fields. Identifying differences is to computing. If these efforts succeed, Computer Science &
crucial since many non-computing STEM majors are enrolled Engineering faculty should expect more students from other
in computing courses especially in the lower level courses. We fields, specifically other STEM students, in their classes. It is
provide a detailed comparison among sophomore and senior level important to understand and address the needs and expectations
courses in computing, biology and chemistry courses. We also of this new category of students not only for their own growth,
compare sophomore and senior CSE courses. Lastly, we discuss but also for the future of computing in general. Understanding,
the important of paying attention to all students’ needs and contemplating, and acting upon students’ expectations will
expectations. Armed with this knowledge, faculty members can
ultimately maximize the efficiency of the time the students are
increase transparency in the classroom, student satisfaction, and
possibly student retention. in the classroom and will guarantee that the learning goals are
achieved. We may even interest a good portion of the students
I. I NTRODUCTION to transition to a computing degree or even to double major.
Without spending time in understanding our students and their
Education is a national interest and President Obama has expectations, it will be very difficult to move forward or to
said “If we want America to lead in the 21st century, nothing achieve these goals.
is more important than giving everyone the best education
possible from the day they start preschool to the day they In this paper, we introduce a survey conducted with 816
start their career” [17]. The President’s Council of Advisors students in 25 different STEM courses that addresses the need
on Science and Technology (PCAST) in its 2012 report to the for examining the students’ expectations at the beginning of
president have an ambitious goal of producing one million ad- any individual course. Results are presented alongside data
ditional college graduates with degrees in the different Science, from two specific computing courses.The rest of the paper
Technology, Engineering, and Mathematical (STEM) fields is organized as follows: Section 2 discusses the related work
[18]. Computer and Computational Sciences and Engineering and provides an overview of literature on student expectations.
have a significant impact on almost all other STEM fields. Section 3 describes the development and content of the survey
It is a vital and crucial foundation to be built upon to other tool we piloted. Section 4 presents the results from the pilot
STEM fields [28]. This translates to a desire to push for multi- with a variety of breakdowns. Section 5 is a discussion of
disciplinary connections between computing fields and other the impact such a tool can make. Section 6 describes future
STEM fields. Projects such as Engaging Computer Science directions of research are discussed.
in Traditional Education (ECSITE) [9], [10], developed cur- II. R ELATED W ORK
ricular units that are embedded into non-computing K − 12
classes aiming to bring computing to K − 12 students, and the Understanding students and their expectations is not a
special session chaired by U. Wolz and L. Cassel to promote new idea, a large volume of work exists in the literature
and discuss interdisciplinary computing [5], are two efforts that investigated these ideas. For example, Tricker [25] tried
that show the existence of initiatives in improving and intro- to find the reasons behind the changes happening in student
expectations over a period of three decades. The most relevant, aspects of our expectations study that we have conducted at
closest, and most prominent work is the National Survey UMD as part of a year-long fellowship. In [22], we discuss the
of Student Engagement (NSSE) [13]. NSSE assesses how survey and its class specific use in biology courses. In [21], we
students are engaged in learning, what practices affect this, slightly touched on some of the issues pertaining to computer
and several other measures. It focuses on institutional-based science versus STEM in terms of expectations.
measures. It only collects data from freshmen and graduating
students. Likewise, Sander et al. [20] introduce a survey All in all, there is a significant volume of work on
called University Students’ Expectations of Teaching (USET), expectations. However, we did not find work that is focused
measuring incoming students’ hopes, dislikes, and expectations enough to be used in redesigning courses either to meet student
for teaching and learning methods. These learning methods are expectations or to clarify why something is not perceived well.
general pedagogical choices such as regular formal lectures,
group work, or private study. The limitation of Sander’s work III. D EVELOPING THE S URVEY (M ETHODS )
is shared with NSSE’s. Both promise a comprehensive course
design but cumbersome and lacking the necessary details for A. Motivation
an instructor who wants to address a specific class. Another To guarantee both success and satisfaction for students in
limitation with Sander’s work, compared to our work, is a course, we have to understand expectations of our students.
that courses surveyed are only in business, psychology, and A mismatch between student expectations and what happens
medicine. in the classroom can have serious negative consequences [12].
Trudeau and Barnes [26] focus on identifying and rank- Lang in “The Chronicle of Higher Education” [14] discusses
ing the ‘teaching dimensions’ that students value the most classroom transparency where openness is a key factor in
and the least. These dimensions, taken from regular course mutual understanding and success between faculty and their
evaluations, again address ideas like “Knows Material” or students. The University of Illinois, Urbana-Champaign has
“Instructor Availability”, without giving concrete knowledge developed an initiative [27] to increase transparency in the
about what students expect to happen exactly in a course. classroom. The initiative provides a platform for collecting
Furthermore, it did not give the students free space where they results from testing different course changes, but only analyzes
can freely express their views in plain text not just limiting the the affect at the end of a semester. It is important to increase
choices to course evaluation contents. Several studies aim to transparency and close the expectations mismatch gap between
determine the effects from cultural differences in international faculty and the students.
[23], first-generation and traditional students [6] or working-
class students [15]. Additional studies have focused on the Our goal in this work is to create a resource for faculty that
expectations that surround technology [8], [16]. could inform them of what students were expecting in a course
and open up the floor with our students in a dialogue about
There is real concerns, a lot of initiatives, and research what faculty themselves expected and planned to utilize in a
done in trying to increase enrollment in computing. Adams course. The survey was created as an adaptable, customizable,
and Pruim [1] recognize the lack of supply of graduates with portable, and generic tool that can be used in any course on
advanced computing skills and suggest ways to encourage non- any university campus. Finally, we believe that the length of
CS majors to take CSE courses. They present strategies to the survey can hurt participation and completion if it is too
attract science students to take additional CSE courses. long. We decided to make the survey as short as we possibly
can make it, while still determining courses expectations. The
Our work is complementary since we are trying to provide
design goal was to create something that would take only a few
and illustrate the differences and similarities between the
(5 − 10) minutes to finish. In summary our design goals were:
expectations of computing majors and non-computing STEM
1. Collect information directly related to a specific course;
majors which can be used to pitch computing courses to non-
2. Examine course components that were applicable to any
computing STEM students.
course; 3. Focused content to be easily and quickly completed,
In other publications, [21], [22] we have discussed different but still relevant for instructors.
40%
20%
0%
Written
Individual
Homework
participation
Projects
Exams
e-Textbooks
Clickers
guides
discussions
Textbooks
Media
Management
PowerPoint
outside office
Know your
Essay-based
MCQs
Social
Study
Papers
Group
Projects
Accessible
Learning
groups
Small
Exams
name
System
points
Class
hrs
Learning Activities
Technology Learning Assessments
Computing Iconic Pedagogy Instructor Interaction
Fig. 1. A selected subset of the aggregate data for students’ expectations in courses
B. Survey Structure and six learning related questions. The survey is freely avail-
able via a creative commons license through the Center for
To satisfy our above stated design goals for the survey,
Teaching Excellence [11]. The department of Family Science
we divided the survey questions into two groups. The first
are adopting this idea and thus, the department will deploy the
was a set of five demographic questions to provide context
tool in all of its undergraduate curriculum starting Fall 2013.
and classification for the results and to help us understand the
make-up of the student population. The second was designed to
gauge the expectations of the students to help faculty in plan- IV. R ESULTS
ning classroom management, assignments, and interactions A. Participation
with and among students. This second set of questions fell
into five pedagogical and learning categories: 1. Technology Instructors from across a wide range of departments were
and its Use; 2. Learning Activities; 3. Learning Assessments; solicited for the study with 27 instructors opting to par-
4. Faculty-Student Interactions; 5. Timeliness of instructor’s ticipate. This provided data from 25 different courses in 8
actions. different departments across campus. Although this sampling
of departments may seem small, there were students from
Table I (adapted from [21] with some modifications) has all 13 undergraduate colleges and schools at the university
a concise summary of the second group of our survey. The represented. The final sample included 816 undergraduate
table outlines each of the five subcategories with the questions students enrolled in STEM courses. Within this comprehensive
being asked and the choices the students can make for each data-set, there were two Computing courses surveyed which
question. The actual complete survey can be found here [11]. we will discuss their results in greater details: Introduction to
The last question we asked (not included in Table I) was an Computer Systems (CMSC 216) and Programming Language
open-ended question which gave the students the opportunity Technologies and Paradigms (CMSC 433). The responses
to elaborate on – “What misconceptions do you believe faculty included 42 from CMSC 216 and 15 from CMSC 433.
have about students?”
The remainder of this section will present selected high-
C. Deployment & Updates lights of the aggregate data including a comparison to comput-
ing courses, a breakdown by class status and a breakdown by
We deployed the survey in the early Spring 2012 at the course level, a STEM vs. Non-STEM comparison. We present
University of Maryland, College Park, an R-1 state university. two detailed case study comparisons of four 200-level courses
After the deployment, feedback was solicited from instructors from three STEM departments: Computer Science, Biology,
whose students participated in the study. We were lucky that and Chemistry, and three 400-level courses from Computer
the instructor were willing to give us useful suggestions. Science, Biology, and Bio-Chemistry. The complete results are
Notably, because of feedback from the instructors, in the available upon request.
“timeliness of action” category we added a new choice, namely
“longer than a week” to address a gap between “within a week” B. Aggregate vs. Computing
and “never” in the options originally provided (Table I). We
also added “lecture” as a selection option for the first three Figure 1 presents a large series of comparisons between the
categories 1 − −3. Table I includes the suggested changes aggregate data from all STEM courses including Computing
indicated by square brackets around them. We also clarified courses and the average result of the two Computing courses
some of the questions. We do not expect that these missing surveyed. A few categories with minimal differences have
items would jeopardize the validity of our results nor would been omitted for brevity. However, the majority of component
it weaken our conclusions especially in the case of the missed categories (∼ 75%) did have differences. Several of these
“lecture” option. As for the timeliness, we did not report any are expected from the historically iconic pedagogies that
of this category data in this paper. The edited and updated are normally used by computing courses, like programming
survey consisted of the three relevant demographic questions projects. They are shown in the first section of Figure 1, and
Freshman Sophomore Junior Senior All Students
presented to remind instructors that Non-Majors may not be 2.5
Mean Technology
expecting the level and volume of individual projects in a 2
Expectation
computing course. 1.5
1
The difference in Learning Management Systems points
0.5
out another trend that is challenging to address. Students from
0
other majors frequently expect professors to utilize university Class Status
wide course management systems. However, instructors that Fig. 2. Mean technology expectation by class status
are accustomed to an in-house, department-specific, or non- 100 200 400
standard learning management system often struggle or refuse 100%
20%
Within the technology components, several unexpected re- 0%
sults appeared. First, while many educators are exploring how Clickers Homework Essay Exams PowerPoint
to utilize e-Textbooks and social media, students themselves Fig. 3. Expectations in 100, 200, and 400 level courses in selected
seem to expect very little use of them with only 10% and categories
5%, respectively, supporting the claims of low technology
expectations by Lohnes & Kinzer [16]. This suggests that between reality and expectations (as explained in [12]) for
either the instructors are not using these techniques since interdisciplinary courses.
they are relatively new in the classrooms and thus students The three Learning Activities components with significant
did not see them in their past experiences and therefore are differences perhaps again harken to the different pedagogical
not expecting them. Even though students are not expecting style inherent in Computing, though these are certainly less
such technologies, there is work in some classrooms and prevalent than the components already separated out. Both
some fields than others to utilize such techniques to enhance study guides and small discussion groups highlight areas where
the learning process [24]. The question will be whether the instructors might easily introduce important learning activities
students will see enhanced learning using these techniques. which will increase student engagement with materials. While
Instructors will need to take a step by step approach when study guides are frequently a source of contention, discussion
introducing such technology. If instead these technologies are groups are a successful and proven method to help students
proving marginally beneficial in a certain way, it would make process materials in class.
sense to spend time in experimenting with them to find their
rightful place in pedagogy in the classroom next to other well- The final category of instructor interaction highlights two
documented and researched pedagogies such as active learning areas for focus, especially in higher-level interdisciplinary
[19]. courses. Students from other STEM disciplines fully expect
instructors to be accessible (66%), while Computing students
Meanwhile, PowerPoint is highly expected, more so than have a noticeably decreased expectation (48%). They also have
in other STEM courses. The need to present pre-written code more than twice the expectancy for an instructor to know their
might explain this, however caution should be used since name. Both these aspects reflect a more personalized approach
Craig and Amernic [7] have evaluated a significant body of and availability are expected from non-Computing students.
literature related to PowerPoint usage and come to a very This might suggest that we have to be more savvy in dealing
‘clear’ ambivalence about the success of PowerPoint in en- with non-Computing students and be more approachable, more
hancing learning. Alternatively, clickers, which several studies available and built more rapport with them which might make
have shown to be effective for enhancing learning, are not them more interested in computing.
being used or expected within Computing, even in the larger
CMSC 216 course, simply because the instructors in these C. Breakdown by Class Status
courses surveyed did not use that technology whereas in lower
level CMSC courses clickers is used heavily. In other STEM Within the comprehensive 816 responses, we had a nearly
courses, clickers is clearly expected. equal distribution among class status, 29% freshmen, 20%
sophomores, 28% juniors and 23% seniors. This provided the
The differences shown in Learning Assessments for Home- opportunity to identify trends in expectations as a student
work and Group Projects perhaps reflect the significantly progresses in his or her collegiate careers such as an increase
higher expectation of Individual projects in Computing in gen- in expectations of discussion, essay-based exams, and group
eral. However, the very low expectation of Group Projects is projects (not shown). Nearly all of these match trends identified
not unexpected since at the lower level undergraduate courses in other literature, or is generally expected. We do however
most instructors of computing courses opt to use individual want to highlight one surprising result of particular interest
projects and not group projects. This probably should be to computing, the overall expectation of technology usage,
balanced though given the prevalence of team projects in which is shown in Figure 2. A mean technology expectation
industry, and raises the question of whether we are realistically was found by averaging the number of components each
preparing our students for their future jobs. Finally, class student expected in the category of “Technology”. The mean
participation points are shown as something instructors should technology component expectation for all students was 1.73
be aware of, and make explicit for students who might have with the specific value for each class status shown in Figure
other expectations as it can create an immediate mismatch 2. This indicates that most students only expected one or two
100%
STEM In Figure 4, there are three pairs of comparisons: technol-
Percent of Students
80%
Non-STEM ogy, assessments, and instructor interactions. Within technol-
60%
ogy, the difference in PowerPoint is certainly worth noting.
Students coming from non-STEM majors expect it noticeably
40% less and may find classes frequently utilizing PowerPoint less
20% interactive. The increased expectation of clickers is clear, but
may be an artifact of a larger portion of Non-STEM students in
0%
Clickers large biology and chemistry general education courses where
PowerPoint
Homework
Projects
Hours
Accessible
Office
office hrs
Group
outside
clickers are heavily used.
Within assessments (category 3 of Table I), we found an
Fig. 4. Expectations of STEM and Non-STEM students in selected
overall trend of Non-STEM students having higher expecta-
categories tions than STEM students, of which the most pronounced
were group projects (p < 0.05) and homework (p < 0.01)
80%
as shown in Fig 4. The only choice from the assessments
Percentage of Students
E. Breakdown by STEM vs. Non-STEM 1) Case Study 1: 200 Level: As our first case study,
we consider four different 200 level STEM courses: General
Our final breakdown investigation is the expectations of Microbiology (BSCI223), General Chemistry (CHEM 232),
STEM and Non-STEM students. In our data set, 73% of the and Organic Chemistry (CHEM 271 and CHEM 272)), and
students belonged to STEM colleges and 27% to Non-STEM Introduction to Computer Systems (CM SC216). As can be
colleges. All of the data comes from STEM courses, however, seen in Figure 6, only about 5% of the students were taking the
many of the courses surveyed satisfy the general education course to fulfill a core requirement, with the Chemistry courses
requirements, a significant portion of the students belonged to being slightly higher. This low percentage indicates that the
Non-STEM colleges and programs. expectations in the course are not obscured by students taking
100%
the course from a widely varied background (of colleges at the
CMSC 216 BSCI 223 CHEM 2** Avg. STEM
university). The results presented can give us a closer view of
Percent of Students
80%
how different majors think, remembering that CM SC216 is
taken by CSE majors only. 60%
PowerPoint
Demos
Interact with
Chalkboard/
Clickers
Projects
requirement
discussions
Whiteboard
Group
you in class
Computing students sufficiently high? When we offer courses
In-class
Core
to Non-majors are we challenging them to perform at their
best?
Differences existed in nearly every category we examined, Fig. 6. Comparison of 200-level courses in Computing, Biology and
many of which were similar to the comparison of Computing Chemistry
100%
to the aggregate data. Several unique differences however also
Percent of Students
CMSC 433 BSCI 440 BCHM 46* Avg. STEM
existed between the courses with the most pronounced shown 80%
in Figure 6. There is clear evidence of different teaching
methods with higher expectations of PowerPoint from CMSC 60%
and BSCI students and higher Whiteboard usage in CMSC
and CHEM students. We can see a higher clicker expectation 40%
your name
accessible
participation
guides
Textbooks
discussions
discussions
Demos
office hrs
Study
outside
In-class
groups
Know
Small
points
had the highest in-class discussion and interaction (the course
Class
has a discussion section other than the regular lecture), suggest-
ing that even though all the courses are large the CMSC course
by design pushed for active participation and engagement with Fig. 7. Comparison of 400-level courses in Computing, Biology and
the students. Chemistry
2) Case Study 2: 400 Level: Our second case study, shown courses can be designed to capitalize on the different expec-
in Figure 7, compares three senior level courses in Com- tations that Non-STEM students have. Further, if departments
puter Science (CM SC433), Biology (BSCI440), and Bio- choose to offer courses specifically designed to interest non-
Chemistry (BCHM 463 and BCHM 464). With the higher majors, as suggested by Adams and Pruim [1], or higher
expectations of study guides and textbook usage, students from level interdisciplinary courses as suggested by Anthony [3]
Biology or Bio-Chemistry may be unprepared for the learning the courses can be tailored in such a way that the students
styles of interdisciplinary courses such as Bio-Informatics. will leave with a higher satisfaction and level of learning. It
This is further exacerbated for the biology students by an is vital to remember that while the students might be novices
expectation for a very different classroom style involving in Computing, they are not novice in the classroom. As one
participation points, and various forms of discussion. The student responded in the open ended question saying: “[Faculty
Bio-Chemistry students on the other hand had very similar believe] [t]hat we don’t want to learn – if we show up to class,
expectations of classroom learning activities to computing we are there to learn – it is not hard to “skip” a class. In that
courses. Finally, it seems that both Bio-Chemistry and Biology vein, if we are in class, please do not baby us, do not mock
students expect to have a more personal interaction with their us for asking questions, and do not waste your time or ours
instructors. They expect the instructors to know their names going into information that is irrelevant. . . ”
and be accessible outside office hours.
Another important impact areas are retention rates, in-
Finally, students outside of Computing have a significantly creased satisfactions, and if we pay more attention to the
higher expectation of accessibility outside of office hours as expectations of women and minorities, increased diversity.
seen in Figure 7, but did not differ significantly in any of the
timeliness categories (not shown). B. Study Limitations
While the reported results produced several interesting
V. D ISCUSSION insights and conclusions, there are some issues that need to
A. Impact on Curricula be discussed. The instructors administered the survey after the
first week of classes and the syllabi had been distributed and
The survey was originally designed to help instructors supposedly discussed. Thus, we would expect students to have
improve their courses. While the results in Section 4 point a decent idea of what things to expect in the course. However,
out several interesting trends, we are looking for a greater in many courses, categories were often not “expected” 100%
impact. We started out by examining the increasing necessity or 0%. This suggests that even though students had jointly
of computing in all STEM courses and even outside of STEM experienced a course there was no unanimous expected for
disciplines. As more students take introductory courses, those or against some common things. This may be a self-reporting
problem, or confusion on the students part. Another limitation [5] L. Cassel and U. Wolz. The role of interdisciplinary computing in
is that the data was only collected at a large R-1 school. It higher education, research and industry. Proceedings of the 43rd ACM
would be very exciting and interesting to analyze the expecta- technical symposium on Computer Science Education, 2012.
tions at smaller colleges and universities. Another limitation is [6] P. J. Collier and D. L. Morgan. Is that paper really due today? differ-
ences in first-generation and traditional college students’ understandings
that a large sector of STEM is missing from our results. We did of faculty expectations. Higher Education, 55(4):425–446, May 2007.
not get a sizable Engineering courses partaking in the survey, [7] R. J. Craig and J. H. Amernic. Powerpoint presentation technology and
nevertheless we got some Engineering students participating. the dynamics of teaching. Innovative Higher Education, 31(3):147–160,
Aug 2006.
VI. F UTURE W ORK [8] P. A. Foral et al. Faculty and student expectations and perceptions
of e-mail communication in a campus and distance doctor of pharmacy
We are convinced that we have touched the surface of a program. American Journal of Pharmaceutical Education, 74(10), Dec.
large area of research that we hope we would be able to further 2010.
work on in the short-term. The first of these is to understand [9] D. S. Goldberg et al. Engaging computer science in traditional
how the survey will influence courses. For example, one faculty education: The ECSITE project. Proceedings of the 17th ACM an-
member responded on their feedback form saying: “I thought nual conference on Innovation and Technology in computer science
education, pages 351–356, 2012.
this survey was great at getting a cross-section of what my
students expected from a class. . . I was surprised at some of [10] D. S. Goldberg et al. Addressing 21st century skills by embedding
computer science in k-12 classes. Proceeding of the 44th ACM technical
the expectations. . . ” symposium on Computer science education, pages 637–638, 2013.
However, since this study was the first, faculty members [11] Graduate Lilly Fellows 2011-2012. CTE UMD Expectations Survey.
http://www.cte.umd.edu/Resource/Surveys/.
were not able to get the results quickly enough to do changes
[12] R. James. Students’ changing expectations of higher education and the
to their courses, the collection and turn-around time forced consequences of mismatches with the reality. In Responding to Student
the data to be returned almost mid-semester. With the survey Expectations -Education and Skills. OECD Publishing, 2002.
publicly available and ready for use [11], turn-around time in [13] G. D. Kuh. What we’re learning about student engagement from NSSE:
future deployments will not be an issue at all. Thus, we expect benchmarks for effective educational practices. Change: The Magazine
faculty to use the data to modify courses. of Higher Learning, 35(2):24–32, Mar 2003.
[14] J. Lang. Classroom transparency. The Chronicle of Higher Education,
The data we collected dealt only with STEM courses. We April, 2007.
would want to do a similar study for Non-STEM courses. It [15] W. Lehmann. University as vocational education: working-class stu-
would be interesting to see how the expectations of STEM dents expectations for university. British Journal of Sociology of
students would be in non-STEM courses and how they differ Education, 30(2):137–149, Mar 2009.
from their expectations in STEM courses. Intuition suggests [16] S. Lohnes and C. Kinzer. Questioning assumptions about students’
that the expectations in Humanities courses will differ. Another expectations for technology in college classrooms. Innovate: Journal
of Online Education, 3(5), 2007.
aspect that would be intriguing to add is a longitudinal study
– similar to the medical and pharmaceutical studies – tracking [17] B. Obama. Education. http://www.whitehouse.gov/issues/education (last
accessed 5 Apr 2013).
the change of student expectations from the day they step
[18] President’s Council of Advisors on Science and Technology. Report
into college till they graduate to see the evolution of their to the President – Engage to Excel: Producing 1M additional college
expectations, similar to the NSSE tracking. graduates with degrees in STEM. Technical report, Executive Office of
the President, Feb 2012.
Conducting a direct extension of this work would be [19] M. Prince. Does active learning work? a review of the research. Journal
administering this survey for across multiple colleges and of engineering education, 93(3):223–231, 2004.
Universities, and comparing both CSE and other STEM stu- [20] P. Sander et al. University students’ expectations of teaching. Studies
dents across liberal arts colleges, small or large, teaching in Higher Education, 25(3):309–323, Oct 2000.
and research institutions. A nation-wide study is our more [21] K. R. B. Schmitt, A.-H. A. Badawy, et al. Student expectations from
ambitious goal. We would expect such future research to have CS and other stem courses: they aren’t like cs- majors! or (cs! =
a large impact on policies and on our understanding of the stem − cs). J. Comput. Sci. Coll., 28(6):100–108, June, 2013.
different institutions and their students. [22] K. R. B. Schmitt, E. Larsen, et al. A survey tool for assessing student
expectations early in a semester. Journal of Microbiology & Biology
ACKNOWLEDGMENT Education, submitted 2013.
[23] M. D. Shank, M. Walker, and T. Hayes. Cross-cultural differences
We gratefully acknowledge UMD’s Grad School for funding us. We in student expectations. Journal of Marketing for Higher Education,
extend our gratitude to Ms. Grossnickle, Mrs. Shaw and CTE staff. Finally, 7(1):17–32, 1996.
deepest gratitude goes to the instructors and the students who participated.
[24] P. M. Timoney. A revolutionary approach to neonatal nurse practitioner
education: Preparing the 21st century nnp. In 39th Annual Meeting
R EFERENCES National Organization of Nurse Practitioner Faculties (NONPF), 2013.
[1] J. C. Adams and R. J. Pruim. Computing for stem majors: enhancing [25] T. Tricker. Student expectations how do we measure up? Probing the
non CS majors’ computing skills. In Proceedings of the 43rd ACM boundaries of higher education, pages 111–114, 2005.
technical symposium on Computer Science Education, pages 457–462, [26] G. P. Trudeau and K. J. Barnes. Shared expectations: Identifying similar-
2012. ities and differences between student and faculty teaching values based
[2] W. A. Anderson et al. Changing the culture of science education at on student evaluation of faculty classroom performance. International
research universities. Science, 331(6014):152, 2011. Business & Economics Research Journal, 1(7), 2011.
[3] B. M. Anthony. Operations research: broadening computer science [27] University of Illinois. Illinois initiative on transparency in learning and
in a liberal arts college. In Proceedings of the 43rd ACM technical teaching in higher education. http://www.teachingandlearning.illinois.
symposium on Computer Science Education, pages 463–468, 2012. edu/transparency.html (last accessed 6 Apr 2013).
[4] J. E. Caldwell. Clickers in the large classroom: Current research and [28] O. Yasar and R. H. Landau. Elements of computational science and
best-practice tips. CBE-Life Sciences Education, 6(1):9–20, 2007. engineering education. SIAM Review, 45(4):787–805, Jan 2003.