0% found this document useful (0 votes)
42 views7 pages

I'm Going To Learn What?!? Teaching Artificial Intelligence To Freshmen in An Introductory Computer Science Course

This document summarizes a paper that proposes teaching artificial intelligence (AI) and machine learning (ML) concepts to non-computer science freshmen students. The paper describes a new AI curriculum designed for an introductory computer science course. The curriculum covers four key areas: core AI concepts, implementation details, limitations of AI, and ethical considerations. The authors taught this curriculum to 174 randomly selected freshmen students and found that non-computer science majors can understand basic AI/ML concepts without being overwhelmed. Students were able to design, code, and deploy their own AI agents to solve problems in a final project.

Uploaded by

youmbilauraine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views7 pages

I'm Going To Learn What?!? Teaching Artificial Intelligence To Freshmen in An Introductory Computer Science Course

This document summarizes a paper that proposes teaching artificial intelligence (AI) and machine learning (ML) concepts to non-computer science freshmen students. The paper describes a new AI curriculum designed for an introductory computer science course. The curriculum covers four key areas: core AI concepts, implementation details, limitations of AI, and ethical considerations. The authors taught this curriculum to 174 randomly selected freshmen students and found that non-computer science majors can understand basic AI/ML concepts without being overwhelmed. Students were able to design, code, and deploy their own AI agents to solve problems in a final project.

Uploaded by

youmbilauraine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

I’m Going to Learn What?!? Teaching Artificial Intelligence


to Freshmen in an Introductory Computer Science Course
Adrian A. de Freitas and Troy B. Weingart
[email protected],[email protected]
United States Air Force Academy
Colorado Springs, Colorado

Figure 1: A screenshot of CS110’s final project. Students programmed a rocket landing simulator in Python, and used a genetic
algorithm to train an agent to safely and efficiently land the rocket on a floating barge.
ABSTRACT CCS CONCEPTS
As artificial intelligence (AI) becomes more widely utilized, there • Social and professional topics → CS1; • Computing method-
is a need for non-computer scientists to understand 1) how the ologies → Artificial intelligence.
technology works, and 2) how it can impact their lives. Currently,
however, computer science educators have been reluctant to teach KEYWORDS
AI to non-majors out of concern that the topic is too advanced. computer science education, artificial intelligence, machine learning
To fill this gap, we propose an AI and machine learning (ML) cur-
ACM Reference Format:
riculum that is specifically designed for first-year students. In this
Adrian A. de Freitas and Troy B. Weingart. 2021. I’m Going to Learn What?!?
paper, we describe our curriculum and show how it covers four key Teaching Artificial Intelligence to Freshmen in an Introductory Computer
content areas: core concepts, implementation details, limitations, Science Course. In The 52nd ACM Technical Symposium on Computer Science
and ethical considerations. We then share our experiences teaching Education (SIGCSE ’21), March 13–20, 2021, Virtual Event, USA. ACM, New
our new curriculum to 174 randomly-selected Freshman students. York, NY, USA, 7 pages. https://doi.org/10.1145/3408877.3432530
Our results show that non-computer scientists can comprehend
AI/ML concepts without being overwhelmed by the subject ma- 1 INTRODUCTION
terial. Specifically, we show that students can design, code, and In recent years, artificial intelligence has transitioned from a niche
deploy their own intelligent agents to solve problems, and that technology to a significant part of our everyday lives. AI systems
they understand the importance and value of learning about AI in ranging from military applications [23] to self-driving vehicles [10]
a general-education course. are now commonplace, and the technology has matured such that
it can be utilized by practitioners with little expert knowledge [8].
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
This trend creates a dilemma for computer science educators.
for profit or commercial advantage and that copies bear this notice and the full citation Currently, AI and its subfields (e.g., machine learning) have been
on the first page. Copyrights for components of this work owned by others than ACM thought of as advanced topics, and as such have largely been re-
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specific permission and/or a served for computer science majors or those with an explicit "need
fee. Request permissions from [email protected]. to know" [24]. As AI becomes more pervasive, however, there is an
SIGCSE ’21, March 13–20, 2021, Virtual Event, USA increasing need for all students to learn 1) how AI works, and 2)
© 2021 Association for Computing Machinery.
ACM ISBN 978-1-4503-8062-1/21/03...$15.00 the potential ethical, legal, and societal issues that will undoubt-
https://doi.org/10.1145/3408877.3432530 edly arise as AI technologies become more widely adopted. Such

198
Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

a foundation not only shows students what AI can and cannot this list. Specifically, they highlighted the inherent risks of creat-
accomplish, but also helps them decide (and eventually influence) ing AI systems that make non-trivial decisions, and stressed the
how the technology should be integrated in their day-to-day lives. need for students to learn ethical analysis so that they can design,
At the United States Air Force Academy, we are actively explor- train, and deploy these systems in a responsible manner. Lastly, in
ing ways to teach artificial intelligence and machine learning to [26, 27] Sulmont et al. conducted interviews with AI practitioners
a wider student audience. As a first step, we have created a new in order to get their thoughts on what AI/ML topics should be
AI lesson block for CS110, our Introduction to Computing course. taught to non-majors. The authors’ findings echoed many of the
Rather than target a small or hand-selected student population, desired learning outcomes described above, but also highlighted the
our curriculum is intended for all first-year students regardless of need for a curriculum that can help students overcome their pre-
their desired academic major. This gives us a unique opportunity conceived notions (e.g., misconceptions about AI/ML from popular
to test our content on a diverse student population and assess the media; their inability to understand AI/ML due to lack of computer
feasibility of including AI topics in a general-education course. science, math, or statistical skills; their belief that AI is a panacea
This paper offers three contributions. First, we explore how AI and can be liberally applied to any problem domain).
has been taught to non-computer science majors to date. We show While these works helped us identify the broad topics and goals
how current efforts focus on identifying high-level AI educational that we wanted to cover in the AI block, they are limited in that
outcomes [26, 27, 31], or exposing students to specific AI topics (e.g., they only offer prescriptive advice. Consequently, we were also
robotics, machine learning) [14, 29]. We then use these works to interested in seeing how AI/ML has actually been taught to non-
identify four key areas (core concepts, implementation, limitations, computer science students. Works such as those detailed in [14],
ethics) that an introductory AI/ML curriculum should cover. [17], and [9], for example, showed that non-majors could build
Secondly, we present a curriculum for teaching AI at the general- simple AI systems with minimal programming experience. These
education level that covers the four topics listed above. Our cur- works, however, largely focused on robotics and simple web applica-
riculum introduces students to fundamental AI concepts and gives tions, and only discussed other AI topics (e.g., ethics, terminology)
them the opportunity to build their own agents and models via to the extent needed to support the primary curriculum. Mean-
a final programming project (Figure 1). Our AI block concludes while, in [29], Way et al. developed a set of downloadable modules
with a discussion of AI’s current limitations, and gives students an to help students gain practical experience using AI. These modules,
opportunity to observe the unintended consequences that can arise however, focused primarily on machine learning algorithms via the
when AI is used to make non-trivial and/or life-or-death decisions. open-source Weka toolkit [19], and did not cover other AI topics.
Third, we provide a retrospective analysis of our experiences Finally, courses such as those proposed in [25], [22], and [7] have all
teaching CS110 to 174 randomly-selected students. Our findings tried to teach AI using a multi-disciplinary approach. While these
show that it is feasible to have students design and implement their works cover a wider range of AI topics, they are often designed
own AI agents in an introductory level computer science course. for students that have little to no formal programming skills. As
We also analyze students’ performance on graded assessments to a result, these works tend to lack a meaningful AI programming
identify concepts they understood and/or struggled with. component, and instead rely heavily on pre-built demos such as
The remainder of this paper is structured as follows. In the next Google’s Teachable Machine [13] and AIY (i.e., "AI Yourself") Kits
section, we provide a summary of related works and show how our [6] to give students hands-on experience.
proposed curriculum was shaped by past research in undergraduate Our work sits at the intersection between these two bodies of
AI education. We then describe our curriculum, and highlight the research. A key goal of our AI block is to provide students with
key concepts, learning objectives, and assessment tools we used. a broad introduction to AI concepts. To achieve this, we took the
In Section 4, we describe our experiences teaching the curriculum high-level learning outcomes identified by Wollowski et al., and
to a randomly-selected student population. We then evaluate the distilled them into four key topics: core concepts, implementation
effectiveness of the AI block by examining students’ performance details, limitations, and ethics. At the same time, our work also seeks
on graded assignments, and analyzing their qualitative feedback. to give students practical experience building AI so that they can
Finally, we discuss the strengths and limitations of our approach, experience the strengths and limitations of the technology firsthand.
and conclude with a discussion of future work. To achieve this, we followed [29] and [9]’s lead and created non-
trivial programming projects that assess students’ abilities to apply
AI algorithms to new problems. Through this approach, our AI
2 RELATED WORKS block is able to cover general AI concepts while giving students
The idea of exposing non-Computer Scientists to AI/ML concepts hands-on experience when necessary. In doing so, we aim to provide
has been slowly gaining momentum as the technologies gain rele- a curriculum that offers students the best of both worlds.
vancy and importance. In [31], Wollowski et al. surveyed 37 com-
puter science instructors to discover best practices and techniques
for teaching AI to non-majors. Through this process, the authors 3 OUR AI/ML CURRICULUM
identified a set of desired learning goals that should be included Using prior literature as a guide, we have developed four lessons
in an AI curriculum. These goals ranged from a general overview for CS110 that cover each of the topics identified above. In this
of AI and ML’s capabilities, to practical knowledge implementing section, we briefly describe CS110 and show how we integrated the
AI and a discussion of its current limitations and societal impacts. AI block. We then describe each lesson’s learning objectives and
In [18], Goldsmith and Burton argued that ethics be included in provide an overview of the final project.

199
Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

3.1 CS110 Overview


CS110 is an "Introduction to Computing" course taken by all first-
year students at our institution, regardless of their desired academic
major. While the course’s overarching goal is "to teach students how
computers operate and the capabilities that they provide," CS110
is primarily a programming course. Students are introduced to
algorithmic reasoning on day one, and apply these skills to solve
real-world problems.
CS110 consists of 40 one-hour lessons. For the first 23 lessons,
students are introduced to the Python programming language and
are taught how to design and write programs using sequential,
conditional, and iterative logic. Students are then assigned a final
programming project that lets them code a non-trivial software
system. While the exact project varies each semester, it typically
includes some graphical component. The project is also sufficiently
complex (e.g., 200+ lines) such that students cannot complete it
without applying some degree of engineering rigor.
The remaining 17 lessons are more conceptual. Here, we discuss
the theoretical capabilities of computing and explain how informa-
tion can be encoded and processed in binary. We also introduce Figure 2: Demo applications used in the "Sense-Think-Act"
cybersecurity concepts by showing them how coding flaws can AI lesson. In Pong (left), students build an agent to control
either be exploited or addressed from an offensive and defensive the red paddle. In Spy Hunter (right), students write multi-
standpoint, respectively. In each of these lessons, students’ knowl- ple agents that can navigate the red car through traffic using
edge of programming scaffolds their learning. Whenever possible, exhaustive search and heuristic algorithms.
we provide students with working code examples that demonstrate
the concept being taught (e.g., showing students how the lack of
input validation can be used by a malicious user to perform an in- • Differentiate between weak and strong AI
jection attack). This shows students how the skills they learned can • Understand the history and evolution of artificial intelligence
be applied outside of a classroom, and gives them an opportunity • Articulate the types of problems AI can help us solve
to run and modify the code to increase their understanding. For homework, students administer the Turing Test to five AI
To accommodate the AI curriculum, we have compressed the systems ranging from chat bots such as Eliza [2, 30] to modern
number of cybersecurity lessons in CS110 by roughly half, and storytelling programs like AI Dungeon [3]. After testing each appli-
added four AI lessons immediately following the programming cation, students state whether or not they could tell if the responses
block. This arrangement lets us present code examples to students they received came from a computer. This exercise forces students
while their understanding of programming is still fresh. Addition- to articulate why a system passed or failed the Turing Test. In doing
ally, this lesson ordering lets us incorporate AI within the final so, they continuously reevaluate what it means for a computer to
project to show students how AI/ML can be readily utilized to solve be "intelligent."
a wide range of automation and optimization problems. This adds
3.2.2 Lesson 2: Sense-Think-Act. In the second lesson, we show
a layer of real-world relevance to the final project–one that is often
students how to implement intelligent agents in Python. Students
difficult to incorporate into introductory computer science courses.
are taught the general process by which AI agents 1) sense their
3.2 AI Block Lessons and Project Description environment, 2) think about what action(s) they should take, 3) and
act. We then have students program their own AI for two games:
Our AI block consists of four one-hour lessons followed by a final
Pong and Spy Hunter (Figure 2). In Pong, students create an agent
project. In this section, we describe each lesson’s learning objectives
that can move a paddle to hit a virtual tennis ball. In Spy Hunter,
and the activities we used to facilitate student learning.
students are tasked with creating an agent that can safely control a
3.2.1 Lesson 1: Introduction to AI. The first lesson of the AI block car as it speeds down a highway road. Students learn how to make
explores the fundamental question: "What is Artificial Intelligence?" decisions by looking one or more moves ahead via a depth-first
Here, students read about the Turing Test [28], and work as a class or breadth-first search. They then develop their own heuristic to
to come up with an operational definition of AI. Students are shown quickly evaluate game states, and compare the resulting agent’s
the difference between weak and strong AI, and are introduced to performance to exhaustive search algorithms.
the general problem categories (e.g., classification, clustering, data Through these exercises, students learn how to:
generation) that AIs have traditionally been used to solve. • Describe each step in the “Sense-Think-Act” cycle
Through these activities, students learn to: • Describe how decision trees, finite state machines, and search
• Describe the characteristics of artificial intelligence trees can be used by AIs to make decisions
• Describe the Turing test, and how it has shaped our under- • Explain the concept of a game state
standing of artificial intelligence • Explain the concept of a heuristic

200
Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

Rather than program from scratch, we provide working code that that generates random locations for the rocket and boat. Next, they
already implements the graphics and logic for each game. This lets implement graphics and the physics model, and track performance
students focus on developing the AI, and lets us cover and evaluate metrics such as the amount of fuel consumed, velocity at time of
multiple AI strategies during a single class session. landing, etc. Finally, they implement keyboard controls so that they
can manually control the rocket’s thrusters and land it on the boat.
3.2.3 Lesson 3: Introduction to Machine Learning. The third AI The next set of deliverables are AI-focused. Using the strategies
lesson steps students through the machine learning process. First, they learned in the AI block, students design and implement an
we have students manually label and split a dataset into training agent that can land the rocket. Students can design their agent
and evaluation sets. Next, they select an appropriate algorithm, and however they wish. A simple agent, for example, might compare
train a model using Weka. By examining Weka’s output, they are the coordinates of the rocket with that of the boat and fire the
able to evaluate the model and propose modifications (e.g., new thrusters accordingly. A more advanced agent, on the other hand,
features) to reduce errors. We then show students how to tune the might have fine-grained rules specifying what the rocket should
model and monitor its performance under real-world conditions. do when it is 1) near the boat, 2) over it, and 3) just about to land.
Through this exercise, students learn how to: For the final deliverable, students use a genetic algorithm to
• Describe the goals and motivation behind machine learning tune their agent. Here, students write a custom fitness function
• Define ML terminology (e.g., “model”, “instance”, “feature”) that scores an AI’s performance. One fitness function, for instance,
• Differentiate between supervised and unsupervised learning might give an agent points for landing on the boat, but subtract
• Describe and execute the steps involved in training a ma- points for consuming too much fuel. The students use a genetic
chine learning model algorithm that experiments with different AI parameters (e.g., rocket
For homework, we give student a second dataset that contains thrust values), and determines which combination will consistently
the layout of X’s and O’s in thousands of Tic-Tac-Toe games [1]. earn the highest scores. They then submit their agent and tuned
The students must then train a classifier that can determine if X parameters as part of their final turn-in.
has won. As students complete the exercise, they discover that the To encourage creativity, the final project includes a competition
best model simply counts the number of X’s to determine if X is component. For the five nights leading up to the due date, we ran
victorious. This assignment highlights the inherent challenges in each students’ AI against 10 randomly-generated scenarios. The
teaching computers to learn abstract concepts, and illustrates the AIs were then ranked (and awarded extra credit) according to how
vital role that human beings play in the machine learning process. consistently they landed, how much fuel they expended, and how
gently they landed on the pad. Students could watch a live stream
3.2.4 Lesson 4: Limitations and Ethical Considerations. Finally, we of the simulation from their home computer and see how their
discuss the problems that arise when AI/ML systems are used in the agent performed relative to the rest of the class. They could then
real world. First, we talk about the biases that can arise when models modify their AI and resubmit to improve their standings.
are trained using skewed datasets, and show historical examples of
how these biases can cause unintentional harm to people [12, 15,
4 VALIDATION
20, 21]. We then transition to the concept of "Explainable AI" [16],
and highlight the importance of developing systems that can, in To evaluate the AI block, we conducted a trial run of CS110 during
addition to making intelligent decisions, explain their rationale. the Spring 2020 semester. For this trial, we randomly selected 174
Through this lessons, students learn to: of the 500+ first-year students scheduled to take CS110 (only 3 of
which were declared Computer and Cyber Science majors). We
• Understand the ethical implications of using machine learn- then split these students into nine "sections" of 20-24 students (the
ing algorithms in high risk and/or life-threatening situations standard classroom size for our institution), and assigned them
• Describe the concept of Explainable AI to seven different instructors. Due to the COVID pandemic, all
• Compare and contrast ethical frameworks for AI AI lessons were administered online. Lectures were delivered via
To assess learning, students create a list of ethical guidelines that prerecorded videos and were supplemented by live question and
they feel would help address the challenges described above. They answer sessions.
then compare and contrast these guidelines with those identified In this section, we examine students’ performance on graded
by academia [11], industry [5], and the Department of Defense [4]. assignments to see how well they retained the material. We then
The goal of this exercise is not to tell students which principles present findings from our end-of-course survey to see how students
are correct, but rather to help them see how AI ethics are shaped rated the new content.
by personal and organizational goals. Students can then pick and
choose the perceptual lens(es) they feel are worth adopting. 4.1 Student Performance
3.2.5 Final Project. The final project in CS110 is a rocket landing Table 1 summarizes students’ performance on both lab exercises
simulator (Figure 1). In this assignment, students write a program (i.e., homework) and the final project. Overall, students performed
that simulates a rocket’s trajectory as it launches and steers towards as expected. On the first and last labs, for example, students earned
the ocean. The goal is for the student (and eventually an AI) to a 98.3% average. This was expected, as these assignments evaluated
control the rocket’s descent and safely land it on a floating barge. students’ understanding of overarching AI concepts. Students could
The final project is split into multiple deliverables. The first set express their opinion (e.g., "I feel this system passed the Turing Test
focuses on basic functionality. First, students create a simulator because..."), and were given credit so long as their reasoning was

201
Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

Table 1: Student Performance on AI/ML Assignments Table 3: End-of-Course survey responses collected from 64
students (1 = "Strongly Disagree"; 5 = "Strongly Agree")
Assignment Description Avg Grade (%)
Question Mean Median
Lesson 1 Lab Administering Turing Test 99.2%
Lesson 2 Lab Pong/Spy Hunter Agents 81.3% 1 The AI labs were helpful and helped 3.76 4
Lesson 3 Lab Training ML Models 82.2% me master the course material.
Lesson 4 Lab AI Ethics Analysis 97.4%
2 The AI programming assignments 4.25 4
Final Project Custom Rocket Landing AI 85.8%
were helpful and helped me master
the course material.
Table 2: AI/ML Exam Questions by Levels of Understanding
3 I’m glad I took the [AI Version] of 4.38 5
CS110.
Lesson / Topic Knowledge Comprehend Apply
1 : What is AI? 2 9 0 and see if the results we obtained are consistent across semesters.
2 : Sense-Think-Act 0 2 3 Nevertheless, our initial results show that our proposed curriculum
3 : Machine Learning 3 0 8 can teach a largely non-computer science student population AI/ML
4 : Limitations/Ethics 0 3 5 concepts without overwhelming them. This in turn strengthens the
argument that AI/ML can and should be included in a well-rounded
Avg Student Score 70.9% 78.9% 63.6% general-education curriculum.
Avg (All Questions) 71.6%
4.2 End-of-Course Survey Results
sound. By comparison, students’ earned 83.1% combined on Lesson In addition to looking at raw scores, we also gave students the
2-3 labs and final project. These assignments required students to opportunity to evaluate the AI block in our end-of-course survey.
implement AI agents and train models rather than just talk about On the final day of class, we gave all 174 students a link to an online
them. As a result, we were not surprised to see students complete form. Of these, we received 64 responses (37%).
these assignments with varying degrees of success. Our results (Table 3) show that students overwhelmingly pre-
While students generally performed well on take-home assign- ferred being enrolled in the AI version of CS110. Student ratings
ments, their exam scores fell slightly below expectations. As shown for the AI block (3.76) closely matched that of the cybersecurity
in Table 2, students only answered 71.6% of the AI/ML questions block (3.78)–a portion of the course that traditionally receives high
correctly–well below our internal target of 75%. Closer examina- ratings. Students also spoke highly of the AI block in the open
tion shows that student performance varied by topic. Students were response portion of the survey. One student stated: "later parts
able to correctly answer exam questions derived from the first and of the course presented new and intriguing information about AI
fourth AI lessons 80% and 70% of the time, respectively. In contrast, and machine learning that I really enjoyed." Another student spoke
students were only able to correctly answer questions from the highly of the AI block, stating "[I] learned something I would have
second and third AI lessons 63% of the time. This suggests that never though[t] about in my life."
students were comfortable with the overarching AI concepts, but In fairness, not all responses were positive. Some students fa-
had trouble directly applying them to new problems. vored Python over AI, and stated that they would ". . . rather learn
We also suspect that the lower scores can be explained by the how to program". Others stated that they were "not as engaged"
types of questions that we asked on the exam. Since the exam had to with the AI block as they were with the cybersecurity lessons. These
be administered online, we were concerned that students might try responses, however, only accounted for 3% of all feedback. Over-
to use unauthorized resources. To counteract this, we intentionally all, the general consensus was that the AI content, despite being
scaled the test to be more difficult. Knowledge-type questions (e.g., delivered purely online, was worthwhile. This finding suggests that
"What is the definition of Weak AI") were largely eschewed in favor teaching AI in an introductory level course is useful, and has given
of questions that required the student to either 1) describe what us the confidence to make the lessons a permanent part of CS110.
happened in a scenario (thereby demonstrating their comprehension
of the material) or 2) state how they would apply AI best practices
5 DISCUSSION
to solve a specific problem. In Table 2, we break down student per-
formance by problem type. Not surprisingly, students struggled the As this was our first attempt teaching AI/ML in an introductory
most with Application-style questions. Surprisingly, however, our level course, we learned many lessons along the way. In this section,
results also show that students performed better on Comprehension- we identify aspects of the AI block that worked especially well for
Type questions than on any other type. This suggests that students’ us, as well as areas that need further refinement.
understanding of AI/ML surpasses rote memorization, and lends
further credence to the idea that AI/ML concepts are not too difficult 5.1 What Worked
for students to understand. Tearing Down Mental Barriers. As stated in [26], one of the
Viewed as a whole, our results show that the AI block is feasi- biggest impediments to teaching AI/ML to Freshmen students is
ble for an introductory-level computer science course. Obviously, getting them to believe that they can learn the material. Conse-
further trials need to be conducted in order to refine the content quently, we spent a significant amount of time priming students

202
Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

for the AI block. During the programming lessons, for example, the AI block was intended to give students a broad understanding
we showed students how conditional statements could be chained of AI, we regularly found that our lessons dove too deep. Although
together to form simple decision-making agents. Likewise, instead we eased some of the learning requirements, the results from the
of just jumping into topics like neural networks and machine learn- exam (Table 2) clearly show that students had difficulty retaining
ing, we first talked about how human beings generally learn and all of the information presented in class. This suggests that we can
teach each other new concepts. By the time students reached the further trim the AI block’s content to make it more reasonable.
AI block, they had already been informally exposed to many of the Limited Incentives for Creativity. During the last weeks of the
learning objectives. This grounded their thinking, and helped them final project, we provided students with an online tutorial that
realize that AI is not as complex a topic as it may seem to be. showed how to build a basic rocket landing AI. This tutorial high-
While it is difficult to know if our efforts were entirely successful, lighted several strategies to get the rocket to land on the boat, and
anecdotal evidence suggests that we were able to break down the was taught by all instructors to their respective sections.
mental barriers for many students. During the final project, for In hindsight, this was a mistake. Students regularly seek the path
example, several of our students were surprised to discover that of least resistance and were eager to use the strategies we suggested
they could implement a working rocket-landing agent in just 25 rather than come up with one on their own. This caused many of
lines of code. One particularly memorable student, who inspired the the AIs in our competition to behave similarly to one another, and
title of this paper, watched her AI land the rocket on the boat, only made the final project, as stated by one student, feel "cookie-cutter."
to say (somewhat crudely) out loud "this is [expletive] magical." As Simply put, we need to do a better job incentivizing students to
students achieved early successes building AIs for Pong and Spy experiment with different AI algorithms. One solution would be
Hunter, they became more confident that they could build more to host the competition throughout the latter half of the semester
complex systems. This confidence eventually resulted in 153 of our instead of the final week so that students have more time to try
students (88%) successfully turning in a working rocket-landing AI out new ideas. Alternatively, we could provide students with a
at the end of the course. This accomplishment would not have been simple AI that they have to outperform in order to get credit. Either
possible had we let students hold onto their preconceived notions. approach should encourage students to think outside of the box,
Using Competition as a Motivational Tool. Originally, we had and increase the variety of AIs that we see in the competition.
planned on grading each student’s rocket-landing AI individually. Overemphasis on Games. In order to make the AI content attrac-
As the course transitioned to distance learning, however, we realized tive to students, we relied heavily on gaming to frame our lectures.
that the final project offered an opportunity to have students and We used simple games (e.g., Pong and Spy Hunter) to show how AI
teachers connect despite being globally separated. Consequently, algorithms work, and even designed the final project such that it
we made the decision mid-semester to include a competition in the felt more like a game than a true simulation.
final project, and offered extra credit to the best performing AIs. While this approach increased student engagement, it also pre-
While we were initially concerned that students would not want vented students from easily seeing how the same AI techniques
to participate, we were pleasantly surprised to see them embrace could be applied to other domains. In future offerings of CS110, we
the competition. In all, 125 students (72%) submitted their AI to the plan on designing final projects that focus on other problems (e.g.,
competition and watched the nightly video feed. Of these, more medical modeling, optimization). This will illustrate AI’s useful-
than half (56%) chose to resubmit their AI at least once in an attempt ness across a wider spectrum of problems, and make the course
to outperform their peers. While the winner of the competition was appealing to a wider audience.
ultimately one of our A-students, there were a number of B and C
students that repeatedly outperformed the top ranking students in 6 CONCLUSION
our class. This achievement boosted their confidence, and helped
solidify our belief that AI can be effectively wielded by a much In this paper, we described our experiences creating an AI/ML cur-
larger student audience than previously believed. riculum for first-year students. We presented a four-lesson block
that gives students a fundamental understanding of AI core con-
cepts, implementation details, limitations, and ethics. We then
5.2 What Didn’t Work taught the AI block to a class of 174 non-computer science ma-
Unrealistic Expectations. During our initial excitement to create jors, and demonstrated that students can both comprehend and
the AI block, we were overly optimistic about what we could get apply the material to solve non-trivial problems.
students to learn. Originally, for example, we had planned on having Our work takes a small but important step towards preparing
students analyze confusion matrices in order to identify Type I and students for a future where AI permeates nearly every facet of our
Type II errors. We had also planned on having students write the lives. In the future, we hope to improve the AI block even further by
genetic algorithm that they used in the final project from scratch, giving students more freedom to explore different AI algorithms. It
rather than use a library provided by us. should be possible, for example, to create final projects that utilize
As the semester progressed, we quickly realized that students deep learning instead of genetic algorithms. Similarly, it should
simply did not have the time to do everything we wanted. Most be possible to design lessons that show how AI can be used in
students, after creating the simulation portion of the final project, non-gaming applications. Through these collective efforts, we hope
only had a week or so to design, implement, and train their AI before to contribute towards a future when students no longer see AI as
the final turn-in. Additionally, by the time they had reached the "magical," but rather a useful technology that comes with its own
exam, most students had only seen a confusion matrix once. While set of advantages and trade-offs.

203
Paper Session: AI / ML SIGCSE ’21, March 13–20, 2021, Virtual Event, USA

7 ACKNOWLEDGMENTS [15] Alexiei Dingli. 2018. It’s Magic...I Owe You No Explanation! https:
//becominghuman.ai/its-magic-i-owe-you-no-explanation-explainableai-
This work is partly sponsored by the Air Force Office of Scientific 43e798273a08
Research (AFOSR) under Grant FA9550-20-S-0003 as part of the [16] Derek Doran, Sarah Schulz, and Tarek R Besold. 2017. What does Explain-
able AI Really Mean? A New Conceptualization of Perspectives. arXiv preprint
Dynamic Data and Information Processing portfolio of Dr. Erik arXiv:1710.00794 (2017).
Blasch. The views and conclusions contained herein are those of the [17] Susan Fox. 2005. Using robotics to introduce AI topics to a wider audience. In
authors and should not be interpreted as necessarily representing Accessible Hands-on Artificial Intelligence and Robotics Education, AAAI Spring
Symposium.
the official policies or endorsements, either expressed or implied, [18] Judy Goldsmith and Emanuelle Burton. 2017. Why teaching ethics to AI practition-
of the Air Force Academy or the U.S. Government. ers is important. In Proceedings of the... AAAI Conference on Artificial Intelligence.
[19] Mark Hall, Eibe Frank, Geoffrey Holmes, Bernhard Pfahringer, Peter Reutemann,
and Ian H Witten. 2009. The WEKA data mining software: an update. ACM
REFERENCES SIGKDD explorations newsletter 11, 1 (2009), 10–18.
[1] 1989. UCI Machine Learning Repository: Tic-Tac-Toe Endgame Data Set. https: [20] Anna Lauren Hoffmann, Sarah T Roberts, Christine T Wolf, and Stacy Wood. 2018.
//archive.ics.uci.edu/ml/datasets/Tic-Tac-ToeEndgame Beyond fairness, accountability, and transparency in the ethics of algorithms:
[2] 2005. Eliza (elizabot.js). https://www.masswerk.at/elizabot/ Contributions and perspectives from LIS. Proceedings of the Association for
[3] 2019. AI Dungeon. https://play.aidungeon.io/main/landing Information Science and Technology 55, 1 (2018), 694–696.
[4] 2019. AI Principles: Recommendations on the Ethical Use of Artificial Intelligence [21] Sam Levin. 2018. Imprisoned by Algorithms: the Dark Side of California Ending
by the Department of Defense. https://innovation.defense.gov/ai/ Cash Bail. https://www.theguardian.com/us-news/2018/sep/07/imprisoned-by-
[5] 2019. Artificial Intelligence at Google: Our Principles. https://ai.google/ algorithms-the-dark-side-of-california-ending-cash-bail
principles/ [22] Justin Li. 2019. Experience Report: Explorable Web Apps to Teach AI to Non-
[6] 2019. Do-It-Yourself Artificial Intelligence. https://aiyprojects.withgoogle.com/ Majors. Journal of Computing Sciences in Colleges 34, 4 (2019), 128–133.
[7] 2019. Teaching Artificial Intelligence in the Secondary Classroom. https: [23] Cheryl Pellerin. 2017. Project Maven to deploy computer algorithms to war zone
//csermoocs.appspot.com/ai_secondary/course by year’s end. US Department of Defense 21 (2017).
[8] 2020. Top 18 Artificial Intelligence Platforms in 2020 - Reviews, Features, Pricing, [24] Stefan AD Popenici and Sharon Kerr. 2017. Exploring the Impact of Artificial
Comparison. https://www.predictiveanalyticstoday.com/artificial-intelligence- Intelligence on Teaching and Learning in Higher Education. Research and Practice
platforms/ in Technology Enhanced Learning 12, 1 (2017), 22.
[9] Tom Armstrong. 2010. Robotics and Intelligent Systems for Social and Behav- [25] Alpay Sabuncuoglu. 2020. Designing One Year Curriculum to Teach Artificial
ioral Science Undergraduates. In Proceedings of the Fifteenth Annual Conference Intelligence for Middle School. In Proceedings of the 2020 ACM Conference on
on Innovation and Technology in Computer Science Education (Bilkent, Ankara, Innovation and Technology in Computer Science Education. 96–102.
Turkey) (ITiCSE ’10). Association for Computing Machinery, New York, NY, USA, [26] Elisabeth Sulmont, Elizabeth Patitsas, and Jeremy R. Cooperstock. 2019. Can
194–198. https://doi.org/10.1145/1822090.1822146 You Teach Me To Machine Learn?. In Proceedings of the 50th ACM Technical
[10] Claudine Badue, Rânik Guidolini, Raphael Vivacqua Carneiro, Pedro Azevedo, Symposium on Computer Science Education (Minneapolis, MN, USA) (SIGCSE ’19).
Vinicius Brito Cardoso, Avelino Forechi, Luan Jesus, Rodrigo Berriel, Thi- Association for Computing Machinery, New York, NY, USA, 948–954. https:
ago Meireles Paixão, Filipe Mutz, et al. 2020. Self-driving cars: A survey. Expert //doi.org/10.1145/3287324.3287392
Systems with Applications (2020), 113816. [27] Elisabeth Sulmont, Elizabeth Patitsas, and Jeremy R. Cooperstock. 2019. What Is
[11] Nick Bostrom and Eliezer Yudkowsky. 2014. The ethics of artificial intelligence. Hard about Teaching Machine Learning to Non-Majors? Insights from Classifying
The Cambridge handbook of artificial intelligence 1 (2014), 316–334. Instructors’ Learning Goals. ACM Trans. Comput. Educ. 19, 4, Article 33 (July
[12] Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional Accu- 2019), 16 pages. https://doi.org/10.1145/3336124
racy Disparities in Commercial Gender Classification. In Conference on fairness, [28] Alan M Turing. 1950. Computing Machinery and Intelligence. Mind 59, 236
accountability and transparency. 77–91. (1950), 433.
[13] Michelle Carney, Barron Webster, Irene Alvarado, Kyle Phillips, Noura Howell, [29] Thomas Way, Mary-Angela Papalaskari, Lillian Cassel, Paula Matuszek, Carol
Jordan Griffith, Jonas Jongejan, Amit Pitaru, and Alexander Chen. 2020. Teachable Weiss, and Yamini Praveena Tella. 2017. Machine learning modules for all disci-
Machine: Approachable Web-Based Tool for Exploring Machine Learning Classi- plines. In Proceedings of the 2017 ACM Conference on Innovation and Technology
fication. In Extended Abstracts of the 2020 CHI Conference on Human Factors in in Computer Science Education. 84–85.
Computing Systems (Honolulu, HI, USA) (CHI EA ’20). Association for Computing [30] Joseph Weizenbaum. 1966. ELIZA—A Computer Program for the Study of Natural
Machinery, New York, NY, USA, 1–8. https://doi.org/10.1145/3334480.3382839 Language Communication Between Man and Machine. Commun. ACM 9, 1 (1966),
[14] Andrea Pohoreckyj Danyluk. 2008. Artificial Intelligence for Non-Majors at Multi- 36–45.
ple Levels.. In AAAI Spring Symposium: Using AI to Motivate Greater Participation [31] Michael Wollowski, Robert Selkowitz, Laura E Brown, Ashok Goel, George Luger,
in Computer Science. 20–25. Jim Marshall, Andrew Neel, Todd Neller, and Peter Norvig. 2016. A survey of
current practice and teaching of AI. In Thirtieth AAAI Conference on Artificial
Intelligence.

204

You might also like