0% found this document useful (0 votes)
39 views27 pages

Usage and Knowledge of Online Tools and Generative

The document presents a mixed-method analysis of a survey conducted at a doctoral-granting university to investigate students' usage and knowledge of online tools and generative AI, particularly ChatGPT. Findings reveal that while AI tools are popular among students, traditional tools like search engines remain more widely used. The study aims to understand how students integrate AI tools into their learning processes and their perceptions of expertise in using these technologies.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views27 pages

Usage and Knowledge of Online Tools and Generative

The document presents a mixed-method analysis of a survey conducted at a doctoral-granting university to investigate students' usage and knowledge of online tools and generative AI, particularly ChatGPT. Findings reveal that while AI tools are popular among students, traditional tools like search engines remain more widely used. The study aims to understand how students integrate AI tools into their learning processes and their perceptions of expertise in using these technologies.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Usage and Knowledge of Online Tools and

Generative AI: A Survey of Students


Rahul R. Divekar

Bentley University
Lisette Gonzalez
Bentley University
Sophia Guerra
Bentley University
Natasha Boos
Bentley University

Research Article

Keywords: ChatGPT, AI, Generative AI, Online Tools, Higher Education, Students

Posted Date: August 9th, 2024

DOI: https://doi.org/10.21203/rs.3.rs-4882673/v1

License:   This work is licensed under a Creative Commons Attribution 4.0 International License.
Read Full License

Additional Declarations: The authors declare no competing interests.


Usage and Knowledge of Online Tools and
Generative AI: A Survey of Students
Rahul R. Divekar ∗
[email protected]

Lisette Gonzalez
[email protected]

Sophia Guerra
[email protected]

Natasha Boos
[email protected]

Bentley University, Waltham MA, USA

Competing Interests: None


Data Availability: Anonymized data may be made available upon reason-
able request and approval of the IRB if request is made within 4 years of original
date of data collection as all data will be deleted after 4 years.

∗ Corresponding Author

1
Abstract
Artificial Intelligence (AI) tools like ChatGPT are poised to transform
student and educator workflows in higher education. However, there is less
documentation on the range of tools students in higher education use, how
they use them and in coordination with other online tools for learning,
and their expertise using AI tools. We present a mixed-method analysis of
a survey conducted at a doctoral-granting university in the United States
investigating the adoption of AI tools in the context of other technologies.
The findings include how the students used GenAI tools in light of other
on-line technologies, their perception of expertise on the topic, and how
they gain expertise in using AI for educational work.
Keywords: ChatGPT, AI, Generative AI, Online Tools, Higher Ed-
ucation, Students

2
1 Introduction and Literature
Artificial Intelligence (AI) researchers have developed applications for education
for a long time as evidenced by several journals, publications, and focused con-
ferences on the topic of AI in Educaton or AIEd. However, the introduction of
ChatGPT and other Large Models (LMs) have changed the landscape for two
reasons. One, Large Models (LMs,) also known as Generative AI (GenAI), are
broad tools that can perform several tasks across a huge spectrum of education-
related activity including writing assignments, grading assignments, analyzing
data, etc. with non-trivial quality and incredible speed (Brown et al., 2020).
This is in contrast to the prior state in which each AIEd research was focused on
narrow tasks such as automatic assessments of some types of essays (Ghosh &
Beigman Klebanov, 2024), Intelligent Tutoring Systems for narrow subject ma-
terials (Mousavinasab et al., 2021), Conversational roleplay for language learn-
ing in specific scenarios (Divekar et al., 2022), among others. Two, these tools
are no longer living only in research labs with limited access to students and
practitioners; rather, many are easily accessible for free to anyone across the
world via a simple internet browser. (Duha, 2023) have noted that this is not
the first time technology has transformed education and that we have seen at
least two such transformations before with Google and Wikipedia. However,
when a new technology like this is released and becomes popular within the
education community, it is likely to disrupt existing technologies and workflows.
For example, when search engines like Google.com or online encyclopedias like
Wikipedia first became popular, it augmented and replaced (in some cases)
traditional information sources like a library or subject-matter experts in uni-
versities in the sense that a student no longer had to go to the library or to
an expert for every question, rather they looked up online (Brindley, 2006).
This prompted libraries to pivot to the integration of new technology and begin
efforts to increase digital literacy.
We are at a similar crossroads again, with AI technology becoming a new
source of information for university students. The education community must
decide what changes we have to make in curriculum, teaching, learning, tech-
nical resources, etc. (Črček & Patekar, 2023) provided to students to create a
future generation of experts that can use artificial intelligence to their advan-
tage. However, we do not know whether AI technology is being used by students
in addition or in substitution to the sources of information of the previous rev-
olution like Google and Wikipedia. We also do not yet know what it means
to have expertise in using AI tools like ChatGPT from a students’ perspective.
We hope to provide initial answers to that through our paper by discussing how
students use AI technologies in relation to other online resources, thereby pro-
viding the groundwork to move the conversation forward in this next iteration
in technology-enabled education.

3
1.1 Generative AI
AI technologies have been in the making since the 1950s (Toosi et al., 2021).
The primary and simplified concept behind AI is that first a model is provided
with some amount of task-specific numerical data. An AI model is then able to
find the patterns that exist in the data. Then, it can apply the found pattern
to new data and create predictions, outcomes, etc. Natural Language Process-
ing (NLP), a subset of the AI research community, applies a similar concept
except that they treat natural language (e.g., English, Hindi) words as num-
bers and perform computations on them to identify patterns and generate new
words. In 2017, a Transformer model showed promised to find patterns in large
amounts of text and generate new words based on that pattern in the con-
text of language translation (Vaswani et al., 2017). OpenAI significantly scaled
up the Transformer model to release the the first iteration of the Generative
Pre-trained Transformer (GPT) (Radford et al., 2018) and then a second more
capable GPT-2 (Radford et al., 2019). At a huge cost of upwards of 4 million
US dollars (Leswing, 2023), they eventually released its iteration called GPT-
3 (Brown et al., 2020) that powered ChatGPT, the first landmark successful
interface and Large Language Model (LLM) that was good at many language
tasks and went far beyond what the initial version of GPT promised. Other
technology companies followed suit. At the time of writing this paper, we now
have a variety of LLMs that are similar to ChatGPT, e.g., Copilot1 , LLaMA2 ,
Gemini 3 , Claude 4 , etc. and their subsequent versions. Similar advances were
made in graphics, image, and audio processing science that allowed models like
Dall-35 , Midjourney6 , Stable Diffusion 7 to generate or edit images and audio
based on text or vice versa. Collectively, these image, sound, and language
models are called large models (LMs).
Other software developers and AI startups used these Large Models (LM)
as a foundation for new products. Specific to education, scite.ai augmented
LLMs with the ability to search academic papers providing an interactive way
to conduct research. Perplexity.ai infused LLMs with the ability to search the
Internet providing a new way to consume information. Elicit helps summarize
research papers, while Writefull and Grammarly help edit text for better flow
and grammar.
In addition, LLMs are not restricted to generating natural language. They
are also able to generate programming languages. Subsequently, companies
like OpenAI and Julius.ai generated programming code on a user’s request and
run that code behind the scenes on data that user uploaded. Basically, their
solutions could allow the user to process and analyze data without knowing any
programming language such as R or Python.
1 https://copilot.microsoft.com/
2 https://llama.meta.com/
3 https://gemini.google.com/
4 https://claude.ai/new
5 https://openai.com/index/dall-e-3/
6 https://docs.midjourney.com/
7 https://stability.ai/news/stable-diffusion-3

4
Although there is no research on how many AI tools exist today, at the time
of writing this paper, the authors are aware of several AI tools that can be used
for educational purposes like writing essays, data analysis, creating slides, etc.
that are now also available to students and anyone else. In the discussion above,
we provided a large but non-exhaustive number of AIEd usecases and products.

1.2 Students’ AI Adoption in Higher Education


The International Monetary Fund estimates that about 40% of global employ-
ment is exposed to AI in that AI could either eliminate these jobs or play a
complementary role and boost productivity (Melina et al., 2024). Finding suit-
able employment is one of the main goals of many students. Students are aware
that AI will play an important role in their field and that some profiles might be
replaced by AI (Almaraz-López et al., 2023). Therefore, it stands to reason that
students are at least curious about tools like ChatGPT. Next, we summarize
surveys on AI adoption in university students.
(Sallam et al., 2024) recently conducted a survey in the UAE to investigate
the acceptance of ChatGPT within university student community. They saw
that 85 of their respondents had used ChatGPT, higher than previously reported
studies in other parts of the world that they have synthesized in their litera-
ture. Further, they found that attitude towards the technology was correlated
on sex, type of university, and ethnicity. (Chan & Hu, 2023) surveyed students
in Hong Kong and found similar familiarity, positive attitude, and willingness
to use Generative AI within the student community. (Albadarin et al., 2024)’s
systematic literature review of empirical research found that ChatGPT was be-
ing used for various writing tasks, feedback, on-demand answers, explanations,
etc. However , they point out that ChatGPT may have negatively impacted
collaboration and innovative tendancies in students.
Students also report using AI for a variety of reasons related to their uni-
versity activities and life. (Crawford et al., 2024) found that AI tools like Chat-
GPT have been used to fill the social connections gap, especially for university
students, with the caveat that when they filled the social need with AI, they
experienced even more human isolation. The authors claim that lack of social
connection could result in a lower sense of belonging on campuses and conse-
quently affect grades and student retention. However, (Al-Zahrani & Alasmari,
2024)’s broader survey has found that the higher education community in Saudi
Arabia uses a variety of AI tools that go beyond just ChatGPT and include
face recognition, speech recognition, etc. for various purposes like educational,
entertainment, etc. They report seeing similar negative experiences as other
reports in the field i.e., privacy issues, financial costs, etc.

1.3 Educators’ Outlook of AI in Higher Education


University students seem to have widely embraced ChatGPT, forcing educators
to respond in a variety of ways: acceptance and rethinking of learning activities,

5
rejection and vigilance towards students using ChatGPT, and general panic over
academic dishonesty (Črček & Patekar, 2023).
(Kiryakova & Angelova, 2023) presented a survey to university professors at
Trakia University and found more concerns from faculty members that learners
will unquestioningly accept the output from ChatGPT to be correct especially
when ChatGPT has a tendency to be incorrect (Islam & Islam, 2024; Rawte
et al., 2023). However, the university professors also showed a positive outlook
while using ChatGPT towards their teaching e.g., to provoke interest in the
subject, stimulate critical thinking, etc (Kiryakova & Angelova, 2023). Further,
(Firat, 2023) found in their qualitative analysis that evolution of learning and
educational systems and changing role of educators were the most frequent code
in interviews from scholars and students from four countries. In a large scale
analysis of X (Formerly, Twitter), (Mamo et al., 2024) found that within the
Higher-education faculty community, while sentiment towards ChatGPT moved
positively overtime, negative sentiments such as anger, fear, disgust, and sadness
were also found. Plagiarism, job security, bias, threats to writing, disruption,
assessment threats and disinformation were some of the factors contributing to
the negative sentiments. Some of these factors are directly related to students
using ChatGPT for university work.
While educators worry about the learning outcomes for their students given
the use of AI, they themselves have not shied away from it, as AI has received
tremendous attention from education researchers recently. (Crompton & Burke,
2023) conducted a PRISMA review of 138 articles in AIEd and found an almost
three-fold increase in the number of articles published in 2021-2022 compared
to previous years where research on AI assistants was prominent. Furthermore,
(Mahapatra, 2024) showed that, with some training, ChatGPT can give good
feedback in writing classes and is especially useful in English as Second Language
(ESL) student writing.
Given the various positive and negative reactions from students and fac-
ulty, (Črček & Patekar, 2023) strongly encourage educators to embrace technol-
ogy, although with clear policies and guidelines on its use to optimize learning.
However, doing so requires knowing how students are using AI. Although most
previous studies have identified attitudes and adoption towards ChatGPT from
students, there is not enough literature to show other AI tools that students use
and how students use AI in context of already existing technology. This infor-
mation is crucial to get a snapshot of the online technology stack that students
use to accomplish their goals in universities. Further, we also recognize that AI
will be a tool in the future of work and that teaching students to be experts
at using AI in their domain is likely going to be a similar wave of education as
digital literacy a few years ago. However, it is not clear what it means to be an
expert in using AI. We bring these two perspectives through our study in this
paper. We uniquely contribute to the existing body of literature by identifying
how students use AI tools in the context of other online tools and what they
consider an expert in the field.
We present a mixed-method analysis of a survey of 68 students at a doctoral-
granting university in the United States. In the results, we show that ChatGPT

6
is still the most popular AI tool. However, we report that AI tools have not fully
replaced ‘older’ tools such as search engines, Wikipedia, etc. In fact, search en-
gines are still slightly more popular than AI tools within the student community.
In addition, specific to AI tools, we note various uses of AI technology reported
by students such as writing, brainstorming, and creative uses that we detail in
our report and discuss the implications of it. We discuss what students see as
expertise while using AI in their field and how students gain that expertise.

2 Methodology

Question Response Type

How do you currently learn new topics on the Multi-option with ability to
Internet? (Select all that apply) fill in manually

Which AI tools have you used in the past? Multi-option with ability to
fill in manually

On average, how often do you use [Tool]a? Multi option

How do you use [Tool]a? Fill in, large text box

On a scale of 1 to 5, where 1 indicates mini- Likert Scale 1-5


mal expertise in using AI tools and 5 denotes
advanced proficiency, how would you rate your
expertise in utilizing AI for your work?

Please explain your expertise level and why you Fill in, large text box
believe what you picked (e.g., if you have taken a
course on prompting, read articles or tips/tricks
related, feel like you always get the answer you
want, etc.)

Additionally, we welcome your thoughts on Fill in, large text box


what it means to be an expert in AI usage.
Please share your perspective.

a [Tool] was automatically replaced with name of each AI tool they submitted in previous
question

Table 1: Select Survey Questions

A link to an electronic survey developed using the Qualtrics platform was


sent through mailing lists to currently enrolled undergraduate and graduate

7
students at a doctoral-degree-granting university in the United States. The
email invitation stated the eligibility and ineligibility criteria mentioning that
students currently enrolled or who wanted to enroll in classes with the project
PI were not eligible to avoid biased responses. In addition, they were required
to have some exposure to AI tools such as ChatGPT. Given that our survey
also had a significant qualitative aspect, we set a target size of approximately
60 students with a higher compensation of $5 gift card to encourage detailed
qualitative responses. The survey was approved by the Institutional Review
Board (IRB). Table 1 shows the list of questions relevant to the study. The
responses to all except the last question were required to complete the survey.
For the sake of brevity, we do not include sections and questions related to
informed consent, eligibility criteria, logistics, and demographics, as they do
not differ from the standards seen in other surveys.

3 Results and Analysis


The responses were analyzed using a mixed method approach, i.e. some re-
sponses were quantitatively analyzed to find broad patterns and others were
qualitatively analyzed using thematic grouping. Using both approaches, we
now deep dive into the various ways in which AI technologies are being used
by university students, how they view expertise in AI use, and where they gain
expertise.

3.1 Demographics
Table 2 presents a quick view of the age range, enrolled study level, gender, and
grade point average (GPA) of the 68 students who participated in the survey.
From table 3, we see the count of students who represented majors, minors,
or areas of subject matter expertise. Note that a student typically can major
in more than one subject area alongside one or more minors. All data were
self-reported by participants.

3.2 Quantitative Analysis


3.2.1 Number of tools used
We asked students how they currently learn on the Internet and offered a mul-
tiselect box with the option to type in additional responses. We found that on
average people currently use 3.64±1.21 (Min. 1, Max.7) online tools to learn.
Fig. 1 shows the distribution

3.2.2 Names and Types of Tools Used


We normalized names manually e.g., edited ‘co-pilot’ and ‘copilot’ to mean the
same thing. Fig 2 shows the names of the tools that students indicated they use

8
Description Number of Students

Total Valid Responses 68


Age Range
18-10 35
21-23 25
24-26 7
27 or older 1
Enrolled Level of Study
Undergraduate 51
Graduate 17
Gender
Female 38
Male 30
GPA (max. 4)
Above 3.5 45
3-3.5 18
2.5-3 1
Below 2.5 1

Table 2: Survey Demographics

Fig. 1 Histogram of number of tools used to learn on the internet

9
Majors and Minors Number of Students

Finance 26
Accounting 13
Management 9
Business Analytics 7
Computer Info. Systems 7
Data Analytics 7
Economics 7
Information Design and Corporate Communication 5
Marketing 4
Spanish 2
Business Administration 1
CFA 1
Film and Media Studies 1
Finance and Technology 1
Health studies 1
Human Factors in Information Design 1
International Affairs 1
Mathematical Sciences 1
Public Policy 1
Sociology 1
Sports Business Management 1
Undecided 1

Table 3: Self-reported Majors, Minors and Areas of Expertise

10
while learning on the Internet. Google is the most used tool followed closely by
ChatGPT, YouTube, and Wikipedia.
Although Fig. 2 shows the frequency of each tool that students indicated
they use, there are multiple tools that can be used similarly and toward the
same goal. For example, Bing and Google are both search engine alternatives
to each other. Therefore, we categorized these tools into types as shown in Table
4. Readers might note that the table shows Search and Multimedia Learning
separate from AI even if search and multimedia tools use AI algorithms in
their back-ends for recommendations and better search results. To clarify, here
on, when we say AI for the purpose of this paper, we mean broad-application
Generative AI tools like ChatGPT. Based on the categorization shown in Table
4, we counted the frequency of students who indicated using each type of tool
after removing duplicates. This process allowed us to count unique mentions by
tool types; for example, if one student mentioned that they used WSJ and NYT,
we counted as one mention of News type. The popularity of types of tools that
students use is seen in fig. 3 where Search, AI, and Multimedia Learning are
the top-3 types of tools students indicated using while learning on the Internet.
However, we also wanted to capture the number of tools students use per
tool type, for example, if a student said that they use WSJ and NYT, that was
two counts of tools for the tool type of News. For each type of tool, on average,
each student used close to one tool as shown in Table 5.

Tool Category Tools under the category

Search Google, Bing, Google Scholar, Brave Web Search


AI ChatGPT, Perplexity, Copilot
Multimedia Learning Youtube
Encyclopedia Wikipedia
Guided Learning Coursera, Khan Academy, Data Camp, Free Code Camp
Social Media TikTok, LinkedIn, Instagram
News Wall St Journal, NY Times, LAT

Table 4: Types of Tools used by Students

3.2.3 AI Tool Usage


Focusing on AI tools specifically, we asked them which AI tools they had used
in the past to capture familiarity with AI tools. All 68 students mentioned they
had used ChatGPT in the past. In addition, some of them also mentioned that
they have used other tools as seen in Table 6. We also note that only 13 of the
68 students selected/entered an option other than ChatGPT. Out of the 13 only
4 selected/entered more than one additional option with one person writing in
4 option in addition to ChatGPT.
In the same table, we have added the third column describing the AI tool
for readers’ ease by searching the name on Google, visiting the tool’s website, or

11
Fig. 2 Popularity of Tools

Fig. 3 Popularity of Tool Types

12
Tool Type Mean±Std. dev.

Search 1.09±0.29
Al 1.03±0.19
Multimedia Learning 1±0
Encyclopedia 1±0
Guided learning 1.11±0.3
Social Media 1.33±0.6
News N/A as only one person reported us-
ing news articles

Table 5: Avg. Number of tools used per person in each category

from our own familiarity. However, the landscape of AI tools is rapidly changing
and AI tools may be different than what we have established as their description
by the time of publication.

Tool Name Frequency Description

ChatGPT 68 Multi-purpose LLM by OpenAI


Copilot 4 Multi-purpose LLM with search by Microsoft
Bard 3 Multi-purpose LLM by Google
Claude 3 Multi-purpose LLM by Anthropic
Dall-e 2 Image generation model by OpenAI
Automata 1 Marketing and Content Re-purposing Model
Beautiful.ai 1 AI Presentation Maker
Bing AI 1 LLM combined with search by Microsoft
Descript 1 Podcast and Video Editor
Grammarly 1 Writing Assistant with Plagiarism Checks
HyperWrite 1 Writing Assistant
Perplexity 1 LLM Combined with Search
Snapchat AI 1 Multi-purpose Assistant Combined with Social
Media app
Writier 1 Writing assistant

Table 6: Familiarity with AI tools

In addition, we asked them how often they used AI tools for each tool they
said they currently use. Fig. 4 shows responses for how often they use ChatGPT.
Few people who use other AI tools like Perplexity (1) said 2-3 times a week. (3)
rare usages of other tools. And one person mentioned they use Co-pilot/Dall-E
as often and together with ChatGPT

13
Fig. 4 Frequency of use of ChatGPT

3.2.4 AI and Student Expertise


We asked students to rate their expertise in utilizing AI for their academic
work. Fig 5 shows the response graph where 6 means advanced expertise. We
see students responded with a mean of 3.2±0.97, median=3, and range of 1-5.

3.3 Qualitative Analysis of Tool Usage


We asked two types of open ended questions in our survey to understand how
participants used AI tools and if they thought that expertise (if any) was re-
quired to use AI tools. We coded their responses and categorize them themati-
cally. Our goal in the analysis was to identify as many themes as we can without
finding a numerical count of the occurrence of each theme. Rather, we describe
them with qualitative words such as few, many, most. We present the results
below.

3.3.1 Usage: Learning Complex Topics


Learning complex topics was the most common theme in the responses. The
participants learned complex topics in various ways. For example, one partici-
pant mentioned “... asking [AI] to explain a complex topic as if I am a younger
student...” or answer “... daily questions”. Other students more directly con-
nected learning to university activities such as “asking questions on homework”,
“understand concepts taught in class...”, “... do a lot of the tedious groundwork
for assignments”

14
Fig. 5 Expertise with AI tools

One participant mentioned that they liked ChatGPT because “... it skips
the part of me scrolling different sites and gathering information” alluding to
ChatGPT’s information distilling capabilities. Several other participants also
mentioned that they use ChatGPT to summarize information or “...get basic
information on a topic” or even organize thinking e.g., “... get a summary from
chatgpt to help me format the text ideas in my head”, “... help think through
ideas”, “... getting a step by step to solving economics problems”. Some even
went to the extent of saying that they use chatgpt “...as a tutor”

3.3.2 Usage: Writing


The next popular code was Writing. We further divided writing into subcodes
as follows.
Communicative Writing Several participants mentioned that they use
ChatGPT to write various communication artifacts such as emails, letters, notes,
resume, cover letters, etc. Some students mentioned that they use ChatGPT
to write more professionally e.g., “... help with writing emails to allow me to
adjust to a more professional tone and expand my vocabulary”
Rephrasing Several students mentioned that ChatGPT helps them rephrase
things or “...provide new ways of wording sentences” possibly by “provid[ing]
word ideas like phrases, synonyms etc” or “...strengthen[ing] my vocabulary in
writing”.
Proofreading Similar to rephrasing, some students used it to proofread
and check for spelling, check grammar, or give them feedback on their writing.

15
Writing Outline Some students also mentioned they use ChatGPT to ”...
create outlines for papers I need to write.”

3.3.3 Brainstorming
Students mentioned that they used chatgpt to brainstorm university-related
work such as “... get ideas for essays”, “...suggestions for topics”, “...how to
start writing about something”, “generate ideas...”, “brainstorming ideas...”,
when they “... want some direction”, “...create outlines for papers”, “... to
know the answer for questions which are interesting for me” or even “... help
me organize thoughts”.

3.3.4 Software and Data Analysis


Several students mentioned that they used ChatGPT for “... data cleaning,
code writing”, “... filter excel data”, or to have code explained to debugged for
them.

3.3.5 Entertainment
Some students mentioned that they used ChatGPT for “...games, entertain-
ment”, “... writing songs”, or for “... hobbies where it helps me find new music
and ideas”

3.3.6 Low frequency yet unique uses of AI tools


Mental health resource One student mentioned they use chatgpt for “...
some personal advice” while another mentioned that they have used chatgpt for
“... therapy reasons in the past”
Presentations One student used “... canva prompt for creating and editing
presentations”
Professional Development Students mentioned they use ChatGPT to
prepare for interviews.
Social use One student mentioned that they used ChatGPT to “... look
for date ideas”
Usage: Search Some students used it as an alternative to search engines
e.g., “help me find research papers...” or even simply “...basic search (like a
search engine)”
Usage of other AI tools Usage of other competitor tools was primarily as a
substitution of ChatGPT because “Copilot [has] become an add-on to Microsoft
Edge” or to “... compare chatgpt responses”

3.4 Qualitative Analysis of ChatGPT and Expertise


The introduction of ChatGPT has led to several certificate courses and a move-
ment in academia and industry to learn more about AI. We asked students if
they thought they had the expertise as noted previously in the quantitative

16
analysis section. Here, we discuss their qualitative responses to why they self-
rated their expertise as they did and to whether they had any thoughts on what
it means to be an expert in AI. We thematically analyze those responses and
present them in this section.

3.4.1 Learning by using


Most people indicated that their expertise came from self-learning by using it
longer. They didn’t think that it was “... hard to use chat gpt”; some said
that their “... prompts [got] better and currently it has become much easier to
get the response I want from the AI model.” Some attributed to the technology
becoming better e.g., “... it’s very user-friendly nowadays and I do not struggle
with using it”
However, when asked what it means to have expertise, students overwhelm-
ingly responded indicating that specialized knowledge is required to drive AI.
For example, “...to be an expert in AI, it means knowing how to optimize the
software to best get results.” We explore different types of specialized knowledge
below.

3.4.2 Technical Knowledge


Few attributed their expertise to having an “...undergraduate [degree] in com-
puter engineering and have [learning] the theory behind AI and the algorithms
it uses”. Very few people mentioned that their expertise comes from knowing
the underlying algorithms or the fundamentals of computer science.
When asked what it means for someone to be an expert, many more students
responded indicating technical expertise was necessary e.g., “in order to be an
AI expert I think it takes much more than just basic knowledge of the publicly
known and used models”, “ChatGPT is a great tool and model, but having a
deeper understanding of both how it works and other tools that can do the same
job with less overlap in responses is what I’d say separates experts and normal
users.” However, some students also responded indicating that expertise is in
a successful outcome or application of AI e.g., “being an expert in AI entails
being able to use it professionally to gain knowledge and profit from it, as well
as advance research and development of a business or career.”

3.4.3 Domain Knowledge


We note that most of our participants are from a business background and might
be applying AI tools in their domain. While others may have implied that their
domain expertise contributed to the expertise, only one person specifically at-
tributed their success to being “...good at identifying misleading or inaccurate
responses”, indicating that having the knowledge of the domain to which Chat-
GPT is applied is essential to using it effectively.

17
3.4.4 Conceptualization of Technology
We found two metaphors about how people describe their own expertise using
relationship metaphors such as calling it “... a student-teacher [relationship]
where I have to teach the teacher to teach me in the most efficient way possible”
and the other mentioned “... just talk to Chatgpt(prompt) like [it is] a child
and specify the environment and the answer I am looking for.”

3.4.5 Gaps in Knowledge


Some people mentioned that they did not have any more than general knowledge
and attributed their lack of expertise to it; but that does not seem to have made
the technology hard to use. For example, one participant said “... do not think
it is hard to use chat gpt. But I know it has its limitations but I’m unsure what
all the limitations are”. Some students indicated that to be an expert in AI, “...
requires you to understand both the tools but also the consequences and ethics
behind them”, indicating that consequences and ethics while using ChatGPT
are not common knowledge as per the participant.

3.4.6 Sources of Information


We noticed that some participants attributed their expertise to the information
they consumed about technology. Some students consumed information online
through articles and videos e.g., “watch videos on how to prompt engineer”,
few learned via social media e.g., “...learnt a lot of tricks from Instagram reels”.
Few students had exposure to ChatGPT via their university either through
classrooms e.g., “took a [university] course on AI Marketing” or research projects
e.g., “...last summer I helped a professor with research about ai for [academia].
At the [unknown word] we used chat gpt”. Few had taken a formal course on
prompt engineering.

4 Discussion
4.1 Demographics
We conducted a survey in which 68 students participated. We know from their
demographics that most of these student participants were between 18 and 23
years old enrolled in undergraduate and graduate courses at Bentley University.
We had slightly more participation in the survey from people who identified
as female rather than male, with no participation from any other gender. We
know that most of these participants were in good academic shape as almost
everyone had a GPA above 3. Furthermore, most of the participants came from
nontechnical business-related areas such as Finance and Accounting. These de-
mographic facts are important, as the results might change if the demographic of
the participant changes. For example, we already know that technical students
might be using AI tools differently than non-technical users (Zamfirescu-Pereira

18
et al., 2023). In addition, our inclusion criteria involved students who were
at least familiar with AI tools. We acknowledge that the findings synthesized
below might not apply in the same way to a broader population of university
students who have never tried an AI tool. Therefore, the rest of the discussion is
in the context of our participants’ background. Tables 2 and 3 show a detailed
view of the demographics data.

4.2 Online Tool Usage


We saw from Fig. 1 that participants on average use 3.64 tools to learn online,
with some participants using up to 7 tools. This shows that as much as Chat-
GPT has the promise of serving education needs, it may not be serving all of
them as students use more than just ChatGPT. Further, we still see from Fig. 2
and 3 that Google and Search Engine category outdo use of AI tools like Chat-
GPT. Considering that this result was seen in a sample that by inclusion criteria
had to have some AI exposure, we anticipate that wider samples will show a
higher usage of search engines rather than AI tools. Although the inclination
of previous work showing that AI has entered the student learning toolkit may
be correct, it is not the case that AI has completely replaced other online tools.
This finding potentially alleviates some concerns (Kiryakova & Angelova, 2023)
presented that learners will unquestionably accept the output of ChatGPT, as
participants tend to use multiple tools. In addition, while previous transfor-
mations in education with Google and Wikipedia (Duha, 2023) have prompted
pivots from existing systems such as libraries, we see that ChatGPT has failed
to fully dethrone the search engine as the most used tool to learn and wikipedia,
youtube and other information sources still remain popular.
Closely following the Search and AI tool categories, we see students using
multimedia tools such as YouTube to learn online. This is not far from prior
research that showed YouTube as a favorable learning tool (Maziriri et al., 2020).
It is possible that even in the age where AI can summarize content from various
sources and even YouTube videos, there is still a desire to learn via YouTube
directly, possibly hinting towards the preference of non-textual modalities.
Although participants mainly used search, AI and multimedia learning tools
to learn online, we saw in table 5 that for each type of tool, participants mainly
use a tool from only one brand. For example, for Multimedia Learning, there was
no tool other than YouTube that came up in the survey. This is not due to the
lack of alternatives as YouTube faces competition from platforms DailyMotion,
Vimeo, etc. For search and AI, the average was slightly higher at 1.09 and 1.03
tools per person. The results indicate that the students are loyal to one tool
per category.

19
4.3 AI Tool Usage
4.3.1 Popularity and Frequency of Use
For AI tools specifically, ChatGPT is more known than any other tool, with all
participants mentioning that they have used this tool in the past. In general,
the participants indicated mainly familiarity with multi-purpose large-language
models. However, some participants also indicated familiarity with AI tools
designed specifically for narrower usecases, e.g., writing assistants, presentation
assistance. An exhaustive list of the AI tools indicated as used by the partici-
pants is found in table 6.
(Sanasintani, 2023)’s survey found that higher education program students
are not only familiar with AI concepts but prefer it over traditional educational
methods and believe that including it in curriculum is important as it can im-
prove the quality of learning in higher education. However, in the results section
we reported that only 13 of 68 students selected an option other than ChatGPT.
Out of the 13 only 4 selected more than one additional option with one person
writing in 4 option in addition to ChatGPT. We notice that the familiarity of
tools beyond the most popular ChatGPT rests within only a small subset of
the sample. This indicates AI familiarity and education is not equal and that
familiarity with more than the most popular tool exists only within a very small
sliver of the sample.
Further, we also note that while many tools were listed under “tools used in
the past” (see table 6), much fewer AI tools and only multipurpose LLMs were
listed when asked about current tool usage (see fig. 2). This might indicate that
narrow-usecase AI tools either did not meet students’ needs or the multipurpose
LLMs did a good job at such a wide range of tasks that narrow usecase tools
were not necessary. Especially in light of each tool having a price associated with
long-term volume use of it, we see a churn in our sample from narrow-use-case
AI tools to ChatGPT.
When using AI tools specifically, in contrast to (Singh et al., 2023) who
noted in their survey that students did not use chatgpt for academic purposes
frequently, we saw most of the participants use it daily or 2-3 times a week. We
believe that the popularity and quality of the tool have increased since previous
research was published in the student community and led to more adoption.

4.3.2 AI tool use cases


Writing Similarly to (Chauke et al., 2024; Črček & Patekar, 2023; Cromp-
ton & Burke, 2024), we see that students use AI tools to generate ideas, para-
phrase, summarize, and proofread text, among other writing activities. This
isn’t surprising as writing is perhaps the most common artifact that students
submit and get graded on.
We caution that when students use AI tools for writing, even in early pro-
cesses such as brainstorming or finding an idea, they might be nudged into one or
another direction (Jakesch et al., 2023), which can hamper their ability to think
creatively. However, AI can eliminate the writer’s block problem by creating

20
an outline and helping students brainstorm interactively. In addition, students
who do not speak English as their first language might find a great help from
tools like ChatGPT (Yan, 2023). The positive impact on language acquisition
has also been seen in our data, where students mentioned that ChatGPT helped
expand their vocabulary and help with the tone of their writing.
In addition to regular writing, we note that students use AI tools for writing
software code and even executing the code to perform data analysis.
Learning Many participants mentioned ChatGPT’s ability to distill infor-
mation into coherent texts. They mention that this ability can break down a
topic so it can be explained at an easier level than how it was first introduced to
the students. Some even say that ChatGPT can act as their tutor. However, we
note that in the specific case of a math tutor, (Bastani et al., 2024) found that
ChatGPT can actually harm education by acting as a “crutch”. They found
that while ChatGPT can make tasks easier for humans, when its access was
taken away from students, students performed worse than students who never
had access to ChatGPT. (Crawford et al., 2024) also warn against substituting
AI for humans (e.g., in the case of learning complex topics, brainstorming, etc.)
as it can be detrimental to the student from a social perspective.
Other Our open-ended responses revealed students are using ChatGPT for
various personal, academic, and professional purposes. Given the holistic nature
of university life, students reported using ChatGPT for non-academic issues like
mental health, relationship advice, date ideas, entertainment. This highlights
the diverse range of university activities where ChatGPT is being utilized.

4.3.3 AI Expertise
How students gained expertise Our finding that most people gained
expertise just by using ChatGPT points towards three things. One, that the
technology itself and the user experience is simple enough to use; similar to
(Romero-Rodrı́guez et al., 2023) who found that the user experience was the
fundamental determinant of ChatGPT acceptance and that ChatGPT has an
easy-to-use interface (Turmuzi et al., 2024). Two, it goes against what some
professors have thought that the technology must be studied before being used
(Kiryakova & Angelova, 2023). Three, several course offerings from universities,
MOOCs, etc. in Prompt Engineering or LLM might not resonate with this
demographic of students who think they can gain expertise by only using it
unless a clear additional value can be demonstrated. In addition to usage, we
also noted that few students learned how to use ChatGPT through research or
classes at universities. Few students also mentioned that they learned on social
media.
We note that the difficulty in using ChatGPT is not the graphical user in-
terface itself but rather crafting the right set of words and providing enough
context to get the appropriate response, especially for advanced use cases. Fur-
thermore, since ChatGPT can generate incorrect information, one must also
have sufficient expertise in the domain to identify responses that may be wrong.
Domain expertise remains crucial as ChatGPT does not provide sources and

21
students using the tool must be able to identify good and bad information by
themselves or in augmentation with other tools or experts. In (Ngo, 2023)’s
study, students were aware of this issue; however, since not many mentioned
the need for domain expertise in our study, we show a possible deviation from
the finding.
Metaphors for Learning
Metaphors and anthropomorphizations are often used to conceptualize new
technology (Hurtienne & Blessing, 2007), especially one such as ChatGPT that
is sufficiently complex in its working and can perform a variety of tasks at scale
with non-trivial quality. We found two metaphors without specifically asking
the students to identify them. They were mentioned in the context of expertise
while using AI. In both metaphors, participants used anthropomorphizations
and called ChatGPT a ‘Child’ and a ‘Teacher’ that needs to be taught. In both
cases, the participants indicated that their relationship with AI was similar to
that of the human guide rather than the other way around.

5 Assumptions and Limitations


The Survey was administered only to people who indicated that they use AI
tools. Most of our participants came from a business-studies background and
had a good GPA. Our findings are limited to the sample described here. In
addition, many of the specific tools mentioned by participants were written in
by participants. Therefore, there could be an undercount of others who also
used the same tools but did not remember to write it down. In this case, listing
all the tools would have alleviated the problem. However, the list of AI tools
on the market is vast and is rapidly growing; it would be impractical to list all
of them. Other standard limitations of a survey such as not being able to ask
follow-up questions, survey fatigue, etc. may also affect results.

6 Statements and Declarations


The study presented in this paper was approved by the Institutional Review
Board (IRB) of Bentley University. To avoid any conflict of interest and to
obtain truthful responses, the survey tool automatically excluded any student
who indicated that they were currently enrolled or were anticipating enrolling
in classes conducted by the PI. Participants were compensated with a USD 5
gift card for their completed and valid responses.
The authors used AI (scite.ai) and search (Google Scholar) to find relevant
papers. They used Writefull to edit the grammar of the paper. No part of the
paper was written by AI. AI did not play a role in any analysis or synthesis of
data (Quantitative or Qualitative).
Data related to this survey may be made available upon reasonable requests
and the approval of the IRB.

22
7 Conclusion
Our research reveals that while AI tools like ChatGPT are rapidly gaining pop-
ularity among university students, they have not replaced traditional online
learning tools such as search engines and multimedia platforms. On average,
students use 3.64 tools to learn online, with search and AI tools like Google and
ChatGPT being the top-2 tools. Students are actively integrating AI into their
workflows, particularly for tasks such as writing and learning complex topics.
In addition, some are also using it to meet their entertainment, mental health,
social and other personal needs on campus. We have documented various use
cases and listed technology tools, including several AI tools, that are now a part
of students’ learning toolkit.
The concept of expertise in AI remains multifaceted, with students attribut-
ing it to both technical knowledge and practical experience, and most students
saying that they gained expertise just by using the tool rather than any formal
introduction or education. In addition, we find that knowledge about AI tools
beyond the most popular ChatGPT may be concentrated only within a few
students.

8 Acknowledgements and Contributor Roles


The authors thank the Faculty Affairs Committee and the Valente Center at
Bentley University for providing the grants needed to conduct this research.
Rahul Divekar - PI. Conceptualization, Literature Review, Data Analysis,
Synthesis, Writing, Visualizations, Methodology, Project administration, Fund-
ing acquisition
Lisette Gonzalez - Student. Piloting survey, proofreading and editing survey
instruments, administering survey
Sophia Guerra - Student. Piloting survey, proofreading and editing survey
instruments
Natasha Boos - Student. Piloting survey, proofreading and editing survey
instruments

References
Albadarin, Y., Saqr, M., Pope, N., & Tukiainen, M. (2024). A systematic liter-
ature review of empirical research on ChatGPT in education. Discover
Education, 3 (1). https://doi.org/10.1007/s44217-024-00138-2
Almaraz-López, C., Almaraz-Menéndez, F., & López-Esteban, C. (2023). Com-
parative Study of the Attitudes and Perceptions of University Students
in Business Administration and Management and in Education toward
Artificial Intelligence. Education Sciences, 13 (6). https://doi.org/10.
3390/educsci13060609

23
Al-Zahrani, A. M., & Alasmari, T. M. (2024). Exploring the impact of artificial
intelligence on higher education: The dynamics of ethical, social, and
educational implications. Humanities and Social Sciences Communica-
tions, 11 (1), 1–12.
Bastani, H., Bastani, O., Sungu, A., Ge, H., Kabakcı, Ö., & Mariman, R. (2024).
Generative ai can harm learning. Available at SSRN 4895486.
Brindley, L. (2006). Re-defining the library. Library hi tech, 24 (4), 484–495.
Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., Nee-
lakantan, A., Shyam, P., Sastry, G., Askell, A., et al. (2020). Language
models are few-shot learners. Advances in neural information processing
systems, 33, 1877–1901.
Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: perceptions,
benefits, and challenges in higher education. International Journal of
Educational Technology in Higher Education, 20 (1). https://doi.org/
10.1186/s41239-023-00411-8
Chauke, T. A., Mkhize, T. R., Methi, L., & Dlamini, N. (2024). Postgraduate
Students’ Perceptions on the Benefits Associated with Artificial Intel-
ligence Tools for Academic Success: The Use of the ChatGPT AI Tool.
Journal of Curriculum Studies Research, 6 (1), 44–59. https://doi.org/
10.46303/jcsr.2024.4
Crawford, J., Allen, K. A., Pani, B., & Cowling, M. (2024). When artificial in-
telligence substitutes humans in higher education: the cost of loneliness,
student success, and retention. Studies in Higher Education, 49 (5), 883–
897. https://doi.org/10.1080/03075079.2024.2326956
Črček, N., & Patekar, J. (2023). Writing with AI: University Students’ Use of
ChatGPT. Journal of Language and Education, 9 (4), 128–138. https:
//doi.org/10.17323/jle.2023.17379
Crompton, H., & Burke, D. (2023). Artificial intelligence in higher education:
the state of the field. International Journal of Educational Technology in
Higher Education, 20 (1). https://doi.org/10.1186/s41239-023-00392-8
Crompton, H., & Burke, D. (2024). The Educational Affordances and Challenges
of ChatGPT: State of the Field. TechTrends, 68 (2), 380–392. https :
//doi.org/10.1007/s11528-024-00939-0
Divekar, R. R., Drozdal, J., Chabot, S., Zhou, Y., Su, H., Chen, Y., Zhu, H.,
Hendler, J. A., & Braasch, J. (2022). Foreign language acquisition via
artificial intelligence and extended reality: Design and evaluation. Com-
puter Assisted Language Learning, 35 (9), 2332–2360.
Duha, M. S. U. (2023). ChatGPT in Education: An Opportunity or a Challenge
for the Future? TechTrends, 67 (3), 402–403. https://doi.org/10.1007/
s11528-023-00844-y
Firat, M. (2023). What ChatGPT means for universities: Perceptions of scholars
and students. Journal of Applied Learning and Teaching, 6 (1), 57–63.
https://doi.org/10.37074/jalt.2023.6.1.22
Ghosh, D., & Beigman Klebanov, B. (2024, July). Automatic evaluation of ar-
gumentative writing by young students [US Patent 12,046,155].

24
Hurtienne, J., & Blessing, L. (2007). Metaphors as tools for intuitive interaction
with technology. Metaphorik. de, 12 (2), 21–52.
Islam, I., & Islam, M. N. (2024). Exploring the opportunities and challenges of
ChatGPT in academia. Discover Education, 3 (1). https://doi.org/10.
1007/s44217-024-00114-w
Jakesch, M., Bhat, A., Buschek, D., Zalmanson, L., & Naaman, M. (2023).
Co-writing with opinionated language models affects users’ views. Pro-
ceedings of the 2023 CHI conference on human factors in computing
systems, 1–15.
Kiryakova, G., & Angelova, N. (2023). ChatGPT—A Challenging Tool for the
University Professors in Their Teaching Practice [ChatGPT: Una Her-
ramienta Desafiante para los Profesores Universitarios en su Práctica
Docente]. Education Sciences, 13 (1056), 1–19. https://doi.org/10.3390/
%20educsci13101056
Leswing, K. (2023, April). Chatgpt and generative ai are booming, but the costs
can be extraordinary. https://www.cnbc.com/2023/03/13/chatgpt-
and-generative-ai-are-booming-but-at-a-very-expensive-price.html
Mahapatra, S. (2024). Impact of ChatGPT on ESL students’ academic writing
skills: a mixed methods intervention study. Smart Learning Environ-
ments, 11 (1). https://doi.org/10.1186/s40561-024-00295-9
Mamo, Y., Crompton, H., Burke, D., & Nickel, C. (2024). Higher education
faculty perceptions of chatgpt and the influencing factors: A sentiment
analysis of x. TechTrends, 68 (3), 520–534.
Maziriri, E. T., Gapa, P., & Chuchu, T. (2020). Student perceptions towards
the use of youtube as an educational tool for learning and tutorials.
International Journal of Instruction, 13 (2), 119–138.
Melina, G., Panton, A. J., Pizzinelli, C., Rockall, E., & Tavares, M. M. (2024).
Gen-ai: Artificial intelligence and the future of work.
Mousavinasab, E., Zarifsanaiey, N., R. Niakan Kalhori, S., Rakhshan, M., Keikha,
L., & Ghazi Saeedi, M. (2021). Intelligent tutoring systems: A system-
atic review of characteristics, applications, and evaluation methods. In-
teractive Learning Environments, 29 (1), 142–163.
Ngo, T. T. A. (2023). The perception by university students of the use of chat-
gpt in education. International Journal of Emerging Technologies in
Learning (Online), 18 (17), 4.
Radford, A., Narasimhan, K., Salimans, T., Sutskever, I., et al. (2018). Improv-
ing language understanding by generative pre-training.
Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I., et al.
(2019). Language models are unsupervised multitask learners. OpenAI
blog, 1 (8), 9.
Rawte, V., Chakraborty, S., Pathak, A., Sarkar, A., Tonmoy, S., Chadha, A.,
Sheth, A. P., & Das, A. (2023). The troubling emergence of halluci-
nation in large language models–an extensive definition, quantification,
and prescriptive remediations. arXiv preprint arXiv:2310.04988.
Romero-Rodrı́guez, J. M., Ramı́rez-Montoya, M. S., Buenestado-Fernández, M.,
& Lara-Lara, F. (2023). Use of ChatGPT at University as a Tool for

25
Complex Thinking: Students’ Perceived Usefulness. Journal of New Ap-
proaches in Educational Research, 12 (2), 323–339. https://doi.org/10.
7821/naer.2023.7.1458
Sallam, M., Elsayed, W., Al-Shorbagy, M., Barakat, M., Khatib, S. E., Ghach,
W., Alwan, N., Hallit, S., & Malaeb, D. (2024). ChatGPT Usage and
Attitudes are Driven by Perceptions of Usefulness, Ease of Use, Risks,
and Psycho-Social Impact: A Study among University Students in the
UAE, 1–17. https://www.researchsquare.com/article/rs-3905717/latest
Sanasintani, S. (2023). Revitalizing The Higher Education Curriculum Through
An Artificial Intelligence Approach: An Overview. Journal of Social
Science Utilizing Technology, 1 (4), 239–248. https://doi.org/10.55849/
jssut.v1i4.670
Singh, H., Tayarani-Najaran, M.-H., & Yaqoob, M. (2023). Exploring computer
science students’ perception of chatgpt in higher education: A descrip-
tive and correlation study. Education Sciences, 13 (9), 924.
Toosi, A., Bottino, A. G., Saboury, B., Siegel, E., & Rahmim, A. (2021). A brief
history of ai: How to prevent another winter (a critical review). PET
clinics, 16 (4), 449–469.
Turmuzi, M., Suharta, I. G. P., Astawa, I. W. P., & Suparta, I. N. (2024).
Perceptions of primary school teacher education students to the use of
chatgpt to support learning in the digital era. International Journal of
Information and Education Technology, 14 (5).
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N.,
Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. Advances
in neural information processing systems, 30.
Yan, D. (2023). Impact of chatgpt on learners in a l2 writing practicum: An ex-
ploratory investigation. Education and Information Technologies, 28 (11),
13943–13967.
Zamfirescu-Pereira, J., Wong, R. Y., Hartmann, B., & Yang, Q. (2023). Why
johnny can’t prompt: How non-ai experts try (and fail) to design llm
prompts. Proceedings of the 2023 CHI Conference on Human Factors
in Computing Systems, 1–21.

26

You might also like