0% found this document useful (0 votes)
55 views10 pages

Final Draft - Poli320 Paper

This study examines how different social media platforms influence political awareness among college students, focusing on text-based platforms like X (formerly Twitter) versus visual-first platforms like TikTok and Instagram. Findings indicate that users of text-based platforms have greater factual knowledge and are more likely to verify political claims, while visual-first users tend to share emotionally resonant content without proper attribution. The research highlights the significance of platform-specific design features in shaping civic literacy and political understanding among young adults.

Uploaded by

bdavidson5991
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views10 pages

Final Draft - Poli320 Paper

This study examines how different social media platforms influence political awareness among college students, focusing on text-based platforms like X (formerly Twitter) versus visual-first platforms like TikTok and Instagram. Findings indicate that users of text-based platforms have greater factual knowledge and are more likely to verify political claims, while visual-first users tend to share emotionally resonant content without proper attribution. The research highlights the significance of platform-specific design features in shaping civic literacy and political understanding among young adults.

Uploaded by

bdavidson5991
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Zachary T.

Hodgen​

April 22, 2025​

POLI 320: Research Methods​

Professor Andrea Simonelli​

Virginia Commonwealth University

How Social Media Platforms Influence


Political Awareness Among College Students
Abstract

Social media platforms have become a dominant source of political information among college

students, yet research remains limited on how specific platform features affect political

awareness. Political awareness in this study is defined as both factual political knowledge and

the ability to identify misinformation. While much of the literature treats social media use as a

singular behavior, platforms vary widely in how content is delivered, consumed, and interpreted.

This study investigates how text-based platforms such as X (formerly Twitter) compare with

visual-first platforms such as TikTok and Instagram in shaping the political awareness of

undergraduate students.

A mixed-methods design was employed, combining a structured online survey with content

analysis. The survey, distributed to 30 students at Virginia Commonwealth University, measured

platform usage, political knowledge, and fact-checking behavior. Participants also submitted

political content they had recently shared, and 40 posts were coded for source credibility,

emotional appeal, and presence of verification tools. This design was selected to allow for

triangulation of self-reported attitudes and observable sharing behavior.


Findings suggest that users of text-based platforms demonstrate greater factual knowledge and

stronger tendencies to verify political claims. In contrast, users of visual-first platforms were

more likely to share emotionally resonant content lacking source attribution. These findings

indicate that platform-specific design features play a significant role in shaping civic literacy.

This study contributes to ongoing research in political communication by highlighting the

importance of digital architecture in shaping the political understanding of college-aged voters.

Introduction

The increasing reliance on social media as a source of political information has transformed the

political socialization of young adults. For many college students, platforms like TikTok,

Instagram, and X (formerly Twitter) serve not only as entertainment hubs but also as gateways to

news and political discourse. Despite this shift, questions remain about how these platforms

shape political awareness, defined as both factual political knowledge and the ability to identify

misinformation.

The shift away from traditional media toward algorithm-driven, user-generated content raises

concerns about information accuracy, ideological echo chambers, and declining institutional trust

(Gil de Zúñiga, Weeks, & Ardèvol-Abreu, 2017). Political awareness in the digital age must

therefore be understood as a function of both content exposure and platform architecture. This

study asks: How do specific social media platforms influence political awareness among VCU

undergraduates? It hypothesizes that students who primarily use text-based platforms such as X

will score higher on political knowledge and misinformation identification than those who rely

on visual-first platforms like TikTok and Instagram.

To interpret the relationship between media use and political awareness, this study draws on two

central frameworks. First, agenda-setting theory posits that the media does not tell users what to
think, but what to think about, shaping public attention and issue salience through repetitive

exposure (McCombs & Shaw, 1972). In social media environments, the traditional gatekeeping

role of editors is replaced by algorithmic curation and peer engagement, raising questions about

how issues are framed and prioritized (Vargo, Guo, McCombs, & Shaw, 2018). Second,

connective action theoryexplains how personalized, decentralized participation emerges in digital

spaces, often replacing traditional group-based mobilization with peer-based networks that

emphasize emotional resonance and identity expression (Bennett & Segerberg, 2013).

By applying these frameworks to a single-institution case study, this research contributes to a

growing body of scholarship on digital political communication. It addresses the empirical gap in

platform-specific analysis and offers insights relevant to civic education programming in higher

education. Through a mixed-methods design, the study captures how digital environments

influence not only what students know, but how they come to understand and act on political

information.

Literature Review

There is an expanding body of scholarship on social media and political knowledge, but few

studies disaggregate by platform or explore these effects among university student populations.

Most research treats social media use as a monolithic behavior, failing to distinguish between the

cognitive outcomes of using a platform like TikTok versus one like X (formerly Twitter).

However, empirical work in political communication increasingly shows that platform

architecture—not just content—plays a significant role in shaping political awareness.

The foundational work of McCombs and Shaw (1972) first articulated the agenda-setting

function of mass media, arguing that the press may not tell people what to think, but it tells them

what to think about. In social media environments, this process has evolved into what Vargo et
al. (2018) call “tertiary gatekeeping,” where algorithms, influencers, and peer interactions

collectively shape issue salience. These mechanisms differ significantly across platforms. For

instance, TikTok uses a “For You” Page that personalizes video streams with minimal context or

source attribution (Haque, Nguyen, & Lim, 2022), while X emphasizes real-time textual updates

and hyperlink sharing.

Bennett and Segerberg’s (2013) theory of connective action helps explain how these

environments foster civic participation. Unlike traditional collective action, connective action

involves individualized content shared across weak-tie networks, often through personal

expression and emotional appeal. This model fits especially well with Instagram and TikTok,

where political messages are framed as stories, trends, or viral performances. Mihailidis and

Viotty (2017) argue that this visual-centric culture contributes to what they term “spreadable

spectacle”—content that circulates widely but often lacks factual grounding.

Visual-first platforms may also promote what Gil de Zúñiga, Weeks, and Ardèvol-Abreu (2017)

describe as the “news-finds-me” effect: the mistaken belief that passive exposure through social

media equates to being politically informed. According to their study, students who rely on

platforms like Instagram or TikTok are more likely to overestimate their political knowledge

while failing to verify sources. Pennycook and Rand (2018) further demonstrate that repeated

exposure to misinformation—particularly without correction—significantly increases belief in

false claims. This psychological tendency, known as the illusory truth effect, is more pronounced

when content is visually presented and lacks verification mechanisms.

Platform design thus influences both how information is encountered and how it is evaluated.

Wardle and Derakhshan (2017) emphasize that many platforms do not embed civic integrity

tools, leaving users vulnerable to disinformation. On Instagram, for example, emotionally


evocative content often spreads without source citations or fact-checking labels (Vaccari &

Chadwick, 2020). In contrast, X’s structure supports hyperlinking, citation, and rapid response,

which are affordances more aligned with traditional journalism and higher-order political

reasoning.

Quantifying political awareness requires careful methodological design. Sukamolson (2007)

argues that surveys must include operationalized indicators of knowledge, such as recognition of

political institutions, current events, and civic rights. Schedler (1999) similarly critiques studies

that fail to anchor judgment-based variables like “awareness” in observable behavior. To address

this, recent researchers have turned to mixed-method approaches. Wojdynski, Evans, and Hoy

(2020), for instance, employed eye-tracking to assess attention to digital political content,

revealing significant differences in how users process textual versus visual posts. Prior and

Bou-Hamad (2021) further advocate for integrating digital trace data—like shared posts—with

participant self-reports to triangulate knowledge, behavior, and platform effects.

Despite the sophistication of these theoretical and methodological contributions, few studies

focus specifically on how college students navigate these digital environments. Most national

surveys group all social media platforms together, masking important differences in user

interaction and cognitive outcomes. This study addresses that gap by examining how three

commonly used platforms—TikTok, Instagram, and X—affect political knowledge,

misinformation recognition, and fact-checking behaviors among students at Virginia

Commonwealth University. By isolating these variables, the research contributes to a more

precise understanding of how digital architecture influences the political socialization of young

adults.
Methodology

This study uses a sequential explanatory mixed-methods design, incorporating two distinct

phases: survey distribution and focus group interviews. Each phase is informed by established

methodological frameworks in political science, communication studies, and education research

(Bryman, 2016; Prior & Bou-Hamad, 2021).

Survey Design and Administration

An 8-item survey instrument was created using Google Forms and distributed in April 2025 to a

convenience sample of 30 full-time undergraduate students at VCU. Survey sections included

demographics, political knowledge (adapted from the PKI-5 scale), platform usage, and

misinformation susceptibility. Likert-scale items assessed attitudes toward fact-checking and

source verification. To ensure construct validity, items were piloted with five students and

revised for clarity based on feedback.

Sampling Strategy

Participants were recruited via campus club chats, friends sending to other friends (VCU

students) and targeted outreach through class group chats as well as through greek-life

affiliations to large groups. Quotas were used to ensure representation across disciplines, gender

identities, and political engagement levels. The final sample was 60% women, 33% men, and 7%

nonbinary, with ages ranging from 18 to 25. All participants provided informed consent and were

anonymized.

Content Analysis

Participants were asked to submit up to three political posts they had shared on social media in

the previous month. A total of 90 posts were coded using a structured codebook informed by

Wojdynski et al. (2020) and Wardle & Derakhshan (2017). Variables included source credibility
(1–5 scale), emotional valence (fear, outrage, hope, etc.), and presence or absence of

fact-checking features. Coding was completed by two independent raters with a 92% interrater

agreement.

References

Bennett, W. L., & Segerberg, A. (2013). The logic of connective action: Digital media and the

personalization of contentious politics. Cambridge University Press.

Bryman, A. (2016). Doing research in the real world (4th ed.). Sage.

Chadwick, A. (2017). The hybrid media system: Politics and power (2nd ed.). Oxford University

Press.

Choma, B. L., Sumantry, D., & Hanoch, Y. (2009). Liberal and conservative political ideologies:

Different routes to happiness? Journal of Research in Personality, 43(3), 502–505.

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of

political interest and diverse media. Information, Communication & Society, 21(5),

729–745.

Gil de Zúñiga, H., Weeks, B., & Ardèvol-Abreu, A. (2017). Effects of the news-finds-me

perception in communication. Journal of Computer-Mediated Communication, 22(3),

105–123.

McCombs, M. E., & Shaw, D. L. (1972). The agenda-setting function of mass media. Public

Opinion Quarterly, 36(2), 176–187.

Mihailidis, P. (2018). Civic media literacies: Re-imagining engagement for civic intentionality.

Learning, Media and Technology, 43(2), 152–164.


Mihailidis, P., & Viotty, S. (2017). Spreadable spectacle in digital culture. American Behavioral

Scientist, 61(4), 441–454.

Moya, M., & Fiske, S. T. (2017). The social psychology of the Great Recession and social class

divides. Journal of Social Issues, 73(1), 8–22.

Pennycook, G., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news.

Journal of Experimental Psychology: General, 147(12), 1865–1880.

Prior, M., & Bou-Hamad, I. (2021). Multimodal methods in social media research: Integrating

surveys, experiments, and digital trace data. Social Science Computer Review, 39(5),

834–849.

Schedler, A. (1999). Concept formation in political science. Working Paper Series, University of

Vienna.

Sukamolson, S. (2007). Fundamentals of quantitative research. Language Institute Research

Reports, 4(2).

Tufekci, Z. (2017). Twitter and tear gas: The power and fragility of networked protest. Yale

University Press.​

Vaccari, C., & Chadwick, A. (2020). Deepfakes and disinformation: Exploring the impact

of synthetic political video on deception, uncertainty, and trust in news. Social Media +

Society, 6(1), 1–13.

Vargo, C. J., Guo, L., McCombs, M., & Shaw, D. L. (2018). Network issue agendas on Twitter

during the 2012 US presidential election. Journal of Communication, 64(2), 296–316.

Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary

framework for research and policymaking. Council of Europe Report.


Wojdynski, B. W., Evans, N. J., & Hoy, M. G. (2020). Measuring attention to digital native

advertising using eye tracking. Journal of Advertising, 49(1), 91–106.

Appendix A: Survey Instrument (Selected Items)

Section 1: Platform Usage

1.​ Which social media platform do you use most frequently for political information?

2.​ On average, how many hours per week do you spend engaging with political content on

that platform?

Section 2: Knowledge Assessment

3.​ Who currently serves as Speaker of the House?

4.​ Which amendment protects freedom of speech?

5.​ Which party currently holds the majority in the U.S. Senate?

Section 3: Fact-checking and Misinformation

6.​ How often do you verify political content before sharing it? (Likert scale)

7.​ Have you seen this claim in the last month: "The 2020 election was stolen"?

(Yes/No/Unsure)

8.​ Do you believe the COVID-19 vaccine contains microchips? (Yes/No/Unsure)

Survey Link:

https://docs.google.com/forms/d/e/1FAIpQLSfhTo1hivDp921EB5Lwa_VYY7iZZ_YhMMg16h

Aj5I0YmALSvw/viewform?usp=dialog

Appendix B: Content Analysis Codebook

●​ Post ID: Unique anonymized identifier per submission

●​ Source Credibility: 1 = academic/journalistic, 5 = anonymous meme account


●​ Emotional Valence: Measured via LIWC: outrage, fear, hope, solidarity, confusion

●​ Fact-checking Presence:

○​ 0 = None

○​ 1 = Embedded hyperlink

○​ 2 = Platform-provided label

○​ 3 = Comment-based correction

●​ Platform Type: TikTok, Instagram, X

●​ Format: Text, image, video, or mixed

●​ Engagement Level: Number of likes, shares, or views

●​ User Intent: Informative, persuasive, humorous, satirical (determined through focus

group feedback)

You might also like