0% found this document useful (0 votes)
626 views164 pages

Fond Du Lac Band of Lake Superior Chippewa Lawsuit Against Social Media Companies

Minnesota's Fond du Lac Band of Lake Superior Chippewa has joined a lawsuit with other tribal nations alleging social media companies have exacerbated the mental health crisis engulfing Native American teenagers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
626 views164 pages

Fond Du Lac Band of Lake Superior Chippewa Lawsuit Against Social Media Companies

Minnesota's Fond du Lac Band of Lake Superior Chippewa has joined a lawsuit with other tribal nations alleging social media companies have exacerbated the mental health crisis engulfing Native American teenagers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 164

1 Robins Kaplan LLP

Daniel L. Allender (SBN 264651)


2 2121 Avenue of the Stars, Suite 2800
Los Angeles, CA 90067
3 Telephone: (310) 229-5414
4 Timothy Q. Purdon (pro hac vice)
1207 West Divide Avenue, Suite 200
5 Bismarck, ND 58503
Telephone: (701) 255-3000
6
Tara D. Sutton (pro hac vice)
7 800 LaSalle Avenue, Suite 2800
Minneapolis, MN 55402
8 Telephone: (612) 349-8500
9 Eric M. Lindenfeld (pro hac vice)
1325 Avenue of the Americas, Suite 2601
10 New York, NY 10019
Telephone: (212) 980-7400
11
Frazer PLC
12 T. Roe Frazer II (pro hac vice)
ROBINS KAPLAN LLP

30 Burton Hills Boulevard., Suite 450


ATTORNEYS AT LAW
LOS ANGELES

13 Nashville, TN 37215
Telephone: (503) 242-1072
14
Attorneys for Plaintiff
15

16 SUPERIOR COURT OF THE STATE OF CALIFORNIA


COUNTY OF LOS ANGELES
17 (UNLIMITED JURISDICTION)
18

19 FOND DU LAC BAND OF LAKE Case No.


SUPERIOR CHIPPEWA,
20 COMPLAINT FOR DAMAGES AND
Plaintiff, DEMAND FOR JURY TRIAL
21
v.
22
META PLATFORMS, INC., META
23 PAYMENTS, INC., SICULUS, INC.,
FACEBOOK OPERATIONS, LLC,
24 FACEBOOK HOLDINGS, LLC,
INSTAGRAM, LLC, SNAP INC., TIKTOK
25 INC., BYTEDANCE INC., GOOGLE LLC,
ALPHABET INC., and YOUTUBE, LLC,
26
Defendants.
27

28

COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL


1 TABLE OF CONTENTS
2
I. INTRODUCTION .................................................................................................................. 1
3 II. THE PARTIES ....................................................................................................................... 5
4 A. PLAINTIFF .................................................................................................................... 5
B. DEFENDANTS .............................................................................................................. 5
5
1. Meta ............................................................................................................................ 5
6 2. Snap ............................................................................................................................ 7
7 3. ByteDance .................................................................................................................. 7
4. Google ........................................................................................................................ 7
8
III. JURISDICTION AND VENUE ............................................................................................. 8
9 IV. FACTUAL ALLEGATIONS ................................................................................................. 9
10 A. DEFENDANTS’ APPS HAVE CREATED A YOUTH MENTAL HEALTH
CRISIS. .......................................................................................................................... 9
11 B. DEFENDANTS TARGET CHILDREN AS A CORE MARKET, HOOKING
KIDS ON THEIR ADDICTIVE SOCIAL MEDIA PLATFORMS. ........................... 16
12
ROBINS KAPLAN LLP

1. Children are uniquely susceptible to Defendants’ addictive apps. ........................... 18


ATTORNEYS AT LAW
LOS ANGELES

13 2. Defendants design their apps to attract and addict youth. ........................................ 21


14 3. Millions of kids use Defendants’ products compulsively. ....................................... 26
4. Defendants’ unreasonably dangerous products encourage risky “challenges.” ....... 28
15
5. Defendants’ unreasonably dangerous social media apps facilitate and contribute
16 to the sexual exploitation and sextortion of children, and the ongoing
production and spread of child sex abuse material online. ...................................... 30
17 C. META MARKETS AND DESIGNS FACEBOOK AND INSTAGRAM TO
ADDICT YOUNG USERS, SUBSTANTIALLY CONTRIBUTING TO THE
18 MENTAL HEALTH CRISIS. ...................................................................................... 33
19 1. Background and overview of Meta’s products. ........................................................ 33
a. Facebook’s acquisition and control of Instagram. ............................................... 34
20
2. Meta intentionally encourages youth to use its products and then leverages that
21 usage to increase revenue. ....................................................................................... 36
3. Meta intentionally designed product features to addict children and adolescents. .. 42
22
a. Facebook’s and Instagram’s algorithms maximize engagement, promoting
23 use at levels and frequency that is harmful to kids.............................................. 43
b. Facebook’s and Instagram’s user interfaces are designed to create addictive
24 engagement. ......................................................................................................... 47
25 c. Instagram’s unreasonably dangerous product features cause negative
appearance comparison and social comparison................................................... 51
26 d. Meta has failed to implement effective age-verification measures to keep
children off of Facebook and Instagram.............................................................. 55
27
e. Facebook’s and Instagram’s parental controls are unreasonably dangerous. ...... 58
28
i
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 f. Facebook’s and Instagram’s unreasonably dangerous features include
impediments to discontinuing use. ...................................................................... 60
2
4. Meta has concealed from users, the public, and Congress the harmful effects that
3 Instagram’s and Facebook’s design have on children. ............................................ 61
5. Meta failed to adequately communicate the dangers and harms caused by
4 Instagram and Facebook, or provide instructions regarding safe use. ..................... 68
5 D. SNAP MARKETS AND DESIGNS SNAPCHAT TO ADDICT YOUNG USERS,
SUBSTANTIALLY CONTRIBUTING TO THE MENTAL HEALTH CRISIS. ...... 70
6 1. Background and overview of Snapchat. ................................................................... 71
7 2. Snap targets children. ............................................................................................... 72
a. Snap has designed its Snapchat product to grow use by children to drive the
8 company’s revenue. ............................................................................................. 72
9 b. Snap promotes Snapchat to children. ................................................................... 74
3. Snapchat is designed to addict children through psychological manipulation. ........ 78
10
a. Snap designed Snapchat to drive compulsive use through a set of social
11 metrics and other manipulation techniques that induce compulsive use............. 78
b. Snap’s unreasonably dangerous features are designed to promote compulsive
12 and excessive use................................................................................................. 83
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

4. Snap designed Snapchat with features that harm children directly or expose
LOS ANGELES

13
children to harm. ...................................................................................................... 85
14 a. Disappearing “Snaps” and “My Eyes Only” encourage destructive behavior
among Snap’s teen users. .................................................................................... 85
15
b. Snapchat’s “Snap Map” feature endangers children. ........................................... 87
16 c. Snapchat’s “Quick Add” feature endangers children........................................... 87
17 d. Snapchat’s Lenses and Filters features promote negative appearance
comparison. ......................................................................................................... 88
18 5. Snap has implemented ineffective and misleading parental controls, further
endangering children. .............................................................................................. 90
19
6. Snap facilitates the spread of CSAM and child exploitation. ................................... 90
20 7. Snap failed to adequately communicate the harms its product causes or provide
instructions regarding safe use................................................................................. 94
21
E. BYTEDANCE MARKETS AND DESIGNS ITS TIKTOK TO ADDICT YOUNG
22 USERS, SUBSTANTIALLY CONTRIBUTING TO THE MENTAL HEALTH
CRISIS. ........................................................................................................................ 95
23 1. Background and overview of TikTok....................................................................... 96
24 2. ByteDance intentionally encourages youth to use its product and then leverages
that use to increase revenue. .................................................................................... 97
25 3. ByteDance intentionally designed product features to addict children and
adolescents. .............................................................................................................. 98
26
a. TikTok’s age-verification measures are unreasonably dangerous. ...................... 98
27 b. TikTok’s parental controls are unreasonably dangerous. .................................... 99
28 c. ByteDance intentionally designed TikTok’s unreasonably dangerous features
and algorithms to maximize engagement using automatic content, time-
- ii -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 limited experiences, intermittent variable rewards, reciprocity, and ephemeral
content. .............................................................................................................. 100
2
d. ByteDance’s unreasonably dangerous features include impediments to
3 discontinuing use. .............................................................................................. 111
e. ByteDance’s unreasonably dangerous features inflict impossible image
4 standards and encourage negative appearance comparison............................... 113
5 4. ByteDance facilitates the spread of CSAM and child exploitation. ....................... 114
5. ByteDance failed to adequately communicate the harms its product causes or to
6 provide instructions regarding safe use. ................................................................ 117
7 F. GOOGLE MARKETS AND DESIGNS YOUTUBE TO ADDICT YOUNG
USERS, SUBSTANTIALLY CONTRIBUTING TO THE MENTAL HEALTH
8 CRISIS. ...................................................................................................................... 120
1. Background and overview of YouTube. ................................................................ 121
9
2. Google intentionally encourages youth to use YouTube and then leverages that
10 use to increase revenue. ......................................................................................... 122
3. Google intentionally designed product features to addict children and
11 adolescents. ............................................................................................................ 126
12 a. Google’s age-verification measures and parental controls are unreasonably
ROBINS KAPLAN LLP

dangerous........................................................................................................... 126
ATTORNEYS AT LAW
LOS ANGELES

13 b. YouTube is unreasonably and dangerously designed to inundate users with


features that use intermittent variable rewards and reciprocity. ........................ 127
14
c. Google’s algorithms are designed to maximize “watch time.” .......................... 131
15 d. YouTube’s unreasonably dangerous features include impediments to
discontinuing use. .............................................................................................. 139
16
4. Google facilitates the spread of CSAM and child exploitation. ............................. 140
17 5. Google failed to adequately communicate the harm its products cause or provide
instructions regarding safe use............................................................................... 144
18
G. IMPACT OF THE DEFENDANT-CREATED MENTAL HEALTH CRISIS ON
19 PLAINTIFF FOND DU LAC BAND OF LAKE SUPERIOR CHIPPEWA. ........... 146
V. PLAINTIFF’S CLAIMS .................................................................................................... 148
20
VI. PRAYER FOR RELIEF ..................................................................................................... 159
21 VII. JURY DEMAND................................................................................................................ 159
22

23

24

25

26

27

28
- iii -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 I. INTRODUCTION
2 1. A youth mental health crisis is devastating Indian Country. In a painful echo to centuries
3 of historical trauma—including forced adoption and compulsory boarding schools—suicide now
4 stands as the second leading cause of death for Native American adolescents. 1 According to the
5 Centers for Disease Control and Prevention, “American Indian/Alaska Native youth and young
6 adults have the highest suicide rates of any racial/ethnic group in the U.S.” 2
7 2. The statistics are staggering and tell a horrific story: tribal teen suicide rates are 3.5 to 4
8 times higher than the national average. 3 CDC data from 2011-2020 similarly identifies suicide
9 among younger tribal members as far exceeding the national average, with most significant
10 disparities among those aged 15-24. 4
11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22

23 3. The statistics for females are even more shocking—according to CDC data, female tribal
24 teens commit suicide at a rate over five times higher than their white counterparts: 5
25 1
Center for Native American Youth, Aspen Institute, Teen Suicide Prevention.
26
2
American Psychiatry Association, Suicide Prevention: Native American Youth (Sept. 9, 2019).
3
Center for Native American Youth, Aspen Institute, Teen Suicide Prevention; C. Urbanski, As rates of suicide for
27 Native American youth increase, culture is key to prevention, Stanford The Clayman Institute for Gender Research
(May 2023).
28
4
Suicide Prevention Resource Center, American Indians and Alaska Natives.
5
HHS.gov, Mental and Behavioral Health – American Indians/Alaska Natives (2020).
1
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1

10 4. In addition to suicides, tribal teens disproportionately suffer from mental illness. Tribal
11 youth (11-15) report nearly twice the level of depression and more anxiety than white youth. 6
12 Disordered eating may also be more prevalent among tribal teens than white teens. 7
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 5. The mental health crisis among Minnesota tribes, including the Fond Du Lac Band, is
14 especially dire. A February 2024 joint report determined that “The suicide rate among Minnesota’s
15 Native Americans is noticeably higher compared to Native Americans nationwide.” 8 And,
16 according to the Minnesota Department of Health, from 2015-2019, suicide rates among tribal
17 adolescents (49.7) was over 4x higher than white Minnesota adolescents (11.7). 9
18 6. Soaring suicide and mental illness have devastated Tribal communities and have pushed
19 already chronically underfunded mental health programs to the breaking point. They have caused
20 widespread damage to the Band’s already vulnerable cultural preservation and fabric and placed
21 further burden on limited resources available for other societal concerns, such as education and job
22 creation.
23

24
6
Serafani et al., A Comparison Of Early Adolescent Behavioral Health Risks Among Urban American Indians/Alaska
Natives and Their Peers, Am Indian Alsk Native Ment Health Res. 2017; 24(2): 1–17.
25
7
See, e.g., Mikhail, et al., A virtual issue highlighting eating disorders in people of Black/African and Indigenous
heritage, J Eat Disord. (2021); Nagata, J.M., Smith-Russack, Z., Paul, A. et al. The social epidemiology of binge-
26 eating disorder and behaviors in early adolescents, J Eat Disord 11, 182 (2023); Striegel-Moore et al., Behavioral
symptoms of eating disorders in Native Americans: Results from the add health survey wave III, International Journal
27 of Eating Disorders, 44(6), 561–566 (2011).
8
The Suicide Epidemic in Rural Minnesota: How we got here and how we move forward, Center for Rural Policy
28 and Development (February 2024).
9
MN Department of Health, Adolescent Suicide (2021).
-2-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 7. This lawsuit follows a growing body of scientific research, including Defendants’ own
2 (previously concealed) studies, drawing a direct line from Defendants’ proliferation of “social
3 media” products offered by the Defendants, including platforms such as Facebook, Instagram,
4 TikTok, and YouTube to the youth mental crisis, including among the Fond Du Lac Band.
5 8. Over the past decade, Defendants have relentlessly pursued a strategy of growth-at-all-
6 costs, recklessly ignoring the impact of their products on children’s mental and physical health and
7 well-being. In a race to corner the “valuable but untapped” market of tween and teen users, each
8 Defendant designed product features to promote repetitive, uncontrollable use by kids. 10
9 9. Recognizing the power of engaging young users, Defendants deliberately tweaked the
10 design and operation of their apps to exploit the psychology and neurophysiology of kids. Because
11 children’s and adolescents’ brains are not fully developed, they lack the same emotional maturity,
12 impulse controls, and psychological resiliency as adults. As a result, they are uniquely susceptible
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 to addictive features in digital products and highly vulnerable to the consequent harms. Knowing
14 this, Defendants wrote code designed to manipulate dopamine release in children’s developing
15 brains and, in doing so, create compulsive use of their apps.
16 10. Defendants’ strategy paid off. Users of their products now number in the billions, and
17 the frequency and time spent by these users has grown exponentially.
18 11. Yet, Defendants’ growth has come at the expense of its most vulnerable users: children
19 and teens around the world who they cultivated and exploited. Children and teens are the direct
20 victims of the intentional product design choices made by each Defendant. They are the intended
21 targets of the harmful features that pushed them into self-destructive feedback loops.
22 12. Today, over a third of 13- to 17-year-old kids report using one of Defendants’ apps
23 “almost constantly” and admit this is “too much.” Yet more than half of these kids report that they
24 would struggle to cut back on their social media use.
25 13. Tribal youth are equally gripped. In 2020, 65.3% tribal youth (15-24) were on social
26 media 3-7 hours per day, “with 86.0% reporting their primary activity on social media as scrolling,
27

28
10
Georgia Wells & Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents
Show, Wall St. J. (Sept. 28, 2021).
-3-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 followed by watching videos (75.1%).” 11 The same survey found that “the most popular daily
2 technology use among AI/AN youth involved browsing Instagram (74.0%), sending/receiving snap
3 messages via Snapchat (60.0%), using TikTok (50.4%), and watching videos on YouTube
4 (48.4%).” 12
5 14. It is clear that Instagram, Facebook, TikTok, Snapchat, and YouTube have rewired how
6 our kids think, feel, and behave. Disconnected “Likes” have replaced the intimacy of adolescent
7 friendships. Mindless scrolling has displaced the creativity of play and sport. While presented as
8 “social,” Defendants’ products have in myriad ways promoted disconnection, disassociation, and a
9 legion of resulting mental and physical harms.
10 15. The U.S. Surgeon General recently explained that children versus Big Tech is “just not
11 a fair fight.” 13 “You have some of the best designers and product developers in the world who have
12 designed these products to make sure people are maximizing the amount of time they spend on
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 these platforms. And if we tell a child, use the force of your willpower to control how much time
14 you’re spending, you’re pitting a child against the world’s greatest product designers.”
15 16. The Surgeon General’s comments have since been echoed by President Biden himself.
16 In a January 11, 2023, op-ed, President Biden recognized: “The risks Big Tech poses for ordinary
17 Americans are clear. Big Tech companies collect huge amounts of data on the things we buy, on
18 the websites we visit, on the places we go and, most troubling of all, on our children.” 14
19 17. The Fond Du Lac Band, like many other Native American tribes across the country, is
20 at a breaking point. Meanwhile, Defendants profit tremendously from their wrongful conduct.
21 Plaintiff brings this action—including in its parens patriae capacity to protect the health, safety,
22 and welfare of Tribal members—to remedy this wrong, hold Defendants accountable, and achieve
23 comprehensive, long-term planning and funding to drive sustained reduction in the mental health
24 crises its youth experience at the Defendants’ hands.
25

26
11
Reed et al., Findings from the 2020 Native Youth Health Tech Survey Am Indian Alsk Native, Ment Health Res.
(2022).
27
12
Id.
13
Allison Gordon & Pamela Brown, Surgeon General says 13 is ‘too early’ to join social media, CNN (Jan. 29,
28 2023). Exhibits and referenced materials are incorporated in this Master Complaint as if fully stated herein.
14
Joe Biden, Republicans and Democrats, Unite Against Big Tech Abuses, Wall St. J. (Jan. 11, 2023).
-4-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 II. THE PARTIES
2 A. PLAINTIFF

3 18. Plaintiff Fond du Lac Band of Lake Superior Chippewa (“Fond du Lac Band” or “the

4 Band”) is a sovereign Indian Tribe that retains its aboriginal rights of self-governance and self-

5 determination, pursuant to the Treaty of LaPointe of September 30, 1854, the Indian Reorganization

6 Act of 1934, the common law of the United States, and as recognized by the United Nations

7 Declaration on the Rights of Indigenous Peoples of September 17, 2007. The Band currently

8 occupies the Fond du Lac Reservation in Northern Minnesota, in Carlton and St. Louis Counties.

9 Ancestors of the Band, the Ojibwe, have resided in the geographic area since approximately 800

10 A.D. The Fond du Lac Band exercises inherent governmental authority within the Fond du Lac

11 Reservation and on behalf of the health and welfare of the Fond du Lac Band and its members,

12 children, and grandchildren. Members of the Fond du Lac Band affected by the actions and conduct
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

of the Defendants alleged herein primarily live on the Fond du Lac Reservation. The Band exercises
LOS ANGELES

13

14 inherent sovereign governmental authority within the Band’s Indian Lands and on behalf of the

15 health and welfare of the Band and its members (“Band Members”), their descendants, children,

16 and grandchildren.

17 19. This action is brought by the Band in the exercise of its authority as a sovereign

18 government and on behalf of the Band in its proprietary capacity and under its parens patriae

19 authority in the public interest to protect the health, safety, and welfare of all Fond du Lac Band

20 Members as well as the non-Band Member inhabitants of its Indian Lands.


B. DEFENDANTS
21

22 20. The defendants identified in this section are collectively referred to as “Defendants”

23 throughout this Complaint.


1. Meta
24

25 21. Defendant Meta Platforms, Inc. (“Meta Platforms”) is a Delaware corporation and

26 multinational technology conglomerate. Its principal place of business is in Menlo Park, California.

27 22. Meta Platforms’ subsidiaries include, but may not be limited to, the entities identified in

28 this section, as well as a dozen others whose identity or involvement is presently unclear.
-5-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 23. Defendant Meta Payments, Inc. (“Meta Payments”) is a wholly owned subsidiary of
2 Meta Platforms that was incorporated in Florida on December 10, 2010, as Facebook Payments
3 Inc. In July 2022, the entity’s name was amended to Meta Payments Inc. Meta Payments is a wholly
4 owned subsidiary of Meta Platforms. Meta Payments manages, secures, and processes payments
5 made through Meta entities, among other activities, and its principal place of business is in Menlo
6 Park, California.
7 24. Defendant Siculus, Inc. (“Siculus”) is a wholly owned subsidiary of Meta Platforms that
8 was incorporated in Delaware on October 19, 2011. Siculus constructs data facilities to support
9 Meta Platforms’ products. Its principal place of business is in Menlo Park, California.
10 25. Defendant Facebook Operations, LLC (“Facebook Operations”) is a wholly owned
11 subsidiary of Meta Platforms that was incorporated in Delaware on January 8, 2012. Facebook
12 Operations is likely a managing entity for Meta Platforms’ other subsidiaries. Meta Platforms is the
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 sole member of this LLC, whose principal place of business is in Menlo Park, California.
14 26. Defendant Facebook Holdings, LLC (“Facebook Holdings”) was organized under the
15 laws of the state of Delaware on March 11, 2020, and is a wholly owned subsidiary of Meta
16 Platforms. Facebook Holdings is primarily a holding company for entities involved in Meta’s
17 supporting and international endeavors, and its principal place of business is in Menlo Park,
18 California. Defendant Meta Platforms is the sole member of Facebook Holdings.
19 27. Defendant Instagram, LLC (“Instagram, LLC”) launched an app called Instagram in
20 October 2010. On or around April 7, 2012, Meta Platforms purchased Instagram, LLC for over one
21 billion dollars and reincorporated the company in Delaware. Meta Platforms is the sole member of
22 this LLC, whose principal place of business is in Menlo Park, California.
23 28. Defendants Meta Platforms, Meta Payments, Siculus, Facebook Operations, Facebook
24 Holdings, and Instagram are referred to jointly as “Meta.”
25 29. Meta owns, operates, controls, produces, designs, maintains, manages, develops, tests,
26 labels, markets, advertises, promotes, supplies, and distributes digital products available through
27 mobile- and web-based applications (“apps”), including Instagram and Facebook (together, “Meta
28
-6-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 products”); Messenger; and Messenger Kids. Meta’s apps and devices are widely distributed to
2 consumers throughout the United States.
3 2. Snap

4 30. Defendant Snap Inc. (“Snap”) is a Delaware corporation. Its principal place of business

5 is in Santa Monica, California.

6 31. Snap owns, operates, controls, produces, designs, maintains, manages, develops, tests,

7 labels, markets, advertises, promotes, supplies, and distributes the app Snapchat. Snapchat is widely

8 available to consumers throughout the United States.

9 3. ByteDance

10 32. Defendant TikTok Inc. was incorporated in California on April 30, 2015, with its

11 principal place of business in Culver City, California. TikTok Inc. transacts or has transacted

12 business in this District and throughout the United States. At all times material to this Complaint,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 acting alone or in concert with others, TikTok Inc. has advertised, marketed, and distributed the

14 TikTok social media platform to consumers throughout the United States. At all times material to

15 this Complaint, acting alone or in concert with ByteDance Inc., TikTok Inc. formulated, directed,

16 controlled, had the authority to control, or participated in the acts and practices set forth in this

17 Complaint.

18 33. Defendant ByteDance Inc. is a Delaware corporation with its principal place of business

19 in Mountain View, California. ByteDance transacts or has transacted business in this District and

20 throughout the United States. At all times material to this Complaint, acting alone or in concert

21 with others, ByteDance has advertised, marketed, and distributed the TikTok social media platform

22 to consumers throughout the United States. At all times material to this Complaint, acting alone or

23 in concert with TikTok, ByteDance formulated, directed, controlled, had the authority to control,

24 or participated in the acts and practices set forth in this Complaint.

25 34. TikTok Inc. and ByteDance Inc. are referred to jointly as “ByteDance.”

26 4. Google
27 35. Defendant Alphabet Inc. (“Alphabet”) is a Delaware corporation with its principal place
28 of business in Mountain View, California.
-7-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 36. Defendant Google LLC (“Google LLC”) is a limited liability company organized under
2 the laws of the state of Delaware, and its principal place of business is in Mountain View,
3 California. Google is a wholly owned subsidiary of XXVI Holdings Inc., and the managing member
4 of YouTube. Google transacts or has transacted business in this District and throughout the United
5 States. At all times material to this Complaint, acting alone or in concert with others, Google has
6 advertised, marketed, and distributed its YouTube video sharing platform to consumers throughout
7 the United States. At all times material to this Complaint, acting alone or in concert, Google
8 formulated, directed, controlled, had the authority to control, or participated in the acts and practices
9 set forth in this Complaint.
10 37. Defendant YouTube, LLC (“YouTube LLC”) is a limited liability company organized
11 under the laws of the state of Delaware, and its principal place of business is in San Bruno,
12 California. YouTube is a wholly owned subsidiary of Google. YouTube transacts or has transacted
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 business in this District and throughout the United States. At all times material to this Complaint,
14 acting alone or in concert, YouTube has advertised, marketed, and distributed its YouTube social
15 media platform to consumers throughout the United States. At all times material to this Complaint,
16 acting alone or in concert, YouTube formulated, directed, controlled, had the authority to control,
17 or participated in the acts and practices set forth in this Complaint.
18 38. Alphabet, Google LLC, and YouTube, LLC (together, “Google”) are alter egos of one
19 another: together and in concert they own, operate, control, produce, design, maintain, manage,
20 develop, test, label, market, advertise, promote, supply, and distribute the app YouTube.
21 III. JURISDICTION AND VENUE
22 39. This Court has original jurisdiction over this action pursuant to Article VI, Section 10 of
23 the California Constitution.
24 40. This Court has general personal jurisdiction over Defendants because each are
25 headquartered and/or have their principal places of business in the State of California and have
26 continuous and systematic operations within the State of California. The Court also has specific
27 personal jurisdiction over Defendants because they actively conduct substantial business in Los
28 Angeles County and the State of California. Defendants have purposefully availed themselves of
-8-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 the privilege of conducting business in this State through the design, development, programming,
2 promotion, marketing, operations, and distribution of their platforms at issue in this lawsuit and
3 have purposefully directed their activities toward the State of California. Defendants have sufficient
4 minimum contacts with the State of California to render the exercise of jurisdiction by this Court
5 permissible under California law and the United States Constitution.
6 41. Venue is proper in this judicial district pursuant to California Code of Civil Procedure
7 Sections 395 and 395.5 because at least some Defendants reside in this County, their principal
8 places of businesses are in this County, and a substantial part of the events or omissions giving rise
9 to the claims at issue in this Complaint arose in this County.
10 IV. FACTUAL ALLEGATIONS
11 A. DEFENDANTS’ APPS HAVE CREATED A YOUTH MENTAL HEALTH
CRISIS.
12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 42. Nearly a decade of scientific and medical studies demonstrate that dangerous features

14 engineered into Defendants’ platforms—particularly when used multiple hours a day—can have a

15 “detrimental effect on the psychological health of [their] users,” including compulsive use,

16 addiction, body dissatisfaction, anxiety, depression, and self-harming behaviors such as eating

17 disorders. 15

18 43. Addiction and compulsive use of Defendants’ products can entail a variety of behavioral

19 problems including but not limited to: (1) a lessening of control, (2) persistent, compulsive seeking

20 out of access to the product, (3) using the product more, and for longer, than intended, (4) trying to

21 cut down on use but being unable to do so, (5) experiencing intense cravings or urges to use,

22 (6) tolerance (needing more of the product to achieve the same desired effect), (7) developing

23 withdrawal symptoms when not using the product, or when the product is taken away,

24 (8) neglecting responsibilities at home, work, or school because of the intensity of usage,

25 (9) continuing to use the product even when doing so interferes and causes problems with important

26

27
15
See, e.g., Fazida Karim et al., Social Media Use and Its Connection to Mental Health: A Systemic Review, Cureus
Volume 12(6) (June 15, 2020); Alexandra R. Lonergan et al., Protect me from my selfie: Examining the association
28 between photo-based social media behaviors and self-reported eating disorders in adolescence, Int. J. of Eating
Disorders 756 (Apr. 7, 2020).
-9-
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 family and social relationships, (10) giving up important or desirable social and recreational
2 activities due to use, and (11) continuing to use despite the product causing significant harm to the
3 user’s physical and mental health.
4 44. Each Defendant has long been aware of this research but chose to ignore or brush it off.
5 45. Scientists have studied the impacts of the overuse of social media since at least 2008,
6 with social media addiction recognized in literature around that time after a pervasive upsurge in
7 Facebook use. 16 The Bergen Social Media Addiction Scale assesses social media addiction along
8 six core elements: 1) salience (preoccupation with the activity), 2) mood modification (the behavior
9 alters the emotional state), 3) tolerance (increasing activity is needed for the same mood-altering
10 effects), 4) withdrawal (physical or psychological discomfort when the behavior is discontinued),
11 5) conflict (ceasing other activities or social interaction to perform the behavior), and 6) relapse
12 (resuming the behavior after attempting to control or discontinue it). 17
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 46. Beginning in at least 2014, researchers began demonstrating that addictive and
14 compulsive use of Defendants’ apps leads to negative mental and physical outcomes for kids.
15 47. In 2014, a study of 10- to 12-year-old girls found that increased use of Facebook was
16 linked with body image concerns, the idealization of thinness, and increased dieting. 18 (This study
17 was sent to Mark Zuckerberg in 2018, in a letter signed by 118 public health advocates.) 19
18 48. In 2016, a study demonstrated that young people who frequently use Defendants’ apps
19 are more likely to suffer sleep disturbances than their peers who use them infrequently. 20
20 Defendants’ products, driven by intermittent variable rewards (“IVR”) algorithms, as described
21 below, deprive users of sleep by sending push notifications and emails at night, prompting children
22 to re-engage with the apps when they should be sleeping. Disturbed and insufficient sleep is
23

24
16
Tim Davies & Pete Cranston, Youth Work and Social Networking: Interim Report, The National Youth Agency
(May 2008).
25
17
Cecilie Andreassen et al., The relationship between addictive use of social media and video games and symptoms
of psychiatric disorders: a large-scale cross-sectional study, 30(2) Psychol. of Addictive Behav., 252-262 (2016).
26
18
Marika Tiggemann & Amy Slater, NetTweens: The Internet and body image concerns in preteenage girls, 34(5), J.
Early Adolesc. 606-620 (June 2014).
27
19
Campaign for a Commercial-Free Childhood, Letter to Mark Zuckerberg Re: Facebook Messenger Kids (Jan. 30,
2018).
28
20
Jessica C. Levenson et al., The Association Between Social Media Use and Sleep Disturbance Among Young
Adults, 85 Preventive Med. 36–41 (Apr. 2016).
- 10 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 associated with poor health outcomes, 21 including increased risk of major depression—by a factor
2 of more than three— 22and future suicidal behavior in adolescents. 23 The American Academy of
3 Sleep Medicine has recommended that, in a 24-hour period, children aged 6–12 years should
4 regularly sleep 9–12 hours and teenagers aged 13–18 years should sleep 8–10 hours. 24
5 49. In another 2016 study, 52% of girls said they use image filters every day, and 80%
6 reported using an app to change their appearance before the age of 13. 25 In fact, 77% of girls
7 reported trying to change or hide at least one part of their body before posting a photo of themselves,
8 and 50% believe they did not look good enough without editing. 26
9 50. In 2017, British researchers asked 1,500 teens to rate how Instagram, Snapchat, and
10 YouTube affected them on certain well-being measures, including anxiety, loneliness, body image,
11 and sleep. 27 Teens rated all three platforms as having a negative impact on body image, “FOMO”
12 (fear of missing out), and sleep. Teens also noted that Instagram and Snapchat had a negative impact
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 on anxiety, depression, and loneliness.


14 51. In 2018, a Journal of Social and Clinical Psychology study examined a group of college
15 students whose use of Instagram, Facebook, and Snapchat was limited to 10 minutes per day per
16 platform. The study found that this limited-use group showed “significant reductions in loneliness
17 and depression over three weeks” compared to a control group that used social media as usual. 28
18 52. In 2018, a systematic literature review of nine studies published in the Indian Journal of
19 Psychiatry concluded that dangerous features in social networking platforms “contribute to
20

21
21
Id.; National Institute of Mental Health, 2016. The teen brain: 6 things to know; R. Sather & A. Shelat,
Understanding the teen brain.
22
22
E. Roberts & H. Doung, The Prospective Association between Sleep Deprivation and Depression among
Adolescents Sleep, Volume 37, Issue 2, 1 Feb. 2014.
23
23
X. Liu, D. Buysse, Sleep and youth suicidal behavior: a neglected field, Current Opinion in Psychiatry (May
2006).
24
24
S. Paruthi, L. Brooks, C. D’Ambrosio, et al., Consensus statement of the American Academy of Sleep Medicine on
the recommended amount of sleep for healthy children: methodology and discussion, 12 J Clin Sleep Med 1549–61
25 (2016).
25
Anna Haines, From “Instagram Face” to “Snapchat Dysmorphia”: How Beauty Filters Are Changing the Way
26 We See Ourselves, Forbes (Apr. 27, 2021).
26
Id.
27
27
Royal Society for Public Health, #StatusOfMind; see also Jonathan Haidt, The Dangerous Experiment on Teen
Girls, The Atlantic (Nov. 21, 2021).
28
28
Melissa G. Hunt et al., No More FOMO: Limiting Social Media Decreases Loneliness and Depression, 37 J. of
Social & Clinical Psych (Dec. 5, 2018).
- 11 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 increased exposure to and engagement in self-harm behavior, as users tend to emulate self-injurious
2 behavior of others online, adopt self-injurious practices from self-harm videos, or are encouraged
3 and acclaimed by others, thus normalizing self-injurious thoughts and behavior.” 29
4 53. A 2019 survey of American adolescents ages 12-14 found that a user’s displeasure with
5 their body could be predicted based on their frequency of using social media (including Instagram
6 and Facebook) and based on the extent to which they engaged in behaviors that adopt an observer’s
7 point-of-view (such as taking selfies or asking others to “rate one’s looks”). This effect was more
8 pronounced among girls than boys. 30
9 54. A third study in 2019 of more than 6500 American adolescents ranging in age from 12
10 to 15 years old found that those who used social media for 3 hours or more per day were more
11 likely to suffer from mental health problems such as anxiety and depression. 31 Notably, this
12 association remained significant even after adjusting for demographics, past alcohol and marijuana
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 use, and history of mental health problems. 32


14 55. In 2020, a study of Australian adolescents found that investment in others’ selfies
15 (through likes and comments) was associated with greater odds of meeting criteria for
16 clinical/subclinical bulimia nervosa, clinical/subclinical binge-eating disorder, night eating
17 syndrome, and unspecified feeding and eating disorders. 33
18 56. In 2020, a longitudinal study investigated whether “Facebook Addiction Disorder”
19 predicted suicide-related outcomes, and found that children and adolescents addicted to Facebook
20 are more likely to engage in self-injurious behavior, such as cutting and suicide. 34
21

22
29
Aksha Memon et al., The role of online social networking on deliberate self-harm and suicidality in adolescents: a
systematized review of literature, 60(4) Indian J Psychiatry 384-392 (Oct-Dec 2018).
23
30
Ilyssa Salomon & Christia Spears Brown, The Selfie Generation: Examining the Relationship Between Social
Media Use and Early Adolescent Body Image, Journal of Early Adolescence (Apr. 21, 2018).
24
31
Kira Riehm et al., Associations between time spent using social media and internalizing and externalizing
problems among US youth, 76(12) JAMA Psychiatry (2019).
25
32
Id.
33
Alexandra R. Lonergan et al., Protect Me from My Selfie: Examining the Association Between Photo-Based Social
26 Media Behaviors and Self-Reported Eating Disorders in Adolescence, Int’l J. of Eating Disorders (Apr. 7, 2020).
34
See, e.g., Julia Brailovskaia et al., Positive mental health mediates the relationship between Facebook addiction
27 disorder and suicide-related outcomes: a longitudinal approach, 00(00) Cyberpsychology, Behavior, and Social
Networking (2020); Jean M. Twenge et al., Increases in Depressive Symptoms, Suicide-Related Outcomes, and
28 Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time, 6 Clinical Psych.
Sci. 3–17 (2018).
- 12 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 57. In 2020, clinical research demonstrated an observable link between youth social media
2 use and disordered eating behavior. 35 The more time young girls spend using Defendants’ products,
3 the more likely they are to develop disordered eating behaviors. 36 And the more social media
4 accounts adolescents have, the more disordered eating behaviors they exhibit. 37
5 58. Eating disorders often occur simultaneously with other self-harm behaviors such as
6 cutting and are often associated with suicide. 38
7 59. In a 2021 study, female undergraduates were randomly shown thinspiration (low body
8 mass index and not muscular), fitspiration (muscular and exercising), or neutral photos. 39
9 Thinspiration and fitspiration images lowered self-esteem, even in those with a self-perceived
10 healthy weight. 40
11 60. A 2022 study of Italian adolescent girls (12-17) and young women (18-28) found that
12 Instagram’s image editing and browsing features, combined with an emphasis on influencer
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 interactions, promulgated unattainable body ideals that caused users to compare their bodies to
14 those ideals. 41 These trends were more prominent among adolescent girls, given their higher
15 susceptibility to social pressures related to their bodies and given the physical changes associated
16 with puberty.
17 61. In 2023, a study of magnetic resonance images demonstrated that compulsive use of
18 Defendants’ apps measurably alters children’s brains. 42 This study measured fMRI responses in
19 12-year-old adolescents who used Facebook, Instagram, and Snapchat over a three-year period and
20 found that neural patterns diverged. Specifically, those who engaged in high social media checking
21

22
35
Simon M. Wilksch et al., The relationship between social media use and disordered eating in young adolescents,
53 Int’l J. Eating Disorders 96–106 (2020).
23
36
Id.
37
Id.
24
38
Sonja Swanson et al., Prevalence and correlates of eating disorders in adolescents, 68(7) Arch Gen Psychiatry
717-723 (2011).
25
39
Karikarn Chansiri & Thipkanok Wongphothiphan, The indirect effects of Instagram images on women’s self-
esteem: The moderating roles of BMI and perceived weight, 00(0) New Media & Society 1-23 (2021).
26
40
Id.
41
Federica Pedalino and Anne-Linda Camerini, Instagram use and body dissatisfaction: The mediating role of
27 upward social comparison with peers and influencers among young females, 19(3) Int’l J of Environmental Research
and Public Health 1543 (2022).
28
42
Maria Maza et al., Association of habitual checking behaviors on social media with longitudinal functional brain
development, JAMA Ped., (Jan. 3, 2023).
- 13 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 behavior “showed lower neural sensitivity to social anticipation” than those who engaged in low to
2 moderate checking behavior. 43
3 62. Defendants’ apps have triggered depression, anxiety, eating disorders, self-harm, and
4 suicidality among thousands of children. Defendants have created a crisis.
5 63. From 2009 to 2019, the rate of high school students who reported persistent sadness or
6 hopelessness increased by 40% (to one out of every three kids). 44 The share of kids who seriously
7 considered suicide increased by 36%, and those that created a suicide plan increased by 44%. 45
8 64. From 2007 to 2019, suicide rates among youth aged 10-24 in the United States increased
9 by 57%. 46
10 65. From 2007 to 2016, emergency room visits for youth aged 5-17 rose 117% for anxiety
11 disorders, 44% for mood disorders, and 40% for attention disorders. 47
12 66. By 2019, one-in-five children aged 3-17 in the United States had a mental, emotional,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 developmental, or behavioral disorder. 48 Mental health issues are particularly acute among
14 females. 49
15 67. Many of these injuries can be long-lasting, if not lifelong. For example, the long-term
16 effects of eating disorders can include: (1) dermatological effects to the nails and hair;
17 (2) gastrointestinal illnesses, such as gastroparesis or hypomotility of the colon; (3) impacts to the
18 endocrine system, such as glycolic or metabolic conditions, bone loss, and hormonal conditions;
19 (4) nervous system effects, such as gray matter brain loss or atrophy; (5) skeletal system effects,
20

21

22
43
Id.
44
Protecting Youth Mental Health: The U.S. Surgeon General’s Advisory at 8, U.S. Dep’t Health & Hum. Servs.
23 (Dec. 7, 2021).
45
Id.
24
46
Id.
47
Charmaine Lo, Children’s mental health emergency department visits: 2007-2016, 145(6) Pediatrics e20191536
25 (June 2020).
48
U.S. Surgeon General Issues Advisory on Youth Mental Health Crisis Further Exposed by COVID-19 Pandemic,
26 U.S. Dep’t Health & Hum. Servs. (Dec. 14, 2021); see also Jean M. Twenge et al., Increases in Depressive
Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased
27 New Media Screen Time, 6 Clinical Psych. Sci. 3–17 (2017), (noting that mental health issues are particularly acute
among females).
28
49
U.S. Surgeon General Issues Advisory on Youth Mental Health Crisis Further Exposed by COVID-19 Pandemic,
U.S. Dep’t Health & Hum. Servs. (Dec. 14, 2021).
- 14 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 such as bone loss; (6) cardiovascular effects, such as structural heart damage, mitral valve prolapse,
2 or fluid around the heart; and (7) fertility issues. 50
3 68. Tribal youth are suffering from the mental health crisis disproportionately. According to
4 the Center for Native American Youth, “Suicide is the second leading cause of death for Native
5 American youth ages 10-24, and Native youth teen suicide rates are nearly 3.5 times higher than
6 the national average.” 51 Over 40% of all Native American suicides involve individuals aged 15-
7 24. 52
8 69. According to CDC data, from 2011-2021, Native American suicides increased 70.3%,
9 over 5x higher than Whites. 53
10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21 70. On December 7, 2021, the United States Surgeon General issued an advisory on the
22 youth mental health crisis. 54 The Surgeon General explained, “[m]ental health challenges in
23 children, adolescents, and young adults are real and widespread. Even before the pandemic, an
24 alarming number of young people struggled with feelings of helplessness, depression, and thoughts
25 50
See Anorexia Nervosa, Cleveland; Bulimia Nervosa; Cleveland Clinic.
26
51
Center for Native American Youth, Aspen Institute, Teen Suicide Prevention.
52
Almendrala, Native American Youth Suicide Rates Are At Crisis Levels, HuffPost (Oct. 2, 2015).
27
53
Heather Saunders and Nirmita Panchal, New KFF reports Suicide death rates in 2021 were highest among
American Indian and Alaska Native (AIAN) people, males, and people who live in rural areas, University of Arizona
28 Center for Rural Health (Aug. 17, 2023).
54
Id.
- 15 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 of suicide—and rates have increased over the past decade.” 55 Those “mental health challenges were
2 the leading cause of disability and poor life outcomes in young people.” 56
3 71. On February 13, 2023, the CDC released new statistics revealing that, in 2021, one in
4 three girls seriously considered attempting suicide. 57
5 72. As discussed herein, each of the Defendants’ products manipulates minor users’ brains
6 by building in stimuli and social reward mechanisms (e.g., “Likes”) that cause users to
7 compulsively seek social rewards. That, in turn, leads to neuroadaptation; a child requires more and
8 more stimuli to obtain the desired dopamine release, along with further impairments of decision-
9 making. It also leads to reward-seeking through increasingly extreme content, which is more likely
10 to generate intense reactions from other users. These consequences are the foreseeable results of
11 Defendants’ engineering decisions.
12 B. DEFENDANTS TARGET CHILDREN AS A CORE MARKET, HOOKING
ROBINS KAPLAN LLP

KIDS ON THEIR ADDICTIVE SOCIAL MEDIA PLATFORMS.


ATTORNEYS AT LAW
LOS ANGELES

13

14 73. Each Defendant designs, engineers, markets, and operates its products to maximize the

15 number of children who download and use them compulsively. Children are more vulnerable users

16 and have more free time on their hands than their adult counterparts. Because children use

17 Defendants’ products more, they see more ads, and as a result generate more ad revenue for

18 Defendants. Young users also generate a trove of data about their preferences, habits, and

19 behaviors. That information is Defendants’ most valuable commodity. Defendants mine and

20 commodify that data, including by selling to advertisers the ability to reach incredibly narrow

21 tranches of the population, including children. Each Defendant placed its app(s) into the stream of

22 commerce and generated revenues through the distribution of those apps at the expense of the

23 consuming public.

24 74. Addicting youth is central to Defendants’ profitability. Like the cigarette industry a

25 generation earlier, Defendants understand that a child user today becomes an adult user tomorrow.

26

27
55
Id.
56
Id.
28
57
Azeen Ghorayashi & Roni Caryn Rabin, Teen Girls Report Record Levels of Sadness, C.D.C. Finds, N.Y. Times
(Feb. 13, 2023).
- 16 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 Indeed, Defendants’ insatiable appetite for growth has created a need for younger and younger
2 users. Defendants’ wrongfully acquired knowledge of their childhood userbase has allowed them
3 to develop product designs to target elementary school-age children, who are uniquely vulnerable.
4 Like Joe Camel of old, Defendants’ recent attempts to capture pre-adolescent audiences include
5 “kid versions” of apps that are “designed to fuel [kids’] interest in the grown-up version.” 58
6 75. Recognizing the vulnerability of children under 13, particularly in the Internet age,
7 Congress enacted the Children’s Online Privacy Protection Act (“COPPA”) in 1999. 59 COPPA
8 regulates the conditions under which Defendants can collect, use, or disclose the personal
9 information of children under 13. Under COPPA, developers of apps and websites that are directed
10 to or known to be used by children under 13 cannot lawfully obtain the individually identifiable
11 information of such children without first obtaining verifiable consent from their parents. 60 Even
12 apart from COPPA, it is well established under the law that children lack the legal or mental
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 capacity to make informed decisions about their own well-being.


14 76. COPPA was enacted precisely because Congress recognized that children under age 13
15 are particularly vulnerable to being taken advantage of by unscrupulous website operators. As a
16 June 1998 report by the FTC observed, “[t]he immediacy and ease with which personal information
17 can be collected from children online, combined with the limited capacity of children to understand
18 fully the potentially serious safety and privacy implications of providing that information, have
19 created deep concerns about current information practices involving children online.” 61 The same
20 report observed that children under the age of 13 “generally lack the developmental capacity and
21 judgment to give meaningful consent to the release of personal information to a third party.” 62
22

23

24
58
Leonard Sax, Is TikTok Dangerous for Teens?, Inst. Fam. Stud. (Mar. 29, 2022).
59
See 15 U.S.C. §§ 6501-6506.
25
60
The FTC recently clarified that acceptable methods for obtaining verifiable parent consent include: (a) providing a
form for parents to sign and return; (b) requiring the use of a credit card online payment that provides notification of
26 each transaction; (c) connecting to trained personnel via video conference; (d) calling a staffed toll-free number; (e)
asking knowledge-based questions; or (f) verifying a photo-ID from the parent compared to a second photo using
27 facial recognition technology. Federal Trade Commission, Complying with COPPA: Frequently Asked Questions,
July 2020.
28
61
Privacy Online: A Report to Congress, Federal Trade Commission (1998) at 13.
62
Id.
- 17 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 77. Contemporaneous testimony by the Chairman of the FTC observed that the Internet
2 “make[s] it easy for children to disclose personal information to the general public without their
3 parents’ awareness or consent. Such public disclosures raise safety concerns.” 63 Further, “the
4 practice of collecting personal identifying information directly from children without parental
5 consent is clearly troubling, since it teaches children to reveal their personal information to
6 strangers and circumvents parental control over their family’s information.” 64
7 78. None of the Defendants conduct proper age verification or authentication. Instead, each
8 Defendant leaves it to users to self-report their age. This unenforceable and facially inadequate
9 system allows children under 13 to easily create accounts on Defendants’ apps.
10 79. This is particularly egregious for two reasons. First, Defendants have long been on notice
11 of the problem. For instance, in May 2011, Consumer Reports reported the “troubling news” that
12 7.5 million children under 13 were on Facebook. 65 Second, given that Defendants have developed
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 and utilized age-estimation algorithms for the purpose of selling user data and targeted
14 advertisements, Defendants could readily use these algorithms to prevent children under 13 from
15 accessing their products, but choose not to do so. Instead, they have turned a blind eye to collecting
16 children’s data in violation of COPPA.
17 80. Defendants have done this because children are financially lucrative, particularly when
18 they are addicted to Defendants’ apps.
19 1. Children are uniquely susceptible to Defendants’ addictive apps.

20 81. Young people are not only Defendants’ most lucrative market, but are also those most
21 vulnerable to harms resulting from Defendants’ products. “Everyone innately responds to social
22 approval.” 66 “[B]ut some demographics, in particular teenagers, are more vulnerable to it than
23 others.” 67 Unlike adults, who “tend to have a fixed sense of self that relies less on feedback from
24

25 63
S. 2326, Children’s Online Privacy Protection Act of 1998: Hearing Before the U.S. Sen. Subcom. On
26 Communications, Comm. On Commerce, Science, and Transportation, 105th Cong. 11 (1998) (statement of Robert
Pitofsky, Chairman, Federal Trade Commission).
27
64
Id.
65
Emily Bazelon, Why Facebook is After Your Kids, N.Y. Times (Oct. 12, 2011).
28
66
Von Tristan Harris, The Slot Machine in Your Pocket, Spiegel Int’l (July 27, 2016).
67
Id.
- 18 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 peers,” 68 adolescents are in a “period of personal and social identity formation” that “is now reliant
2 on social media.” 69
3 82. To understand the impact Defendants’ apps have on young people, it is helpful to
4 understand some basics about the human brain.
5 83. The frontal lobes of the brain—particularly the prefrontal cortex—control higher-order
6 cognitive functions. This region of the brain is central to planning and executive decision-making,
7 including the evaluation of risk and reward. It also helps inhibit impulsive actions and “regulate
8 emotional responses to social rewards.” 70
9 84. Children and adolescents are especially vulnerable to developing harmful behaviors
10 because their prefrontal cortex is not fully developed. 71 Indeed, it is one of the last regions of the
11 brain to mature. 72 In the images below, the blue color depicts brain development. 73
12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22 85. Because of the immaturity of their prefrontal cortex, young people have less impulse

23 control, and less ability to regulate their responses to social rewards, than adults.

24 68
Zara Abrams, Why young brains are especially vulnerable to social media, Am. Psych. Ass’n (Aug. 25, 2022).
25
69
Betul Keles et al., A systematic review: the influence of social media on depression, anxiety and psychological
distress in adolescents, Int’l J. Adolescence & Youth (202) 25:1, 79–93 (Mar. 3, 2019).
26
70
Zara Abrams, Why young brains are especially vulnerable to social media, Am. Psych. Ass’n (Aug. 25, 2022).
71
Nino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model of
27 problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022).
72
Id.
28
73
Heiner Boettger, & Deborah Koeltezsch, The fear factor: Xenoglossophobia or how to overcome the anxiety of
speaking foreign languages, Training Language and Culture, 43-55 (June 2020).
- 19 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 86. Beginning around the age of 10, the brain also begins to change in important ways.
2 Specifically, the receptors for dopamine multiply in the subcortical region of the brain. 74 Dopamine
3 is a neurotransmitter that is central to the brain’s reward system. 75
4 87. During this developmental phase, the brain learns to seek out stimuli (e.g., Instagram)
5 that result in a reward (e.g., likes) and cause dopamine to flood the brain’s reward pathways. Each
6 time this happens, associations between the stimulus and the reward become stronger. 76 Notably,
7 once the brain has learned to make this association, dopaminergic neurons “shift their … activation
8 from the time of reward delivery to the time of presentation of [a] predictive cue.” 77 In other words,
9 the anticipation of a reward can itself trigger a dopamine rush.
10 88. As New York University professor and social psychologist Adam Alter has explained,
11 product features such as “Likes” give users a dopamine hit similar to drugs and alcohol: “The
12 minute you take a drug, drink alcohol, smoke a cigarette . . . when you get a like on social media,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 all of those experiences produce dopamine, which is a chemical that’s associated with pleasure.
14 When someone likes an Instagram post, or any content that you share, it’s a little bit like taking a
15 drug. As far as your brain is concerned, it’s a very similar experience.” 78
16 89. When the release of dopamine in young brains is manipulated by Defendants’ products,
17 it interferes with the brain’s development and can have long-term impacts on an individual’s
18 memory, affective processing, reasoning, planning, attention, inhibitory control, and risk-reward
19 calibration.
20 90. Given their limited capacity to self-regulate and their vulnerability to peer pressure,
21 children (including teens) are at greater risk of developing a mental disorder from use of
22 Defendants’ products. 79 Children are more susceptible than adults to feelings of withdrawal when
23 74
Zara Abrams, Why young brains are especially vulnerable to social media, Am. Psych. Ass’n (Aug. 25, 2022).
24
75
Id.
76
See Bryo Adinoff, Neurobiologic processes in drug reward and addiction, 12(6) Harv. Rev. Psychiatry 305-320
25 (2004).
77
Luisa Speranza et al., Dopamine: The Neuromodulator of Long-Term Synaptic Plasticity, Reward and Movement
26 Control, 10 Cells 735 (2021).
78
Eames Yates, What happens to your brain when you get a like on Instagram, Business Insider (Mar. 25, 2017); see
27 also Sӧren Krach et al., The rewarding nature of social interactions, 4(22) Frontiers in Behav. Neuro., (28 May
2010); Julian Morgans, The Secret Ways Social Media Is Built for Addiction, Vice (May 17, 2017).
28
79
Betul Keles et al., A systematic review: the influence of social media on depression, anxiety and psychological
distress in adolescents, Int’l J. Adolescence & Youth (202) 25:1, 79–93 (Mar. 3, 2019).
- 20 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 a dopamine hit wears off. Depending on the intensity, delivery, and timing of the stimulus, and the
2 severity of its withdrawal, these feelings can include anxiety, dysphoria, and irritability. 80 Children
3 also are more likely to engage in compulsive behaviors to avoid these symptoms, due to their
4 limited capacity for self-regulation, relative lack of impulse control, and struggle to delay
5 gratification.
6 91. This leads to a vicious cycle. Repeated spikes of dopamine over time may cause a child
7 to build up a tolerance for the stimulus. In this process of “neuroadaptation,” the production of
8 dopamine and the sensitivity of dopamine receptors are both reduced. 81 As a consequence, the child
9 requires more and more of the stimulus to feel the same reward. Worse, this cycle can cause
10 decreases in activity in the prefrontal cortex, leading to further impairments of decision-making
11 and executive functioning. 82
12 92. As described further below, each Defendant deliberately designed, engineered, and
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 implemented dangerous features in their apps that present social-reward and other stimuli in a
14 manner that has caused so many young users to compulsively seek out those stimuli, develop
15 negative symptoms when they were withdrawn, and exhibit reduced impulse control and emotional
16 regulation.
17 93. In short, children find it particularly difficult to exercise the self-control required to
18 regulate their use of Defendants’ platforms, given the stimuli and rewards embedded in those apps,
19 and as a foreseeable consequence tend to engage in addictive and compulsive use. 83
20 2. Defendants design their apps to attract and addict youth.
21 94. Instagram, Facebook, TikTok, SnapChat, and YouTube employ many similar dangerous

22 product features that are engineered to induce more use by young people—creating an unreasonable

23 risk of compulsive use and addiction. 84 For instance, all five apps harvest user data and use this

24

25
80
George Koob, and Nora Volkow, Neurobiology of addiction: a neurocircuitry analysis, 3 (8) Lancet Psychiatry
760-773 (2016).
26
81
Id.
82
Id.
27
83
Fulton Crews et al., Adolescent cortical development: a critical period of vulnerability for addiction, 86
Pharmacology, Biochemistry and Behavior 189-199 (2007).
28
84
See Kevin Hurler, For Sites Like Instagram and Twitter, Imitation Is the Only Form of Flattery, Gizmodo (Aug.
16, 2022). (“Over the last decade, some of the most popular social media apps have blatantly ripped off features from
- 21 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 information to generate and push algorithmically tailored “feeds” of photos and videos. And all
2 five include methods through which approval can be expressed and received, such as likes, hearts,
3 comments, shares, or reposts. This section explains the psychological and social mechanisms
4 exploited by these and other unreasonably dangerous product features.
5 95. First, Defendants’ apps are designed and engineered to methodically, but unpredictably,
6 space out dopamine-triggering rewards with dopamine gaps. The unpredictability is key because,
7 paradoxically, intermittent variable rewards (or “IVR”) create stronger associations (conditioned
8 changes in the neural pathway) than fixed rewards. Products that use this technique are highly
9 addictive or habit forming.
10 96. IVR is based on insights from behavioral science dating back to research in the 1950s by
11 Harvard psychologist B. F. Skinner. Skinner found that laboratory mice respond most voraciously
12 to unpredictable rewards. In one famous experiment, mice that pushed a lever received a variable
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 reward (a small treat, a large treat, or no treat at all). Compared with mice who received the same
14 treat every time, the mice who received only occasional rewards were more likely to exhibit
15 addictive behaviors such as pressing the lever compulsively. This exploitation of neural circuitry is
16 exactly how addictive products keep users coming back.
17 97. Many products that employ IVR are limited by the fact that they deliver rewards in a
18 randomized manner. By contrast, Defendants’ apps are designed to purposely withhold and release
19 rewards on a schedule its algorithms have determined is optimal to heighten a specific user’s
20 craving and keep them using the product. For example, TikTok will at times delay a video it knows
21 a user will like until the moment before it anticipates the user would otherwise log out. Instagram’s
22 notification algorithm will at times determine that a particular user’s engagement will be
23 maximized if the app withholds “Likes” on their posts and then later delivers them in a large burst
24 of notifications.
25

26

27

28 some of the other most popular social media apps, in a tech version of Capture the Flag where the only losers are the
users who are forced to persist through this cat-and-mouse game.”).
- 22 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 98. Defendants’ use of IVR is particularly effective on and dangerous for adolescents, given
2 the incomplete aspects of their brain maturation described above—including lack of impulse
3 control and reduced executive functions.
4 99. There are multiple types of dopamine neurons that are connected with distinct brain
5 networks and have distinct roles in motivational control. Apart from the dopamine reward loop
6 triggered by positive feedback, other dopamine neurons are impacted by salient but non-rewarding
7 stimuli and even painful-aversive stimuli. 85 Defendants’ apps capitalize on this by algorithmically
8 ranking photos and videos that “engage” users because they present a dopamine pay-off, including
9 novel, aversive, and alarming images.
10 100. Second, there are multiple types of dopamine neurons that are connected with distinct
11 brain networks and have distinct roles in motivational control. Apart from the dopamine reward
12 loop triggered by positive feedback, other dopamine neurons are impacted by salient but non-
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 rewarding stimuli and even painful-aversive stimuli. 86 Defendants’ apps capitalize on this by
14 algorithmically ranking photos and videos that “engage” users because they present a dopamine
15 pay-off, including novel, aversive, and alarming images.
16 101. Third, unreasonably dangerous features in Defendants’ apps manipulate young users
17 through their exploitation of “reciprocity”—the psychological phenomenon by which people
18 respond to positive or hostile actions in kind. Reciprocity means that people respond in a friendly
19 manner to friendly actions, and with negative retaliation to hostile actions. 87 Phillip Kunz best
20 illustrated the powerful effect of reciprocity through an experiment using holiday cards. Kunz sent
21 cards to a group of complete strangers, including pictures of his family and a brief note. 88 People
22 whom he had never met or communicated with before reciprocated, flooding him with holiday
23

24

25 85
J.P.H. Verharen, Yichen Zhu, and Stephan Lammel, Aversion hot spots in the dopamine system, 64 Neurobiology
26 46-52 (March 5, 2020).
86
Id.
27
87
Ernst Fehr & Simon Gächter, Fairness and Retaliation: The Economics of Reciprocity, 14(3) J. Econ. Persps. 159–
81 (2000).
28
88
Phillip R. Kunz & Michael Woolcott, Season’s Greetings: From my status to yours, 5(3) Soc. Sci. Rsch. 269–78
(Sept. 1976).
- 23 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 cards in return. 89 Most of the responses did not even ask Mr. Kunz who he was—they simply
2 responded to his initial gesture with a reciprocal action. 90
3 102. Products like Instagram and Snapchat exploit reciprocity by, for example, automatically
4 telling a sender when their message is seen, instead of letting the recipient avoid disclosing whether
5 it was viewed. Consequently, the recipient feels more obligated to respond immediately, keeping
6 users on the product. 91
7 103. Fourth, Defendants’ apps addict young users by preying on their already-heightened need
8 for social comparison and interpersonal feedback-seeking. 92 Because of their relatively
9 undeveloped prefrontal cortex, young people are already predisposed to status anxieties, beauty
10 comparisons, and a desire for social validation. 93 Defendants’ apps encourage repetitive usage by
11 dramatically amplifying those insecurities.
12 104. Mitch Prinstein, Chief Science Officer for the American Psychology Association, has
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 explained that online and real-world interactions are fundamentally different. 94 For example, in the
14 real world, no public ledger tallies the number of consecutive days friends speak. Similarly, “[a]fter
15 you walk away from a regular conversation, you don’t know if the other person liked it, or if anyone
16 else liked it.” 95 By contrast, a unreasonably dangerous design feature like the “Snap Streak” creates
17 exactly such artificial forms of feedback. 96 On Defendants’ apps, friends and even complete
18 strangers can deliver (or withhold) dopamine-laced likes, comments, views, or follows. 97
19 105. The “Like” on Facebook, Instagram, TikTok, and YouTube or other comparable features
20 common to Defendants’ products has an especially powerful effect on teenagers and can
21 89
Id.
22
90
Id.
91
Von Tristan Harris, The Slot Machine in Your Pocket, Spiegel Int’l (July 27, 2016).
23
92
Jacqueline Nesi & Mitchell J. Prinstein, Using Social Media for Social Comparison and Feedback-Seeking:
Gender and Popularity Moderate Associations with Depressive Symptoms, 43 J. Abnormal Child Psych. 1427–38
24 (2015).
93
Susan Harter, The Construction of the Self: Developmental and Sociocultural Foundations, (Guilford Press, 2d ed.,
25 2012) (explaining how, as adolescents move toward developing cohesive self-identities, they typically engage in
greater levels of social comparison and interpersonal feedback-seeking).
26
94
Zara Abrams, Why young brains are especially vulnerable to social media, Am. Psych. Ass’n (Aug. 25, 2022).
95
Id.
27
96
A “Snap Streak” is designed to measure a user’s Snapchat activity with another user. Two users achieve a “Snap
Streak” when they exchange at least one Snap in three consecutive 24-hour periods. When successively longer
28 “Streaks” are achieved, users are rewarded with varying tiers of emojis.
97
Zara Abrams, Why young brains are especially vulnerable to social media, Am. Psych. Ass’n (Aug. 25, 2022).
- 24 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 neurologically alter their perception of online posts. Researchers at UCLA used magnetic resonance
2 imaging to study the brains of teenage girls as they used Instagram. They found that girls’
3 perception of a photo changed depending on the number of likes it had generated. 98 That an image
4 was highly liked—regardless of its content—instinctively caused the girls to prefer it. As the
5 researchers put it, teens react to perceived “endorsements,” even if likes on social media are often
6 fake, purchased, or manufactured. 99
7 106. The design of Defendants’ apps also encourages unhealthy, negative social comparisons,
8 which in turn cause body image issues and related mental and physical disorders. Given
9 adolescents’ naturally vacillating levels of self-esteem, they are already predisposed to comparing
10 “upward” to celebrities, influencers, and peers they perceive as more popular. 100 Defendants’ apps
11 turbocharge this phenomenon. On Defendants’ apps, users disproportionately post “idealized”
12 content, 101 misrepresenting their lives. That is made worse by appearance-altering filters built into
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Defendants’ apps, which underscore conventional (and often racially biased) standards of beauty,
14 by allowing users to remove blemishes, make bodies and faces appear thinner, and lighten skin-
15 tone. Defendants’ apps provide a continuous stream of these filtered and fake appearances and
16 experiences. 102 That encourages harmful body image comparisons by adolescents, who begin to
17 negatively perceive their own appearance and believe their bodies, and indeed their lives, are
18 comparatively worse. 103
19 98
Lauren E. Sherman et al., The Power of the Like in Adolescence: Effects of Peer Influence on Neural and
20 Behavioral Responses to Social Media, 27(7) Psychol Sci. 1027 (2016).
99
Id.
21
100
Jacqueline Nesi & Mitchell J. Prinstein, Using Social Media for Social Comparison and Feedback-Seeking:
Gender and Popularity Moderate Associations with Depressive Symptoms, 43 J. Abnormal Child Psych. 1427–38
22 (2015). (“Upward comparison occurs when people compare themselves to someone they perceive to be superior[ ],
whereas a downward comparison is defined by making a comparison with someone perceived to be inferior[.]”); Jin-
23 Liang Wang, Hai-Zhen Wang, James Gaskin, & Skyler Hawk, The Mediating Roles of Upward Social Comparison
and Self-esteem and the Moderating Role of Social Comparison Orientation in the Association between Social
24 Networking Site Usage and Subjective Well-Being, Frontiers in Psychology (May 2017).
101
Jacqueline Nesi & Mitchell J. Prinstein, Using Social Media for Social Comparison and Feedback-Seeking:
25 Gender and Popularity Moderate Associations with Depressive Symptoms, 43 J. Abnormal Child Psych. 1427–38
(2015).
26
102
Jin Kyun Lee, The Effects of Social Comparison Orientation on Psychological Well-Being in Social Networking
Sites: Serial Mediation of Perceived Social Support and Self-Esteem, Current Psychology (2020).
27
103
Id.; see also Nino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation
model of problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022).
28 (explaining that youth are particularly vulnerable because they “use social networking sites for construing their
identity, developing a sense of belonging, and for comparison with others”); Jin Lee, The effects of social comparison
- 25 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 107. Fifth, Defendants’ respective product features work in combination to create and
2 maintain a user’s “flow-state”: a hyper-focused, hypnotic state, where bodily movements are
3 reflexive and the user is totally immersed in smoothly rotating through aspects of the social media
4 product. 104
5 108. As discussed in more detail below, unreasonably dangerous features like the ones just
6 described can cause or contribute to the following injuries in young people: eating and feeding
7 disorders; depressive disorders; anxiety disorders; sleep disorders; trauma- and stressor-related
8 disorders; obsessive-compulsive and related disorders; disruptive, impulse-control, and conduct
9 disorders; suicidal ideation; self-harm; and suicide. 105
10 3. Millions of kids use Defendants’ products compulsively.

11 109. Defendants have been staggeringly successful in their efforts to attract young users to

12 their apps. In 2021, 32% of 7- to 9-year-olds, 106 49% of 10- to 12-year-olds, 107 and 90% of 13- to
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 17-year-olds in the United States used social media. 108 A majority of U.S. teens use Instagram,

14 TikTok, Snapchat, and/or YouTube. Thirty-two percent say they “wouldn’t want to live without”

15 YouTube, while 20% said the same about Snapchat, and 13% said the same about both TikTok and

16 Instagram. 109

17 110. U.S. teenagers who use Defendants’ products are likely to use them every day. Sixty-

18 two percent of U.S. children ages 13-18 use social media daily. 110 And daily use often means

19 constant use. About one-in-five U.S. teens visit or use YouTube “almost constantly,” while about

20

21
orientation on psychological well-being in social networking sites: serial mediation of perceived social support and
22 self-esteem, 41 Current Psychology 6247-6259 (2022).
104
See e.g., What Makes TikTok so Addictive?: An Analysis of the Mechanisms Underlying the World’s Latest Social
23 Media Craze, Brown Undergraduate J. of Pub. Health (2021) (describing how IVR and infinite scrolling may induce
a flow state in users).
24
105
E.g., Nino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model of
problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022) (collecting sources).
25
106
Sharing Too Soon? Children and Social Media Apps, C.S. Mott Child’s Hosp. Univ. Mich. Health (Oct. 18,
2021).
26
107
Id.
108
Social Media and Teens, Am. Acad. Child & Adolescent Psychiatry (Mar. 2018); see also Victoria Rideout et al.,
27 The Common Sense Census: Media Use by Tweens and Teens, 2021 at 5, Common Sense Media (2022).
109
Victoria Rideout et al., Common Sense Census: Media use by tweens and teens, 2021 at 31, Common Sense
28 Media (2022).
110
Id.
- 26 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 one-in-six report comparable usage of Instagram. 111 Nearly half of U.S. teens use TikTok at least
2 “several times a day.” 112 In one study, U.S. teenage users reported checking Snapchat thirty times
3 a day on average. 113
4 111. Social media use among Tribal youth is compulsive. The 2020 Native Youth Health Tech
5 Survey determined that 65.3% of tribal youth (15-24) are on social media 3-7 hours per day, “with
6 86.0% reporting their primary activity on social media as scrolling, followed by watching videos
7 (75.1%).” 114 The same survey found that “the most popular daily technology use among AI/AN
8 youth involved browsing Instagram (74.0%), sending/receiving snap messages via Snapchat
9 (60.0%), using TikTok (50.4%), and watching videos on YouTube (48.4%).” 115
10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22 112. Teenagers know they are addicted to Defendants’ products: 36% admit they spend too
23 much time on social media. 116 Yet they can’t stop. Of the teens who use at least one social media
24

25
111
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
112
Id.
26
113
Erinn Murphy et al., Taking Stock with Teens, Fall 2021 at 13, Piper Sandler (2021); see also Emily Vogels et al.,
Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
27
114
Reed et al., Findings from the 2020 Native Youth Health Tech Survey Am Indian Alsk Native, Ment Health Res.
(2022).
28
115
Id.
116
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
- 27 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 product “almost constantly,” 71% say quitting would be hard. Nearly one-third of this population—
2 and nearly one-in-five of all teens—say quitting would be “very hard.” 117
3 113. Notably, the more teens use Defendants’ apps, the harder it is to quit. Teens who say
4 they spend too much time on social media are almost twice as likely to say that giving up social
5 media would be hard, compared to teens who see their social media usage as about right. 118
6 114. Despite using social media frequently, most young people don’t particularly enjoy it. In
7 2021, only 27% of boys and 42% of girls ages 8-18 reported liking social media “a lot.” 119
8 Moreover, one survey found that young people think social media is the main reason youth mental
9 health is getting worse. 120 About twice as many of the surveyed youth believed that social media is
10 the main reason for declining mental health than the next likely cause, and over seven times more
11 believed it to be the main cause rather than drugs and alcohol. 121
12 4. Defendants’ unreasonably dangerous products encourage risky “challenges.”
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 115. Dangerous “viral” challenges are one particularly pernicious result of the unreasonably

14 dangerous design of Defendants’ apps. “Online challenges or dares typically involve people

15 recording themselves doing something difficult, which they share online to encourage others to

16 repeat.” 122 These challenges often generate significant engagement—sometimes millions of likes

17 or views—and resulting social rewards to the users who post videos of themselves carrying out the

18 challenges. Predictably, a substantial portion of online challenges—created for the purpose of

19 generating social rewards—are very dangerous.

20 116. For example, one common social media challenge is the “Blackout Challenge,” where

21 youth are encouraged to make themselves faint by holding their breath and constricting their chest

22 muscles, or by restricting airflow with a ligature around their neck. This challenge is dangerous

23 because, should the participant fail to remove the ligature around their neck prior to fainting, they

24 may strangle themselves. Similarly, an “I Killed Myself” challenge involves participants faking

25 117
Id.
26
118
Id.
119
Victoria Rideout et al., Common Sense Census: Media use by tweens and teens, 2021 at 34, Common Sense
27 Media (2022).
120
Headspace (2018) National youth mental health survey 2018, National Youth Mental Health Foundation (2018).
28
121
Id.
122
TikTok, Online Challenges.
- 28 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 their own deaths to record their family members’ reactions upon believing their loved one has died.
2 This challenge is dangerous because certain methods of participating can actually kill (or inflict
3 catastrophic injury) on participants. Likewise, the game Russian Roulette—in which a participant
4 loads a revolver with a single bullet, spins the chamber until it falls on a random slot, and then
5 shoots themselves—has taken on new life as a social media challenge.
6 117. Again, these injuries and deaths are a foreseeable consequence of Defendants’ addictive
7 product designs. Many other addictive products cause injury or death because neuroadaptation
8 causes addicts to use increasingly extreme methods to maintain dopamine levels. That compulsive
9 use of social media would do the same was, at all relevant times, foreseeable, particularly as to
10 young users whose abilities to assess risks, make wise decisions, and regulate their responses to
11 perceived social needs are still developing.
12 118. Defendants are perfectly aware of the foreseeable risks to youth presented by their apps’
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 “viral” promotion of dangerous challenges.


14 119. Defendants have encouraged the viral challenge phenomenon in spite of the fact that
15 their encouragement furthers dangerous challenges themselves and a broader ecosystem in which
16 those dangerous challenges occur.
17 120. Meta, TikTok, and YouTube use engagement-optimized algorithms to control users’
18 main feeds. Such algorithms spread extreme content as a consequence of its propensity to generate
19 engagement. That unreasonably dangerous design feature foreseeably leads to dangerous
20 challenges spreading easily on these Defendants’ platforms.
21 121. Defendants have further encouraged challenges in other ways. ByteDance regularly
22 creates overlays and filters that facilitate viral challenges. It offers advertisers the ability to launch
23 Branded Hashtag Challenges and promotes them on user feeds. 123 It boasts that challenges are
24 “geared towards building awareness and engagement,” and that “research shows that they can
25 deliver strong results”—i.e., increased return on ad spending—“at every stage of the funnel.” This,
26 in turn, generates advertising revenue for ByteDance.
27

28 123
TikTok for Business, Branded Hashtag Challenge: Harness the Power of Participation, (Mar. 16, 2022).
- 29 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 122. Snap also promotes viral challenges through Snapchat’s Snap Spotlight feature. It gives
2 cash prizes to challenge participants whose challenges receive the most views on Snap Spotlight. 124
3 It has also created overlays that encourage challenges—such as a Speed Filter, showing how fast a
4 given user was going at the time they took a particular Snap. 125 Other Defendants have also
5 promoted viral challenges based on a recognition that such challenges drive engagement and
6 advertising revenue.
7 5. Defendants’ unreasonably dangerous social media apps facilitate and
contribute to the sexual exploitation and sextortion of children, and the ongoing
8
production and spread of child sex abuse material online.
9

10 123. It is well documented that sexual predators use Defendants’ products to target and exploit

11 minors. They are drawn to social media because it provides them with easy access to a large pool

12 of potential victims, many of whom are addicted to Defendants’ products. On February 7, 2023,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 for example, the FBI and international law enforcement agencies issued a joint warning about the

14 global “financial sextortion crisis” which stated: “Financial sextortion can happen anywhere,

15 although it mainly occurs on the digital platforms where children are already spending their screen

16 time, like social media and gaming websites, or video chat applications. On these platforms,

17 predators often pose as girls of a similar age and use fake accounts to target young boys, deceiving

18 them into sending explicit photos or videos. The predator then threatens to release the

19 compromising materials unless the victim sends payment, however, in many cases, the predator

20 will release the images anyway.” 126

21 124. Rather than mitigate the risk of sexual exploitation and harm to children and teens on

22 their products, Defendants have facilitated and exacerbated it by implementing unreasonably

23 dangerous product features that help sexual predators connect with children. Defendants’ products

24 are designed in unsafe ways—including, as to some or all Defendants, flawed age verification, lack

25 of meaningful mechanisms to prevent sham accounts, default-public profiles, matching and

26 recommending connections between adults and minors, promoting unsolicited messages and

27 124
“Snapchat adds ‘challenges’ with cash prizes to its TikTok competitor,” The Verge, Oct. 6, 2021.
28
125
See Lemmon v. Snap, Inc., 995 F.3d 1085, 1085 (9th Cir. 2021).
126
International Law Enforcement Agencies Issue Joint Warning about global financial sextortion crisis, FBI (2023).
- 30 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 interactions from adults, and wholly inadequate and ineffective parental controls, among others—
2 that allow children to be easily identified, targeted, accessed, and exploited.
3 125. Compounding the problem, Defendants routinely fail to report abuse. Steve Grocki, the
4 U.S. Department of Justice’s Chief of the Criminal Division’s Child Exploitation & Obscenity
5 Section, put it this way: “A lot of times, someone has come across something problematic and the
6 platform isn’t doing anything[.]” He went on to say, “[t]hese reports can be of great value because
7 they signal where there are big problems and we can flag those issues to Internet companies, such
8 as when products are being exploited by offenders, they aren’t meeting reporting requirements, or
9 when children under the age restriction are accessing inappropriate content.” 127
10 126. By failing to implement adequate age and identity verification on the products,
11 Defendants knowingly and foreseeably place children in the pathways of sexual predators, who
12 utilize their products and exploit their unreasonably dangerous design features. Age verification on
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Defendants’ products can be unreasonably dangerous for a variety of reasons, including:


14 a. Inaccurate information: Users can easily enter false information, such as a fake date
15 of birth, to bypass age restrictions;
16 b. No measures to prevent sham accounts: Defendants’ products are structured so that
17 users can easily create multiple accounts under different names and ages; and
18 c. Incomplete implementation: Defendants’ products only partially implement age
19 verification, such as requiring users to provide their date of birth but not verifying it
20 through any means.
21 127. To be sure, Defendants possess the technology to identify minors who are posing as
22 adults and adults who are posing as minors, but they do not use this information to identify violative
23 accounts and remove them from their products.
24 128. For example, by making minors’ profiles public by default, certain Defendants have
25 supplied sexual predators with detailed background information about children, including their
26 friends, activities, interests, and even location. By providing this information, these Defendants put
27 children at risk of being approached and befriended by sexual predators. These young targets had
28 127
Equality Now, Steve Grocki Expert Interview - United States, (Nov. 15, 2021).
- 31 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 no idea that their posts, friends, pictures, and general online sharing represented a windfall of
2 information that a predator could use to sexually exploit or abuse them.
3 129. Meta, ByteDance, and Google are aware of these very real risks that public-by-default
4 accounts for minors can cause by leaving minors broadly exposed to adult strangers, and yet have
5 only recently (in response to litigation) begun to create default privacy settings for youth
6 accounts—even though they have been aware of these harmful interactions for years.
7 130. Defendants have also lagged behind in restricting who can reach minor users through
8 Direct Message features.
9 131. Defendants’ parental controls are also unreasonably dangerous in that they do not give
10 parents effective control over their children’s online activity, which can lead to kids connecting
11 with predators. These design features take several forms:
12 a. Limited scope: parental control tools only partially cover a child’s activity, leaving
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 gaps that predators can exploit;


14 b. Inadequate monitoring: parental control tools do not provide real-time monitoring
15 of a child’s activity, meaning that harmful interactions with predators go unnoticed;
16 c. Lack of customization: parents are not able to fully customize their parental control
17 settings to meet the specific needs of their family and children, leaving them with a
18 one-size-fits-all solution that is wholly ineffective in preventing connections with
19 predators; and
20 d. Opt-in format: parental control tools that require parents to affirmatively link to their
21 child’s account are futile when parents are not notified that their minor child has
22 opened an account on the platform in the first place.
23 132. Defendants are well aware that vulnerable young users—whom Defendants induce to
24 spend large amounts of time on their products, through a powerful combination of algorithmic
25 recommendations and addictive features designed to make it hard for a user to disengage—are
26 uniquely susceptible to grooming by seasoned sexual predators. Research shows that young users
27 are heavily reliant on their social connections—exploring and shaping their identity through their
28 social relationships.
- 32 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 133. Defendants’ unreasonably dangerous product features have benefited sexual predators in
2 many other ways as well. For example, sexual predators leverage Defendants’ use of ephemeral,
3 “disappearing” technology to assure young users that there is no risk to them sending a sexual photo
4 or video. Trusting young users are then horrified to discover that these videos have been captured
5 by predators and then circulated to their own friends and contacts or other sexual predators. In some
6 severe cases, young users find themselves in the nightmarish scheme known as “sextortion,” where
7 a predator threatens to circulate the sexual images of the minor unless the predator is paid to keep
8 the images under wraps. Law enforcement agencies across the country report that this scheme has
9 become pervasive on Defendants’ products.
10 C. META MARKETS AND DESIGNS FACEBOOK AND INSTAGRAM TO
ADDICT YOUNG USERS, SUBSTANTIALLY CONTRIBUTING TO THE
11 MENTAL HEALTH CRISIS.
12 1. Background and overview of Meta’s products.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 134. Meta operates and designs Facebook and Instagram, two of the world’s most popular

14 social media products. In 2022, two billion users worldwide were active on Instagram each month,

15 and almost three billion were monthly active users of Facebook. 128 This enormous reach has been

16 accompanied by enormous damage for adolescent users.

17 135. Meta understands that its products are used by kids under 13. It understands that its

18 products are addictive. It understands that addictive use leads to problems. And it understands that

19 these problems can be so extreme as to include encounters between adults and minors—and driven

20 largely by a single unreasonably dangerous design feature. 129

21 136. Despite this knowledge, and rather than stand up a five-alarm effort to stop the problems

22 created by its products, Meta has abjectly failed at protecting child users of Instagram and

23 Facebook.

24 137. Meta has done next to nothing. And its reason was simple: growth. Taking action would

25 lower usage of (and therefore lower profits earned from) a critical audience segment.

26

27

28
128
Alex Barinka, Meta’s Instagram Users Reach 2 Billion, Closing In on Facebook, Bloomberg (Oct. 26, 2022).
129
This defect, “People You May Know,” allows adults to connect with minors on Facebook and Instagram.
- 33 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 138. Meta’s frequent gestures towards youth safety were never serious and always driven by
2 public relations: Meta offered tools to kids and parents, like “time spent,” that it knew presented
3 false data. At the same time, Meta engaged in a cynical campaign to counter-message around the
4 addiction narrative by discrediting existing research as completely made up. Meta knew better.
5 Indeed, both Zuckerberg and Instagram CEO Adam Mosseri heard firsthand from a leading
6 researcher that Instagram and Facebook posed unique dangers to young people.
7 a. Facebook’s acquisition and control of Instagram.

8 139. On or around April 6, 2012, Zuckerberg called Kevin Systrom, one of the co-founders
9 of Instagram, offering to purchase his company. 130
10 140. Instagram launched as a mobile-only app that allowed users to create, filter, and share
11 photos. On the first day of its release in October 2010, it gained a staggering 25,000 users. 131 By
12 April 2012, Instagram had approximately 27 million users. When Instagram released an Android
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 version of its app—right around the time of Zuckerberg’s call—it was downloaded more than a
14 million times in less than a day. 132 Instagram’s popularity is so widespread and image-based, a new
15 term has grown up around it for the perfect image or place: “Instagrammable.” 133
16 141. On April 9, 2012, just days after Zuckerberg’s overture to Systrom, Facebook, Inc.
17 purchased Instagram, Inc. for $1 billion in cash and stock. This purchase price was double the
18 valuation of Instagram implied by a round of funding the company closed days earlier. 134
19 142. Facebook, Inc. held its initial public offering less than two months after acquiring
20 Instagram, Inc. 135
21 143. Zuckerberg’s willingness to pay a premium for Instagram was driven by his instinct that
22 Instagram would be vital to reaching a younger, smartphone-oriented audience—and thus critical
23 to his company’s going-forward success.
24 130
Nicholas Carlson, Here’s The Chart That Scared Zuckerberg Into Spending $1 Billion On Instagram, Insider
25 (Apr. 14, 2012).
131
Dan Blystone, Instagram: What It Is, Its History, and How the Popular App Works, Investopedia (Oct. 22, 2022).
26
132
Kim-Mai Cutler, From 0 to $1 billion in two years: Instagram's rose-tinted ride to glory, TechCrunch (Apr. 9,
2012).
27
133
Sarah Frier, No Filter 81 (2020).
134
Alexia Tsotsis, Right Before Acquisition, Instagram Closed $50M At A $500M Valuation From Sequoia, Thrive,
28 Greylock And Benchmark, TechCrunch (Apr. 9, 2012).
135
Evelyn Rusli & Peter Eavis, Facebook Raises $16 Billion in I.P.O., N.Y. Times (May 17, 2012).
- 34 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 144. This was prescient. Instagram’s revenue grew exponentially from 2015 to 2022. 136 A
2 study conducted in the second quarter of 2018 showed that, over the prior year, advertisers’
3 spending on Instagram grew by 177%—more than four times the growth of ad spending on
4 Facebook. 137 Likewise, visits to Instagram rose by 236%, nearly thirty times the growth in site
5 visits experienced by Facebook during the same period. 138 By 2021, Instagram accounted for over
6 half of Meta’s $50.3 billion in net advertising revenues. 139
7 145. Meta has claimed credit for Instagram’s success since its acquisition. Zuckerberg told
8 market analysts that Instagram “wouldn’t be what it is without everything that we put into it,
9 whether that’s the infrastructure or our advertising model.” 140
10 146. Instagram has become the most popular photo-sharing social media product among
11 teenagers and young adults in the United States. 62% of American teens use Instagram, with 10%
12 of users reporting that they use it “almost constantly.” 141 Instagram’s young user base has become
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 even more important to Meta as the number of teens using Facebook has decreased over time. 142
14 147. Facebook’s and Instagram’s success, and the riches they have generated for Meta, have
15 come at an unconscionable cost in human suffering. In September 2021, The Wall Street Journal
16 began publishing internal documents leaked by former Facebook product manager Frances
17 Haugen. 143
18 148. The documents are disturbing. They reveal that, according to Meta’s researchers, 13.5%
19 of U.K. girls reported more frequent suicidal thoughts and 17% of teen girls reported worsening
20

21
136
See Josh Constine, Instagram Hits 1 Billion Monthly Users, Up From 800M in September, TechCrunch (June 20,
2018). (showing meteoric rise in monthly active users over period and reporting year-over-year revenue increase of
22 70% from 2017-2018).
137
Merkle, Digital Marketing Report 3 (Q2 2018).
23
138
Id.
139
Sara Lebow, For the First Time, Instagram Contributes Over Half of Facebook’s US Ad Revenues, eMarketer
24 (Nov. 2, 2021).
140
Salvador Rodriguez, Mark Zuckerberg Is Adamant that Instagram Should Not Be Broken Off from Facebook,
25 CNBC (Oct. 20, 2019).
141
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Research. Ctr. (Aug. 10, 2022); see also
26 Piper Sandler, Taking Stock With Teens 19 (Fall 2021). (eighty-one percent of teens use Instagram at least once a
month).
27
142
Sheera Frenkel et al., Instagram Struggles with Fears of Losing Its ‘Pipeline’: Young Users, N.Y. Times (Oct. 26,
2021).
28
143
The collection of Wall Street Journal articles are available online via the following link:
https://www.wsj.com/articles/the-facebook-files-11631713039?mod=bigtop-breadcrumb.
- 35 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 eating disorders after starting to use Instagram. 144 Over 40% of Instagram users who reported
2 feeling “unattractive” said that feeling began while using Instagram, 145 and 32% of teen girls who
3 already felt bad about their bodies felt even worse because of the app. 146
4 149. Internal Meta presentations from 2019 and 2020 were unsparing in their conclusions
5 about the harms caused by Instagram: “We make body image issues worse for one in three teen
6 girls” “Mental health outcomes related to this can be severe.” “Aspects of Instagram exacerbate
7 each other to create a perfect storm.” 147
8 150. Haugen’s revelations made clear to the public what Meta has long known: in an effort to
9 addict kids and promote usage, Meta’s products exploit the neurobiology of developing brains, and
10 all the insecurities, status anxieties, and beauty comparisons that come along with it. In a bid for
11 higher profits, Meta ignored the harms resulting from its overuse-oriented business model, which
12 are widespread, serious, long-term, and in tragic instances fatal.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 2. Meta intentionally encourages youth to use its products and then leverages that
usage to increase revenue.
14

15 151. Facebook and Instagram owe their success to their unreasonably dangerous design,

16 including their underlying computer code and algorithms. Meta’s tortious conduct begins before a

17 user has viewed, let alone posted, a single scrap of content.

18 152. Meta describes the Instagram product as a “mobile-first experience.” 148 Indeed, the great

19 majority of Instagram users in the U.S. access Instagram through a mobile application for either the

20 iOS or Android operating system.

21 153. In order to use the Facebook or Instagram app, one must first obtain it. On a mobile

22 device, this is accomplished by visiting a store from which the product can be downloaded—either

23 the Apple App Store (for iPhone users) or the Google Play Store (for Android users). Once installed

24 144
Morgan Keith, Facebook’s Internal Research Found its Instagram Platform Contributes to Eating Disorders and
25 Suicidal Thoughts in Teenage Girls, Whistleblower Says, Insider (Oct. 3, 2021).
145
Georgia Wells, Jeff Horwitz, Deepa Seetharaman, Facebook Knows Instagram is Toxic for Teen Girls, Company
26 Documents Show, Wall St. J. (Sept. 14, 2021); Facebook Staff, Teen Girls Body Image and Social Comparison on
Instagram – An Exploratory Study in the U.S., 9 (Mar. 26, 2020).
27
146
Billy Perrigo, Instagram Makes Teen Girls Hate Themselves. Is That a Bug or a Feature?, Time (Sept. 16, 2021).
147
Georgia Wells, Jeff Horwitz, Deepa Seetharaman, Facebook Knows Instagram is Toxic for Teen Girls, Company
28 Documents Show, Wall St. J. (Sept. 14, 2021).
148
Yorgos Askalidis, Launching Instagram Messaging on Desktop, Instagram (Sept. 25, 2020).
- 36 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 onto an individual’s smartphone, they can open the app. They are then asked to create a new account
2 by entering an email address, adding a name, and creating a password and username.
3 154. A prospective Instagram or Facebook user is then invited to press a colorful button that
4 says “Sign up.” In small print above this button, the user is informed: “By tapping Sign up, you
5 agree to our Terms, Data Policy and Cookies Policy.” The text of those policies is not presented on
6 the sign-up page. While the words “Terms,” “Data Policy,” and “Cookies Policy” are slightly
7 bolded, the user is not informed that they can or should click on them, or otherwise told how they
8 can access the policies.
9

10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22

23 155. Meta’s Data Policy (rebranded as a “Privacy Policy” in 2022), which applies to a raft of
24 Meta apps, including Facebook and Instagram, 149 indicates Meta collects a breathtaking amount of
25 data from the users of its products, including:
26 a. “[c]ontent that you create, such as posts, comments or audio;”
27

28 149
Meta, Privacy Policy, Meta (Jan. 1, 2023).
- 37 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 b. “[c]ontent you provide through our camera feature or your camera roll settings, or
2 through our voice-enabled features;”
3 c. “[I]nformation you've shared with us through device settings, such as GPS location,
4 camera access, photos and related metadata;”
5 d. “[m]essages that you send and receive, including their content;”
6 e. “Metadata about content and messages;”
7 f. “[t]ypes of content that you view or interact with, and how you interact with it;”
8 g. “[t]he time, frequency and duration of your activities on our products;”
9 h. “your contacts' information, such as their name and email address or phone number,
10 if you choose to upload or import it from a device, such as by syncing an address
11 book;”
12 i. information about “What you're doing on your device (such as whether our app is
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 in the foreground or if your mouse is moving);”


14 j. “device signals from different operating systems,” including “things such as nearby
15 Bluetooth or Wi-Fi connections;”
16 k. “[i]nformation about the network that you connect your device to,” which includes
17 “The name of your mobile operator or Internet service provider (ISP), Language,
18 Time zone, Mobile phone number, IP address, Connection speed, Information about
19 other devices that are nearby or on your network, Wi-Fi hotspots you connect to
20 using our products;” and
21 l. “information from . . . third parties, including . . . [m]arketing and advertising
22 vendors and data providers, who have the rights to provide us with your
23 information.”
24 156. While the Data Policy indicates the scope of user information collected by Meta through
25 Facebook and Instagram, it is far less forthcoming about the purposes for which this data is
26 collected, and its consequences for younger users.
27

28
- 38 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 157. The Data Policy presents those goals as benign and even positive for its users—“to
2 provide a personalized experience to you” and to “make suggestions for you such as people you
3 may know, groups or events that you may be interested in or topics that you may want to follow.”
4 158. The Data Policy does not inform users that the more time individuals spend using
5 Facebook and Instagram, the more ads Meta can deliver and the more money it can make, or that
6 the more time users spend on Facebook and Instagram, the more Meta learns about them, and the
7 more it can sell to advertisers the ability to micro-target highly personalized ads. 150
8 159. Meta monetizes its users and their data by selling ad placements to marketers. Meta
9 generated $69.7 billion from advertising in 2019, more than 98% of its total revenue for the year. 151
10 160. Given its business model, Meta has every incentive to—and knowingly does—addict
11 users to Facebook and Instagram. It accomplishes this through the algorithms that power its apps,
12 which are designed to induce compulsive and continuous scrolling for hours on end, operating in
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 conjunction with the other unreasonably dangerous features described throughout this
14 Complaint. 152
15 161. Meta’s Data Policy contains no warnings whatsoever that use of its products at the
16 intensity and frequency targeted by Meta creates known risks of mental, emotional, and behavioral
17 problems for children, Instagram’s key audience.
18

19

20
150
Nor does it inform users that Meta has allowed third-party apps to harvest from Facebook “vast quantities of
highly sensitive user and friends permissions.” In re Facebook, Inc., No. 18-md-02843-VC, ECF No. 1104 at 9 (N.D.
21 Cal. Feb. 9, 2023). This has included an app called Sync.Me, which—according to Meta’s internal investigative
documents—“had access to many ‘heavyweight’ permissions,” “including the user’s entire newsfeed, friends’ likes,
22 friends’ statuses, and friends’ hometowns.” In re Facebook, Inc., No. 18-md-02843-VC, ECF No. 1104 at 9 (N.D.
Cal. Feb. 9, 2023). It has included Microstrategy, Inc., which accessed data from “16 to 20 million” Facebook users,
23 despite only being installed by 50,000 people. In re Facebook, Inc., No. 18-md-02843-VC, ECF No. 1104 at 9 (N.D.
Cal. Feb. 9, 2023). And it has included one Yahoo app that made “billions of requests” for Facebook user
24 information, including “personal information about those users’ friends, including the friends’ education histories,
work histories, religions, politics, ‘about me’ sections, relationship details, and check-in posts.” In re Facebook, Inc.,
25 No. 18-md-02843-VC, ECF No. 1104 at 9-10 (N.D. Cal. Feb. 9, 2023).
151
Rishi Iyengar, Here’s How Big Facebook’s Ad Business Really Is, CNN (July 1, 2020).
26
152
See Christian Montag, et al., Addictive Features of Social Media/Messenger Platforms and Freemium Games
against the Background of Psychological and Economic Theories, 16 Int’l J. Env’t Rsch. and Pub. Health 2612, 5
27 (July 2019), (“One technique used to prolong usage time in this context is the endless scrolling/streaming feature.”);
see generally, Ludmila Lupinacci, ‘Absentmindedly scrolling through nothing’: liveness and compulsory continuous
28 connectedness in social media, 43 Media, Culture & Soc’y 273 (2021), (describing the ways that users use and
experience social media apps).
- 39 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 162. Instagram’s collection and utilization of user data begins the instant a user presses “Sign
2 Up.” At that point, Instagram prompts a new user to share a substantial amount of personal data.
3 First, Instagram asks the user to share their personal contacts, either by syncing contacts from their
4 phone and/or syncing their “Friends” from Facebook—“We’ll use your contacts to help you find
5 your friends and help them find you.” Next, Instagram asks the new user to upload a photo of
6 themselves. After that, Instagram asks the user to “Choose your interests” in order to “Get started
7 on Instagram with account recommendations tailored to you.” And finally, Instagram invites the
8 new user to “Follow accounts to see their photos and videos in your feed,” offering a variety of
9 recommendations. After sign-up is completed, Instagram prompts the new user to post either a
10 photo or a short video.
11 163. Meta’s collection and utilization of user data continues unabated as a new user begins to
12 interact with its products. Meta’s tracking of behavioral data—ranging from what the user looks at,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 to how long they hover over certain images, to what advertisements they click on or ignore—helps
14 Meta build out a comprehensive and unique fingerprint of the user’s identity. As the user continues
15 to use the product, Meta’s algorithm works silently in the background to refine this fingerprint, by
16 continuously monitoring and measuring patterns in the user’s behavior. Meta’s algorithm is
17 sophisticated enough that it can leverage existing data to draw educated inferences about even the
18 user behavior it does not track firsthand. Meta’s comprehensive data collection allows it to target
19 and influence its users in order to increase their “engagement” with its apps.
20 164. Meta’s collection and analysis of user data allows it to assemble virtual dossiers on its
21 users, covering hundreds if not thousands of user-specific data segments. This, in turn, allows
22 advertisers to micro-target marketing and advertising dollars to very specific categories of users,
23 who can be segregated into pools or lists using Meta’s data segments. Only a fraction of these data
24 segments come from content knowingly designated by users for publication or explicitly provided
25 by users in their account profiles. Many of these data segments are collected by Meta through covert
26 surveillance of each user’s activity while using the product and when logged off the product,
27 including behavioral surveillance that users are unaware of, like navigation paths, watch time, and
28 hover time. Essentially, the larger Meta’s user database grows, the more time the users spend on
- 40 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 the database, and the more detailed information that Meta can extract from its users, the more
2 money it makes.
3 165. Currently, advertisers can target Instagram and Facebook ads to young people based on
4 age, gender, and location. 153 According to U.S.-based non-profit Fairplay, Meta did not actually
5 cease collecting data from teens for advertising in July 2021, as Meta has claimed. 154
6 166. Meta clearly understands the revenue and growth potential presented by its youngest
7 users, and it is desperate to retain them. Documents obtained by The New York Times indicate,
8 since 2018, almost all of Instagram’s $390 million global marketing budget has gone towards
9 showing ads to teenagers. 155
10 167. Before the rise of Instagram, Facebook was the social media product by which Meta
11 targeted young users. Until recently, when youth Facebook usage began to drop, this targeting was
12 devastatingly effective.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 168. While the number of teen Facebook users has declined in recent years, Facebook remains
14 critical to Meta’s strategy towards young users. Meta views Facebook as the nexus of teen users’
15 lives on social media, and as filling a similar role for such users as the career-focused social media
16 product LinkedIn fills for adults.
17 169. To create this cycle, and notwithstanding restrictions under COPPA, Meta has targeted
18 kids as young as six. The centerpiece of these efforts is Messenger Kids (“MK”). 156
19 170. Meta was also eager to market its products to tweens—users aged 10-12. Although Meta
20 employees publicly denied using children as “guinea pigs” to develop product features, internally
21 Meta was intensely interested in children’s use of their apps. 157
22 171. Meta has studied features and designs from its other products to make Instagram as
23 attractive and addictive as possible to young users. Meta’s flagship product Facebook was the
24 original testing ground for many of Instagram’s addicting and unreasonably dangerous features,
25 153
Andrea Vittorio, Meta’s Ad-Targeting to Teens Draws Advocacy Group Opposition, Bloomberg (Nov. 16, 2021).
26
154
Id.
155
Sheera Frenkel, et al, Instagram Struggles With Fears of Losing Its ‘Pipeline’: Young Users, N.Y. Times (Oct. 16,
27 2021).
156
Nick Stat, Facebook launches a version of Messenger for young children, The Verge (December 4, 2022).
28
157
John Twomey, Molly Russell Inquest Latest: Teenager Viewed Suicide Videos of ‘Most Distressing Nature’,
Express (Sept. 23, 2022).
- 41 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 which the two products share to this day. This feature overlap is no accident: it represents a
2 conscious strategy adopted by Meta to keep social media users hooked on its “family” of products
3 for their entire lives.
4 172. From the beginning, both the Facebook and Instagram products have exploited
5 vulnerabilities in human psychology to addict users and maximize user time and engagement.
6 Facebook’s first President, Sean Parker, summed up the devastating impact of this product design
7 in a 2017 interview:
8 God only knows what it's doing to our children’s brains. . . . The
thought process that went into building these applications, Facebook
9 being the first of them, . . . was all about: ‘How do we consume as
10 much of your time and conscious attention as possible?’. . . And that
means that we need to sort of give you a little dopamine hit every
11 once in a while, because someone liked or commented on a photo or
a post . . . And that’s going to get you to contribute more content, and
12 that’s going to get you . . . more likes and comments. . . . It’s a social-
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

validation feedback loop . . . exactly the kind of thing that a hacker


LOS ANGELES

13
like myself would come up with, because you’re exploiting a
14 vulnerability in human psychology. . . . The inventors, creators —
it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s
15 all of these people — understood this consciously. And we did it
anyway. 158
16

17 Tellingly, many tech leaders, including individuals with inside knowledge of the dangerousness of

18 Meta’s social media products, either ban or severely limit their own children’s access to screen time

19 and social media. 159 Such leaders in the field include Tim Cook and former Facebook executives

20 Tim Kendall and Chamath Palihapitiya. 160

21 3. Meta intentionally designed product features to addict children and


adolescents.
22

23 173. Meta designed Facebook and Instagram with unreasonably dangerous design features
24 that users encounter at every stage of interaction with the product. These design features, which
25

26
158
Mike Allen, Sean Parker unloads on Facebook: “God only knows what it’s doing to our children’s brains,” Axios
(Nov. 9, 2017).
27
159
Samuel Gibbs, Apple’s Tim Cook: “I Don’t Want My Nephew on a Social Network”, The Guardian (Jan. 19,
2018); James Vincent, Former Facebook Exec Says Social Media is Ripping Apart Society, The Verge (Dec. 11,
28 2017).
160
Id.
- 42 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 have harmed adolescents that use the products, include but are not limited to: (a) recommendation
2 algorithms, fueled by extensive data collection, which are designed to promote use in quantities
3 and frequency harmful to adolescents; (b) product features that prey upon children’s desire for
4 validation and need for social comparison; (c) product features that are designed to create harmful
5 loops of repetitive and excessive product usage; (d) lack of effective mechanisms, despite having
6 the ability to implement them, to restrict children’s usage of the products; (e) inadequate parental
7 controls, and facilitation of unsupervised use of the products; and (f) intentionally placed obstacles
8 to discourage cessation of use of the products.
9 174. Facebook and Instagram have been designed, maintained, and constantly updated by one
10 of the world’s most wealthy, powerful, and sophisticated corporations. Large teams of expert data
11 scientists, user experience (“UX”) researchers, and similar professionals have spent years fine-
12 tuning these products to addict users. Every aspect of the products’ interfaces, each layer of their
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 subsurface algorithms and systems, and each line of underlying code has been crafted by brilliant
14 minds. Every detail—the color of product icons, the placement of buttons within the interface, the
15 timing of notifications, etc.—is designed to increase the frequency and length of use sessions.
16 Therefore, it is impractical to create a comprehensive list of addictive, harm-causing design features
17 in the product until in-depth discovery occurs. Many product features, such as the inner workings
18 of Meta’s algorithms, are secret and unobservable to users. Discovery during this litigation will
19 reveal additional detail about the unreasonably dangerous, addictive, and harmful design of Meta’s
20 products.
21 a. Facebook’s and Instagram’s algorithms maximize engagement, promoting
use at levels and frequency that is harmful to kids.
22

23 175. Meta has invested its vast resources to intentionally design Facebook and Instagram to

24 be addictive to adolescents, all the while concealing these facts from its users and the public.

25 176. In its original form, Meta’s Facebook and Instagram algorithms ranked chronologically,

26 meaning that a particular user’s feed was organized according to when content was posted or sent

27 by the people the user followed. In 2009, Meta did away with Facebook’s chronological feed in

28 favor of engagement-based ranking; in 2016, it did the same on Instagram. This “engagement-
- 43 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 based” system meant that posts that received the most likes and comments were highlighted first
2 for users. But facing declining engagement, Meta redesigned its algorithms once again in or around
3 early 2018. This change prioritized “meaningful social interaction” (“MSI”), with the goal of
4 showing users content with which they were more likely to engage. The MSI- oriented algorithms
5 purportedly emphasize the interactions of users’ connections; e.g., likes and comments, and give
6 greater significance to the interactions of connections that appear to be closest to users. Meta’s
7 current algorithms consider a post’s likes, shares, and comments, as well as a respective user’s past
8 interactions with posts with similar characteristics, and displays the post in the user’s feed if it
9 meets these and certain other benchmarks.
10 177. While Meta has publicly attempted to cast MSI as making time spent on its platforms
11 more “meaningful,” MSI was actually just another way for Meta to increase user engagement on
12 Instagram and Facebook. While the feature increases the likelihood that an interaction will be
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 “meaningful” by Meta’s definition—more likes, comments, and interactions—it does not consider
14 whether recommended content is “meaningful” to the user. This sets up users who may have reacted
15 negatively to upsetting or dangerous posts to see more of the same. That, in turn, can lead to a
16 “horrible feedback loop/downward spiral”—with negative reactions leading the algorithm to
17 present more posts that generate more negative reactions.
18 178. In algorithmically generating users’ feeds, Meta draws upon the vast amount of data it
19 collects about and from its users. Meta’s algorithms combine the user’s profile (e.g., the information
20 posted by the user on the product) and the user’s dossier (the data collected and synthesized by
21 Meta, to which it assigns categorical designations) along with a dossier of similar users. Meta’s
22 algorithms identify and rank recommended posts to optimize for various outcomes, such as for
23 time-spent by a user or for user engagement. More often than not, this has the effect that Meta’s
24 algorithms direct users to alarming and aversive material.
25 179. Much of what Meta shows users is content that they did not sign up for. Meta often
26 overrides users’ explicit preferences because they conflict with Meta’s predictions of what will get
27 shared and engaged.
28
- 44 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 180. Through its algorithms, Meta intentionally supplants the content that users have elected
2 to see with content that it believes will drive more use and engagement. Thus, the products that
3 Meta touts as “[g]iv[ing] people the power to build community and bring[ing] the world closer
4 together,” are actually designed in a way that prioritizes not social connection but product use at
5 all costs, even to the detriment of the health and safety of young people. 161 The result, for Meta, is
6 an increase in its bottom line. The result for young users is products that are so addictive that they
7 return again and again, even when the products push posts they’re not interested in.
8 181. Meta knew that its engagement-based ranking algorithm (and its subsequent, iterative
9 MSI ranking algorithm) was structured in a way that meant the content which produces intense
10 reactions (i.e., strong engagement) triggers amplification by the apps. This propels users into the
11 most reactive experiences, favoring posts that generate engagement because they are extreme in
12 nature. Zuckerberg publicly recognized this in a 2018 post, in which he demonstrated the
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 correlation between engagement and sensational content that is so extreme that it impinges upon
14 Meta’s own ethical limits. 162 While Zuckerberg went on to claim that Meta had designed its
15 algorithms to avoid this natural propensity of engagement-based algorithms, his claim to the public
16 is belied by extensive research indicating Meta’s products do amplify extreme material.
17 182. Meta intentionally designed its MSI-focused algorithms to collect and analyze several
18 kinds of data: a user’s profile, content the user reports, content the user posts, content viewed,
19 content engaged with, navigation paths, watch time, hover time (the amount of time a user viewed
20 a piece of content), whether a user mutes or unmutes a video, and whether a user makes a full video
21 screen, among other data. Meta uses this data from adolescents to predict what posts will capture
22 the user’s attention. Meta also tracks and utilizes data from various other sources, such as a user’s
23 off-product activities and the activities on websites that contain Facebook or Instagram like or share
24 buttons. 163
25

26

27
161
Meta, Mission Statement, Meta.
162
Mark Zuckerberg, A Blueprint for Content Governance and Enforcement, Facebook.
28
163
Allen St. John, How Facebook Tracks You, Even When You're Not on Facebook, Consumer Reports (April 11,
2018).
- 45 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 183. Meta’s algorithmic ranking is utilized in a variety of product features that are designed
2 by Meta to maximize user engagement.
3 184. For example, the Instagram product consists primarily of a never-ending and user-
4 specific Feed, which Instagram’s data-driven algorithms generate for each user. In the app’s
5 “Home” pane, this feed includes (but is not limited to) photos and videos posted by Instagram users
6 that the user has elected to “follow,” as well as recommended photos and videos. In the app’s
7 “Explore” pane, the feed consists almost exclusively of photos and videos from users the user has
8 not elected to “follow.” In both cases, Instagram’s algorithms evaluate each user’s data to predict
9 what material will maximize their attention and time spent using the product, irrespective of what
10 the user wants to see.
11 185. Other “recommendation” features that are similarly algorithmically powered include
12 Facebook’s Newsfeed, Instagram’s Feed, Instagram Reels, Facebook Reels, Facebook Watch (and
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 its “For You” page), Accounts to Follow, People You May Know (introductions to persons with
14 common connections or backgrounds), Groups You Should Join, and Discover (recommendations
15 for Meta groups to join).
16 186. These product features work in combination to create and maintain a user’s “flow-state”:
17 a hyper-focused, hypnotic state, where bodily movements are reflexive and the user is totally
18 immersed in smoothly rotating through aspects of the social media product. 164
19 187. They also create a phenomenon referred to as “feeding the spiral,” where someone
20 feeling bad sees content that makes them fill bad and they engage with it. Then there Instagram is
21 flooded with it, like an echo chamber screaming their most upsetting thoughts back at them. Meta
22 recognizes that Instagram users at risk of suicide or self-injury are more likely to encounter more
23 harmful self-injury and suicide-related content in their feeds.
24 188. This phenomenon was cast into vivid relief when 14 year-old Molly Russell took her
25 own life after viewing reams of content related to suicide, self-injury, and depression on Instagram
26

27 164
See e.g., What Makes TikTok so Addictive?: An Analysis of the Mechanisms Underlying the World’s Latest Social
28 Media Craze, Brown Undergraduate J. of Pub. Health (2021), (describing how IVR and infinite scrolling may induce
a flow state in users).
- 46 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 and several other products. 165 During an official inquest investigating the role that social media
2 products played in her death, a Meta executive said that such content was “safe” for children to
3 see. 166 The coroner rejected this claim, finding instead that Molly “died from an act of self-harm
4 whilst suffering from depression and the negative effects of on-line content” that she had not sought
5 out, but that the products’ algorithms had pushed on her. 167 “The platform operated in such a way
6 using algorithms as to result, in some circumstances, of binge periods of images, video clips and
7 text some of which were selected and provided without Molly requesting them. These binge periods
8 … are likely to have had a negative effect on Molly…. In some cases, the content was particularly
9 graphic, tending to portray self-harm and suicide as an inevitable consequence of a condition that
10 could not be recovered from. The sites normalised her condition focusing on a limited and irrational
11 view without any counterbalance of normality.” 168 The coroner further observed that “[t]here was
12 no age verification when signing up to the on-line platform” and that Molly’s parents “did not have
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 access, to the material being viewed or any control over that material.” 169
14 189. Despite Molly’s death, and notwithstanding Meta’s research into dangerous spirals—at
15 one point dubbed the “Rabbithole project”—the company did nothing to stop harm to its young
16 users.
17 b. Facebook’s and Instagram’s user interfaces are designed to create addictive
engagement.
18

19 190. To further drive user engagement (and thereby drive data collection and advertising

20 revenue), Facebook and Instagram also utilize a series of design features that are carefully

21 calibrated to exploit users’ neurobiology. These features work in tandem with algorithmic ranking

22 to promote addictive engagement.

23 191. First, Meta programs IVR into its products. Behavioral training via intermittent rewards

24 keeps users endlessly scrolling in search of a dopamine release, oftentimes despite their desire to

25 165
Dan Milmo, Social Media Firms ‘Monetising Misery’, Says Molly Russell’s Father After Inquest, The Guardian
26 (Sept. 20, 2022).
166
Ryan Merrifeld, Molly Russell Inquest: Instagram Boss Says Suicidal Posts Shouldn’t Be Banned From App, The
27 Mirror (Sept. 26, 2022).
167
Id.
28
168
Andrew Walker, H.M. Coroner, Regulation 28 Report to Prevent Future Deaths (Oct. 13, 2022).
169
Id.
- 47 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 put their phone down and move onto other activities. Children, who are less likely to have adequate
2 impulse control than adults, are more susceptible to being drawn into this engineered “flow state”
3 and more likely to grow dependent on Facebook or Instagram.
4 192. Second, Facebook and Instagram utilize “Likes” to control the release of dopamine in
5 children. This feature, which Meta created for Facebook and “introduced … to the world” in 2010,
6 allows users to indicate that they “Like” a post and visibly tallies the number of “Likes” any given
7 post has earned. 170 Instagram launched in 2010 with the like feature built-in—a user can “Like” a
8 post simply by tapping a heart-shaped button.
9 193. Users never know when a like will come. This conditions them to stay on the app. But it
10 also exacerbates issues of social comparison and feedback seeking, creating detrimental effects on
11 adolescent physical and mental health.
12 194. Meta has expanded the likes feature in both Facebook and Instagram. In December 2016,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Meta began allowing users to like comments, not just posts. In February 2022, Meta began allowing
14 users to “Like” Instagram Stories. 171 Expanding the like feature has intensified and multiplied the
15 body of feedback that teen users receive (or don’t receive) on their posts, preying on their desire to
16 seek validation through comparison with others.
17 195. Despite its ability to alleviate the negative impact of likes on younger users, Meta chose
18 only to implement half-measures. Meta created the option for users to hide like counts in May 2021,
19 but it made this an optional setting left off by default. 172 Moreover, the number of likes remain
20 visible to the poster of the content. These changes stop short of resolving the issue of negative
21 social comparison that these score-keeping features inflict.
22 196. Third, Meta has designed its video features in several ways geared to maximizing users’
23 flow state and keeping them immersed in its products for longer periods of time. Video clips on
24 Facebook Reels and Instagram Reels automatically play as a user scrolls and automatically restart
25 once they conclude. Reels cannot be paused and tapping on the video will simply mute its audio.
26

27
170
Ray C. He, Introducing new Like and Share Buttons, Meta (Nov. 6, 2013).
171
Jhinuk Sen, Instagram is adding Likes to Stories so it doesn’t clog up people’s inboxes, Business Today (Feb. 15,
28 2022).
172
Meta, Giving People More Control on Instagram and Facebook, (May 26, 2021).
- 48 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 In addition, Meta imposes limits on the length of video content on Reels (currently 90 seconds, and
2 at times as short as 15 seconds). These limits ensure that users do not become bored by long videos
3 and end their sessions.
4 197. Meta designed the comment features of Reels to minimize any disruption to users’
5 heightened flow state. The interface of Reels displays the “Like,” “Comment,” “Save,” and “Share”
6 buttons on the bottom right of the screen. This placement avoids the milliseconds of delay or
7 discomfort that could disrupt the flow state of right-handed users if placed elsewhere on the screen.
8 Furthermore, these buttons are overlaid on top of the continuously playing clips, to eliminate any
9 temporal or visual interruption during which a user might evaluate whether to continue using the
10 product. Likewise, when a user taps to view the comments on a Reel, the video’s audio and the top
11 quarter of the video continue to play behind the comments section. Again, this design feature keeps
12 the user’s attention on the feed.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 198. In keeping with its study of IVR, Meta knows when to strategically interrupt a user’s
14 flow. Occasionally, while a video is playing, a comment from the video will appear on the bottom
15 of the screen, even without the user tapping to view the comments section. These comments are
16 selected, displayed, and timed intentionally, to retain a user’s attention by engaging with the
17 comments section.
18 199. Fourth, Meta carefully (and dangerously) calibrates the notifications it sends outside of
19 the Facebook and Instagram apps, to maximize success in drawing back users who are not presently
20 using the products. By default, Facebook and Instagram notify users through text and email about
21 activity that might be of interest, which prompts users to open and reengage with the products.
22 However, Meta intentionally chooses to display only a limited amount of information in
23 notifications, in order to trigger curiosity and manipulate the user to click or tap through to the
24 product. 173
25 200. Meta’s studied manipulation of user engagement through notifications is particularly
26 detrimental to teenagers, who lack impulse control and crave social rewards, and who are therefore
27 more susceptible to falling into compulsive patterns of product use. Those harms are compounded
28 173
Clickbait, Merriam-Webster Dictionary.
- 49 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 by the fact that Meta sends push notifications in the middle of the night, prompting children to re-
2 engage with Instagram and Facebook apps when they should be sleeping. Disturbed and insufficient
3 sleep is associated with poor health outcomes. 174
4 201. Meta knows that these notifications are psychologically harmful to young users, despite
5 young users’ high tolerance for notifications.
6 202. For example, an internal Meta document discussing “Problematic Facebook Use” stated
7 that “smartphone notifications caused inattention and hyperactivity among teens, and they reduced
8 productivity and well-being.”
9 203. An internal research document from March 2021 acknowledged that Meta’s “current
10 notification controls do not enable enough agency” in users. In fact, Meta has long known that
11 “notifications with little or no relevance” to the user and “constant updates including Like counts”
12 constitute “rewards [that] are unpredictable or lacking in value.” In June 2018, an internal
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 presentation called “Facebook ‘Addiction’” proposed that Meta reduce such notifications to curb
14 problematic use in users. Even so, Instagram does not offer users a setting to permanently disable
15 all notifications on Instagram at once. At most, users can opt to pause all notifications for up to 8
16 hours at a time. Users seeking to permanently disable all notifications must disable each category
17 of notifications one by one.
18 204. Fifth, the “Stories” feature of both Facebook and Instagram is dangerously designed to
19 create artificial urgency so that users return to the apps. “Stories” was added by Meta in response
20 to the growing popularity of Snapchat with teenagers in 2016. “Stories” appear at the top of a user’s
21 home page upon opening the app and are available to view for only 24 hours, after which they
22 disappear. This creates pressure to use the product daily, or else risk missing out on dopamine-
23 causing stimuli or social interactions. This feature is particularly addicting to adolescent users, who
24 feel increased social pressure to view all their contact’s stories each day before the content
25 disappears, thus increasing their compulsive usage and potential addiction to the product. 175 The
26

27

28
174
Nat’l Inst. of Mental Health, The Teen Brain: Still Under Construction, 6 (2011).
175
Sarah Lempa, Science Behind Why Instagram Stories So Addicting?, Healthline (April 5, 2021).
- 50 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 ephemeral nature of disappearing content is a ploy intended to inspire urgent perusal, and it
2 works. 176
3 205. Sixth, Instagram’s and Facebook’s algorithms are structured to recommend “keywords”
4 or “hashtags” to its young users that lead them to navigate to dangerous content.
5 206. All of the above unreasonably dangerous design features, in addition to the Instagram-
6 specific design features discussed in the section that follows, interact with and compound one
7 another to make Meta’s products addictive and harmful for kids.
8 207. Meta has long been aware of this dangerous and toxic brew, yet Meta has failed to invest
9 in adequate tools to limit the harm their products inflicted on users.
10 208. Meta’s failure to prevent compulsive use by children, and the harms resulting therefrom,
11 are a simple function of its misplaced priorities: Profit over safety.
12 209. Meta’s decision to hook teenage users by rewiring their brains has not aged well for some
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 of its former employees. Chamath Palihapitiya, the former Vice President of User Growth at
14 Facebook, admitted that he feels “tremendous guilt” about his contributions to social media, saying
15 “[t]he short-term, dopamine-driven feedback loops that we have created are destroying how society
16 works.” 177
17 c. Instagram’s unreasonably dangerous product features cause negative
appearance comparison and social comparison
18

19 210. Instagram use by teens is associated with negative impacts on body image, social

20 comparison, eating issues, confidence in friendships, and mental health, including anxiety,

21 depression, and suicidal thoughts. Teen users feel worse about themselves while using Instagram.

22 Some even link their negative feelings to the platform.

23 211. Social comparison is particularly bad on Instagram because, among other things,

24 celebrity and influencer content is pervasive. By manufacturing and emphasizing influence and

25 celebrity, and purposely inundating tween and teen users with those accounts, Meta further exploits

26

27 176
Madiha Jamal, Ephemeral Content — The Future of Social Media Marketing, Better Marketing (March 2, 2021).
28
177
Amy B. Wang, Former Facebook VP says social media is destroying society with ‘dopamine-driven feedback
loops’, Wash. Post (Dec. 12, 2017).
- 51 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 and monetizes social comparison. That has come at a direct cost to the mental health of its teen
2 users, who are more susceptible to body dissatisfaction and negative social comparisons.
3 212. Score-keeping features designed into Instagram amplify these problems. Teenage girls
4 are particularly impacted when comparing like counts, follower counts, views, and comments on
5 their posts to those of models, celebrities, and so-called influencers.
6 213. Instagram compounds the foregoing problems with yet another pernicious feature—
7 image “filters” that allow users to engage in selective self-presentation by altering their appearance
8 in photos and videos. These filters allow facial structure alteration, body slimming, skin lightening,
9 skin tanning, blemish clearing, the artificial overlap and augmentation of makeup, and other
10 beautification “improvements.” 178
11 214. These filters have harmed young users in multiple ways, both independently and in
12 concert with Instagram’s other unreasonably dangerous features. 179
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 215. First, the easy accessibility of filters, combined with features such as “Likes,” encourage
14 adolescents to artificially change their appearances. 180 As noted, adolescents naturally seek social
15 validation. When they notice increased interaction and favorable responses to their filter-edited
16 photos (more “Likes” and comments”), many are led to believe they are only attractive when their
17 images are edited. 181 These young people begin to prefer how they look using filters, not as they
18 appear naturally. 182 In a 2016 study, 52% of girls said they use image filters every day, and 80%
19 have used an app to change their appearance before age 13. 183
20 216. Second, because Instagram already promotes a high degree of social comparison, youth
21 find themselves comparing their real-life appearances to the edited appearances not only of
22

23

24
178
T. Mustafa, An ‘Instagram Vs Reality’ filter is showing how toxic photo editing can be, Metro (April 2021).
179
Anna Haines, From ‘Instagram Face’ To ‘Snapchat Dysmorphia’: How Beauty Filters Are Changing The Way
25 We See Ourselves, Forbes (Apr. 27, 2021).
180
Tate Ryan-Mosley, Beauty Filters Are Changing the Way Young Girls See Themselves, MIT Tech. Rev. (Apr. 2,
26 2021).
181
Tate Ryan-Mosley, Beauty Filters Are Changing the Way Young Girls See Themselves, MIT Tech. Rev. (Apr. 2,
27 2021).
182
Poojah Shah, How Social Media Filters Are Affecting Youth, Parents (Apr. 28, 2022).
28
183
Anna Haines, From ‘Instagram Face’ to ‘Snapchat Dysmoprhia’; How Beauty Filters Are Changing the Way We
See Ourselves, Forbes (Apr. 27, 2021).
- 52 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 themselves but of others online. 184 These false and unrealistic body image standards further lead
2 teenagers to develop negative perceptions of their appearance. 77% of girls reported trying to
3 change or hide at least one part of their body before posting a photo of themselves, and 50% believe
4 they did not look good without editing. 185
5 217. Third, the specific changes filters make to an individual’s appearance can cause negative
6 obsession or self-hatred surrounding aspects of their appearance. 186 The filters alter specific facial
7 features such as eyes, lips, jaw, face shape, and slimness, which often require medical intervention
8 to alter in real life. 187 The pervasiveness of Meta-designed filters through the algorithm permeates
9 Instagram and causes adolescent users to negatively compare their real appearances against a false
10 physical reality. 188 In one recent study, even users who reported a higher initial self-esteem level
11 felt they looked 44% worse before their image was edited using a filter. 189 “[W]hen the . . . filter
12 increased the gap between how participants wanted to look and how they felt they actually looked,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 it reduced their self-compassion and tolerance for their own physical flaws.” 190 As one
14 psychodermatologist has summed it up, “these apps subconsciously implant the notion of
15 imperfection and ugliness, generating a loss of confidence.” 191
16 218. Fourth, Meta has intentionally designed its product to not alert adolescent users when
17 images have been altered through filters or edited. Meta has therefore designed its product so that
18 users cannot know which images are real and which are fake, deepening negative appearance
19 comparison.
20 219. The impact of the negative social and appearance comparison caused by Meta’s
21 unreasonably dangerous product features is profound. Instagram-induced social comparison creates
22 184
See Teen Girls Body Image and Social Comparison on Instagram – An Exploratory Study in the U.S., Wall. St. J.
23 (Sept. 29, 2021), (explaining that users forget that Instagram only shows the highlights of people’s lives and is not
depicting reality).
24
185
Anna Haines, From ‘Instagram Face’ to ‘Snapchat Dysmorphia’; How Beauty Filters Are Changing the Way We
See Ourselves, Forbes (Apr. 27, 2021).
25
186
Tonya Russell, Social Media Filters Are Changing How Young People See Themselves, Teen Vogue (Jan. 25,
2022).
26
187
Id.
188
Id.
27
189
Ana Javornik, Ben Marder, Marta Pizzetti, & Luk Warlop, Research: How AR Filters Impact People’s Self-Image,
Harvard Business Review (December 22, 2021).
28
190
Id.
191
Genesis Rivas, 6 Consequences of Social Media Filters, According to Experts, InStyle (Sept. 14, 2022).
- 53 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 a schism between the ideal self and the real self, leading to distress and depression. Filters,
2 especially in combination with other product features, cause body image issues, eating disorders,
3 body dysmorphia, and related harms. 192
4 220. The various unreasonably dangerous design features built into Instagram exacerbate each
5 other, creating a perfect storm. Posting for the ‘gram creates a pressure to look perfect. 193 The
6 ability of influencers to monetize their face and body creates a highlight reel norm. And “feeding
7 the spiral” creates compulsive use. 194 Taken together, these three features—all driven by very
8 specific design features of Instagram—create a social comparison sweet spot:
9

10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20 221. Meta understands that the social comparison it knowingly enables through appearance
21 filters create compulsive behavior among child users, especially when paired with other
22 unreasonably dangerous design features such as likes and algorithmic recommendations.
23

24
192
See Sian McLean, Susan Paxton, Eleanor Wertheim, & Jennifer Masters, Photoshopping the Selfie: Self Photo
Editing and Photo Investment Are Associated with Body Dissatisfaction in Adolescent Girls, 48 Int’l J. of Eating
25 Disorders 1132, 1133 (Aug. 27, 2015), (presenting a 2015 study involving 101 adolescent girls, which found that
more time spent editing and sharing selfies on social media raised their risk of experiencing body dissatisfaction and
26 disordered eating habits.); Scott Griffiths, Stuart Murray, Isabel Krug, & Sian McLean, The Contribution of Social
Media to Body Dissatisfaction, Eating Disorder Symptoms, and Anabolic Steroid Use Among Sexual Minority Men,
27 21 Cyberpsychology Behavior, and Soc. Networking 149, 149 (Mar. 1, 2018).
193
Teen Girls Body Image and Social Comparison on Instagram – An Exploratory Study in the U.S., Wall. St. J.
28 (Sept. 29, 2021).
194
Id.
- 54 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 222. Despite its awareness that the deliberate design of Instagram was drastically damaging
2 teen mental and physical health, Meta ignored the problem, failing to implement its own
3 researchers’ recommendations.
4 223. In fact, Meta executives ignored or declined requests to fund proposed well-being
5 initiatives and strategies that were intended to reduce the Platforms’ harmful features.
6 224. For example, in April 2019, David Ginsberg, then Meta’s Vice President of Research,
7 emailed Zuckerberg proposing investments in well-being on Instagram and Facebook. Ginsberg
8 recommended the investment because “there is increasing scientific evidence (particularly in the
9 US . . . ) that the average net effect of [Facebook] on people’s well-being is slightly negative.”
10 225. As Ginsberg explained, Meta has a “deep understanding around three negative drivers
11 that occur frequently on [Facebook] and impact people’s well-being: [p]roblematic use . . . , [s]ocial
12 comparison . . . , [and] [l]oneliness.” Ginsberg noted that if the investment was not approved, these
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 initiatives would remain under- or unstaffed.


14 226. Nevertheless, Susan Li, a high-level member of Meta’s finance department, responded
15 that Meta’s leadership team declined to fund this initiative.
16 227. In September 2019, Fidji Simo, Head of Facebook, stated to Adam Mosseri, Head of
17 Instagram, that with respect to improving well-being on both Platforms, “the main problem is that
18 we need to increase investment.”
19 228. And in September 2020, Director of Data Science at Instagram, recognized in an internal
20 chat that Meta faced “two workstreams”: “1. Keep regulators away, keep teens engaged” and “2.
21 Make teens safe.” Further the belief among some in the company that “we only really have
22 bandwidth for 1.”
23 d. Meta has failed to implement effective age-verification measures to keep
children off of Facebook and Instagram.
24

25 229. Meta purports to ban children under the age of 13 from using its products but, at all

26 relevant times, has lacked any reliable form of age verification to prevent such use.

27 230. Other online products employ substantially more effective and reliable age verification

28 schemes before granting children access. These include but are not limited to connecting new users
- 55 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 to parents’ accounts, credit card verification, verification by presentation of identification card (or
2 other government-issued document), or linking a verified undergraduate or professional email,
3 among other methods. Meta chooses not to implement any of these systems, even though they are
4 technologically feasible, used by many companies across the Internet, and could be employed at
5 relatively low cost. Indeed, Meta itself uses an age verification technique for its Facebook Dating
6 product that it claims can verify ages without identifying users—but does not use the same
7 technology at account startup for Facebook or Instagram. 195
8 231. For most of its history, Meta knew that children under the age of 13 were using its apps.
9 And it certainly could have figured this out based on posted photos of elementary school age users.
10 Yet Meta continued to promote and target Facebook and Instagram to children. As long as a new
11 user simply clicked a box confirming that they were at least 13 years old, Meta asked no questions,
12 engaged in zero follow-up, and let the user access the products indefinitely.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 232. This minimal age verification procedure is toothless. Meta does not as a default matter
14 require users to verify their ages upon signing up to use Instagram or Facebook. Users are only
15 asked to self-report their birthday.
16 233. If the user reports a birthday indicating they are less than 13 years old, they are informed
17 that they cannot create an account. However, after acknowledging this message, users can
18 immediately reattempt to create an account and input an eligible birthday. When a user enters an
19 eligible birthday, there are no restrictions to creating an account, other than having it linked to a
20 cell phone number or an email address. In other words, Meta routinely allows pre-teens to
21 misrepresent their age as 13 or 40 or any other age—without so much as asking for proof. This is
22 analogous to selling a teenager alcohol who has admitted to being under 21 but then promptly
23 changed his story.
24 234. The upshot is that, in a matter of seconds, and without age verification, identity
25 verification, or parental consent, children of all ages can create a Facebook or Instagram account,
26

27

28 Erica Finkle, Meta Director of Data Governance, Bringing Age Verification to Facebook Dating, Meta (Dec. 5,
195

2022).
- 56 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 and immediately become subject to the products’ various addictive and harmful features. 196
2 235. There can be no serious debate about whether Meta has more effective age verification
3 tools at its disposal. Perversely, Meta does employ age verification on Instagram—but only when
4 a user self-reports they are younger than 13. In that case, Meta provides a user with what amounts
5 to an appeal right: “if you believe we made a mistake, please verify your age by submitting a valid
6 photo ID that clearly shows your face and date of birth.”
7 236. That is, instead of asking users to prove they are really over 13, Meta asks them if they
8 are really sure they are under 13. At best, this reflects a completely upside-down view of Meta’s
9 duty of care, using age verification to screen in minor users but not to screen them out. At worst,
10 Meta’s “are you sure you’re really under 13” question invites pre-teens to falsify their identification
11 to gain access to Instagram.
12 237. Similarly, Meta imposes unnecessary barriers to the removal of accounts created by
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 children under 13. Since at least April 2018, Instagram and Facebook both accept reports of
14 accounts created by children under 13. 197 However, before an Instagram or Facebook account is
15 deleted, Meta requires verification that the child is under the age of 13. For example, Instagram’s
16 reporting page states:
17 if you’re reporting a child’s account that was made with a false date
of birth, and the child’s age can be reasonably verified as under 13,
18 we’ll delete the account. You will not get confirmation that the
19 account has been deleted, but you should no longer be able to view
it on Instagram. Keep in mind that complete and detailed reports
20 (example: providing the username of the account you’re reporting)
help us take appropriate action. If the reported child’s age can’t
21 reasonably be verified as under 13, then we may not be able to take
action on the account. 198
22

23 Facebook’s reporting page contains almost identical language. 199 By choosing to implement age
24 verification only before deleting accounts of users suspected to be children, but not when those
25

26 196
Similarly, the absence of effective age verification measures means that adult users can claim to be children—with
27 obvious dangers to the actual children on Meta’s products.
197
Report an Underage User on Instagram, Instagram; Report an Underage Child, Facebook.
28
198
Report an Underage User on Instagram, Instagram.
199
Reporting an Underage Child, Facebook.
- 57 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 accounts are first created, Meta makes it more difficult to prove a user is under age 13 than it does
2 for a minor to pretend to be over 13.
3 238. It is unclear how long Meta takes to delete a reported account on average, if it does so at
4 all. Meta has ignored some parents’ attempts to report and deactivate accounts of children under 13
5 years old.
6 239. Zuckerberg has stated that he believes children under 13 should be allowed on
7 Facebook, 200 so Meta’s lax approach to age verification is no surprise.
8 240. Meta’s approach to underage users of its product has consistently been one of feigned
9 ignorance. On October 10, 2021, Senator Marsha Blackburn reported that a young celebrity told
10 Instagram CEO Adam Mosseri that she had been active on Instagram since she was eight. Mosseri
11 replied that he “didn’t want to know that.” 201
12 241. But Meta does know that its age-verification protocols are inadequate to keep minors off
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Facebook and Instagram. According to a May 2011 ABC News report, “about 7.5 million
14 [Facebook] users in the U.S. are under the age of 13, and about 5 million are under the age of 10.” 202
15 Meta knows through retrospective cohort analyses that “up to 10 to 15% of even 10-year-olds in a
16 given cohort may be on Facebook or Instagram.” 203
17 e. Facebook’s and Instagram’s parental controls are unreasonably dangerous.

18 242. Facebook and Instagram lack adequate parental controls, which hinders parents’ ability

19 to monitor and protect their children from harm.

20 243. Despite its obligations under COPPA, Meta does not require “verifiable parental

21 consent” for minors to use Facebook or Instagram. Meta has chosen to avoid its obligations by

22 purporting to ban children younger than 13, despite knowing that such children continue to access

23 and use its products due to its inadequate age verification methods.

24

25

26
200
Kashmir Hill, Mark Zuckerberg Is Wrong About Kids Under 13 Not Being Allowed on Facebook, (May 20, 2011).
201
Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
27 Consumer Protection, Product Safety, and Data Security, (Oct. 5, 2021).
202
Ki Mae Heussner, Underage Facebook Members: 7.5 Million Users Under Age 13, ABC (May 9, 2011).
28
203
Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
Consumer Protection, Product Safety, and Data Security, (Oct. 5, 2021).
- 58 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 244. While COPPA requires parental consent only for users under the age of 13, a reasonable
2 company that knows or should have known its products are harmful to adolescents would require
3 parental consent for any minor to use them. But Meta’s lack of parental consent requirement for
4 any underage users robs parents of an important way to protect their children from the harms caused
5 by Instagram and Facebook.
6 245. Those apps largely lack parental controls, despite their ready availability, affordability,
7 and ease of implementation. For example, Meta has chosen not to: (a) require children’s accounts
8 on Facebook and Instagram to be linked to their parents’, as it does with another one of its products,
9 Messenger Kids; 204 (b) send reports of a child’s activity to parents; (c) allow parents to implement
10 maximum daily usage limitations or to prohibit use during certain hours (school, sleep hours, etc.);
11 (d) notify parents about interactions with accounts associated with adults; (e) notify parents when
12 child sexual abuse material is found on a minor’s account; or (f) require parental approval before a
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 minor can follow new accounts.


14 246. Controls like these would enable parents to track the frequency, time of day, and duration
15 of their child’s use, identify and address problems arising from such use, and better exercise their
16 rights and responsibilities as parents. It is reasonable for parents to expect that social media
17 companies that actively promote their products to minors will undertake reasonable efforts to notify
18 parents when their child’s use becomes excessive, occurs during sleep time, or exposes the child to
19 harmful content. Meta could feasibly design Instagram and Facebook to do so at negligible cost.
20 247. Meta creates a foreseeable risk to young users through its unreasonably dangerous
21 products and then attempts to shift the burden of protection from those products onto parents. As
22 troublingly, Meta intentionally designs Facebook and Instagram so that children can easily evade
23 their parents’ supervision. Instagram and Facebook allow children to create a limitless number of
24 anonymous accounts without parental approval or knowledge, and also allows kids to block their
25 parent’s profile. 205 On Instagram, children can post stories to “Close Friends Only” (i.e., to a select
26 204
Loren Chang, Introducing Messenger Kids, a New App for Families to Connect, Meta (Dec. 4, 2017).
27
205
See Caity Weaver and Danya Issawi, ‘Finsta,’ Explained, N.Y. Times (Sept. 30, 2021), (“It is neither an official
designation nor a type of account offered by Facebook. Rather, it is a term many users ascribe to secondary accounts
28 they create for themselves on Instagram, where their identities — and, often, the content of their posts — are
obscured to all but a small, carefully chosen group of followers.”).
- 59 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 group of followers), excluding their parents. On Facebook, children can place their parents on a
2 “restricted list” of people who are unable to view their stories.
3 248. Finally, Meta has failed to develop effective reporting tools to deal with abuse directed
4 at children through Instagram and Facebook. Meta does not have a phone number that a parent or
5 child can call to report such abuse in real time. Its online reporting mechanisms lack an immediate
6 response mechanism, regardless of the seriousness of the harm at issue. Some users have found that
7 Meta declined to respond to reports filed through its online reporting tool, citing technical issues.
8 f. Facebook’s and Instagram’s unreasonably dangerous features include
impediments to discontinuing use.
9

10 249. Meta has intentionally, unreasonably, and dangerously designed its products so that

11 adolescent users face significant navigational obstacles and hurdles when trying to delete or

12 deactivate their accounts, in contrast to the ease with which users can create those accounts.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

250. Currently, to delete or deactivate an Instagram or Facebook account, a user must locate
LOS ANGELES

13

14 and tap on approximately seven different buttons (through seven different pages and popups) from

15 the main feed. Some young users give up on their attempts to quit because it’s too difficult to

16 navigate through the interface to completion.

17 251. Even if a user successfully navigates these seven pages, Meta still won’t

18 immediately delete their account. Instead, Meta preserves the account for 30 more days. If at any

19 time during those 30 days a user’s addictive craving becomes overwhelming and they access the

20 account again, the deletion process starts over. The user must go through all the above steps again,

21 including the 30-day waiting period, if they again wish to delete their account.

22 252. As an additional barrier to deletion, Meta urges users of both products to deactivate,

23 rather than delete, their accounts. For example, Instagram users who choose to delete their accounts

24 are immediately shown a screen with their profile picture and asked: “Deactivate your account

25 instead of deleting?” The option to deactivate is conspicuously highlighted. Similarly, Facebook

26 displays a screen that automatically selects the option of deactivating rather than deleting a user

27 account.

28
- 60 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 253. Meta’s aggressive efforts to prevent users from discontinuing their use of Facebook and
2 Instagram is particularly problematic because unsuccessful efforts to discontinue use are a hallmark
3 of addiction, incorporated as the sixth criteria in the Bergen Social Media Addiction Scale,
4 discussed above.
5 4. Meta has concealed from users, the public, and Congress the harmful effects
that Instagram’s and Facebook’s design have on children.
6

7 254. Meta has engaged in a years-long pattern of concealing critical information about the

8 safety of Instagram and Facebook from the public. While Meta touted the safety of its products, it

9 failed to disclose information it knew concerning the significant risks associated with its products,

10 even though it knew that the public lacked access to this information.

11 255. Meta’s pattern of intentional concealment came to a head in August 2021, just weeks

12 before Frances Haugen dropped her bombshell revelations on the public. On August 4, 2021,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

Senators Marsha Blackburn and Richard Blumenthal wrote to Mark Zuckerberg. The Senators’
LOS ANGELES

13

14 letter observed that “[a]n expanding volume of scientific research shows that social media platforms

15 can have a profoundly harmful impact on young audiences” and noted “grave concerns about

16 [Meta’s] apparent effort to ensnare children into social media platforms at earlier and earlier

17 ages.” 206 The letter concluded by asking Zuckerberg six “pretty straightforward questions about

18 how the company works and safeguards children and teens on Instagram.” 207

19 256. Meta’s years-long concealment of its research was revealed just weeks later, when

20 Frances Haugen released internal Meta studies, along with a trove of other internal Meta

21 documents, to the Wall Street Journal.

22 257. On September 21, 2021, Senator Blumenthal confronted a Meta representative about the

23 conspicuous omissions in Meta’s response to his letter:

24 Last month, on August 4, Senator Blackburn and I wrote to Mark


Zuckerberg and asked him specifically about this issue. We asked,
25 and I’m quoting, “Has Facebook’s research ever found that its

26 206
Letter from Richard Blumenthal, U.S. Senator, to Mark Zuckerberg, Chief Executive Officer of Facebook (Aug. 4,
27 2021).
207
Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
28 Consumer Protection, Product Safety, and Data Security, (Oct. 5, 2021). See also, Letter from Richard Blumenthal,
U.S. Senator, to Mark Zuckerberg, Chief Executive Officer of Facebook (Aug. 4, 2021).
- 61 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 platforms and products can have a negative effect on children’s and
teens’ mental health or well-being such as increased suicidal
2 thoughts, heightened anxiety, unhealthy usage patterns, negative
3 self-image, or other indications of lower well-being?”

4 It wasn’t a trick question. It preceded the reports in the Journal. We


had no idea about the whistleblower documents that were ultimately
5 revealed.
6 Facebook dodged the question. “We are not aware of a consensus
7 among studies or experts about how much screen time is too much.”

8 We are not aware. Well, we all know now that representation was
simply untrue. 208
9

10 258. Senator Blumenthal went on to ask the witness, Facebook’s Vice President of Privacy &

11 Public Policy, “why did Facebook misrepresent its research on mental health and teens when it

12 responded to me and Senator Blackburn?” After disputing the characterization, Satterfield


ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 responded, “The safety and well-being of the teens on our platform is a top priority for the company.

14 We’re going to continue to make it a priority. This was important research.” Senator Blumenthal

15 then went on: “Why did you conceal it?” Satterfield responded, “we didn’t make it public because

16 we don’t, with a lot of the research we do because we think that is an important way of encouraging

17 free and frank discussion within the company about hard issues.” 209

18 259. Meta unilaterally decided to prioritize “free and frank” internal discussion over honest

19 and transparent responses to direct questions from sitting United States Senators. When it “dodged,

20 ducked, sidetracked, [and] in effect misled” Senators Blumenthal and Blackburn, Meta deceived

21 the public via its elected representatives. 210

22 260. Moreover, Satterfield’s “free and frank discussion” excuse has been contradicted

23 publicly by Meta employees.

24

25
208
Richard Blumenthal, Blumenthal Demands Facebook Appear at Next Week’s Consumer Protection Subcommittee
Hearing to Explain Coverup of its Platforms’ Negative Impact on Teens and Children, (Sept. 21, 2021).
26
209
Id.
210
Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
27 Consumer Protection, Product Safety, and Data Security, (Oct. 5, 2021); see also Subcommittee: Protecting Kids
Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On Consumer Protection, Product
28 Safety, and Data Security (Oct. 5, 2021), (statement by Senator Brian Schatz to Frances Haugen that he had “a long
list of misstatements, misdirections and outright lies from the company”).
- 62 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 261. In her testimony before the Senate, Frances Haugen cited evidence that Meta “is so
2 scared of even basic transparency that it goes out of its way to block researchers who are asking
3 awkward questions.” 211 Ms. Haugen further testified that Meta’s culture emphasizes insularity and
4 promotes the idea that “if information is shared with the public, it will just be misunderstood.” 212
5 262. This is consistent with reports from Facebook content moderators that there is a “culture
6 of fear and excessive secrecy” within Meta that “prevent[s] [them] from speaking out.” 213
7 263. Notably, Meta’s pattern of concealment did not end after Frances Haugen came forward.
8 On September 30, 2021, Antigone Davis, Facebook’s Head of Safety, testified before the Senate.
9 Ms. Davis represented that, when Instagram “do[es] ads to young people, there are only three things
10 that an advertiser can target around: age, gender, location. We also prohibit certain ads to young
11 people, including weight-loss ads.” 214 She further testified, “We don’t allow the sexualization of
12 minors on our platform.” 215
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 264. Ms. Davis’s statements were subsequently proven false by Senator Mike Lee. During an
14 October 2021 hearing, Senator Lee explained that a group called the Technology Transparency
15 Project (“TTP”) alerted the U.S. Senate that it had gained Facebook’s approval to target a series of
16 harmful ads to up to 9.1 million users between the ages of 13 and 17. 216 While TTP did not actually
17 run the ads, approval from Meta to do so demonstrates that the company allows harmful targeted
18 advertising toward minors. Senator Lee showed three examples of these Meta-approved ads, shown
19 below: 217
20

21

22 211
Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
23 Consumer Protection, Product Safety, and Data Security, (Oct. 5, 2021).
212
Id.
24
213
Zoe Schiffer, Facebook Content Moderators Call for Company to Put an End to Overly Restrictive NDAs, The
Verge (Jul. 22, 2021).
25
214
Subcommittee: Protecting Kids Online: Facebook, Instagram, and Mental Health Harms Hearing before
Subcomm. On Consumer Protection Product Safety, and Data Security (Sept. 30, 2021).
26
215
Subcommitte: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
Consumer Protection, Product Safety, and Data Security (Oct. 5, 2021).
27
216
Id.
217
These screen captures were taken from a video of the October 5, 2021 Senate Hearing with witness Frances
28 Haugen. See Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before
Subcomm. On Consumer Protection, Product Safety, and Data Security (Oct. 5, 2021).
- 63 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1

10

11

12 265. The first ad encourages children to “[t]hrow a skittles party like no other” and displays
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

the suggestion against a background of colorful prescription pills. The second ad promotes an “Ana
LOS ANGELES

13

14 Tip” instructing the viewer to “visit pro-ana sites to feed your motivation and reach your goal”

15 when feeling hungry. The third ad informs the viewer that they “look lonely” and encourages them

16 to “[f]ind your partner now to make a love connection.”

17 266. Senator Lee stated that based on the Meta Defendants’ approval of these pro-drug, pro-

18 anorexia, pro-sexualization ads targeted to children aged 13 to 17, “[o]ne could argue that it proves

19 that Facebook is allowing and perhaps facilitating the targeting of harmful adult-themed ads to our

20 nation’s children.” 218

21 267. In addition to the litany of misrepresentations and omissions identified above, Meta has

22 repeatedly failed to tell the truth about the age of users on Instagram. In statements to Congress and

23 elsewhere, Zuckerberg has represented that Meta does not allow users under the age of 13 to use

24 the product. For example, in testimony before the U.S. House of Representatives Committee on

25 Energy and Commerce, Zuckerberg stated: “There is clearly a large number of people under the

26

27

28 218
Id.
- 64 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 age of 13 who would want to use a service like Instagram. We currently do not allow them to do
2 that.” 219
3 268. However, as discussed further above, Meta has long known that its product is widely
4 used by children under the age of 13. In fact, Meta knows through retrospective cohort analyses
5 that “up to 10 to 15% of even 10 year-olds in a given cohort may be on Facebook or Instagram.” 220
6 269. Far from acknowledging the serious and unreasonable design features in its products and
7 warning children and parents of the same, Meta has launched advertising campaigns designed to
8 encourage more children to use its products—by touting the purported safety of those products. For
9 example, in a recent television ad, Meta claimed that it “build[s] technology that gives you more
10 control and helps keep you safe” including through its “industry leading AI” and other “tools that
11 can protect—so you can connect.”
12 270. Other advertising campaigns have similarly touted Meta’s AI as being a feature that
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 contributes to its products’ safety—without disclosing the unreasonably dangerous design features
14 identified in this Complaint.
15 271. In another example of advertising that promotes use by children, a Meta 2021 online
16 advertisement actively highlighted the content available for fifth grade children on its Facebook
17 product, highlighting the experience of an art teacher who used Facebook to communicate with
18 students during the pandemic—an experience the video noted was “a lot to unpack for little, tiny
19 people.”
20 272. In contrast to its public claims, Meta’s internal communications reveal that it prioritizes
21 engagement and profits to the detriment of young users’ well-being.
22

23

24
219
Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation Hearing Before H.
Energy and Commerce Subcomm. on Communications and Technology 59 (March 25, 2021); see also id.
25 (Zuckerberg: “[O]ur policies on the main apps that we offer generally prohibit people under the age of 13 from using
the services.”); See also Transcript of Zuckerberg’s appearance before House committee, Washington Post (April 11,
26 2018), (When asked if it is correct that children can get a Facebook account starting at age 13, Zuckerberg confirmed
that it was correct); see also NewSchools Venture Fund, NewSchools Summit 2011: John Doerr and Mark
27 Zuckerberg on innovation and education (May 24, 2011), (Zuckerberg: “[A]nd so basically, we don’t allow people
under the age of 13 on Facebook . . . today we don’t allow people under the age of 13 to sign up”).
28
220
Subcommittee: Protecting Kids Online: Testimony from a Facebook Whistleblower Hearing before Subcomm. On
Consumer Protection, Product Safety, and Data Security (Oct. 5, 2021).
- 65 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 273. One such example is Meta making visual filters that simulate facial plastic surgery
2 available to young users on its Social Media Platforms.
3 274. Meta’s leadership (including Instagram’s former Head of Policy Newton) came to
4 understand that Meta was “actively encouraging young girls into body dysmorphia” with these
5 filters. Meta leaders communicated these concerns about the “severe impacts” of these filters on
6 users’ mental health to Zuckerberg.
7 275. Zuckerberg, however, dismissed these concerns, which were raised by multiple
8 employees.
9 276. In November 2019, Margaret Gould Stewart, Meta’s then-Vice President of Product
10 Design and Responsible Innovation, initiated an email conversation, with the subject “[Feedback
11 needed] Plastic Surgery AR Effects + Camera Settings Policies,” addressing recipients including
12 Andrew Bosworth (Meta’s Chief Technology Officer), Mosseri (Head of Instagram), Fidji Simo
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 (then-Head of Facebook), and Newton.


14 277. Gould Stewart described a “PR fire” in mid-October 2019, stemming from “selfie”
15 camera filters on Meta’s Platforms that simulated plastic surgery.
16 278. This included public allegations that Meta was “allowing the promotion of plastic
17 surgery,” including to Instagram’s youngest users.
18 279. Meta’s initial response to the public backlash was to institute a temporary ban on the
19 camera filters.
20 280. Gould Stewart recommended that this ban be made permanent.
21 281. Gould Stewart distributed a briefing memo to these senior leaders detailing the
22 “significant concerns” raised by “global well-being experts . . . about the impact of these effects on
23 body dysmorphia and eating disorders,” especially for teenage girls.
24 282. In a separate communication, Gould Stewart urged Meta’s leadership that “when it
25 comes to products or technology that are used extensively by minors (under 18) I do believe we
26 have an obligation to act more proactively in mitigating potential harm . . . .”
27 283. The briefing memo noted that a potential option to limit the filters to only users who
28 were 18-years-old and older would not be effective because Instagram’s age-gating procedures
- 66 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 were inadequate, such that “minors will still have access to the filters, especially on IG.” The
2 document reminded Meta’s senior leaders that academic researchers had demonstrated that
3 “Facebook and Instagram use is associated with body image issues and anxiety among users and
4 particularly among women and teenage girls.” The document warned that long-term studies of the
5 effects of such filters “likely will not be available before the potentially damaging impact to user
6 wellbeing manifests.”
7 284. Newton agreed with the recommendation to extend the ban, expressing concern that
8 these filters were “actively encouraging young girls into body dysmorphia and enabling self-view
9 of an idealized face (and very western definition of that face by the way) that can result in serious
10 issues.”
11 285. Newton further noted that “outside academics and experts consulted were nearly
12 unanimous on the harm here.”
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 286. A meeting with Zuckerberg to discuss the matter was then scheduled for April 2, 2020,
14 and a “Cosmetic Surgery Effects Pre-Read” document was prepared and circulated in anticipation
15 of that meeting.
16 287. The “pre-read” detailed Meta’s consultation with “21 independent experts around the
17 world,” finding that “[t]hese extreme cosmetic surgery effects can have severe impacts on both the
18 individuals using the effects and those viewing the images.” Experts told Meta that “[c]hildren are
19 particularly vulnerable” to these impacts, in addition to “those with a history of mental health
20 challenges [and] eating disorders[.]” The memo also included Meta’s review of academic research
21 on the negative effects of edited images on viewers’ satisfaction with their own bodies, as well as
22 anecdotal evidence that “editing one’s own selfie images could activate desire for cosmetic
23 surgery.”
24 288. In addition to noting the experts’ “agree[ment] that these effects are cause for concern
25 for mental health and wellbeing, especially” for women and girls, the memo noted that continuing
26 the ban may have a “negative growth impact” on the company.
27 289. On April 1, 2020, one day before this meeting was to take place, it was canceled.
28
- 67 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 290. Rather than rescheduling the meeting, Zuckerberg vetoed the proposal to ban camera
2 filters that simulated plastic surgery.
3 291. Zuckerberg dismissed concerns about the filters (from the public, from experts, and from
4 his own employees) as “paternalistic.”
5 292. Zuckerberg stressed that there was a “clear[] demand” for the filters, and wrongly
6 asserted that he had seen “no data” suggesting that the filters were harmful.
7 293. In reality, Zuckerberg was provided the “pre-read” document, detailing expert consensus
8 on “the dangers these filters have in advancing unrealistic beauty standards and impacting mental
9 health and body image,” and he continued to receive information from colleagues summarizing the
10 harmful nature of the plastic surgery filters.
11 294. A follow-up memo sent to Zuckerberg before he gave a final order to end the ban noted
12 that the cosmetic surgery filters could have disproportionate impacts for children and teen girls.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 295. After Zuckerberg rejected the proposal to permanently ban plastic surgery simulation
14 camera filters, Gould Stewart wrote to Zuckerberg, “I respect your call on this and I’ll support it,
15 but want to just say for the record that I don’t think it’s the right call given the risks . . . I just hope
16 that years from now we will look back and feel good about the decision we made here.”
17 5. Meta failed to adequately communicate the dangers and harms caused by
Instagram and Facebook, or provide instructions regarding safe use.
18

19 296. Meta has misrepresented, omitted, and failed to adequately communicate and warn

20 adolescent users and parents regarding the physical and mental health risks posed by Instagram and

21 Facebook. These risks include a plethora of mental health disorders like compulsive use, addiction,

22 eating disorders, anxiety, depression, insomnia, exacerbated executive dysfunction, sexual

23 exploitation from adult users, suicidal ideation, self-harm, and death. Meta knew of these

24 significant risks, but deceptively and fraudulently omitted, downplayed, or misled consumers and

25 the Band regarding these risks.

26 297. Meta targets adolescent users via advertising and marketing materials distributed

27 throughout digital and traditional media that fail to provide sufficient warnings to potential

28
- 68 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 adolescent consumers of the physical and mental risks associated with using Facebook and
2 Instagram.
3 298. Meta also fails to adequately warn adolescent users during the product registration
4 process. At account setup, neither Instagram nor Facebook contain warning labels, banners, or
5 conspicuous messaging to adequately inform adolescent users of the known product risks and
6 potential physical and mental harms associated with usage. Instead, Meta allows adolescent users,
7 including those under the age of 13, to easily create an account (or multiple accounts) and fully
8 access these products.
9 299. Meta’s failure to warn adolescent users continues even as adolescents exhibit
10 problematic signs of addiction to and compulsive use of Facebook or Instagram. For example, Meta
11 does not warn users when their screen time reaches harmful levels or when adolescents are
12 accessing the product habitually.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 300. Despite proactively providing adolescent users with countless filtering and editing tools,
14 Meta also does not appropriately warn adolescent users regarding which images have been altered
15 or the mental health harms associated with the heavily filtered images that Meta presents and
16 recommends.
17 301. Not only does Meta fail to adequately warn users regarding the risks associated with
18 Instagram and Facebook, it also does not provide sufficient instructions on how adolescents can
19 safely use the products.
20 302. Meta’s failure to adequately communicate or warn as set forth herein has proximately
21 caused significant harm to the mental and physical well-being of young users.
22 303. Moreover, when making the Instagram app available to Tribal consumers users in
23 Apple’s App Store and other online marketplaces, Meta tell consumers that Instagram contains only
24 “infrequent/mild” “profanity and crude humor,” “alcohol, tobacco, and drug use or references,”
25 “sexual content or nudity,” and “mature/suggestive themes.” Meta then claims a “12+” rating in
26 the App Store, which tells consumers the app is suitable for users aged 12 and older. Meta knows
27 intends that all these representations will be conveyed to Tribal consumers.
28
- 69 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 304. These representations are patently false. Meta allows rampant profanity, sexual content
2 and nudity, alcohol, tobacco, and drug use and references, and mature/suggestive themes on the
3 Instagram platform, including readily accessible hardcore pornography. Meta uses human and
4 computer moderators to police the content on Instagram, but those moderators either systematically
5 fail or apply internal policies that allow these types of content to remain on the platform. Whether
6 by design or by failure, Meta’s moderators miss large amounts of mature content on Instagram,
7 leaving young Tribal members regularly exposed to it. Moreover, as discussed extensively above,
8 use of Meta’s platforms among adolescents even 12+ is harmful to the developing brain and, as
9 currently designed, not appropriate for use by the age group.
10 D. SNAP MARKETS AND DESIGNS SNAPCHAT TO ADDICT YOUNG USERS,
SUBSTANTIALLY CONTRIBUTING TO THE MENTAL HEALTH CRISIS.
11

12 305. Snap Inc. calls itself “a camera company.” 221 Its “flagship product, Snapchat, is a camera
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

application that was created to help people communicate through short videos and images. [Snap]
LOS ANGELES

13

14 calls each of those short videos or images a Snap.” 222 Snap’s design of its Snapchat product

15 capitalizes on children’s increasing attachment to quick, instantaneous exchanges. As Snap’s

16 founder and CEO Evan Spiegel has explained, “today… pictures are being used for talking. So

17 when you see your children taking a zillion photos of things that you would never take a picture of,

18 it’s cos [sic] they’re using photographs to talk. And that’s why people are taking and sending so

19 many pictures on Snapchat every day.” 223

20 306. Spiegel’s statement is telling, as Snap has tailored every aspect of its Snapchat product

21 to children rather than adults. Snap designed and implemented dangerous features in Snapchat that

22 exploit children’s need for social acceptance and rewards by pushing its users to maximize their

23 use of and engagement with the app. Snap built Snapchat using manipulative techniques to compel

24 young users to send an ever-increasing number of photographs and videos, and to reward users who

25

26
221
Snap Inc. Form S-1 Registration Statement (hereafter “Form S-1”) at 1 (Feb. 2, 2017); See also, Snap – Who We
Are, Snap Inc.; (“We believe that reinventing the camera represents our greatest opportunity to improve the way
27 people live and communicate.”). (“We believe that reinventing the camera represents our greatest opportunity to
improve the way people live and communicate.”).
28
222
Snap Inc. Form S-1 Registration Statement (hereafter “Form S-1”) at 1 (Feb. 2, 2017).
223
Stuart Dredge, What is Snapchat? CEO Evan Spiegel explains it all for parents, The Guardian, June 15, 2015.
- 70 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 maximize their engagement with elevated status. Snap also dangerously encourages adolescents to
2 increase engagement on the app indiscriminately, pushing tools to share sensitive material with an
3 ever-expanding group of friends and strangers.
4 307. Snapchat’s design features cause its young users to suffer increased anxiety, depression,
5 disordered eating, sleep deprivation, suicide, and other severe mental and physical injuries. Snap
6 knows or should have known this. Snap intentionally designed Snapchat to prey on the
7 neuropsychology and behavioral patterns of children to maximize their engagement and increase
8 Snap’s advertising revenue. Despite this knowledge, Snap continues to update its product and add
9 features intentionally designed to entice, exploit, and addict kids, including Snap Streaks, trophies,
10 social signifiers and reward systems, quickly disappearing messages, filters, lenses, and games.
11 308. Snap knew, or should have known, that its conduct has negatively affected youth. Snap’s
12 conduct has been the subject of inquiries by the United States Senate regarding Snapchat’s use “to
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 promote bullying, worsen eating disorders, and help teenagers buy dangerous drugs or engage in
14 reckless behavior.” 224 Further, Senators from across the ideological spectrum have introduced bills
15 that would ban many of Snapchat’s features that are particularly addictive to adolescents. 225
16 309. Despite these calls for oversight from Congress, Snap has failed to curtail its use of
17 features such as streaks, badges, and other awards that reward users’ level of engagement with
18 Snapchat. As described in detail below, Snapchat is a product that causes harm to children, the
19 target audience for whom Snap designed and to whom it promoted its product.
20 1. Background and overview of Snapchat.

21 310. Snapchat was created by three college students in 2011 and first released for iPhones in

22 September 2011. Snapchat quickly evolved from its origin as a disappearing-message chat

23 application after Snap’s leadership made design changes and rapidly developed new product

24 features. As a result of its design and implementation of dangerous and addictive features

25

26 224
Bobby Allyn, 4 Takeaways from the Senate child safety hearing with YouTube, Snapchat and TikTok, National
27 Public Radio (Oct. 26, 2021).
225
See Abigal Clukey, Lawmaker Aims To Curb Social Media Addiction With New Bill, National Public Radio (Aug.
28 3, 2019); Social Media Addiction Reduction Technology Act, S. 2314, 116th Cong. (2019); Kids Internet Design and
Safety Act, S. 2918, 117th Cong. (2021).
- 71 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 specifically targeting youths (described below), Snapchat quickly became widely used among
2 children.
3 311. Snap marketed Snapchat as “temporary social media” that would allow users to show a
4 more authentic, unpolished, and spontaneous side of themselves. 226 Snapchat’s central and defining
5 feature, the “Snap,” allows users to send and receive ephemeral, or “disappearing,” audiovisual
6 messages. That feature foreseeably and quickly drove users to exchange sexually explicit “Snaps,”
7 sometimes called “sexts” even though they are photos. Because of its brand identity among
8 millennials as the original ephemeral-messaging app, Snapchat almost immediately became known
9 as the “sexting” app—a fact that Snap was or should have been on notice of from public sources. 227
10 312. Snapchat creates images and GIFs for users to incorporate into their videos and picture
11 postings. Snap has also acquired publishing rights to thousands of hours of music and video which
12 it provides to Snapchat users to attach to the videos and pictures that they send.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

2. Snap targets children.


LOS ANGELES

13

14 a. Snap has designed its Snapchat product to grow use by children to drive the
company’s revenue.
15

16 313. Within five months of launching, Snapchat had 40,000 users. 228 By May 2012, less than
17 eight months after launching, CEO Evan Spiegel reported that the company was “thrilled” to learn
18 that most of Snapchat’s users were high school students sending “behind-the-back photos of
19 teachers and funny faces” to each other during class. According to Spiegel, Snap’s server data
20 showed peaks of activity during the school day. 229
21 314. Snap immediately focused on increasing the product’s frequency of use. 230 By late 2012,
22 Snapchat had over a million active users sending over 20 million Snaps per day. 231 By 2013,
23

24 226
Jenna Wortham, A Growing App Lets You See It, Then You Don’t, New York Times (Feb. 9, 2013).
25
227
Megan Dickey, Let’s Be Real: Snapchat Is Totally Used For Sexting, Bus. Insider (Nov. 30, 2012); Billy
Gallagher, No, Snapchat Isn’t About Sexting, Says Co-Founder Evan Spiegel, TechCrunch (May 12, 2012),
26 (describing an interview in which a journalist asked the CEO of Snap about the product’s potential use for sexting).
228
Ken Auletta, Get Rich U, New Yorker (Apr. 30, 2012).
27
229
Team Snapchat, Let’s Chat, Snapchat Blog at http://blog.snapchat.com (May 9, 2012).
230
Billy Gallagher, You Know What’s Cool? A Billion Snapchats: App Sees Over 20 Million Photos Shared Per Day,
28 Releases On Android, TechCrunch (Oct. 29, 2012).
231
Id.
- 72 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 Snapchat users were sending over 60 million Snaps per day. 232 By the end of 2022, this number has
2 risen to over 5 billion Snaps per day. 233
3 315. As Snap continued to quickly add new features to its product, the number of Snapchat’s
4 daily active users (users who open Snapchat at least once during a 24-hour period) rapidly
5 increased. 234 In 2017, Snap reported that its users opened the product more than 18 times a day on
6 average. By 2019, users were opening the product an average of 30 times per day.
7 316. Today, Snapchat is one of the world’s most widely used apps. By its own estimates,
8 Snapchat has 363 million daily users, including 100 million daily users in North America.235
9 Snapchat also “reaches 90% of the 13-24 year old population” in over twenty countries, and reaches
10 nearly half of all smartphone users in the United States. 236
11 317. Snapchat’s explosive growth is driven by its key user demographic, 13-17 year olds. In
12 2022, 59% of US teens used Snapchat and 15% said they used it “almost constantly.” 237
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 318. In 2014, Snap began running advertisements on Snapchat. 238 Snapchat’s entire business
14 model revolves around its advertising revenue. According to internal company records,
15 advertisements were pervasive on Snapchat by 2015 and, by 2018, 99% of Snap’s total revenue
16 came from advertising. Advertising has accounted for 99% of Snap’s revenue each year since
17 2018. 239 In 2022, Snap’s revenue was approximately $4.6 billion. 240
18 319. Snap attracts advertisers by providing them access to the huge universe of Snapchat users
19 and by collecting immense amounts of data on its users, including its pre-teen and teenage users,
20 which it uses to target advertising to those users. Snap makes no secret of this practice, recently
21 acknowledging that it relies “heavily on our ability to collect and disclose data, and metrics to our
22 advertisers so we can attract new advertisers and retain existing advertisers. Any restriction or
23

24 232
Id.
25
233
Snap Inc. Q4 2022 Investors Meeting Transcript at p. 7 (Jan. 31, 2023).
234
Snap Inc. Form S-1 Registration Statement (hereafter “Form S-1”) at 91 (Feb. 2, 2017).
26
235
October 2022 Investor Presentation at 5, Snap Inc. (Oct. 20, 2022).
236
Id.
27
237
Pew Research Center, Teens, Social Media and Technology 2022, (Aug. 10, 2022).
238
Angela Moscaritolo, Snapchat Adds ‘Geofilters’ in LA, New York, PC Mag. (July 15, 2014).
28
239
Snap Inc. Form 10-K at 18 (Dec. 31, 2022).
240
Id.
- 73 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 inability, whether by law, regulation, policy, or other reason, to collect and disclose data and metrics
2 which our advertisers find useful would impede our ability to attract and retain advertisers.” 241
3 320. Snap’s growth in advertising revenues was driven by changes Snap made to Snapchat
4 that incentivized compulsive and addictive use at the expense of its users’ health.
5 321. Snap understands that its user experience must be immersive and all-encompassing in
6 order to maximize its advertising revenue. Indeed, Snap recently admitted to its investors that its
7 revenue could be harmed by, among other things, “a decrease in the amount of time spent on
8 Snapchat, a decrease in the amount of content that our users share, or decreases in usage of our
9 Camera, Visual Messaging, Map, Stories, and Spotlight platforms.” 242
10 b. Snap promotes Snapchat to children.

11 322. Snap specifically promotes Snapchat to children because they are a key demographic for

12 Snap’s advertising business.


ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 323. In its first post on its website, Snapchat observed that “[t]o get a better sense of how

14 people were using Snapchat and what we could do to make it better, we reached out to some of our

15 users. We were thrilled to hear that most of them were high school students who were using

16 Snapchat as a new way to pass notes in class—behind-the-back photos of teachers and funny faces

17 were sent back and forth throughout the day.” 243

18 324. As shown in this capture of a Snapchat feature page created by Snap, Snap uses bright

19 colors, cartoonish designs, and other features that appeal to younger audiences.

20

21

22

23

24

25

26

27 241
Id.
28
242
Id.
243
Team Snapchat, Let’s Chat, Snapchat Blog at http://blog.snapchat.com (May 9, 2012).
- 74 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1

10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15 325. Similarly, in an October 2019 interview, Snap’s CEO explained that “we’ve seen a lot

16 of engagement with our 13-34 demographic, which for us is strategically a critical demographic,

17 not only because that’s a demographic that enjoys using new products but also because I think they

18 represent, really, the future . . . So that’s obviously been a group that’s been really fun to build for,

19 and really it started because those are our friends.” 244

20 326. Snap touts to advertisers its ability to use Snapchat to reach children. In a December

21 2022 statement to advertisers, Snap claimed that “Snapchat delivers on the emotions that Gen Z

22 seeks and it does so consistently across the platform in areas like Discover, Stories and the

23 Camera.” 245 To prove that, Snapchat “used a neuroscience measurement called Immersion to

24 measure reactions to different brand messaging—specifically brand purpose messaging vs. non-

25 brand purpose messaging. Immersion captures attention and emotional resonance through

26 variations in heart rate rhythm collected by smartwatches.” 246 Per Snapchat, “[a]ny brand or

27 244
Evan Spiegel, Co-Founder and CEO of Snap, Inc., Goldman Sachs, at 4:43-6:23. (Oct. 2, 2019).
28
245
Snap for Business, What Does Gen Z Want From Brands? (Dec. 15, 2022).
246
Id.
- 75 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 marketer can get on any app and start targeting Gen Z [emphasis added]. After all, Gen Z is digitally
2 native. But to effectively connect and engage with this generation, that takes a different, more
3 intentional type of platform- Snapchat.” 247
4 327. Advertisers have responded, pouring into Snapchat money clearly intended for
5 advertising aimed at children. Brands like candy manufacturer Sour Patch Kids, children’s toy store
6 ToysRUs, and sugary beverage seller Kool-Aid have all run successful advertising campaigns
7 through Snapchat, frequently using augmented reality tools developed in collaboration with
8 Snapchat.
9

10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22
328. Snapchat’s age verification systems are unreasonably dangerous. For the first two years
23
of its existence, Snap did not even purport to limit user access to those 13 or older. 248 Users were
24
not required to input a date of birth when creating an account. 249
25

26

27 247
Id.
28
248
Team Snapchat, iOS Update: Bug Fixes and More!, Snapchat Blog (June 22, 2013).
249
Id.
- 76 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 329. In 2013, Snap belatedly introduced age limits (which, as explained below, it does not
2 effectively enforce). At the same time, Snap launched a new feature called “Snapkidz” aimed at
3 and designed to attract children, while hedging against the potential user loss due to the new age
4 limits. The Snapkidz feature allowed children under the age of 13 to take filtered photos, draw on
5 them, save them locally on their devices, send them to others, and upload them to other apps. 250
6 Although this version prevented children from sharing “Snaps” on the product, it nonetheless
7 exposed children to Snapchat’s features, which normalized and acclimatized children to using
8 Snapchat. In addition, nothing prevented children from creating an unrestricted account with a false
9 date of birth on Snapchat and using the product outside the SnapKidz’s limited features. 251
10 330. The SnapKidz feature was discontinued in or around 2016. Snap now purports to prohibit
11 users under the age of 13. But nothing prohibits the minor user from simply altering their birthdate
12 during the same session where they were just denied an account for being an underage user. Snap
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 could have implemented robust, effective age verification protocols. Instead, it has set up its
14 business and product so that nothing is done to verify the age of its users or to enforce its age
15 limitations. Snap could, but intentionally does not, verify the phone number, email address, or
16 birthdate used to create accounts, and it allows users to create multiple accounts using the same
17 email address or phone number.
18 331. Snap’s executives have admitted that Snapchat’s age verification “is effectively useless
19 in stopping underage users from signing up to the Snapchat app.” 252 Not surprisingly, underage use
20 is widespread. As of 2021, 13% of children ages 8-12 use Snapchat. 253
21 332. Once Snapchat is installed on a user’s mobile phone, the product continues to download
22 and install updates, design changes, and new features from Snapchat directly to its users.
23 333. Similarly, the absence of effective age-verification measures means that users who are
24 older can claim to be children—which is an obvious danger to the actual children on Snap’s product.
25 250
Id.
26
251
See Larry Magid, Snapchat Creates SnapKidz – A Sandbox for Kids Under 13, Forbes (June 23, 2013); Anthony
Cuthbertson, Snapchat admits its age verification system does not work, Independent (Mar. 19, 2019).
27
252
Isobel Asher Hamilton, Snapchat admits its age verification safeguards are effectively useless, Bus. Insider (Mar.
19, 2019).
28
253
Victoria Rideout et al., The Common Sense Census: Media Use by Tweens and Teens, 2021 at 5, Common Sense
Media.
- 77 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 3. Snapchat is designed to addict children through psychological manipulation.
2 334. Once Snap entices children to use its product, it uses a series of product features that are
3 designed to addict children. As laid out below, those features can be broadly grouped into two
4 categories that exploit techniques discussed earlier in this Complaint. The first are social metrics
5 and other similar psychological manipulation techniques. The second are features designed to
6 encourage endless passive consumption of content on the Snapchat product. These features, in
7 tandem with each other and the other harmful features described throughout this section and
8 Complaint, induce addiction, compulsive use, and other severe mental and physical harm to the
9 child users of the Snapchat product.
10 a. Snap designed Snapchat to drive compulsive use through a set of social
metrics and other manipulation techniques that induce compulsive use.
11

12 335. Snapchat includes a variety of social metrics—such as Snapscores, Snap Streaks, and
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Snap Awards—that reward users when they engage with Snapchat and punish them when they fail
14 to engage with Snapchat. Internal research by Snap has found these psychological manipulation
15 techniques are highly effective at instilling anxiety about not using Snapchat frequently enough—
16 and competitor research has confirmed these features are addictive. In tandem with Intermittent and
17 Variable Rewards (“IVR”), like push notifications and design choices that make it difficult to stop
18 using the Snapchat product, these induce compulsive use of the product by children.
19 336. These manipulation techniques are so effective in part because Snapchat’s disappearing
20 messages themselves create a compulsion to engage with the Snapchat product. Because Snaps
21 typically disappear within ten seconds of being viewed, users feel compelled to reply immediately.
22 Snap activates the psychological desire to reciprocate the social gesture of sending a Snap. 254
23 Snapchat also tells users each time they receive a Snap by pushing a notification to the recipient’s
24 device. These notifications are designed to prompt users to open Snapchat repetitively, increasing
25 the overall time spent on the app.
26

27

28 254
Nir Eyal, The Secret Psychology of Snapchat, Nir & Far (Apr. 14, 2015).
- 78 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 (i) Snapscores
2 337. Snapscores were one of the earliest features of the Snapchat product. Almost as soon as
3 Snapchat launched, Snap gave users the ability to draw and color on Snaps and add a short text
4 caption before sending. An Android version of the app, video sharing, and user profiles with
5 “Snapscores” soon followed. 255
6 338. Originally called “HI score,” Snapscore keeps a running profile score based on a user’s
7 Snapchat activity levels, such as the number of Snaps sent and received or Stories posted. 256 The
8 sole purpose of Snapscore is to increase product use and drive revenue. 257
9

10

11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17 339. Although Snap does not disclose precisely how Snapscores work, sending and receiving
18 a Snap increases the score by one point. Interacting with other product features provides additional
19 points. A user’s Snapscore is visible on their profile, serves as a signifier of the user’s “worth,” and
20 encourages users to further engage with Snapchat’s features to increase their score. Snapscores are
21 important to users, especially young users, because they operate as a form of social validation,
22 similar to an Instagram “Like.” Google has reported millions of searches for “How to improve Snap
23

24

25

26
255
Snap Inc. Form S-1 Registration Statement (hereafter “Form S-1”) at 91 (Feb. 2, 2017); Katie Notopoulos, The
Snapchat Feature That Will You’re your Life, BuzzFeed News (Dec. 5, 2012).
27
256
Snapchat Support, What is a Snap Score?, (“Your Snapchat score is determined by a super-secret, special
equation…�”).
28
257
Brad Barbz, *2020 NEW * How To Increase Snapscore By Up To 1000 Per Minute On IOS And Android -
Working 2020, YouTube (Dec. 4, 2019).
- 79 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 score.” YouTube contains numerous videos with titles like “How to Increase Snapchat Score
2 Fast.” 258
3 340. Snapscores reward users who post videos that are viewed extensively. This encourages
4 many to use Snapchat in harmful and dangerous ways, to increase the virality of their videos and
5 increase their Snapscore. As more users engage with and forward that video to others, its creator is
6 awarded with an increased Snapscore. Snapchat’s rewards incentivize this dangerous behavior,
7 resulting too often in physical harm or humiliation in the obsessive pursuit of social significance.
8 (ii) Trophies, Charms, and Stickers
9 341. Snap has also designed Snapchat to include user rewards, including trophies and other
10 social recognition signals, similar to “Likes” on other apps. These features are highly addictive and
11 drive compulsive use.
12 342. “Trophies” are emojis awarded for achieving engagement milestones or performing
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 certain activities, such as increasing one’s Snapscore, sending creative Snaps, or posting a live
14 story. A user’s “Trophies” are displayed in a “trophy box” viewable by their friends. Snap designed
15 this feature to encourage users to share their videos and posts with the public, promote greater use
16 of Snapchat, and deepen young users’ addiction to and compulsive use of the product.
17 343. In 2020, Snap phased out Trophies and replaced them with “Charms.” Unlike Trophies,
18 where users were rewarded for unlocking individual accomplishments like sending 1,000 selfies,
19 Charms reward users for achieving certain milestones in their relationship with other users.
20 Typically, the more users interact with one another, the more Charms they unlock in their
21 relationship. Charms are private and viewable only by users’ mutual contacts.
22 344. For example, if two users are at the top of each other’s friends list, meaning they
23 exchange frequent Snaps, they may unlock a “BFF (Best Friends Forever)” Charm. Conversely,
24 the “It’s Been Forever” and “It’s Been a Minute” Charms may be awarded to friends who are
25 infrequently in contact, to prompt their engagement with one another on Snapchat. Although there
26 are a number of different Charms awarded for various reasons, all of them encourage user
27

28
258
FozTech, How to Increase Snapchat Score Fast! (100% Works in 2023), YouTube (Oct. 1, 2019), (How to
Increase Snapchat Score Fast has 4.2 million views as of January 10, 2023).
- 80 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 interaction, furthering engagement and buy-in to Snap’s reward system. This in turn exacerbates
2 social-comparison harms and undermines self-esteem.
3 345. Snap incorporates other product features that, like Charms and Trophies, serve no
4 functional purpose, but make Snapchat more appealing and lead to excessive use by children and
5 teens. For example, Snap has developed images called “Stickers” for users to decorate the pictures
6 or videos they post. Snap also offers app-specific emojis and animations that users can apply to
7 their photos or videos.
8 346. Snap designed each of these features to function as rewards for increased engagement,
9 exploit underage users’ desire for social validation, ultimately compel them to use Snapchat
10 excessively. Because many of these rewards and scores are visible to others, these features tap into
11 adolescents’ deep-seated need for acceptance. By exploiting this need, Snap increases time spent
12 engaging with its product and thereby its profits.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 (iii) Snap Streak


14 347. The “Snap Streak” is unique to Snapchat and is an addictive feature “especially to
15 teenagers.” 259 A Snap Streak is designed to measure a user’s Snapchat activity with another user.
16 Two users achieve a Snap Streak when they exchange at least one Snap in three consecutive 24-
17 hour periods. When the Streak is achieved, users receive a fire emoji next to their profile avatar.
18 Over time, users may be rewarded with additional emojis signifying their Streak. If users reach a
19 Streak of 100 days, for example, each receives a 100 emoji.
20 348. Snap Streak emojis are similar to Charms in that they reward users for interaction and
21 are viewable only by mutual friends.
22 349. It is a matter of common knowledge in the social media industry that the Snap Streak
23 product feature is designed to be addictive. Nonetheless, Snap continues to provide this feature to
24 its adolescent users.
25

26 259
See Cathy Becker, Experts warn parents how Snapchat can hook in teens with streaks, ABC News (July 27,
27 2017); Avery Hartmans, These are the sneaky ways apps like Instagram, Facebook, Tinder lure you in and get you
‘addicted’, Bus. Insider (Feb. 17 2018); see generally Virginia Smart & Tyana Grundig, ‘We’re designing minds’:
28 Industry insider reveals secrets of addictive app trade, CBC (Nov. 3, 2017); Julian Morgans, The Secret Ways Social
Media is Built for Addiction, Vice (May 17, 2017).
- 81 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 350. Worse still, to manufacture deeper addiction to its product, Snap sends notifications to
2 users with an hourglass emoji when Streaks are about to expire—to create extra urgency, nudge
3 users to keep their Streaks alive, and maintain a system where a user must “check constantly or risk
4 missing out.” 260
5 351. This feature is particularly effective with teenage users since Streaks are “a vital part of
6 using the app and their social lives as a whole.” 261 Some children become so obsessed with
7 maintaining their Streaks that they give their friends access to their accounts when they may be
8 away from their phone for a day or more. 262 Aware of how important maintaining a Snap Streak is
9 to its users, Snap has even launched a special form on its support website allowing users who lost
10 their streak to petition to get it back. 263
11 352. Snap Streaks contribute to feelings of social pressure and anxiety when users lose or
12 break a Streak. Researchers have found that losing a Streak can cause feelings of betrayal for some
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 users, especially girls, who reported “negative” feelings when losing a Streak with one of their
14 friends. 264
15 353. Streaks are important to users. However, these design features do not enhance the
16 communication function of the product. Instead, they exploit users’ susceptibility to social pressure
17 and to the compulsive accumulation of other rewards, including Snap Score points and Charms.
18 (iv) Push Notifications
19 354. In addition to Snapchat’s in-app reward features, Snap also sends push notifications and
20 emails to encourage addictive engagement and increase use. Notifications are triggered based on
21 information Snap collects from, and about, its users. Snap “pushes” these communications to users
22 excessively and at disruptive times of day. Snap has even designed the format of these notifications
23

24

25
260
Lizette Chapman, Inside the Mind of a Snapchat Streaker, Bloomberg (Jan. 30, 2017).
261
Avery Hartmans, These are the sneaky ways apps like Instagram, Facebook, Tinder lure you in and get you
26 ‘addicted’, Bus. Insider (Feb. 17, 2018).
262
Caroline Knorr, How to resist technology addiction, CNN (Nov. 9, 2017); Jon Brooks, 7 Specific Tactics Social
27 Media Companies Use to Keep You Hooked, KQED (June 9, 2017).
263
Snapchat Support, Contact Form, https://support.snapchat.com/en-US/i-need-help?start=5695496404336640.
28
264
Hristoya et al., “Why did we lose our snapchat streak?” Social media gamification and metacommunication.
Computers in Human Behavior Reports, 5, 100172 (2022).
- 82 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 to pull users back onto its app by preying on their fear of missing out—never mind the consequences
2 to their health and well-being.
3 (v) Impediments to Discontinuing Use
4 355. Snap has intentionally, unreasonably, and dangerously designed its products so child
5 users face significant navigational obstacles and hurdles when trying to delete or deactivate their
6 Snapchat accounts, despite the ease with which a user can create one. For example, when a user
7 elects to delete their account, they cannot do so on demand. The data and the account are preserved
8 for 30 days. In addition, after initiating the deletion process, the user is presented with a black
9 screen depicting a crying emoji and a message that reads, “Your account will be deactivated, which
10 means friends won’t be able to contact you on Snapchat. You’ll also lose any Chats you’ve saved
11 and Snaps and Chats you haven’t opened.” 265
12 356. This cumbersome process prioritizes user retention and continued use over the well-
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 being of Snapchat’s users.


14 b. Snap’s unreasonably dangerous features are designed to promote
compulsive and excessive use.
15

16 (i) “Stories” and the “Discover” Interface

17 357. In October 2013, Snap added “Stories,” a feature that generates a compilation of its
18 users’ designated photos and videos that expire in 24 hours and can be viewed an unlimited number
19 of times by friends or anyone on Snapchat if the user sets the visibility setting to Everyone. 266
20 Within eight months of launching the Stories feature, users were viewing more Stories per day than
21 Snaps. 267
22 358. Snap’s Stories feature includes a running view count and list of viewers for each Story,
23 both of which provide users with dopamine-triggering feedback that encourages users to make their
24 Stories visible to everyone in order to increase the view count. The view count, view list, and
25

26

27
265
See Snapchat Support, How do I deactivate or delete my Snapchat account?.
266
Darrell Etherington, Snapchat Gets Its Own Timeline With Snapchat Stories, 24-Hour Photo & Video Tales,
28 TechCrunch (Oct. 3, 2013).
267
Ellis Hamburger, Surprise: Snapchat’s most popular feature isn’t snaps anymore, The Verge (Jun. 20, 2014).
- 83 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 ephemeral nature of Stories also reinforces the principle of reciprocity and compels users to monitor
2 Stories, so they do not miss out.
3 359. In 2016, Snap updated the Stories feature to include recommendations based on an
4 algorithm that considers “proximity, time, interestingness, or other such metrics.” 268 That same
5 year, Snap introduced ads between Stories and updated Stories to include “Auto-Advance,” a
6 feature that starts a new Story automatically after the preceding one ends. 269 This creates an endless
7 cycle of consumption that Snap knows, or should know, is detrimental to users’ mental health. 270
8 Nevertheless, Snap designed and implemented this feature because it is proven to induce a flow
9 state that increases product use, regardless of whether the use is healthy or enjoyable.
10 Unsurprisingly, one study of over 2,000 UK residents found 68% of respondents who used
11 Snapchat reported that “the platform prevented them from sleeping.” 271
12 360. Since then, Snap has built upon its Stories interface with “Discover,” a feature that
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 showcases a continuous feed of advertisements to Snapchat’s captive audience. Using Discover,


14 users may subscribe to an advertiser’s “channel” and watch its Stories; as well as see what their
15 friends are watching.
16 361. Both Stories and Discover encourage user engagement with Snapchat and increase the
17 amount of time users spend using the product by making the product more addictive at the expense
18 of users’ mental health and well-being.
19 (ii) “Spotlight”
20 362. In November 2020, Snap launched “Spotlight,” a feature that pushes to users “an endless
21 feed” that Snap curates from its 300 million daily Snapchat users. 272 Spotlight functions and appears
22 nearly identical to TikTok, with similar addictive qualities and harms. Snapchat’s Spotlight feature
23 allows users to make videos that anyone can view, and Snap pays users whose Spotlight videos go
24

25 268
Snapchat, Inc., Content Collection Navigation and Autoforwarding, US 20170289234, USPTO (Mar. 29, 2016).
26
269
James Vincent, Snapchat will start showing ads between your friends’ stories, The Verge (Jun. 14, 2016);
Snapchat, Inc., Content Collection Navigation and Autoforwarding, US 20170289234, USPTO (Mar. 29, 2016).
27
270
See, e.g., Gino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model
of problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022).
28
271
Frazer Deans, Curb Your Snapchat Addiction.
272
Salvador Rodriguez, Snap is launching a competitor to TikTok and Instagram Reels, CNBC (Nov. 23, 2020).
- 84 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 viral, thus serving as yet another reward system that encourages user engagement. After Snap
2 introduced Spotlight, user time spent on the product increased by over 200%. 273
3 363. In February 2022, Snap CEO Evan Spiegel told investors that users are spending more
4 time on Spotlight than almost any other aspect of Snapchat. A year prior, Snap announced
5 “Spotlight Challenges,” which provided users with cash prizes for creating Spotlight videos with
6 specific lenses, sounds, or topics, further integrating the user into the Snap ecosystem. Snap claims
7 it paid out more than $250 million in cash prizes to Spotlight Challenge participants in 2021
8 alone. 274
9 4. Snap designed Snapchat with features that harm children directly or expose
children to harm.
10

11 364. Snapchat further contains a number of features which foreseeably cause children harm

12 above and beyond harms inherent in addiction and compulsive use.


ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 a. Disappearing “Snaps” and “My Eyes Only” encourage destructive behavior


among Snap’s teen users.
14

15 365. As discussed above, Snapchat’s “Snap” feature allows users to send and receive
16 ephemeral, or “disappearing,” audiovisual messages. Prior to sending a Snap, a user can designate
17 the period of time—typically no more than a few seconds—that the recipient will be allowed to
18 view the Snap. According to Snapchat, once the allotted time expires, the Snap disappears forever.
19 366. Snapchat’s limited display time reduces teenagers’ communication apprehension and
20 encourages users to send photos depicting deviant behavior. 275 Sexting is a prime example, but
21 cyberbullying, underage alcohol consumption, and illicit use of narcotics are also commonly the
22 subject of Snaps. A 2016 survey of pre-teens and teens ages 12-17 found that “dick pics” were
23 among some of the unwanted content that users—predominantly females—received while using
24 the app. 276
25

26
273
See Snap Q4 Earnings Beat Estimates, User Growth Aids Top Line, Zacks Equity Research (Feb. 5, 2021).
274
Mia Sato, Snapchat will put ads within stories and share the money with creators, (Feb. 14, 2022).
27
275
See Vaterlaus et al., “Snapchat is more personal”: An exploratory study on Snapchat behaviors and young adult
interpersonal relationships, Computers in Human Behavior, 62, 594-601 (2016).
28
276
Kofoed et al., (2106) A snap of intimacy: Photo-sharing practices among young people on social media, First
Monday 21(11).
- 85 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 367. Disappearing Snaps do not operate as advertised. Although designed to disappear after
2 an allotted time, recipients possess the ability to save or record them at will. This is particularly
3 harmful to adolescents, who rely on Snap’s representations when taking and sending photos, and
4 who only learn after the fact that recipients have the means to save photos or videos. In some cases,
5 this can lead to sexual exploitation.
6 368. Snap could, but does not, warn users, including children and teenagers, that Snaps may
7 not necessarily disappear.
8 369. In addition, and especially for pre-teen users, Snaps are unreasonably dangerous because
9 Snap’s parental controls are ill-equipped to mitigate the risks posed by this feature. As set forth
10 below, even with parental controls activated, parents are unable to view a Snap’s content and
11 therefore cannot adequately protect their children and/or deter their children from engaging in
12 dangerous behavior in conjunction with sending Snaps.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 370. “My Eyes Only” is yet another unreasonably dangerous design feature of Snapchat. This
14 feature enables and encourages users to hide harmful content from their parents in a special tab that
15 requires a passcode. Content cannot be recovered from “My Eyes Only”—allegedly even by Snap
16 itself. Snap designed “My Eyes Only” knowing it would likely be used to store potentially illegal
17 and injurious photos and images like sexts and CSAM. 277 This dangerous product feature
18 unreasonably increases the risk that Snapchat’s adolescent users, many under age 13, will be
19 targeted and sexually exploited and/or trafficked by child predators.
20 371. The content in “My Eyes Only” self-destructs if a user attempts to access the hidden
21 folder with the wrong code. “My Eyes Only” has no practical purpose or use other than to hide
22 potentially dangerous content from parents and/or legal owners of the devices used to access
23 Snapchat. Moreover, while this information and evidence should be in Snap’s possession and
24 control, it has designed this feature in a way that causes the permanent loss of relevant, material,
25 and incriminating evidence.
26

27

28 Salvador Rodriguez, Snapchat Finally Acknowledges the Existence of Sexting With 'Memories' The latest app
277

update includes a tool called "My Eyes Only" that lets you privately store sensitive photos and videos, (Jul. 6, 2016).
- 86 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1

2 b. Snapchat’s “Snap Map” feature endangers children.

3 372. Snapchat also contains a feature called “Snap Map” that allows users to share their

4 location with their followers (and the public) on an activity-level-based, color-coded heatmap. At

5 all relevant times, this feature has been available to all users, including minors. Although users can

6 disable “Snap Map,” this is not a default setting.

7 373. Snap Map is an unreasonably dangerous feature for underage users because it provides

8 strangers with their locations, exposing children and adolescents to potential harm. Researchers

9 have also found that Snap Map causes feelings of sadness and anxiety for some users, as they

10 jealously view their friends’ locations. 278 For young people especially, such social comparison

11 often leads to distress and depression.

12 374. Snap Map also functions as a social metric. A report by 5Rights, a United Kingdom
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

based children’s online safety advocacy group highlighted the experience of John, a 14 year old
LOS ANGELES

13

14 boy, who explained that “[h]aving more connections on Snapchat makes his Snap Map look more

15 crowded, which he can then show off to people in real life and therefore appear more ‘popular.’” 279
c. Snapchat’s “Quick Add” feature endangers children.
16

17 375. Through a feature known as “Quick Add,” Snap recommends new, sometimes random

18 friends, similar to Facebook’s “People You Might Know” feature. Suggestions are formulated using

19 an algorithm that considers users’ friends, interests, and location. Quick Add encourages users to

20 expand their friend base to increase their Snapscore by interacting with an ever-expanding group

21 of friends, which--in addition to expanding their time online--can result in exposure to dangerous

22 strangers. Of particular concern, until 2022, Quick Add’s suggestions included profiles for users

23 Snap knew to be between the ages of 13-17, meaning that Quick Add could, and in fact did,

24 recommend that a minor and adult user connect.

25 376. Criminal users interested in selling drugs to minors have utilized the Quick Add feature

26 to find random friends interested in making a purchase.

27 278
See Dunn et al., “Oh, Snap!”: A Mixed-Methods Approach to Analyzing the Dark Side of Snapchat, The Journal
28 of Social Media in Society, 9(2), 69-104 (2020).
279
5Rights Foundation, Pathways: How digital design puts children at risk, (July 2021).
- 87 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 377. Despite these dangers Snap designed Quick Add because it increases the odds that users
2 will add friends, send more Snaps, and spend more time using Snapchat.
3 378. In 2022, Snap revised the Quick Add feature to limit the friend suggestions promoted to
4 minor users. For those aged 13 to 17, Quick Add would only suggest friends who shared a certain
5 number of common friends with the minor user. Snap did not disclose how many common friends
6 must be shared by each user to satisfy this safety feature. Further, this modification to the Quick
7 Add feature still does not prohibit the connection of minors with adults.
8 d. Snapchat’s Lenses and Filters features promote negative appearance
comparison.
9

10 379. Snap also incorporates numerous custom-designed lenses and filters, which allow users

11 to edit and overlay augmented-reality special effects and sounds on their Snaps. Many of

12 Snapchat’s lenses and filters change users’ appearance and face, creating unrealistic, idealized
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

versions that cause profound body image issues in teenagers, especially girls.
LOS ANGELES

13

14 380. Examples of these features include the Smoothing Filter, which blurs facial

15 imperfections and evens out skin tone; Bold Makeup, which adds makeup over the user’s face,

16 blurs imperfections, and evens out skin tone; Sunkissed and Cute Freckles, which adds freckles

17 over the nose and cheeks, blurs imperfections, evens out skin tone, and adjusts skin color; Face and

18 Body Mellow Glow, which smooths the face and body and adjusts skin color; and Fluffy Eyelashes,

19 which alters the shape of the user’s face by lifting their eyes and adding more pronounced cheek

20 bones. The common theme among all of these filters is that they remove the subjects’ perceived

21 blemishes to create the perfect “selfie.”

22 381. A 2017 study found that these features made Snapchat one of the worst social media

23 products for the mental health of children and adolescents, behind only Instagram. 280 In recent

24 years, plastic surgeons have reported an increase in requests for alterations that correspond to

25 Snapchat’s filters. This has led researchers to coin the term “Snapchat Dysmorphia,” in which the

26

27

28 280
Kara Fox, Instagram worst social media app for young people’s mental health, CNN (May 19, 2017).
- 88 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 effect of Snapchat’s filters triggers body dysmorphic disorder. 281 The rationale underlying this
2 disorder is that beauty filters on Snapchat create a “sense of unattainable perfection” that leads to
3 self-alienation and damages a person’s self-esteem. 282 One social psychologist summarized the
4 effect as “the pressure to present a certain filtered image on social media,” which can certainly play
5 into [depression and anxiety] for younger people who are just developing their identities.” 283
6 382. Snap also created and promoted “smart filters” that allowed users to stamp date/time,
7 temperature, battery life, altitude, and speed on their Snaps. 284 These filters utilize sensor data on
8 users’ devices to provide the desired filter stamp.
9 383. A particularly dangerous smart filter is the speed filter, which from 2013 to 2021 allowed
10 users to record their real-life speed and overlay that speed onto Snaps. Snap knew, or should have
11 known, that the speed filter served no purpose other than to motivate, incentivize, and/or encourage
12 users to drive at dangerous speeds in violation of traffic and safety laws. Indeed, soon after
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 launching its speed filter, the feature became a viral game for users—particularly teenage users—
14 to capture photos and videos of themselves driving at 100 miles-per-hour or more. Tragically, the
15 quest to capture a 100 mile-per-hour Snap caused a number of fatal vehicle accidents involving
16 teens and young adults. 285
17 384. Snap knew, or should have known, its speed filter created an unreasonable risk of harm
18 to its users and the public. Despite this knowledge, however, as well as pleas from the public to
19 disable the filter, Snap refused to remove the filter from its application until 2021. 286
20 385. By including features like lenses, cartoonish filters, and stamps to attract ever-increasing
21 numbers of children to use and engage with its product, Snap has knowingly created a product that
22 leads to excessive use by children and teens and causes them to suffer harm.
23 281
Chen et al., Association Between Social Media and Photograph Editing Use, Self-esteem, and Cosmetic Surgery
24 Acceptance, JAMA Facial Plastic Surgery, 2019; See also Nathan Smith & Allie Yang, What happens when lines
blur between real and virtual beauty through filters, ABC News (May 1, 2021).
25
282
Id.
283
Nathan Smith & Allie Yang, What happens when lines blur between real and virtual beauty through filters, ABC
26 News (May 1, 2021).
284
Karissa Bell, Snapchat adds an altitude filter to show how high you are, (Aug.19, 2016).
27
285
Did Snapchat play role in deaths of 3 young women?, ABC6 Action News (Feb. 16, 2016); Manpreet Darroch,
Snapchat and driving . . . you could be sending your last snap, (Apr.25, 2016); The Most Dangerous App on Your
28 Phone, DistractedDriverAccidents.com.
286
Bobby Allyn, Snapchat Ends ‘Speed Filter’ That Critics Say Encouraged Reckless Driving, NPR (June 17, 2021).
- 89 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 5. Snap has implemented ineffective and misleading parental controls, further
endangering children.
2

3 386. Snap has also designed and set up Snapchat with inadequate parental controls.
4 387. From Snapchat’s launch in 2011 until August 2022, Snapchat had no parental controls
5 even though its core user base was under the age of 18 and a significant number of those users were
6 under the age of 13.
7 388. In August 2022, Snap introduced the “Family Center.” The features and processes
8 offered through the Family Center are woefully inadequate to protect teen and pre-teen users. The
9 Family Center allows a parent or guardian to install Snapchat on their phone and then link to the
10 child’s account. The parent or guardian can then see who the child user is communicating with.
11 However, the content of these communications remains hidden and still disappears after the allotted
12 time. In addition, the Family Center does not allow a parent or guardian to block minors from
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 sending private messages, control their child’s use or engagement with many of Snapchat’s product
14 features, control their child’s use of Snapchat’s geolocation feature, or control who their child may
15 add to their friend list. Finally, the Family Center fails to help a parent monitor their child’s account
16 when the child has secretly created a Snapchat account without the parents’ knowledge in the first
17 place.
18 6. Snap facilitates the spread of CSAM and child exploitation.

19 389. Despite being marketed to and designed for children, Snapchat includes a number of
20 features that promote and dramatically exacerbate sexual exploitation, the spread of CSAM,
21 sextortion, and other socially maladaptive behavior that harms children. Snap knows or should have
22 known that its product features are unsafe for children and that it fails to implement reasonable,
23 child-protective safeguards. For example, by failing to age-restrict its Discover feature, Snapchat’s
24 algorithm has recommended inappropriate sexual content to adolescent users. By promoting the
25 connection between minors and adults, it is facilitating child exploitation and predation. By failing
26 to implement parental controls that give parents true control over their children’s activity, Snap
27 allows harmful interactions with predators to continue unnoticed.
28
- 90 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 390. Like the other Defendants, as a direct consequence of the child exploitation that occurs
2 on its platform, Snapchat is tainted by illegal material that promotes and facilitates the continued
3 sexual exploitation of minors. Snap receives value in the form of increased user activity for the
4 dissemination of CSAM on its product.
5 391. Furthermore, Snapchat’s disappearing-content design, while appealing to minors, makes
6 it more difficult for parents to monitor their children’s social-media activity. This feature also
7 contributes to a sense of impunity for many users, encouraging and fomenting exploitation and
8 predatory behavior, which has been observed in multiple empirical studies. 287 According to these
9 studies, Snapchat users believe their conduct is hidden and accordingly feel empowered to engage
10 in criminal behavior through the product without fear of getting caught.
11 392. These feelings are promoted by design. Snap intends for the product’s disappearing
12 messaging to entice users to share highly personal photos and information that many users would
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 otherwise feel uncomfortable sharing on “higher-stake” apps. 288 In short, this design choice
14 encourages and allows minors to share harmful, illegal, and sexually explicit images while
15 providing predators with a vehicle to recruit victims. Studies have also found that the “close ties”
16 generated between teenagers on Snapchat foster the conditions for grooming and other predatory
17 behavior.
18 393. As a result, Snapchat is one of the go-to products for sexual predators. 289
19 394. In 2014, Snap introduced “Snapcash,” a peer-to-peer mobile payment service. Snapcash
20 provided a way for users to pay for private content with little to no oversight. 290 Snapcash enabled
21 CSAM and other sexual exploitation, as users were paid with Snapcash to send, receive, create,
22 publish, save, accept, or otherwise participate in CSAM. It also enabled predators to extort cash
23 from adolescent users by threatening to disseminate CSAM to other users.
24

25

26 287
Snapchat by the Numbers: Stats, Demographics & Fun Facts, Omnicore (Mar. 2, 2022).
27
288
See Evelyn Lopez et al., The Gratifications of Ephemeral Marketing Content, the Use of Snapchat by the
Millennial Generation and Their Impact on Purchase Motivation, Global Bus. Rev. (2021).
28
289
See, e.g., Rebecca Woods, What Are The Dangers Of Snapchat To Avoid?, PhoneSpector (June 16, 2021).
290
Kurt Wagner, Snapchat to Let You Send Money to Friends, Thanks to Square, Vox.
- 91 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 395. Snapcash was abruptly removed from Snapchat in 2018 as users were sending sexually
2 explicit photos and using Snapcash for payment. 291
3 396. Snapchat also allows users to voice or video call one another in the app. 292 This feature
4 is dangerous when paired with the many others that permit easy access to minors by predators, such
5 as Quick Add and Snap Map. It allows predators to call and video chat with minors in private, with
6 virtually no evidence of what was exchanged. Predators use this function to identify children
7 willing to add and speak with a stranger, and then prey on the child’s vulnerabilities.
8 397. Collectively, these product features promulgate communication and conduct with a false
9 sense of intimacy between users and encourage predators to use Snapchat to target children for
10 grooming, sexual exploitation, sextortion, and CSAM.
11 398. In November 2019, a bipartisan group of Senators sent a letter to leading tech companies,
12 including Snapchat. The letter sought answers about the online sexual grooming of children and
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 CSAM detection technologies. 293 The following year, ParentsTogether, a national parent group,
14 delivered a petition from 100,000 parents to Snap demanding that the company do more to “protect
15 children from sexual abuse and exploitation” on Snapchat. 294 The petition listed numerous
16 examples of widespread online sexual grooming of children, including: a high school coach in New
17 Mexico who used Snapchat to extort sexual videos from several girls as young as fourteen; a
18 Cleveland man who posed as a therapist and blackmailed a thirteen-year-old girl into sending him
19 sexual videos and photos; and a Virginia man who was arrested for running a sextortion ring on
20 Snapchat, coercing children into sending sexually explicit material. 295
21 399. In response, Snap announced that by Fall of 2020, it would deploy technology in addition
22 to Microsoft’s PhotoDNA to help stop the spread of CSAM through its product.
23 400. By failing to utilize these technologies until late 2020, Snap harmed adolescent users as
24 its product contributed to child exploitation, sextortion, and the spread of CSAM.
25

26
291
Christian Hargrave, Snapcash Goes Away After Excessive Feature Misuse, App Developer Magazine (July 25,
2018).
27
292
Snapchat Support, How to Start a Video Chat on Snapchat.
293
Letter to Sundar Pichai and 36 other Tech Companies by Senate Committee, (Nov. 18, 2019).
28
294
Snapchat: Prevent Pedophiles from Sharing Abuse Videos.
295
Id.
- 92 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 401. In addition, while Snapchat allows users to report harmful images or videos, they cannot
2 specifically report CSAM that is sent to a user via direct messaging, including from another user’s
3 camera roll.
4 402. Snapchat’s disappearing messages cannot be reported at all.
5 403. While Snap states that it is using “technology to identify known illegal images and videos
6 of CSAM and report them to NCMEC,” it does not address how Snapchat’s design contributes to
7 the ongoing proliferation of CSAM materials and the sexual exploitation of its adolescent users.
8 404. Utilizing the data and information it collects about Snapchat’s users, Snap could detect,
9 report, and take actions to prevent instances of sexual grooming, sextortion, and CSAM
10 distribution.
11 405. Despite receiving numerous reports regarding how its product’s features contribute to
12 child exploitation, Snap has elected to keep many of these features in place. 296 It has done so
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 because removing them would significantly diminish Snapchat’s popularity and negatively impact
14 profits.
15 406. Notwithstanding these glaring flaws, Snap advertises and promotes its product as safe
16 and fun. Snap’s Vice President of Global Public Policy, Jennifer Stout, stated in written testimony
17 to a Senate Subcommittee that Snap takes “into account the unique sensitivities and considerations
18 of minors when we design products” 297 when, in fact, Snap intentionally designed its product to
19 promote compulsive and excessive use and help underage users conceal information from their
20 parents. Stout claimed that Snap makes it harder for strangers to find minors when, in fact,
21 Snapchat’s “Quick Add” feature is responsible for introducing minors to complete strangers, and
22 its “Snap Map” feature has enabled threats, exploitation, and location of minors by complete
23 strangers. Likewise, Snap’s Head of Global Platform Safety, Jacqueline Beauchere, represented to
24 the public that “Snapchat is designed for communications between and among real friends; it
25

26

27 296
See, e.g., Zak Doffman, Snapchat Has Become A ‘Haven For Child Abuse’ With its ‘Self-Destructing Messages’,
28 Forbes (May 26, 2019).
297
Snap’s Senate Congressional Testimony - Our Approach to Safety, Privacy and Wellbeing.
- 93 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 doesn’t facilitate connections with unfamiliar people like some social media platforms.” 298 But
2 again, this is not true and/or historically was not the case.
3 407. In addition, Snap knows or should have known, that its products facilitate and encourage
4 the production, possession, distribution, receipt, transportation, and dissemination of millions of
5 materials that exploit children and violate child pornography laws. Snap further knows, or should
6 have known, that its product facilitates the production, possession, distribution, receipt,
7 transportation, and dissemination of materials that depict obscene visual representations of the
8 sexual abuse of children.
9 7. Snap failed to adequately communicate the harms its product causes or provide
instructions regarding safe use.
10

11 408. Since Snap’s inception, it has misrepresented, omitted, and failed to warn adolescent

12 users about its products’ physical and mental health risks. These risks include, but are not limited
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

to, addiction, compulsive and excessive use, sexual exploitation by adult users, dissociative
LOS ANGELES

13

14 behavior, social isolation, and an array of mental health disorders like body dysmorphia, anxiety,

15 depression, and insomnia. Snap knew of these significant risks, but deceptively and fraudulently

16 omitted, downplayed, or misled consumers and the Band regarding these risks.

17 409. Snap targets adolescent users via advertising and marketing materials distributed via

18 digital and traditional media, including expensive advertisements placed during high-profile

19 sporting events. Snap fails to warn the targets of these ads—often minors—about the physical and

20 mental risks associated with using Snapchat.

21 410. Snap further fails to adequately communicate dangers or to warn adolescent users during

22 the product registration process. At account setup, Snap’s product contains no warning labels,

23 banners, or conspicuous messaging to adequately inform adolescent users of the known risks and

24 potential physical and mental harms associated with usage of its product. Instead, Snap allows

25 adolescent users to easily create an account (or multiple accounts) and fully access the product.

26 411. Snap’s lack of adequate warnings continues after an adolescent has the Snapchat product.

27 Snap does not adequately inform adolescent users that their data will be tracked, used to help build

28 298
Snap’s Meet Our Head of Global Platform Safety.
- 94 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 a unique algorithmic profile, and potentially sold to Snap’s advertising clients, who will in turn use
2 the data to target and profile the user.
3 412. Alarmingly, Snap also does not warn adolescent users before facilitating adult
4 connections and interactions that adult predators use its product. It also fails to instruct adolescent
5 users on ways to avoid unknown adults on Snap.
6 413. Snap also fails to warn adolescent users who exhibit problematic signs of addiction or
7 are habitually and compulsively accessing the app. Instead, Snap utilizes push notifications to
8 encourage engagement with Snapchat.
9 414. In addition, despite proactively providing adolescent users with countless filtering and
10 editing tools, Snap does not warn its adolescent users regarding the mental health harms associated
11 with those heavily filtered images.
12 415. Snap also tells Tribal consumers in Apple’s App Store that Snapchat is rated “12+” (for
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 users 12 and older) because it contains only “infrequent/mild” “profanity and crude humor,”
14 “sexual content or nudity,” “alcohol, tobacco, and drug use or references,” and “mature/suggestive
15 themes.” Snap knows intends that all these representations will be conveyed to Tribal consumers.
16 416. These representations are false. Snapchat hosts and displays a vast library of videos with
17 profanity, sex, illegal drugs, and other content parents would not expect to find on a “12+” app.
18 Such content is visible and even recommended to younger users. Perhaps most importantly, as
19 discussed extensively above, use of YouTube among adolescents even 12+ is harmful to the
20 developing brain and, as currently designed, not appropriate for use by the age group.
21 E. BYTEDANCE MARKETS AND DESIGNS ITS TIKTOK TO ADDICT YOUNG
USERS, SUBSTANTIALLY CONTRIBUTING TO THE MENTAL HEALTH
22 CRISIS.
23 417. Since its launch, TikTok has grown exponentially. In late 2021, its owner and creator

24 ByteDance publicly stated that TikTok had 1 billion active global users, up from 55 million in early

25 2018 and 700 million in mid-2020. 299

26 418. A large swath of TikTok’s user base is comprised of American children. In July 2020,

27 TikTok reported that more than one-third of its 49 million daily users in the United States were 14

28 299
Jessica Bursztynsky, TikTok says 1 billion people use the app each month, CNBC (Sept. 27, 2021).
- 95 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 or younger. 300 More recently, a 2022 Pew Research Center survey reported that 67% of American
2 teenagers (age 13-17) use TikTok, with most American teenagers (58%) using the product daily.
3 Among teenage TikTok users, a quarter say they use the site or app almost constantly. 301 In another
4 recent report, more than 13% of young users declared they “wouldn’t want to live without”
5 TikTok. 302
6 419. TikTok’s capture of the American youth market is no accident, but instead the result of
7 a carefully executed campaign. Early on, Alex Zhu, one of TikTok’s creators, recognized that
8 “[t]eenagers in the U.S. [were] a golden audience” for this emerging social media product. 303 To
9 cash in on this gold, ByteDance implemented a series of product features designed to attract and
10 addict young users. As Zhu explained in 2019, “[e]ven if you have tens of millions of users, you
11 have to keep them always engaged.” 304 This engagement has come at the cost of young users’
12 health.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

1. Background and overview of TikTok.


LOS ANGELES

13

14 420. In 2012, Beijing-based technologist Zhang Yiming paired up with an American venture
15 capitalist, Matt Huang, to launch ByteDance and its first product Jinri Toutiao (“Today’s
16 Headlines”), which utilized A.I. to gather and present world news to users on a single feed.
17 421. Following the success of its first product, ByteDance created Douyin in 2016, a music-
18 based app loosely modeled on the popular app Musical.ly. Musical.ly was a critical hit in the U.S.
19 as American teens gravitated to the platform, which allowed users, including minor users, to create
20 15-second videos of themselves lip-syncing, dancing, or goofing around to popular songs and
21 movie scenes, and then post them to a scrollable feed for other users to see.
22 422. In 2017, ByteDance launched TikTok, a version of Douyin for the non-Chinese market,
23 and acquired Musical.ly, which by then boasted a user base of almost 60 million monthly active
24 300
Raymond Zhong & Sheera Frenkel, A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety
25 Questions, N.Y. Times (Aug. 14, 2020).
301
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
26
302
Victoria Rideout et al., Common Sense Census: Media use by tweens and teens, 2021 at 31, Common Sense
Media (2022).
27
303
Paul Mozur, Chinese Tech Firms Forced to Choose Market: Home or Everywhere Else, N.Y. Times (Aug. 9,
2016).
28
304
Biz Carson, How A Failed Education Startup Turned into Musical.ly, The Most Popular App You’ve Probably
Never Heard Of, Bus. Insider (May 28, 2016) (emphasis added).
- 96 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 users, for $1 billion. Nine months later, Byte Dance merged its newly acquired app into its existing
2 product, and a global version of TikTok was born.
3 423. Douyin is a version of TikTok that is exclusively for Chinese users. ByteDance’s design
4 of Douyin is profoundly different than TikTok. Douyin serves its Chinese users educational and
5 patriotic content and limits their use to just 40 minutes per day. 305 TikTok, in sharp contrast, has
6 no usage limits and is designed to encourage addictive and compulsive use. Far from promoting
7 educational content, TikTok’s algorithm instead actively sends its young American users down a
8 harmful rabbit hole of artificially filtered “ideal” body images and dangerous viral challenges.
9 2. ByteDance intentionally encourages youth to use its product and then leverages
that use to increase revenue.
10

11 424. ByteDance has designed and aggressively marketed TikTok, the harmful and addictive

12 version of Douyin, to attract young Americans.


ROBINS KAPLAN LLP
ATTORNEYS AT LAW

425. Like the other Defendants’ products, TikTok depends on advertising revenue, which has
LOS ANGELES

13

14 boomed. TikTok was projected to receive $11 billion in advertising revenue in 2022, over half of

15 which is expected to come from the United States. 306

16 426. The initial iteration of TikTok allowed users to lip sync pop music by celebrities who

17 appealed primarily to teens and tweens (e.g., Selena Gomez and Ariana Grande). It labeled folders

18 with names attractive to youth (e.g., “Disney” and “school”); and included in those folders songs

19 such as “Can You Feel the Love Tonight” from the movie “The Lion King,” “You’ve Got a Friend

20 in Me” from the movie “Toy Story,” and other renditions covering school-related subjects or

21 school-themed television shows and movies. 307

22 427. ByteDance also specifically and intentionally excluded videos that would not appeal to

23 young Americans, instructing TikTok moderators that videos of “senior people with too many

24

25

26
305
Sapna Maheshwari, Young TikTok Users Quickly Encounter Problematic Posts, Researchers Say, N.Y. Times
(Dec. 14, 2022).
27
306
Jessica Bursztynsky, TikTok says 1 billion people use the app each month, CNBC (Sept. 27, 2021); Bhanvi Staija,
TikTok’s ad revenue to surpass Twitter and Snapchat combined in 2022, Reuters (Apr. 11, 2022).
28
307
Complaint for Civil Penalties, Permanent Injunction, and Other Equitable Relief (“Musical.ly Complaint”) at p. 8,
¶¶ 26–27, United States v. Musical.ly, 2:19-cv-01439-ODW-RAO (C.D. Cal. Feb. 27, 2019) Dkt. # 1.
- 97 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 wrinkles” should not be permitted on users’ “For You” pages because such content was “much less
2 attractive [and] not worth[] . . . recommend[ing].” 308
3 428. Even TikTok’s sign-up process demonstrates that young users are what ByteDance
4 values most. In 2016, the birthdate for those signing up for the app defaulted to the year 2000
5 (i.e.,16 years old). 309
6 3. ByteDance intentionally designed product features to addict children and
adolescents.
7

8 429. TikTok’s growth among young Americans has been further enabled by its unreasonably
9 dangerous age verification and parental control procedures, which allow children under 13
10 unfettered access to the app.
11 a. TikTok’s age-verification measures are unreasonably dangerous.
12 430. When a user first opens TikTok, they are prompted to “Login in to TikTok” or “Sign up”
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 for an account using a phone number or email address. TikTok then asks, “When’s your birthday?”

14 431. ByteDance does not verify the age that TikTok users report. Nor does it use any method

15 to verify that users who acknowledge they are minors have the consent of their parents or legal

16 guardians to use the product. In fact, at least as of 2020, TikTok still had not developed a company

17 position on age verification.

18 432. ByteDance has designed TikTok so users can circumvent TikTok’s age restrictions by

19 using TikTok without creating an account. TikTok allows users, no matter what age, to “browse as

20 [a] guest,” and watch TikTok’s “For You” page while TikTok’s algorithm collects data about that

21 user and their viewing behavior. 310

22 433. ByteDance knows that many U.S. TikTok users under the age of 13 fail to report their

23 birth dates accurately. 311

24

25 308
Sam Biddle et al., Invisible Censorship: TikTok Told Moderators to Suppress Posts by “Ugly” People and the
26 Poor to Attract New Users, Intercept (Mar. 15, 2020).
309
Melia Robinson, How to Use Musical.ly, The App With 150 million Users That Teens Are Obsessed With, Bus.
27 Insider (Dec. 7, 2016).
310
Browse as Guest, TikTok Support.
28
311
Jon Russell, Musical.ly Defends its Handling of Young Users, As it Races Past 40M MAUs, TechCrunch (Dec. 6,
2016).
- 98 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 434. In 2019, the FTC acted on this admission and alleged that ByteDance failed to comply
2 with COPPA. 312
3 435. TikTok settled the FTC claims, agreeing to a then-record civil COPPA penalty and
4 several forms of injunctive relief intended to protect children who use the product. 313
5 436. To comply with the terms of that settlement, ByteDance created “TikTok for Younger
6 Users,” a “limited app experience” for users under the age of 13. 314 “TikTok for Younger Users”
7 does not permit users to “share their videos, comment on others’ videos, message with users, or
8 maintain a profile or followers.” 315 However, users can still “experience what TikTok is at its core”
9 by recording and watching videos on TikTok. For that reason, experts state the app is “designed to
10 fuel [kids’] interest in the grown-up version.” 316
11 437. Moreover, users under 13 can easily delete their age-restricted accounts and sign up for
12 an over-13 account on the same mobile device—without any restriction or verification—using a
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 fake birthdate.
14 438. The absence of effective age verification measures also means that adult users claim to
15 be children—with obvious dangers to the children on ByteDance’s product.
16 b. TikTok’s parental controls are unreasonably dangerous.
17 439. Following the FTC settlement, ByteDance created a “Family Pairing” feature on TikTok.
18 The supposed purpose of that feature was to allow parents to link their accounts to their children’s
19 accounts and enforce certain controls (such as screen time limits and restriction of “content that
20 may not be appropriate for all audiences”). 317
21 440. “Family Pairing” also is supposed to allow parents to prevent their children from direct
22 messaging other TikTok users. But ByteDance has designed TikTok’s “Family Pairing” feature so
23 that it is not mandatory for minor users. And to use it, a parent or guardian is forced to create their
24 own TikTok account to pair it with their child’s account. Further, the “Family Pairing” feature is
25

26
312
See Musical.ly Complaint, at p. 8, ¶¶ 26–27.
313
Natasha Singer, TikTok Broke Privacy Promises, Children’s Groups Say, NY Times (May 14, 2020).
27
314
TikTok for Younger Users, TikTok (Dec. 13, 2019).
315
Dami Lee, TikTok Stops Young Users from Uploading Videos after FTC Settlement, Verge (Feb. 27, 2019).
28
316
Leonard Sax, Is TikTok Dangerous for Teens?, Inst. Fam. Stud. (Mar. 29, 2022).
317
TikTok Introduces Family Pairing, TikTok Newsroom (April 15, 2020).
- 99 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 available only on the TikTok app. It provides no protection when a child accesses TikTok through
2 a web browser. Because this feature requires parents to know the name of their child’s account to
3 pair it, youth can easily evade the protections of the “Family Pairing” feature by creating
4 anonymous accounts, again without parental approval or knowledge.
5 441. ByteDance further stymies parents’ ability to supervise minors’ use of TikTok by
6 permitting minor users to block their parent’s profile, post ephemeral videos called “Stories” that
7 disappear after 24 hours, and post those stories to “Friends Only.”
8 442. ByteDance could, but does not, adopt safety features that notify parents when minors are
9 engaging excessively with the product and are using it during sleeping hours.
10 443. Until January 13, 2021, ByteDance interfered with parental supervision and endangered
11 children by defaulting all accounts, including those registered to children as young as 13, to
12 “public.” That allowed strangers to contact minor users regardless of age or location. ByteDance
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 also intentionally and actively promoted these types of connections by suggesting accounts to
14 follow through the “Find Friends” or “People You May Know” features.
15 444. Today, for users 16 and over, ByteDance still sets the default privacy setting for all
16 registered accounts to “public,” meaning that anyone can view a user’s profile, on or off TikTok,
17 request the user as a friend, or engage with the user’s content. 318
18 c. ByteDance intentionally designed TikTok’s unreasonably dangerous
features and algorithms to maximize engagement using automatic content,
19 time-limited experiences, intermittent variable rewards, reciprocity, and
ephemeral content.
20

21 445. Like each of the other Defendants, ByteDance has designed and coded TikTok with
22 features that foster addictive and compulsive use by youth, leading to a cascade of additional mental
23 and physical injuries.
24 446. One of TikTok’s defining features is its “For You” page (or “FYP”). According to
25 ByteDance, it is “central to the TikTok experience and where most of our users spend their time.” 319
26

27 318
See, e.g., Lauren E. Sherman et al., The Power of the Like in Adolescence: Effects of Peer Influence on Neural and
28 Behavioral Responses to Social Media, 27(7) Psych. Sci. 1027–35 (July 2016).
319
How TikTok recommends videos #ForYou, TikTok (June 18, 2020).
- 100 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 447. TikTok’s FYP uses ByteDance’s powerful machine-learning algorithms to select content
2 to feed users to maximize their engagement and thereby serve ByteDance’s interests—as opposed
3 to simply responding to searches by users. As one industry commentator explained, TikTok uses
4 “a machine-learning system that analyzes each video and tracks user behavior so that it can serve
5 up a continually refined, never-ending stream of TikToks optimized to hold [users’] attention.” 320
6 As another commentator put it, “you don’t tell TikTok what you want to see. It tells you.” 321
7 448. Zhu has remarked that, “[e]ven if you have tens of millions of users, you have to keep
8 them always engaged.” 322 Thus, according to Zhu, TikTok’s algorithms are “focused primarily on
9 increasing the engagement of existing users.” 323
10 449. An internal document titled “TikTok Algo 101,” which TikTok has confirmed is
11 authentic, “explains frankly that in the pursuit of the company’s ‘ultimate goal’ of adding daily
12 active users, it has chosen to optimize for two closely related metrics in the stream of videos it
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 serves: ‘retention’—that is, whether a user comes back—and ‘time spent.’” 324
14 450. “This system means that watch time is key,” explained Guillaume Chaslot, the founder
15 of Algo Transparency. 325 Chaslot noted that “rather than giving [people] what they really want,”
16 TikTok’s “algorithm tries to get people addicted[.]” 326
17 451. To fulfill this goal, the TikTok algorithm responds to a user’s time spent watching and
18 engaging with a video by feeding them similar content. 327 As TikTok describes it, the algorithms
19 populate each user’s FYP feed by “ranking videos based on a combination of factors” that include,
20 among others, any interests expressed when a user registers a new account, videos a user likes,
21 accounts they follow, hashtags, captions, sounds in a video they watch, certain device settings, such
22

23

24
320
Jia Tolentino, How TikTok Holds Our Attention, New Yorker (Sept. 30, 2019).
321
Drew Harwell, How TikTok Ate the Internet, Wash. Post. (Oct. 14, 2022).
25
322
Biz Carson, How a Failed Education Startup Turned Musical.ly, the Most Popular App You’ve Probably Never
Heard Of, Business Insider (May 28, 2016), (emphasis added).
26
323
Joseph Steinberg, Meet Musical.ly, the Video Social Network Quickly Capturing the Tween and Teen Markets,
Inc. (June 2, 2016).
27
324
Ben Smith, How TikTok Reads Your Mind, N.Y. Times (Dec. 5, 2021).
325
Id.
28
326
Ben Smith, How TikTok Reads Your Mind, N.Y. Times (Dec. 5, 2021).
327
Kaitlyn Tiffany, I’m Scared of the Person TikTok Thinks I Am, The Atlantic (June 21, 2021).
- 101 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 as their language preferences and where they are located, and finally the likelihood of the user’s
2 interest. 328
3 452. ByteDance has designed TikTok’s algorithm so that certain factors, such as time spent
4 watching a video, are more important to the algorithm than others. For example, TikTok explains
5 that, “whether a user finishes watching a longer video from beginning to end, would receive greater
6 weight than . . . whether the video’s viewer and creator are both in the same country.” 329
7 453. TikTok’s algorithms are designed to begin working the minute a user opens the app. The
8 FYP shows the user a single, full-screen stream of videos, then records how the user reacts. “A
9 second of viewing or hesitation indicates interest; a swipe suggests a desire for something else.” 330
10 With each data point collected, TikTok’s algorithm winnows a mass of content to a single feed,
11 continually refined to keep users engaging often and at length.
12 454. This algorithmic encouragement of continuous scrolling and interaction makes it hard
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 for users to disengage from the app. A recent ByteDance-funded study, which imaged the brains of
14 TikTok and other social media product users, found that those using TikTok engaged with the
15 product about 10 times a minute, twice as often as with peer apps. 331
16 455. ByteDance leverages users’ inability to disengage as a benefit to attract advertisers,
17 rather than taking steps to address the addictive nature of its product. A recent TikTok marketing
18 document observed that “the TikTok audience is fully leaned in.” 332 Marketing research
19 commissioned by TikTok found that compared to other social media sites, TikTok users evidenced
20 a higher frequency of rate per minute. TikTok boasted, “[o]ur algorithm and shorter video formats
21 create continuous cycles of engagement, making TikTok the leading platform for Information
22 Density.” 333
23

24

25
328
Investigation: How TikTok’s Algorithm Figures Out Your Deepest Desires, Wall St. J. (Jul. 21, 2021); see also
How TikTok recommends videos #ForYou | TikTok Newsroom.
26
329
Investigation: How TikTok’s Algorithm Figures Out Your Deepest Desires, Wall St. J. (Jul. 21, 2021); see also
How TikTok recommends videos #ForYou | TikTok Newsroom.
27
330
Id.
331
TikTok Ads Break Through Better Than TV and Drive Greater Audience Engagement, TikTok.
28
332
Id.
333
Id.
- 102 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 456. ByteDance also creates images and GIFs for users to incorporate into TikTok videos to
2 keep users returning to the product. And ByteDance has acquired publishing rights to thousands of
3 hours of music and video, which it provides its users to attach to the videos and pictures they post
4 on TikTok.
5 457. TikTok’s powerful machine-learning algorithms dictate the content of each user’s FYP.
6 An estimated 90-95% of the content viewed on TikTok comes from its algorithms (as opposed to
7 user selection), the highest among Defendants’ products. 334
8 458. The algorithm encourages use of the product, regardless of whether that use is enjoyable
9 or healthy. From TikTok’s perspective, it doesn’t matter whether you’re engaging with a video
10 because you’re horrified or angry or upset—the engagement itself is the end goal.
11 459. As the algorithm continues to refine what users see, they are “more likely to encounter
12 harmful content.” 335 Indeed, TikTok’s quest to monopolize user attention often forces users down
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 “rabbit holes” of harmful content. Users end up in these rabbit holes, and become trapped in them,
14 because TikTok has optimized its algorithm’s design for retention and time spent on the app. 336
15 TikTok wants to keep users coming back as often as possible for as long as possible.
16 460. Once users are in a rabbit hole, it is extremely difficult to climb out. One user was shown
17 a few anti-vaccination conspiracy theory videos on his FYP and commented on them in an attempt
18 to refute the videos’ claims. His feed was quickly overtaken with similar videos, and it took him
19 months of intentional interaction with the app to purge this content from his FYP. 337 In general,
20 escaping a rabbit hole requires a user to repeatedly and actively strategize ways to counter the
21 algorithm, pitting individual users’ David against TikTok’s machine-learning Goliath.
22 461. The Wall Street Journal documented the pernicious operation of ByteDance’s
23 algorithms, as shown by a recent experiment. The experimenters used bots, each programmed with
24 various interests such as sports, forestry, dance, astrology, and animals. They did not disclose these
25 interests upon registration with TikTok. Instead, TikTok’s algorithm quickly learned the assigned
26

27
334
Investigation: How TikTok’s Algorithm Figures Out Your Deepest Desires, Wall St. J. (Jul. 21, 2021).
335
Inside TikTok’s Algorithm: A WSJ Video Investigation, Wall St. J. (July 21, 2021).
28
336
Ben Smith, How TikTok Reads Your Mind, N.Y. Times (Dec. 5, 2021).
337
Kaitlyn Tiffany, I’m Scared of the Person TikTok Thinks I Am, The Atlantic (June 21, 2021).
- 103 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 interests from the bots’ behavior—that is, “by rewatching or pausing on videos” related to the bot’s
2 programmed interest. 338
3 462. One bot watched 224 videos in 26 minutes, lingering over videos with hashtags for
4 “depression” or “sad.” The algorithm quickly refined its output. Afterward, 93% of the videos
5 TikTok showed that bot were about depression or sadness. One post implored the bot to: “Just go.
6 Leave. Stop trying. Stop pretending. You know it and so do they. Do Everyone a favor and
7 leave.” 339
8 463. ByteDance’s choices about how to design and structure its app—including choosing not
9 to implement effective age gating and parental controls, in addition to choosing to design
10 algorithms to maximize engagement through pushing extreme and outrageous content—go far
11 beyond benignly organizing the content of others. Instead, they create an environment and
12 experience suited to ByteDance’s goal of maximizing ad revenues—an environment and
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 experience that is unreasonably dangerous to the children and teens ByteDance targets.
14 464. In a follow-up experiment by the Wall Street Journal, bots were registered as users
15 between 13 and 15 years-old. One of those bots, programmed to pause on videos referencing drugs,
16 lingered briefly on “a video of a young woman walking through the woods with a caption” referring
17 to “stoner girls.” The next day, the algorithm showed the bot a video about a “marijuana-themed
18 cake.” Then, the “majority of the next thousand videos” that TikTok’s algorithm produced “tout[ed]
19 drugs and drug use,” including marijuana, psychedelics, and prescription drugs. 340
20 465. The algorithm immersed another bot—registered as a 13-year-old boy—into a rabbit
21 hole of videos related to bondage and sex, including videos explaining, among other things, “how
22 to tie knots for sex, recover from violent sex acts and discussing fantasies about rape.” 341 The bot
23 simply searched for the term “onlyfans”—a site known for hosting adult entertainment—and
24 watched a handful of videos in the results before returning to the FYP. 342 The algorithm
25 subsequently bombarded the bot with videos about sex and, as the bot lingered on those videos, the
26 338
Inside TikTok’s Algorithm: A WSJ Video Investigation, Wall St. J. (July 21, 2021).
27
339
Id.
340
Rob Barry et al., How TikTok Serves Up Sex and Drug Videos to Minors, Wall St. J. (Sept. 8, 2021).
28
341
Id.
342
Id.
- 104 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 bot’s feed became almost entirely dominated by sex-related videos. At one point, “more than 90
2 percent of [the] account’s video feed was about bondage and sex.” 343
3 466. The Wall Street Journal concluded “that through its powerful algorithms, TikTok can
4 quickly drive minors—among the biggest users of the app—into endless spools of content about
5 sex and drugs.” 344 In another follow-up experiment, the Wall Street Journal found that once
6 TikTok’s algorithm determined that the bots would rewatch videos related to weight loss, it
7 “speedily began serving more, until weight-loss and fitness content made up more than half their
8 feeds—even if the bot never sought it out.” 345
9 467. Indeed, TikTok’s algorithm recommended over 32,000 weight-loss videos over a two-
10 month period, “many promoting fasting, offering tips for quickly burning belly fat and pushing
11 weight-loss detox programs and participation in extreme weight-loss competitions.” 346
12 468. Alyssa Moukheiber, a treatment center dietitian, explained that TikTok’s algorithm can
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 push children into unhealthy behaviors or trigger a relapse of disordered eating. 347 Indeed, several
14 teenage girls interviewed by the Wall Street Journal reported developing eating disorders or
15 relapsing after being influenced by extreme diet videos TikTok promoted to them. 348
16 469. Their experiences are not unique. Katie Bell, a co-founder of the Healthy Teen Project,
17 explained that “the majority of her 17 teenage residential patients told her TikTok played a role in
18 their eating disorders.” 349
19 470. Others, like Stephanie Zerwas, an Associate Professor of Psychiatry at the University of
20 North Carolina at Chapel Hill, could not even recount how many of her young patients told her that
21 “I’ve started falling down this rabbit hole, or I got really into this or that influencer on TikTok, and
22 then it started to feel like eating-disorder behavior was normal, that everybody was doing that.” 350
23

24 343
Id.
25
344
Id.
345
Tawnell D. Hobbs, ‘The Corpse Bride Diet’: How TikTok Inundates Teens With Eating-Disorder Videos, Wall St.
26 J. (Dec. 17, 2021).
346
Id.
27
347
Id.
348
Id.
28
349
Id.
350
Id.
- 105 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 471. In December 2022, the Center for Countering Digital Hate (“CCDH”) conducted a
2 similar study, creating TikTok accounts with a registered age of 13 in the United States, United
3 Kingdom, Canada, and Australia. 351 For the first 30 minutes on the app, the accounts paused briefly
4 on videos about body image and mental health and liked them. “Where researchers identified a
5 recommended video matching one of the below categories, they viewed the video for 10 seconds
6 and liked it. For all other videos, researchers would immediately scroll the For You feed to view
7 the next video recommended by TikTok.” 352 TikTok’s algorithm seized on this information and
8 within minutes began recommending content about eating disorders and self-harm.
9 472. The CCDH report further illustrated TikTok’s algorithms at work, noting that, for an
10 account that liked content about body image and mental health, the algorithm recommended similar
11 content every 39 seconds. As the 30 minutes went on, TikTok recommended more videos related
12 to eating disorders, suicide, and self-harm, as the graph below shows.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27 351
Deadly by Design, Center for Countering Digital Hate (Dec. 2022).
28
352
Tawnell D. Hobbs, ‘The Corpse Bride Diet’: How TikTok Inundates Teens With Eating-Disorder Videos, Wall St.
J. (Dec. 17, 2021).
- 106 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 473. TikTok’s rabbit holes are particularly problematic for young people, who lack the
2 necessary impulse control to stop watching. The more the user engages by viewing or hesitating on
3 a particular piece of content, the more TikTok’s algorithms learn about the user. ByteDance uses
4 this feature to exploit the vulnerabilities of children and teenagers, and addict them to its product.
5 474. Indeed, ByteDance admits that its recommendation algorithm creates a “risk of
6 presenting an increasingly homogeneous stream of videos.” 353 As the above-referenced studies and
7 experiments demonstrate, that homogeneous stream often includes harmful content, including posts
8 about depression, self-harm, drugs, and extreme diets.
9 475. ByteDance uses a series of interrelated design features that exploit known mental
10 processes to induce TikTok’s users to use the product more frequently, for more extended periods,
11 and with more intensity (i.e., providing more comments and likes). ByteDance knows or should
12 have known that children, whose brains are still developing, are particularly susceptible to these
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 addictive features.
14 476. TikTok is unreasonably dangerous in part because ByteDance designed the app so users
15 cannot disable the auto-play function on the FYP. 354 As noted above, when a user opens the TikTok
16 app or visits the TikTok website, the product immediately begins playing a video on the user’s
17 FYP. The user may request more videos with a simple upward swipe, and the product will deliver
18 a seemingly endless content stream. If a user does not proceed from a video, it continues to play on
19 an endless loop. The ability to scroll continuously induces a “flow-state” and distorts users’ sense
20 of time.
21 477. The TikTok app interface is designed with only a limited number of buttons and sections
22 of the app for users to navigate, such that the design does not impede “flow.”
23 478. The FYP also leverages principles of IVR to encourage compulsive usage, in the same
24 fashion as Instagram Reels. A user swipes to receive the next video, and each swipe offers the
25 prospect (but not the certainty) of dopamine-releasing stimuli.
26

27

28
353
How TikTok recommends videos #ForYou, TikTok (June 18, 2020).
354
2 Best Ways You Can Turn off TikTok Autoplay, Globe Calls (Dec. 16, 2022).
- 107 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 479. The cumulative effect of these features is addictive, compulsive engagement. As
2 researchers at the Brown University School of Public Health explained, “the infinite scroll and
3 variable reward pattern of TikTok likely increase the addictive quality of the app as they may induce
4 a flow-like state for users that is characterized by a high degree of focus and productivity at the task
5 at hand.” 355
6 480. Dr. Julie Albright, a Professor at the University of Southern California, similarly
7 explained that TikTok is so popular because users will “just be in this pleasurable dopamine state,
8 carried away. It’s almost hypnotic, you’ll keep watching and watching.” Users “keep scrolling,”
9 according to Dr. Albright, “because sometimes you see something you like, and sometimes you
10 don’t.” That differentiation, according to Dr. Albright, “is key.” 356
11 481. Aza Raskin, the engineer who designed infinite scroll, described the feature as being “as
12 if [social media companies are] taking behavioral cocaine and just sprinkling it all over your
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 interface, and that’s the thing that keeps you coming back and back and back.” Because the infinite
14 scroll does not “give your brain time to catch up with your impulses . . . you just keep scrolling.” 357
15 482. To reinforce this addictive experience, ByteDance intentionally omits the concept of
16 time from their product, stripping information such as when a user uploaded a video from its endless
17 stream of content. In the FYP, there is no way to discern how long ago the video was posted, or
18 when the user who posted the video joined TikTok.
19 483. On at least some phones, TikTok is designed to cover the clock displayed at the top of
20 user’s iPhones, preventing them from keeping track of the time spent on TikTok. 358
21 484. ByteDance has designed the app so that users can see, however, how many times a video
22 was liked, commented on, or shared. So, the only thing users can quantify within the app is the
23 approval or disapproval of others.
24 485. In June 2022, after receiving public criticism regarding its product’s effects on people’s
25 mental health, ByteDance introduced various tools to purportedly encourage users to take a break
26 355
Sophia Petrillo, What Makes TikTok So Addictive? An Analysis of the Mechanisms Underlying the World’s Latest
27 Social Media Craze, Brown Undergraduate J. of Pub. Health (Dec. 13, 2021).
356
John Koetsier, Digital Crack Cocaine: The Science Behind TikTok’s Success, Forbes (Jan. 18, 2020).
28
357
John Koetsier, Digital Crack Cocaine: The Science Behind TikTok’s Success, Forbes (Jan. 18, 2020).
358
Louise Matsakis, On TikTok, There is No Time, Wired (October 3, 2019).
- 108 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 from infinite scrolling, such as a “Take a Break” reminder and time-limit caps. ByteDance could
2 but does not activate these tools by default. Even for minors, once they have exceeded 100 minutes
3 of usage a day, TikTok only “reminds” them that these “Take a Break” tools exist upon opening
4 the app, but does not automatically activate them by default.
5 486. In addition to the unreasonably dangerous infinite scroll, ByteDance has designed
6 TikTok so it has other design features that exploit social psychological impulses to induce children
7 to use TikTok daily and for extended periods of time, adding to the product’s addictive nature.
8 487. Several TikTok features actively encourage users to generate ephemeral photos and
9 videos. This unreasonably dangerous design feature promotes compulsive use, because users risk
10 missing the content posted by their friends and others if they do not check TikTok at least daily.
11 488. A TikTok user can, for example, post expiring “Stories,” short videos that disappear after
12 24 hours. These videos do not otherwise appear in a user’s feed. TikTok’s live stream feature is
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 similar. 359
14 489. A relatively new feature, “TikTok Now,” pushes daily notifications to users to share
15 “authentic, real-time images or 10-second videos at the same time as your friends.” 360 ByteDance
16 designed this feature so that once a user gets the notification, the user has three minutes to post an
17 image or video. That user cannot view friends’ “TikTok Now” posts without sharing one of their
18 own, and posts submitted outside of the three-minute window are marked as “late.” TikTok
19 preserves a user’s history in a calendar view, adding to the pressure to visit the app daily and when
20 notified by TikTok to do so. ByteDance designed these unreasonably dangerous features to increase
21 responsiveness to notifications and keep young users locked into the product, as they do not want
22 to miss out on this perceived social activity.
23 490. Like “Snap Streaks,” “TikTok Now” does not enhance the communication function of
24 the product, but simply exploits young users’ susceptibility to persuasive design, teenage social
25 anxiety, and FOMO. ByteDance’s insidious design of “TikTok Now” also employs point scoring
26

27

28
359
Hilary Anderson, Social media apps are ‘deliberately addictive to users, BBC (July 4, 2018).
360
TikTok Now, TikTok.
- 109 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 and competition with others to drive frequent and continuous engagement by children, who
2 otherwise risk checking in late and alienating other peers participating in the exchange.
3 491. Like the other Defendants’ apps, ByteDance designed TikTok to leverage the principle
4 of IVR by encouraging users to like, share, or reshare videos that others have created or posted.
5 Receiving a “Like” or “Reshare” indicates that others approve of that user’s content and satisfies
6 their natural, developmentally predictable desire for acceptance. As discussed above, “Likes”
7 activate the reward region of the brain and release dopamine to create a positive feedback loop. 361
8 Users return to TikTok again and again, hoping for yet another pleasurable experience. 362
9 492. ByteDance also designed TikTok to use reciprocity to manipulate users into using the
10 app. One example is the “Duet” feature, which allows users to post a video side-by-side with a
11 video from another TikTok user. Users utilize “Duet" to react to the videos of TikTok content
12 creators. ByteDance intends the response to engender a reciprocal response from the creator of the
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 original video, inducing them to return to the app.


14 493. Another “core feature” of TikTok that ByteDance has pursued are “challenges,” which
15 are campaigns that compel users to create and post in TikTok certain types of videos, such as
16 performing a dance routine or a dangerous prank. By fostering competition and the social rewards
17 of posting a challenge video, ByteDance incentivizes users to engage with the product
18 continuously.
19 494. Harmful and dangerous interactions are a foreseeable consequence of TikTok’s
20 engagement-maximization design. For example, numerous minor users have injured themselves or
21 others participating in viral pranks to obtain rewards and increase the number of likes, views, and
22 followers.
23 495. One such viral prank, “the Benadryl challenge,” features a user filming themselves
24 taking large quantities of Benadryl to cause hallucinations or induce an altered mental state. Other
25 similar viral challenges include the “NyQuil Challenge,” in which young people are encouraged to
26 eat chicken cooked in NyQuil; the “Milk Crate Challenge,” where adolescents climb atop a stack
27 361
Rasan Burhan & Jalal Moradzadeh, Neurotransmitter Dopamine (DA) and its Role in the Development of Social
28 Media Addiction, 11(7) J. Neurology & Neurophysiology 507 (2020).
362
Id.
- 110 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 of milk crates and jump off; the “Penny Challenge,” where young users are encouraged to plug a
2 charger halfway into an outlet while holding a penny against the exposed prongs; and the “Blackout
3 Challenge” where youth are encouraged to make themselves faint by holding their breath and
4 constricting their chest muscles or restricting airflow with a ligature around their neck.
5 496. TikTok challenges have led to serious health complications, seizures, and death, with at
6 least 12 children in the United States dying from the TikTok Blackout Challenge alone. 363
7 497. Nevertheless, ByteDance encourages businesses to create challenges as a form of
8 marketing, explaining that challenges are “geared towards building awareness and engagement,”
9 and “research shows that they can deliver strong results” and increased return on ad spending “at
10 every stage of the funnel.” 364 While ByteDance extolls the revenue potential from challenges,
11 young users continue to face new and serious harms as the challenges’ stakes grow even more
12 extreme and dangerous.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

d. ByteDance’s unreasonably dangerous features include impediments to


LOS ANGELES

13
discontinuing use.
14

15 498. Even if a user escapes the addictiveness of TikTok’s design and decides to delete their

16 account, ByteDance makes doing so a lengthy and complex undertaking. The deletion process is

17 unreasonably and dangerously designed to encourage users to retain their accounts, even if their

18 stated reason for deletion is that the product is endangering their safety or health.

19 499. When a user selects the “Deactivate or delete account” in the “Account” section of the

20 TikTok app, the user is presented an option: “Delete or deactivate?” Deactivating an account will

21 preserve the user’s data, but hide it from the product; deleting, on the other hand, will permanently

22 delete all data associated with the account.

23 500. If a user selects the “Delete account permanently” option, the user is asked “Why are

24 you leaving TikTok?” The user must select from the following list: (1) I’m leaving temporarily; (2)

25 I’m on TikTok too much; (3) Safety or privacy concerns; (4) Too many irrelevant ads; (5) Trouble

26 getting started; (6) I have multiple accounts; or (7) Another reason.

27 363
Quinn Nguyen, Don’t let your kids try these 9 dangerous TikTok trends!; Olivia Carville, TikTok’s Viral
28 Challenges Keep Luring Young Kids to Their Deaths, Bloomberg (Nov. 30, 2022).
364
Branded Hashtag Challenge: Harness the Power of Participation, TikTok for Business (Mar. 16, 2022).
- 111 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 501. If a user selects “I’m on TikTok too much,” ByteDance makes a last-ditch effort to retain
2 the user by reminding the user that a limit can be set on the user’s watch time on the product. If a
3 user selects “Safety or privacy concerns,” the user is provided a list of resources to “secure” the
4 account. If the user selects “Another reason,” a written explanation must be provided. The only
5 option that does not provide or require further information is “I have multiple accounts.” ByteDance
6 isn’t worried about users deleting merely one account if they already have multiple others.
7 502. Once a user selects a reason for deletion, the next screen prompts the user to download
8 their TikTok data.
9 503. Before the user continues the deletion, the product requires the user to check a box at the
10 bottom of the screen that says, “[b]y continuing, you reviewed your data request and wish to
11 continue deleting your account.” This contrasts with the process of a user “agreeing” to the Terms
12 of Service and Privacy Policy during the registration process, which does not require a separate
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 confirmation.
14 504. Once the user confirms a desire to continue with the deletion process, the product takes
15 the user to yet another screen, which yet again asks whether the user wants to “delete this account?”
16 The text also explains that the account will be deactivated for 30 days, during which the user may
17 reactivate the account, and after 30 days, the account and data associated with it will be permanently
18 deleted. It goes on to warn that if a user deletes the account, the user will no longer be able to do
19 many things in the app.
20 505. Once a user again confirms that they want to delete their account, TikTok requires
21 validation with a 6-digit code sent to the telephone number or email address associated with the
22 account. Only after the user receives and enters the code may they finally “delete” their account
23 (after waiting 30 days).
24 506. ByteDance’s account deletion process is inadequate for children attempting to escape its
25 addictive and harmful product. Requiring a child to go through multiple steps, and offering
26 alternatives, as well as a list of things they are giving up, is designed to convince them to change
27 their mind. Moreover, requiring the user to maintain a deactivated account for 30 days, rather than
28 deleting it on demand, increases the chance that an addicted user will relapse and return to the app.
- 112 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 507. ByteDance’s intentionally cumbersome and unreasonably dangerous deletion process
2 prioritizes the retention of young users, and ad revenue that they generate, over their well-being.
3 e. ByteDance’s unreasonably dangerous features inflict impossible image
standards and encourage negative appearance comparison.
4

5 508. ByteDance designed TikTok with image-altering filters that harm users. These filters

6 allow children to artificially change their appearance, for example by lightening their skin and eyes,

7 giving them glowing tan skin, or giving them larger lips or fluttering eyelashes.

8 509. Young people often then compare the filtered images to their real-life appearance,

9 developing a negative self-image based on unrealistic, artificial images. 365 Many young girls use

10 image-altering filters every day, harming their mental health. And those filters subconsciously

11 make girls feel imperfect and ugly, “reduc[ing] their self-compassion and tolerance for their own

12 physical flaws.” 366


ROBINS KAPLAN LLP
ATTORNEYS AT LAW

510. So compelling is the desire to resemble more closely the filtered ideal that there are
LOS ANGELES

13

14 online tutorials explaining how to recreate certain filters using makeup.

15

16

17

18

19

20

21

22

23

24

25

26

27 365
Anna Haines, From ‘Instagram Face’ To ‘Snapchat Dysmorphia’: How Beauty Filters Are Changing The Way
28 We See Ourselves, Forbes (Apr. 27, 2021 at 1:19 PM EDT).
366
Id.
- 113 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 511. Children’s idealization of their filtered image is externally reinforced when the filtered
2 images receive more likes, comments, and other interaction. Young people also compare these
3 interaction “scores” to those of friends and celebrities who use filters, reinforcing the idea that
4 beauty depends on matching a digital ideal.
5 512. But filters, retouch, and other editing tools available on TikTok often alter specific facial
6 features, such as the shape of a person’s eyes and lips, in ways that would require medical
7 intervention to alter in real life. Children, particularly girls, are thus striving for a standard of beauty
8 that is functionally impossible to achieve, with every TikTok filter creating a test that they are
9 doomed to fail.
10 4. ByteDance facilitates the spread of CSAM and child exploitation.

11 513. ByteDance has designed various TikTok features that promote and dramatically

12 exacerbate sexual exploitation, the spread of CSAM, sextortion, and other socially maladaptive
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 behavior that harms children.

14 514. TikTok’s design features enable the spread of this illegal material, and it receives value

15 in the form of increased user activity for disseminating these materials on the product.

16 515. TikTok allows users to add a location to publicly shared videos of themselves. 367 TikTok

17 encourages the use of location services, “prompt[ing] [users] to turn on Location Services when

18 [users] browse the For You feed.”

19 516. By providing access to a child user’s present physical location, ByteDance encourages

20 predators to locate nearby children for purposes of sexual exploitation, sextortion, and CSAM.

21 517. ByteDance designed TikTok with a “Your Private Videos,” feature, where users can

22 create and store private videos that are only visible to the user, better known as “Post-in-Private”

23 accounts, where adult predators store, create, post, and share CSAM. Within days of following a

24 small number of “Post-in-Private” accounts, TikTok’s algorithm begins recommending dozens of

25 other “Post-in-Private” accounts to follow, making it easy for predators to view and share even

26 more CSAM. 368

27

28
367
Location Information on TikTok, TikTok.
368
Id.
- 114 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 518. These accounts are nominally private, but users can share their usernames and passwords
2 with other users to access these private videos. 369 While ByteDance’s user policy forbids sharing
3 passwords with other users, TikTok’s design means that it is nonetheless very easy to do. 370
4 519. ByteDance designed TikTok to offer two-factor authentication but does not require users
5 to enable it. In fact, when a user creates a new account, the default setting disables the two-factor
6 authentication. 371
7 520. Furthermore, TikTok allows more than one device to be simultaneously logged into a
8 single account, allowing multiple predators to use one “Post-in-Private” account simultaneously.
9 521. ByteDance’s “Post-in-Private” account features also facilitate the grooming of children
10 and adolescents by adult predators. Adult predators can store CSAM videos in “Your Private
11 Videos” and then show them to adolescent users as a grooming tool. Should adult predators
12 convince adolescent users to create CSAM of themselves in the “Post-in-Private” accounts, the
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 “Your Private Videos” feature makes it easy for the videos to be produced, uploaded, and stored.
14 522. Another unreasonably dangerous feature of TikTok is its livestream product, “TikTok
15 LIVE.” Although ByteDance’s policy restricts access for anyone under eighteen to “TikTok LIVE,”
16 TikTok’s design, as discussed above, does not incorporate an age verification protocol, so it is easy
17 for underage users to access this feature. 372
18 523. Within “TikTok LIVE” is another feature called “LIVE Gifts” for “viewers to react and
19 show their appreciation for [] LIVE content in real-time.” 373 TikTok then awards “Diamonds” to
20 LIVE creators based on the popularity of their content. One way for creators to collect “Diamonds”
21 is to receive Gifts from viewers on [their] LIVE videos. Creators awarded “Diamonds” may obtain
22 a Reward Payment in money or in virtual items. 374
23 524. ByteDance’s design of the “LIVE Gifts” and “Diamonds” rewards greatly increases the
24 risk of adult predators targeting adolescent users for sexual exploitation, sextortion, and CSAM.
25 369
Gracelynn Wan, These TikTok Accounts Are Hiding Child Sexual Abuse Material In Plain Sight, Forbes (Nov. 14,
26 2022).
370
TikTok Terms of Service.
27
371
How your email and phone number are used on TikTok, TikTok.
372
What is TikTok LIVE?, TikTok.
28
373
LIVE Gifts on TikTok, TikTok.
374
Id.
- 115 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 According to Leah Plunket, an assistant dean at Harvard Law School, “TikTok LIVE” is “the digital
2 equivalent of going down the street to a strip club filled with 15-year-olds.” 375 “Livestreams on
3 [TikTok] are a popular place for men to lurk and for young girls—enticed by money and gifts—to
4 perform sexually suggestive acts.” 376
5 525. Another of TikTok’s unreasonably dangerous features enables predators to communicate
6 privately with youth, with virtually no evidence of what was exchanged. The private messaging or
7 “Direct messaging” feature allows a user to send a direct private message to another user. Predators
8 use these messages to identify children willing to respond to a stranger's message and then prey on
9 the child’s vulnerabilities.
10 526. Although TikTok’s features enable predators, TikTok does not have any feature to allow
11 users to specifically report CSAM. 377
12 527. Federal law mandates that ByteDance report suspected CSAM to NCEMC under 18
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 U.S.C. § 2258A. To limit and avoid its reporting requirements under federal law, ByteDance
14 purposely designed its products—which it knows are used by children, including children under
15 13—not to incorporate modern CSAM detection technology. This technology would be free for
16 ByteDance to implement within its product design.
17 528. Furthermore, in violation of 18 U.S.C. § 2258A, ByteDance knowingly fails to report
18 massive amounts of material in violation of 18 U.S.C. § 2256 and 18 U.S.C. § 1466A.
19 529. ByteDance knowingly fails to take feasible, adequate, and readily available measures to
20 remove these contraband materials from its product in a timely fashion.
21 530. ByteDance made approximately 596 reports to NCMEC in 2019 and 22,692 reports in
22 2020. 378 However, ByteDance failed to report materials, violating the reporting requirements of 18
23 U.S.C. § 2258A in 2019.
24 531. Users have reported “Post-in-Private” CSAM videos to TikTok, and ByteDance
25 responded that no violations of its policy were found. One user searched for and contacted multiple
26 375
Alexandra Levine, How TikTok Live Became a Strip Club Filled with 15 Year Olds, Forbes (Apr. 27, 2022).
27
376
Id.
377
Canadian Centre for Child Protection, Reviewing Child Sexual Abuse Material Reporting Functions on Popular
28 Platforms.
378
Community guidelines enforcement report, TikTok (2022).
- 116 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 TikTok employees to sound the alarm that CSAM was being created and shared within TikTok’s
2 “Post-in-Private” accounts. This user did not receive a single response to her concerns. 379
3 532. ByteDance nonetheless continues to make false representations that they will “take
4 immediate action to remove content, terminate accounts, and report cases to NCMEC and law
5 enforcement as appropriate.” 380
6 533. ByteDance gains revenue for every daily user on TikTok in North America. Each user
7 and their data are worth income, and ByteDance continues to benefit financially from predators
8 who commit sexual abuse against children and/or share CSAM using ByteDance’s product.
9 5. ByteDance failed to adequately communicate the harms its product causes or
to provide instructions regarding safe use.
10

11 534. Since TikTok’s inception, ByteDance has misrepresented, omitted, downplayed, and

12 failed to adequately warn young users about the physical and mental health risks its product poses.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

These risks include, but are not limited to, product abuse and addiction, sexual exploitation from
LOS ANGELES

13

14 adult users, dissociative behavior, damage to body image, social isolation, and a plethora of mental

15 health disorders like body dysmorphia, eating disorders, anxiety, depression, insomnia,

16 ADD/ADHD exacerbation, suicidal ideation, self-harm, suicide, and death. ByteDance knew of

17 these significant risks, but deceptively and fraudulently omitted, downplayed, or misled consumers

18 and the Band regarding these risks.

19 535. ByteDance targets young users via advertising and marketing materials distributed

20 throughout traditional as well as digital media, including other social media products. ByteDance

21 fails to adequately communicate or to provide adequate warnings in advertising and marketing

22 campaigns to potential adolescent consumers of the physical and mental harms associated with

23 using TikTok.

24

25

26

27 379
Gracelynn Wan, These TikTok Accounts Are Hiding Child Sexual Abuse Material In Plain Sight, Forbes (Nov. 14,
28 2022).
380
Protecting Against Exploitative Content, TikTok.
- 117 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 536. ByteDance heavily advertises its product on YouTube and Snapchat, where it knows it
2 can effectively reach younger users. In 2019, for example, 80 percent of TikTok’s advertising
3 spending was on Snapchat. 381
4 537. One TikTok ad compiles viral videos featuring people of all ages and sets the video to
5 the pandemic musical hit “Bored in the House,” by a popular TikTok creator. The 15-second video,
6 titled “It Starts On TikTok,” notes, “if it’s in culture, it starts on TikTok.” 382 Zhu highlighted the
7 importance of the U.S. teen market to TikTok, admitting that in China, “teenage culture doesn’t
8 exist” because “teens are super busy in school studying for tests, so they don’t have the time and
9 luxury to play social media apps.” On the other hand, teen culture in the United States is “a golden
10 audience.” 383
11 538. Other advertisements ByteDance places on YouTube promote TikTok as a family-
12 friendly product. For example, one commercial features parents impersonating their children,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 explaining that “parents roasting their kids is the best kind of family bonding.” 384 Another TikTok
14 ad asks content creators what TikTok means to them. Responses include “family,” “sharing special
15 moments with my daughter,” and a featured appearance by well-known TikTok creator Addison
16 Rae, who says TikTok represents “family and fun.” 385
17 539. ByteDance released another TikTok ad, part of the “It Starts on TikTok” ad campaign,
18 and scheduled it to release on the linear TV, digital media, digital out-of-home, radio and TikTok’s
19 own social channels. 386 The tagline for the campaign was “[l]oving all of you and the things you
20 do. Celebrating you” and featured a series of viral clips of various cheerful scenes depicting people
21 gathered with friends and family of ages.
22 540. ByteDance is also one of the biggest advertisers on Snapchat. In 2019, ByteDance
23 accounted for 4.4% of Snapchat’s advertising revenue. 387 ByteDance knows that advertising on
24 381
TikTok – Snapchat’s Biggest Advertiser – What’s the Strategy, Media Radar (Feb. 24, 2020).
25
382
TikTok, It Starts on TikTok: Bored in the House, YouTube (Sept. 9, 2020).
383
Paul Mozur, Chinese Tech Firms Forced to Choose Market: Home or Everywhere Else, N.Y. Times (Aug. 9,
26 2016).
384
Family Impressions, Compilation, TikTok’s Official YouTube Page.
27
385
TikTok Creators Share Their Thoughts About TikTok, TikTok’s Official YouTube Page.
386
Todd Spangler, TikTok Launches Biggest-Ever Ad Campaign as Its Fate Remains Cloudy, Variety (Aug. 10,
28 2020).
387
Robert Williams, TikTok is the biggest advertiser on Snapchat, study says, MarketingDive (March 16, 2020).
- 118 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 Snapchat is an effective way to reach a young audience. Snap claims that its Snapchat product
2 reaches 90% of people aged 13-24 years old, and 75% of 13-34 year olds in the United States.
3 541. Despite its funny, cheerful ads featuring smiling families and funny images, TikTok, as
4 designed, presents serious risks to young users on the platform, through its distinctive and
5 manipulative product features, including a lack of adequate age and identity verification tools, as
6 well as inadequate parental controls.
7 542. ByteDance fails to adequately communicate or warn young users of these risks beginning
8 with the first stages of the product registration process. At account setup, TikTok contains no
9 warning labels, banners, or conspicuous messaging to adequately inform adolescent users of
10 product risks, potential dangers, and physical and mental harm associated with usage of the product.
11 Instead, ByteDance allows underage users to easily create an account (or multiple accounts) and
12 fully access the product.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 543. This deceptive conduct continues once a child has TikTok. ByteDance does not suitably
14 inform child users that their data will be tracked, used to help build a unique algorithmic profile,
15 and potentially sold to TikTok’s advertising clients.
16 544. Alarmingly, ByteDance also does not adequately communicate risks to young users
17 before facilitating adult connections and interactions that adult predators use its product.
18 545. ByteDance’s deceptive and unlawful conduct continues even if younger users display
19 signs of addiction or habitual and compulsive use. Besides the disabled by default “Take a Break”
20 reminder, ByteDance does communicate to users when their screen time reaches harmful levels or
21 when young users are accessing the product on a habitual basis.
22 546. Not only does ByteDance fail to adequately communicate to users about the risks
23 associated with TikTok, but it also does not provide sufficient instructions on how children can
24 safely use the product. A reasonable and responsible company would instruct children on best
25 practices and safety protocols when using a product known to contain danger and health risks.
26 547. ByteDance, however, fails to adequately communicate or warn users that:
27 a. sexual predators use its product to produce and distribute CSAM;
28
- 119 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 b. adult predators targeting children for sexual exploitation, sextortion, and CSAM are
2 prevalent on ByteDance’s product;
3 c. usage of its product can increase the risk of children being targeted and sexually
4 exploited by adult predators;
5 d. usage of its product can increase risky and uninhibited behavior in children, making
6 them easier targets to adult predators for sexual exploitation, sextortion, and CSAM;
7 and,
8 e. end-to-end encryption and/or the ephemeral nature of ByteDance’s direct messaging
9 product prevents the reporting of CSAM.
10 548. ByteDance failed to adequately communicate or warn parents about all of the foregoing
11 dangers and harms.
12 549. Making matters worse, ByteDance tells Tribal consumers in Apple’s App Store that
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 TikTok is rated “12+” (for users 12 and older) because it contains only “infrequent/mild” “profanity
14 and crude humor,” “sexual content or nudity,” “alcohol, tobacco, and drug use or references,”
15 “cartoon or fantasy violence,” and “mature/suggestive themes.” TikTok knows intends that all these
16 representations will be conveyed to Tribal consumers. These representations are false. As discussed
17 above, TikTok hosts a vast library of videos with profanity, sex, illegal drugs, and other content
18 parents would not expect to find on a “12+” app. Such content is visible and even recommended to
19 users that TikTok knows to be as young as 13 years old. Moreover, as discussed extensively above,
20 use of TikTok among adolescents even 12+ is harmful to the developing brain and, as currently
21 designed, not appropriate for use by the age group.
22 F. GOOGLE MARKETS AND DESIGNS YOUTUBE TO ADDICT YOUNG
USERS, SUBSTANTIALLY CONTRIBUTING TO THE MENTAL HEALTH
23 CRISIS.
24

25 550. Eric Schmidt, the former CEO of Google and more recently, Alphabet, YouTube’s

26 corporate parent, recently acknowledged the powerful, and purposeful, addictive effect of social

27 media. Social media products are about “maximizing revenue,” Mr. Schmidt said, and the best way

28 to maximize revenue is to “maximize engagement.” As Mr. Schmidt continued, in pursuit of their


- 120 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 goal of maximizing engagement to increase revenues, social media products “play into the
2 addiction capabilities of every human.” 388
3 551. Google’s YouTube product is no exception. It includes specific, carefully calibrated
4 features that are known to exploit the mental processes of its users to keep them engaged for as
5 long, as frequently, and as intensely as possible. Google knows that children and teenagers who
6 flock in droves to its YouTube product are particularly susceptible to these features. The impact of
7 YouTube’s addictive power on American youth has been devastating.
8 1. Background and overview of YouTube.

9 552. YouTube is a social media product that allows users to post and consume countless hours
10 of video content about virtually any topic imaginable. YouTube is available without any age
11 verification feature or adequate parental controls, and comes pre-installed in many Smart-TVs,
12 mobile devices, various digital media players like Roku, and video game consoles like PlayStation,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Wii, X-box and Nintendo.


14 553. YouTube allows users to search for specific video content. It also employs a powerful
15 algorithm that exploits detailed user information to target each individual user with hours upon
16 hours of videos recommended by YouTube.
17 554. A group of design experts and computer scientists created YouTube and launched the
18 product for public use in December 2005.
19 555. Technology behemoth Google quickly recognized YouTube’s huge profit potential. In
20 2006, just a year after YouTube’s launch, Google acquired YouTube for more than $1.65 billion in
21 Google stock. At the time, Google’s acquisition of YouTube was one of the largest-ever tech
22 acquisitions.
23 556. YouTube primarily generates revenue by selling advertising. The more people who use
24 YouTube and spend time on the site, the more ads YouTube can sell. 389 The ads are then embedded
25 or placed within the endless stream of videos recommended to the user by YouTube’s algorithm.
26

27
Issie Lapowsky, Eric Schmidt: Social Media Companies ‘Maximize Outrage’ for Revenue, Protocol (Jan. 6, 2022).
388

28 Mark Bergen, YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, Bloomberg (Apr. 2,
389

2019).
- 121 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 557. By 2012, YouTube users were watching close to four billion hours of video every month.
2 Yet, the average YouTube user spent just fifteen minutes daily engaged with the product. 390 Users
3 “were coming to YouTube when they knew what they were coming to look for.” 391 They employed
4 the product to identify and watch certain video content, and then they were done.
5 558. To drive greater revenue, “YouTube . . . set a company-wide objective to reach one
6 billion hours of viewing a day[.]” 392
7 559. As Susan Wojcicki, YouTube’s CEO explained, the goal of a “billion hours of daily
8 watch time gave our tech people a North Star.” 393
9 560. Google decided that “the best way to keep eyes on the site” was to introduce a feature
10 that would “[recommend] videos, [that were playing] or after one was finished.” 394
11 561. That new product feature uses a recommendation algorithm to identify and push
12 additional videos to users, which YouTube plays automatically, through a feature called “autoplay.”
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Autoplay begins the next video as soon as the previous videos ends, creating a constant stream of
14 content.
15 562. Google’s design changes worked. Today, YouTube “has over 2 billion monthly logged-
16 in users.” 395 And that 2 billion figure does not capture all product usage because YouTube, by
17 design, allows users to consume videos without logging in or registering an account.
18 2. Google intentionally encourages youth to use YouTube and then leverages that
use to increase revenue.
19

20 563. Google knows that children and teenagers use YouTube in greater proportions than older

21 demographics. YouTube now ranks as the world’s most popular social media product for minors.

22 According to one recent report, more than 95% of children ages 13-17 have used YouTube. 396

23

24

25
390
John Seabrook, Streaming Dreams: YouTube Turns Pro, New Yorker (Jan. 16, 2012).
391
Casey Newton, How YouTube Perfected the Feed, Verge (Aug. 30, 2017).
26
392
Mark Bergen, YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, Bloomberg (Apr. 2,
2019).
27
393
Id.
394
Id.
28
395
YouTube for Press, YouTube.
396
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
- 122 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 Nearly 20% of U.S. teens use YouTube almost constantly. 397 Among U.S. teenagers who regularly
2 use social media, 32% “wouldn’t want to live without” YouTube. 398
3 564. Rather than ensuring minors are not inappropriately or excessively using YouTube,
4 Google has sought to dominate their attention.
5 565. YouTube’s age controls are unreasonably dangerous (or non-existent, since registration
6 is not required). In addition, Google has developed and marketed a version of YouTube, YouTube
7 Kids, explicitly targeted at children under 13. Google developed this product to encourage early—
8 and therefore lasting—adoption of YouTube by children.
9 566. Google knows that a robust and committed base of young users is key to maximizing
10 advertising revenue. Indeed, it has aggressively touted its hold on child users to advertisers.
11 567. In 2014, for example, Google pitched its YouTube product to Hasbro, a popular toy
12 manufacturer, and specifically boasted of the product’s immense popularity among children, noting
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 that it was “unanimously voted as the favorite website of kids 2-12” and that “93% of tweens” use
14 the product. 399
15 568. In 2015, Google gave a similar presentation to toy manufacturer Mattel, the maker of
16 Barbie and other popular kids’ toys, highlighting children’s widespread use of YouTube to persuade
17 Mattel to display digital ads on the site. 400
18 569. The FTC has aptly summarized Google’s pitch to advertisers concerning the value of its
19 youth user base. 401 For example, Google boasted that YouTube “is today’s leader in reaching
20 children age 6-11;” “the new ‘Saturday Morning Cartoons’;” “unanimously voted as the favorite
21 website of kids 2-12;” “the #1 website regularly visited by kids;” and used by “93% of tweens.” 402
22

23 397
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
24
398
Victoria Rideout et al., Common Sense Census: Media Use by Tweens and Teens, 2021 at 31, Common Sense
Media (2022).
25
399
Complaint for Permanent Injunction, Civil Penalties, and Other Equitable Relief, FTC v. Google LLC et al., No. 1-
19-cv-02642-BAH, at 6 (D.D.C. Sept. 4, 2019) Dkt. #1-1.
26
400
Id.
401
Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law, FTC (Sept.
27 4, 2019). (“YouTube touted its popularity with children to prospective corporate clients”, said FTC Chairman Joe
Simons.)
28
402
Complaint for Permanent Injunction, Civil Penalties, and Other Equitable Relief, FTC v. Google LLC et al., No. 1-
19-cv-02642-BAH, at 3,12, and 6-7 (D.D.C. Sept. 4, 2019) Dkt. #1-1.
- 123 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 570. Many of YouTube’s most-viewed videos are kid-focused, and the most subscribed and
2 highest paid YouTubers are children. With over 12 billion views, “Baby Shark Dance,” a video
3 aimed at toddlers, is the most viewed video in the history of YouTube– and it and five other child-
4 focused videos make up the top ten YouTube videos of all time. 403 Child creators also dominate
5 top-earner lists year after year. Ryan Kaji of Ryan’s World (f/k/a Ryan ToysReview), a channel
6 featuring now 12-year-old Ryan Kaji unboxing children’s toys, has been among YouTube’s Top
7 10 most-subscribed channels in the United States since 2016. 404 Ryan started Ryan’s World in 2015
8 when he was only 3. By 2017, his videos had over 8 billion views, and by 2018, he was the highest-
9 earning YouTuber in the world. 405
10 571. As with other defendants, once Google lures children in, it then mines them (and all other
11 users) for a breathtaking amount of data. Google’s current privacy policy, which includes the
12 YouTube product’s data collection, reveals how sweeping this data collection is. It states that
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 Google tracks:
14 a. “information about the apps, browsers, and devices you use to access Google
services . . . include[ing] unique identifiers, browser type and settings, device type
15 and settings, operating system, mobile network information including carrier name
and phone number, and application version number. We also collect information
16 about the interaction of your apps, browsers, and devices with our services,
17 including IP address, crash reports, system activity, and the date, time, and referrer
URL of your request.”
18

19 b. “your activity in our services . . . includ[ing] terms you search for[;] videos you
watch[;] views and interactions with content and ads[;] voice and audio
20 information[;] purchase activity[;] people with whom you communicate or share
content[;] activity on third-party sites and apps that use our services[;] and Chrome
21
browsing history you’ve synced with your Google Account.”
22

23 c. “Your location information [including] GPS and other sensor data from your
device[;] IP address[;] activity on Google services, such as your searches and places
24

25

26
403
Most Viewed Videos of All Time • (Over 700M views) - YouTube.
404
Madeline Berg, The Highest-Paid YouTube Stars of 2019: The Kids Are Killing It, Forbes (Dec. 18, 2019);
27 Madeline Berg, The Highest-Paid YouTube Stars 2017: Gamer DanTDM Takes The Crown With $16.5 Million,
Forbes (Dec. 7, 2017).
28
405
Gamer DanTDM Takes The Crown With $16.5 Million, Forbes (Dec. 7, 2017); Natalie Robehmed & Madeline
Berg, Highest-Paid YouTube Stars 2018: Markiplier, Jake Paul, PewDiePie And More, Forbes (Dec. 3, 2018).
- 124 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 you label like home or work[;] [and] information about things near your device, such
as Wi-Fi access points, cell towers, and Bluetooth-enabled devices;” 406
2

3 572. Google’s privacy policy also indicates that, like other Defendants, it purchases data about
4 its users from data brokers, which it euphemistically refers to as “trusted partners” or “marketing
5 partners.” 407
6 573. As with other Defendants, YouTube’s collection and analysis of user data allows it to
7 assemble virtual dossiers on its users, covering hundreds if not thousands of user-specific data
8 segments. This, in turn, allows advertisers to micro-target marketing and advertising dollars to very
9 specific categories of users, who can be segregated into pools or lists using YouTube’s data
10 segments. Advertisers purchase ad real estate space on users’ feeds, which allow them to place the
11 right ads in front of these micro-targeted segments of users--including children, both in the main
12 YouTube frame and in the YouTube Kids product. Only a fraction of these data segments come
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 from content knowingly designated by users for publication or explicitly provided by users in their
14 account profiles. Instead, many of these data segments are collected by YouTube through
15 surveillance of each user’s activity while using the product and even when logged off the
16 product. 408
17 574. As with Meta, Google’s data policy does not inform users that the more time individuals
18 spend using YouTube, the more ads Google can deliver and the more money it can make, or that
19 the more time users spend on YouTube, the more YouTube learns about them, and the more it can
20 sell to advertisers the ability to micro-target highly personalized ads.
21 575. Google’s secret virtual dossiers on its users, including child users, fuel its algorithms.
22 The company relies on this data—including data plainly reflecting use by children—to train its
23 algorithms. A Google engineer explained in a 2014 presentation:
24 What do I mean by a training example? It’s a single-user experience.
On YouTube, perhaps it’s that one [Thomas the Tank Engine]
25 webpage my son saw six months ago, along with all the
recommendations that we showed him. We also record the outcome
26
to know whether the recommendations we made are good or whether
27 406
Information Google Collects.
28
407
Id.
408
About Targeting for Video Campaigns, Google.
- 125 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 they’re bad. That’s a single training exercise. On a large property,
you can easily get into hundreds of billions of these. 409
2

3 The engineer illustrated this with a slide, excerpted below, presenting how algorithmic analysis
4 both structured the format of recommendations of Thomas the Tank Engine YouTube videos and
5 provided information to inform algorithmic training through user engagement:
6 576. Through these and other efforts, YouTube has delivered massive amounts of advertising
7 revenue to Google. In 2021 alone, YouTube generated about $29 billion in revenue selling ads on
8 its site. 410
9 3. Google intentionally designed product features to addict children and
adolescents.
10

11 577. Google devised and continues to employ interrelated product features to increase usage
12 and maximize engagement by teenagers and children. Simply put, YouTube’s product features are
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 engineered to induce excessive use and to addict adolescents and children to the product.
14 a. Google’s age-verification measures and parental controls are unreasonably
dangerous.
15

16 578. Google’s strategy to entrench minor users begins with access. The company purports to

17 impose a minimum age requirement and claims to verify the age of its users. But those features are

18 unreasonably dangerous, as they do little to prevent children and teenagers from using the product.

19 579. Anyone with access to the Internet, regardless of age, can use YouTube and access every

20 video available through the product without registering an account or verifying their age. YouTube

21 does not even ask for age information before allowing users to consume YouTube videos.

22 580. A user needs an account to post content or like (or comment) on videos. But to get one,

23 a user needs only enter a valid email address and a birthday. Google does nothing to verify the

24 birthday entered by users in the U.S.—and the product freely permits users to change their birthdays

25 in their account settings after creating an account.

26

27 Alex Woodie, Inside Sibyl, Google’s Massively Parallel Machine Learning Platform, Datanami (Jul. 17, 2014).
409

Andrew Hutchinson, YouTube Generated $28.8 Billion in Ad Revenue in 2021, Social Media Today (Feb. 2,
410

28 2021); Jennifer Elias, YouTube Is a Media Juggernaut That Could Soon Equal Netflix in Revenue, CNBC (Apr. 27,
2021).
- 126 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 581. YouTube’s unreasonably dangerous age verification feature means that Google fails to
2 protect children from other product features discussed below that Google knows to be harmful to
3 kids.
4 582. For example, for users 13-17, Google claims to disable YouTube’s autoplay feature.
5 However, that measure is virtually meaningless because children can use YouTube without logging
6 into any account or by logging in but misreporting their age.
7 583. Even if children use YouTube Kids, that product contains many of the same
8 unreasonably dangerous design features YouTube does, including a harmful, manipulative
9 algorithm, as alleged below.
10 584. Google cannot credibly claim that it is unaware of the fact and extent of youth usage of
11 YouTube. Google’s system can “identify children as being much younger than 13.” 411 According
12 to Tracking Exposed, YouTube can rapidly identify a user as a child. 412
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 585. Google engineers have publicly admitted YouTube’s algorithm tracks user age. As
14 Google engineers outlined in a 2016 paper on YouTube’s recommendation system, “[d]emographic
15 features are important for providing priors so that the recommendations behave reasonably for new
16 users. The user’s geographic region and device are embedded and concatenated. Simple binary and
17 continuous features such as the user's gender, logged-in state and age are input directly into the
18 network as real values normalized to [0; 1].” 413
19 586. The Tracking Exposed Report indicated that there was “strong evidence” that Google’s
20 systems continue to refine and develop a more precise estimate for under 18 users, but the product
21 does not “redirect them to YouTube Kids.” 414
22 b. YouTube is unreasonably and dangerously designed to inundate users with
features that use intermittent variable rewards and reciprocity.
23

24 587. Google uses a series of interrelated design features that exploit known mental processes

25 to induce YouTube’s users to use the product more frequently, for more extended periods, and with

26

27
411
Tracking Exposed, Report: Non-Logged-In Children Using YouTube at 6 (Apr. 2022).
412
Id.
28
413
Paul Covington et al., Deep Neural Networks for YouTube Recommendations, Google (2016).
414
Tracking Exposed, Report: Non-Logged-In Children Using YouTube at 6, 19 (Apr. 2022).
- 127 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 more intensity (i.e., providing more comments and likes). Google knows children and adolescents,
2 whose brains are still developing, are particularly susceptible to these addictive features.
3 588. Google designed its product so that when children and teenagers use it, they are
4 inundated with interface design features specifically designed to dominate their attention and
5 encourage excessive use. Every aspect of how YouTube presents the format of a given page with a
6 video is structured to ensure unimpeded viewing of the videos, alongside download, like, and share
7 buttons, plus recommendations for more videos to watch. The organization of these features is
8 carefully calibrated to adjust to the space constraints of a user’s device, such that minimal effort is
9 needed to watch a video unimpeded. YouTube even has an ambient mode that uses dynamic color
10 sampling so that the YouTube product adapts to the video being watched and the user is not
11 distracted by the video’s borders. 415
12 589. Like the other Defendants, Google has designed YouTube with features that exploit
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 neuropsychology to maximize the time users (including children) spend using the product.
14 590. IVR features, such as notifications and likes, compel YouTube content creators and
15 consumers, particularly children, to use the product habitually and excessively. For example, in
16 order to create and upload content to YouTube, a user under 13 may submit a fictitious birthdate in
17 order to gain access to posting privileges. Once the young user has a logged–in account, they are
18 capable of receiving notifications and likes. For example, the logged in user can subscribe to
19 various YouTube channels, which in turn will send them notifications from various channels they
20 follow. Similarly, young content creators who upload videos to YouTube are able to track the likes
21 received by the video. These features psychologically reward creators who upload videos to
22 YouTube. As explained above, receiving a “Like” shows others’ approval and activates the brain’s
23 reward region. 416 Thus, users’ ability to like content encourages creators to use the product
24 compulsively, seeking additional pleasurable experiences.
25

26

27
YouTube rolling out black dark theme, ‘Ambient Mode,’ and other video player updates (Oct. 24, 2022).
415

28 See, e.g., Lauren E. Sherman et al., The Power of the Like in Adolescence: Effects of Peer Influence on Neural and
416

Behavioral Responses to Social Media, 27(7) Psych. Sci. 1027–35 (July 2016).
- 128 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 591. Another YouTube design feature is the design Google engineers deploy to induce “flow”
2 state among users, which as described above is unreasonably dangerous to children because it
3 induces excessive use and poses a risk of addiction, compulsive use, and sleep deprivation.
4 592. YouTube uses two design features that induce flow state. The first is its panel of
5 recommended videos. YouTube recommends videos both on the home page and on each video page
6 in the “Up Next” panel. 417 This panel pushes an endless stream of videos that YouTube’s algorithm
7 selects and “suggests” to keep users watching by teasing a pipeline of upcoming content.
8 593. The second feature is autoplay, which complements the Up Next panel and seamlessly
9 takes users through the list of upcoming videos without users having to affirmatively click on or
10 search for other videos. This constant video stream—comprised of videos recommended by
11 YouTube’s algorithm—is the primary way Google increases the time users spend using its product.
12 This endless video succession induces users to enter a flow state of consumption, which is
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 particularly dangerous for children.


14 594. In an April 2021 letter to YouTube CEO Susan Wojcicki, the House Committee on
15 Oversight and Reform criticized the autoplay feature:
16 This places the onus on the child to stop their viewing activity, rather
than providing a natural break or end point. Without that natural
17
stopping point, children are likely to continue watching for long
18 periods of time. 418

19 595. This unreasonably dangerous design feature is particularly acute for Google’s recently
20 launched YouTube Shorts. YouTube Shorts enables users to create short videos up to sixty seconds
21 in length, in a full-screen format popularized by TikTok and copied by Instagram Reels. As in Reels
22 and TikTok, Shorts are presented in an algorithmically generated feed; users can watch new videos
23 by swiping up on their smartphones. Instead of presenting videos chronologically, they are
24 organized in a manner to drive the most watch time, as dictated by the algorithm. Indeed, Google
25 hired TikTok’s North American head, Kevin Ferguson, and other TikTok engineers to develop
26

27
Recommended Videos, YouTube.
417

28 Letter from Rep. Raja Krishnamoorthi, Chairman, Subcomm. on Economic and Consumer Policy, to Susan
418

Wojcicki, CEO, YouTube (Apr. 6, 2021).


- 129 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 YouTube Shorts. 419 And much like those other products, the ability to scroll continuously through
2 YouTube Shorts content induces a “flow-state,” distorting users’ sense of time and facilitating
3 extended use.
4 596. An important target audience for YouTube Shorts is children. For example, YouTube
5 Shorts features content, such as child “influencers,” that appeals to children. YouTube Shorts also
6 includes similar unreasonably dangerous design features in other Defendants’ short form products,
7 including the ability to scroll continuously through YouTube Shorts, inducing a “flow-state” that
8 distorts users’ sense of time and facilitates extended use, and dangerous exploitation of “social
9 comparison” techniques by promoting misleadingly idealized portrayals from influencers and
10 others who are rewarded for posting popular material.
11

12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20 597. Almost immediately upon launch, Google began marketing YouTube Shorts to children.
21 For example, Google launched an advertisement featuring images of children and teenagers (like
22 in the screenshot below) engaging with the YouTube Shorts product.
23 598. Similarly, another advertisement for Shorts explains how creators on YouTube can keep
24 revenue generated by their Shorts viewership, while an image of a video creator young enough to
25 be in braces appears on screen. 420
26 599. Shorts is one of YouTube’s interrelated design features that exploit known mental
27

28
419
Richard Nieva, In the Age of TikTok, YouTube Shorts Is a Platform in Limbo, Forbes (Dec. 20, 2022).
420
Made on YouTube: New ways to join YPP, Shorts Monetization & Creator Music.
- 130 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 processes to induce YouTube users to use the product more frequently, for more extended periods,
2 and with more intensity (i.e., providing more comments and likes). Not surprisingly, given its
3 copycat origin, the unreasonably dangerous design features in Shorts replicate the same ones in
4 TikTok and Instagram Reels, discussed above. Google knows or should have known that children,
5 whose brains are still developing, are particularly susceptible to such addictive features.
6 600. YouTube has monetized users’ susceptibility to IVR by allowing creators who obtain
7 more than a thousand subscribers with four-thousand valid public watch hours to qualify for the
8 YouTube Partner Program. Once a creator obtains this elite status, they are rewarded with “Super
9 Chat” and “Super Stickers”—special images or distinct messages that other users can purchase and
10 place on a creator’s channel. 421 Paid messages, including the amount donated, are visible to all
11 users. And the more a user pays for these promotions, the more prominent and longer the image is
12 displayed. Both features are intended to allow a user to show support for, or connect with, their
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 favorite YouTube creators. Similar to the “Likes” feature, this paid support activates the reward
14 center of the content creator’s brain and releases dopamine while the creator is generating revenue
15 for YouTube.
16 c. Google’s algorithms are designed to maximize “watch time.”

17 601. YouTube began building its’ algorithms in 2008. 422 Its goal was to maximize how long
18 users spent watching YouTube videos. 423
19 602. These algorithms select videos that populate the YouTube homepage, rank results in user
20 searches, and push videos for viewers to watch through the “Up Next” feature.
21 603. YouTube designed its algorithms to manipulate users and induce them to use YouTube
22 excessively.
23 604. A former YouTube engineer explained that when he designed YouTube’s algorithm,
24 YouTube wanted to optimize for one key metric: “watch time.” 424 The engineer elaborated that
25

26 421
YouTube Partner Program: How to Make Money on YouTube.
27
422
Cristos Goodrow, On YouTube’s Recommendation System, YouTube (Sept. 15, 2021).
423
Ben Popken, As Algorithms Take Over, YouTube’s Recommendations Highlight a Human Problem, NBC (Apr.
28 19, 2018).
424
William Turton, How YouTube’s Algorithm Prioritizes Conspiracy Theories, Vice (Mar. 5, 2018).
- 131 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 “[i]ncreasing users’ watch time is good for YouTube’s business model” because it increases
2 advertising revenue. 425
3 605. In 2012 the YouTube Head of Content Creator Communications, similarly explained:
4 “When we suggest videos, we focus on those that increase the amount of time that the viewer will
5 spend watching videos on YouTube, not only on the next view, but also successive views
6 thereafter.” 426
7 606. The current algorithm uses deep-learning neural networks, a type of software that returns
8 outputs based on data fed into it. 427 The VP of Engineering at YouTube, explained that it is
9 “constantly evolving, learning every day from over 80 billion pieces of information [Google] calls
10 signals.” 428 Those signals include “watch and search history . . . , channel subscriptions, clicks,
11 watch time, survey responses, and sharing, likes, and dislikes.” 429 They also include user
12 demographic information like age and gender. 430
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 607. Google’s algorithm also “uses data from your Google Account activity to influence your
14 recommendations.” 431
15 608. The algorithm “develops dynamically” to predict which posts will hold the user’s
16 attention. 432 That is, it can also determine which “signals” are more important to individual users.
17 For example, if a user shares every video they watch, including those they rate low, the algorithm
18 learns to discount the significance of the user’s shares when recommending content. 433
19 609. Besides the algorithm’s self-learning capability, Google also consistently refines the
20 algorithm, updating it “multiple times a month.” 434
21

22

23
425
Jesselyn Cook & Sebastian Murdock, YouTube Is a Pedophile’s Paradise, Huffington Post (Mar. 20, 2020).
426
Eric Meyerson, YouTube Now: Why We Focus on Watch Time, YouTube (Aug. 10, 2012).
24
427
Alexis C. Madrigal, How YouTube’s Algorithm Really Works, Atlantic (Nov. 8, 2018); Paul Covington et al.,
Deep Neural Networks for YouTube Recommendations, Google (2016).
25
428
Cristos Goodrow, On YouTube’s Recommendation System, YouTube (Sept. 15, 2021).
429
Cristos Goodrow, On YouTube’s Recommendation System, YouTube (Sept. 15, 2021).
26
430
Paul Covington et al., Deep Neural Networks for YouTube Recommendations, Google (2016).
431
Manage Your Recommendations and Search Results, Google.
27
432
Cristos Goodrow, On YouTube’s Recommendation System, YouTube (Sept. 15, 2021).
433
Id.
28
434
Nilay Patel, YouTube Chief Product Officer Neal Mohan on The Algorithm, Monetization, and the Future for
Creators, Verge (Aug. 3, 2021).
- 132 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 610. In 2017, the former technical lead for YouTube recommendations explained that “one of
2 the key things [the algorithm] does is it’s able to generalize.” 435 While older iterations “were pretty
3 good at saying, here’s another [video] just like” ones the user had watched, by 2017, the algorithm
4 could discern “patterns that are less obvious,” identifying “adjacent relationships” of “similar but
5 not exactly the same” content. 436
6 611. Over time, the algorithm became increasingly successful in getting users to watch
7 recommended content. By 2018, YouTube Chief Product Officer Neal Mohan said that the
8 YouTube algorithm was responsible for more than 70% of users’ time using the product. 437 That
9 is, more than 70% of the time users spend on YouTube was from recommendations Google’s
10 algorithm pushed to them rather than videos identified by users through independent searches.
11 612. The algorithm also keeps users watching for longer periods. For instance, Mohan
12 explained that mobile device users watch for more than 60 minutes on average per session “because
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 of what our recommendations engines are putting in front of [them].” 438


14 613. The algorithm is particularly effective at addicting teenagers to the product. In 2022,
15 Pew Research Center found that “[a]bout three-quarters of teens visit YouTube daily, including
16 19% who report using the site or app almost constantly.” 439
17 614. A software engineer explained that the algorithm is “an addiction engine.” 440 He raised
18 concerns with YouTube staff, who said they had no intention to change the algorithms. After all,
19 the engineer explained, the algorithm works as intended: “it makes a lot of money.” 441
20 615. Since users watch more than one billion hours of YouTube videos daily and
21 approximately 70% of the time is spent on videos pushed to users by YouTube’s “recommendation
22 engine,” Google’s algorithms are responsible for hundreds of millions of hours users spend
23 watching videos on YouTube each day. 442
24 435
Casey Newton, How YouTube Perfected the Feed, Verge (Aug. 30, 2017).
25
436
Id.
437
Joan E. Solsman, YouTube’s AI Is the Puppet Master over Most of What You Watch, CNET (Jan. 20, 2018).
26
438
Id.
439
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022).
27
440
Mark Bergen, YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, Bloomberg (Apr. 2,
2019).
28
441
Id.
442
See Joan E. Solsman, YouTube’s AI Is the Puppet Master over Most of What You Watch, CNET (Jan. 10, 2018).
- 133 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 616. The videos pushed out to users by Google’s “recommendation engine” are more likely
2 to be addictive and more likely to lead to harm. For example, “fear-inducing videos cause the brain
3 to receive a small amount of dopamine,” which acts as a reward and creates a desire to do something
4 over and over. 443 That dopaminergic response makes it more likely that a user will watch the
5 harmful video, which the algorithm interprets as signaling interest and preference. Former Google
6 engineers told the Wall Street Journal that “[t]he algorithm doesn’t seek out extreme videos . . . but
7 looks for clips that data show are already drawing high traffic and keeping people on the site. Those
8 videos often tend to be sensationalist.” 444 An investigation by Bloomberg put it simply: “In the race
9 to one billion hours, a formula emerged: Outrage equals attention.” 445
10 617. Google’s algorithm makes it more likely for children to encounter harmful content by
11 pushing them down “rabbit holes,” which “[lead] viewers to incrementally more extreme videos or
12 topics, which . . . hook them in.” 446 For example, a user might “[w]atch clips about bicycling, and
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 YouTube might suggest shocking bike race crashes.” 447 In this way, the algorithm makes it more
14 likely that youth will encounter content that is violent, sexual, or encourages self-harm, among
15 other types of harmful content.
16 618. YouTube’s “recommendation engine” creates a vicious cycle in its ruthless quest to grow
17 view time. Users who get pushed down rabbit holes then become models for the algorithm. And
18 the algorithm consequently emphasizes that harmful content, disproportionately pushing it to more
19 users. That is, because Google designed the algorithm to “maximize engagement,” uncommonly
20 engaged users become “models to be reproduced.” 448 Thus, the algorithms will “favor the content
21 of such users,” which is often more extreme. 449
22

23

24
443
Josephine Bila, YouTube’s Dark Side Could be Affecting Your Child’s Mental Health, CNBC (Feb. 13, 2018).
444
Why is YouTube Suggesting Extreme and Misleading Content (2/7/2018); see also Josephine Bila, YouTube’s
25 Dark Side Could be Affecting Your Child’s Mental Health, CNBC (Feb. 13, 2018).
445
Mark Bergen, YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, Bloomberg (Apr. 2,
26 2019).
446
Max Fisher & Amanda Taub, On YouTube’s Digital Playground, an Open Gate for Pedophiles, NY Times (June
27 3, 2019).
447
Id.
28
448
Guillaume Chaslot, The Toxic Potential of YouTube’s Feedback Loop, Wired (Jul. 13, 2019).
449
Id.
- 134 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 619. The algorithm also makes extreme content less likely to get flagged or reported. As
2 Guillaume Chaslot explained, the algorithm becomes “more efficient” over time “at recommending
3 specific user-targeted content.” 450 And as the algorithm improves, “it will be able to more precisely
4 predict who is interested in [harmful or extreme] content.” 451 So “problems with the algorithm
5 become exponentially harder to notice, as [harmful] content is unlikely to be flagged or
6 reported.” 452
7 620. Even on YouTube Kids, Google’s product designed for children under 13 years old,
8 researchers from the Tech Transparency Project found that the product’s algorithm fed children
9 content related to drugs and guns, as well as beauty and diet tips that risked creating harmful body
10 image issues. For example, the researchers found videos speaking positively about cocaine and
11 crystal meth; instructing users, step-by-step, how to conceal a gun; explaining how to bleach one’s
12 face at home; and stressing the importance of burning calories. 453
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13

14

15

16

17

18

19

20

21

22

23

24

25

26 450
Id.
27
451
Id.
452
Id.
28
453
Guns, Drugs, and Skin Bleaching: YouTube Kids Poses Risks to Children, Tech Transparency Project (May 5,
2022).
- 135 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 621. Amy Kloer, a campaign director with the child safety group Parents Together, spent an
2 hour on her preschool-age child’s YouTube Kids account and found videos “encouraging kids how
3 to make their shirts sexier, a video in which a little boy pranks a girl over her weight, and a video
4 in which an animated dog pulls objects out of an unconscious animated hippo’s butt.” 454 Another
5 parent recounted how YouTube Kids autoplay feature led her 6-year-old daughter to “an animated
6 video that encouraged suicide.” 455
7 622. These are not isolated examples. According to Pew Research Center, 46% of parents of
8 children 11 or younger report that children encountered videos that were inappropriate for their
9 age. 456 And kids do not “choose” to encounter those inappropriate videos—YouTube’s algorithm—
10 its “recommendation engine”—directs and pushes them there. Again, YouTube’s algorithm is
11 responsible for 70% of the time users spend using the product. 457
12 623. Other reports have confirmed that YouTube’s algorithm pushes users towards harmful
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 conduct. In 2021, the Mozilla Foundation studied 37,000 YouTube users, finding that 71% of all
14 reported negative user experiences came from videos recommended to users by Google’s
15 algorithm. 458 And users were 40% more likely to report a negative experience from a video
16 recommended by YouTube’s algorithm than from one they searched for. 459 Importantly, videos
17 that elicited those negative experiences “acquired 70% more views per day than other videos
18 watched by [study] volunteers.” 460
19 624. Those unreasonably dangerous design features combine to compel children and
20 teenagers to overuse a product that feeds them harmful content, which in turn can adversely affect
21 mental health. One 10-year-old girl in the Mozilla Foundation study who sought “dance videos,
22 ended up encountering videos promoting extreme dieting.” 461 Her mother explained that “[s]he is
23

24 454
Rebecca Heilweil, YouTube’s Kids App Has a Rabbit Hole Problem, Vox (May 12, 2021).
25
455
Id.
456
Brooke Auxier et al., Parenting Children in The Age of Screens, Pew Rsch. Ctr. (July 28, 2020).
26
457
Joan E. Solsman, YouTube’s AI Is the Puppet Master over Most of What You Watch, CNET (Jan. 20, 2018).
458
YouTube Regrets: A Crowdsourced Investigations into YouTube’s Recommendation Algorithm, Mozilla
27 Foundation 13 (July 2021).
459
Id.
28
460
Id.
461
Id.
- 136 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 now restricting her eating and drinking.” 462 Another middle-schooler compulsively consumed
2 YouTube videos every day after she came home from school. 463 Eventually, she became depressed
3 and “got the idea to overdose online.” 464 Three weeks later, she “down[ed] a bottle of Tylenol.”
4 She landed in rehab for digital addiction due to her compulsive YouTube watching. 465
5 625. Those experiences are not unique. Mental health experts have warned that YouTube is a
6 growing source of anxiety and inappropriate sexual behavior among kids under 13 years old.
7 Natasha Daniels, a child psychotherapist, described treating children between 8 and 10 years old,
8 who were “found doing sexual things: oral sex, kissing and getting naked and acting out sexual
9 poses.” 466 This kind of behavior “usually indicates some sort of sexual abuse.” 467 Previously,
10 Daniels would typically “find a child who has been molested himself or that an adult has been
11 grooming the child from abuse.” 468 But “in the last five years, when I follow the trail all the way
12 back, it’s YouTube and that’s where it ends.” 469
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 626. Daniels has also seen increased rates of anxiety among children using YouTube. And
14 because of that anxiety, those children “exhibit loss of appetite, sleeplessness, crying fits, and
15 fear.” 470 Ultimately, she says, “YouTube is an ongoing conversation in my therapy practice, which
16 indicates there’s a problem.” 471
17 627. One study determined that using Google’s product was “consistently associated with
18 negative sleep outcomes.” 472 Specifically, for every 15 minutes teens spent using YouTube, they
19 were 24% less likely to get seven hours of sleep. According to Dr. Alon Avidan, director of the
20 UCLA Sleep Disorders Center, YouTube is particularly sleep disruptive because its
21 462
Id.
22
463
Lesley McClurg, After Compulsively Watching YouTube, Teenage Girl Lands in Rehab for ‘Digital Addiction’,
PBS (May 16, 2017).
23
464
Lesley McClurg, After Compulsively Watching YouTube, Teenage Girl Lands in Rehab for ‘Digital Addiction’,
PBS (May 16, 2017).
24
465
Lesley McClurg, After Compulsively Watching YouTube, Teenage Girl Lands in Rehab for ‘Digital Addiction’,
PBS (May 16, 2017).
25
466
Josephine Bila, YouTube’s Dark Side Could be Affecting Your Child’s Mental Health, CNBC (Feb. 13, 2018).
467
Id.
26
468
Id.
469
Id.
27
470
Id.
471
Id.
28
472
Meg Pillion et al., What’s ‘app’-ning to adolescent sleep? Links between device, app use, and sleep outcomes, 100
Sleep Med. 174–82 (Dec. 2022).
- 137 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 recommendation algorithm and autoplay features make it “so easy to finish one video” and watch
2 the next. 473 Similarly, a signal that the YouTube algorithm relies on is the ‘time of day’ a user is
3 watching—a signal that, when used to maximize length of duration with the YouTube product,
4 induces sleep deprivation. 474
5 628. Sleep deprivation is, in turn, associated with poor health outcomes. For example,
6 “insufficient sleep negatively affects cognitive performance, mood, immune function,
7 cardiovascular risk, weight, and metabolism.” 475
8 629. Compulsively consuming harmful content on YouTube can also harm brain
9 development. According to Donna Volpitta, Ed.D, “[c]hildren who repeatedly experience stressful
10 and/or fearful emotions may under develop parts of their brain’s prefrontal cortex and frontal lobe,
11 the parts of the brain responsible for executive functions, like making conscious choices and
12 planning ahead.” 476
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 630. Google’s algorithm also promotes the creation of and pushes children towards extremely
14 dangerous prank or “challenge” videos, which often garner thousands of “Likes,” adding to the
15 pressure children feel to participate. 477 For example, the YouTube algorithm repeatedly pushed 10-
16 year-old MDL plaintiff K.L.J. to videos of a viral prank called the “I Killed Myself Prank,” in
17 which children pretend to have committed suicide to record their loved ones’ reactions. When
18 K.L.J. eventually participated in the prank and tried to pretend to hang himself, he accidentally did
19 hang himself, suffering brain damage as a result. The neurological and psychological techniques
20 by which Google, like other Defendants, fosters excessive, addictive use of YouTube in turn foster
21 watching “challenge” videos.
22 631. Even though Google knew or should have known of these risks to its youth users,
23 Google’s product lacks any warnings that foreseeable product use could cause these harms.
24

25

26
473
Cara Murez, One App Is Especially Bad for Teens’ Sleep, U.S. News & World Rep. (Sept. 13, 2022).
474
YouTube, How YouTube Works.
27
475
Jessica C. Levenson et al., The Association Between Social Media Use and Sleep Disturbance Among Young
Adults, 85 Preventive Med. 36–41 (Apr. 2016).
28
476
Josephine Bila, YouTube’s Dark Side Could be Affecting Your Child’s Mental Health, CNBC (Feb. 13, 2018).
477
See, e.g., ViralBrothers, Revenge 9 – Cheating Prank Turns into Suicide Prank, YouTube (June 11, 2014).
- 138 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 632. And despite all the evidence that YouTube’s design and algorithms harm millions of
2 children, Google continues to manipulate users and compel them to use the product excessively, to
3 enhance Google’s bottom line. As a result, young people are confronted with more and more
4 extreme videos, often resulting in significant harm.
5 d. YouTube’s unreasonably dangerous features include impediments to
discontinuing use.
6

7 633. As with other Defendants, Google has intentionally, unreasonably, and dangerously

8 designed its products so that adolescent users face significant navigational obstacles and hurdles

9 when trying to delete or deactivate their accounts, in contrast to the ease with which users can create

10 those accounts.

11 634. First, because YouTube is accessible without a user needing to log in, YouTube users

12 cannot prevent themselves from being able to access YouTube by deleting their YouTube account.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

635. Second, YouTube accounts are linked to a user’s broader Google account. These
LOS ANGELES

13

14 accounts are structured such that, for a user to delete a YouTube account, the user must also delete

15 the user’s entire Google account. This means that if a YouTube user uses Google’s other products

16 those accounts will be lost as well. This structure holds hostage user data—if a child needs to keep

17 their email account through Google (for instance, if that is a requirement of their school), they

18 cannot delete their YouTube account, even if they want to. If a user stores family photos in Google

19 Photos, but wants to delete their YouTube account, they must choose between storage for their

20 photos or deleting their YouTube account. Similarly, if a user has purchased books or movies

21 through Google’s digital market Google Play, the user’s copy of those books or movies will be

22 deleted if the user deletes their Google account to rid themselves of YouTube. Google explicitly

23 threatens users with this consequence on the page where users can delete their account, listing every

24 associated account Google will delete and providing examples of the kinds of content that will be

25 deleted if a user does not back down from their desire to delete their YouTube account.

26 636. Third, Google intentionally designed its product so that to delete a user’s Google

27 account, a user must locate and tap on six different buttons (through six different pages and popups)

28 from YouTube’s main feed to delete an account successfully. This requires navigating away from
- 139 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 YouTube and into the webpages of other Google products. As with Meta, users are still able to
2 recover their accounts after deletion—though unlike Meta, Google does not tell users when their
3 accounts will become unrecoverable, simply threatening that they will soon after deletion.
4 4. Google facilitates the spread of CSAM and child exploitation.

5 637. Various design features of YouTube promote and dramatically exacerbate sexual
6 exploitation, the spread of CSAM, sextortion, and other socially maladaptive behavior that harms
7 children.
8 638. Google is required to comply with COPPA and obtain verifiable parental consent before
9 collecting personal information from children. It fails to do so. In 2019, the FTC and New York
10 Attorney General alleged in a federal complaint that Google and YouTube violated COPPA by
11 collecting personal information from children without verifiable parental consent. 478
12 639. Google and YouTube collected persistent identifiers that they used to track viewers of
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 child-directed channels across the Internet without prior parental notification, in violation of
14 Sections 1303(c), 1305(a)(1), and 1306(d) of COPPA. 479
15 640. Google and YouTube designed the child-centered YouTube Kids product. Despite its
16 clear knowledge of this channel being directed to children under 13 years old, Google served
17 targeted advertisements on these channels. 480
18 641. Google pays its users to create content because it benefits from increased user activity
19 and receives something of value for its YouTube Partner Program. 481
20 642. Google allows users to monetize its product to generate revenue for itself and its users,
21 including users that violate laws prohibiting the sexual exploitation of children.
22 643. According to its own guidelines, Google prohibits using its social media product in ways
23 that “[endanger] the emotional and physical well-being of minors.” 482
24

25

26
478
Fed. Trade Comm’n, Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s
Privacy Law (2022).
27
479
Id.
480
Id.
28
481
YouTube Partner Program overview & eligibility.
482
Child safety policy - YouTube help, Google.
- 140 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 644. Google represents that YouTube “has strict policies and robust operations in place to
2 tackle content and behavior that is harmful or exploitative to children.” 483
3 645. Google maintains that its guidelines prohibit images, videos, and comments that put
4 children at risk, “including areas such as unwanted sexualization, abuse, and harmful and dangerous
5 acts.” 484
6 646. While Google “may place an age restriction on the video,” 485 its product fails to
7 implement proper age-verification mechanisms to prevent minor users from accessing age-
8 restricted content, as discussed above.
9 647. Google fails to prevent collages of images and videos of children showing their exposed
10 buttocks, underwear, and genitals from racking up millions of views, on its product which are then
11 promoted and monetized by displaying advertisements from major brands alongside the content. 486
12 648. Through Google’s product, videos of minors revealing their “bathing suit hauls,” playing
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 in pools, beaches, waterparks, or performing gymnastics are recommended, shown, and promoted
14 to child predators who interact with these videos, including commenting to share “time codes for
15 crotch shots,” to direct others to similar videos, and to arrange to meet up on other social media
16 products to share and exchange CSAM. 487
17 649. Multiple YouTube channels dedicated to pre-teen models, young girls stretching, and
18 teen beauty are routinely oversexualized and manipulated by predators. 488
19 650. Google’s product recommends and promotes abusive behaviors towards children and
20 victimizes unsuspecting minors on a mass scale.
21 651. When users search for images and videos of minors, Google’s algorithm pushes
22 additional videos, which strictly feature children, and this recommended content often includes
23 promoted content for which Google receives value from advertisers.
24

25

26
483
Id.
484
Id.
27
485
Id.
486
K.G Orphanides, On YouTube, a network of pedophiles is hiding in plain sight WIRED UK (2019).
28
487
Id.
488
Id.
- 141 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 652. Users of Google’s product who search for images and videos of minors are further
2 inundated with comments from other predators that provide hyperlinks to CSAM and opportunities
3 to share CSAM on other products. 489
4 653. On average, Google pays its creators $0.50 to $6.00 per 1,000 views of any video they
5 create, including materials depicting minors in violation of 18 U.S.C. §§ 2252, 2252A, 1591, 1466,
6 and other criminal statutes. 490
7 654. Google actively participates and receives value for creating content on its product in
8 violation of 18 U.S.C. §§ 2252, 2252A, 1591, 1466, and other criminal statutes.
9 655. Google actively participates and receives value for creating content on its product in
10 violation of laws prohibiting the sexual exploitation of children.
11 656. Google maintains that it is “dedicated to stopping the spread of online child exploitation
12 videos.” 491 Yet, it fails to implement proper safeguards to prevent the spread of illegal contraband
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 on its product.
14 657. The troves of data and information about its users that Google collects enable it to detect,
15 report as legally required, and take actions to prevent instances of sexual grooming, sextortion, and
16 CSAM distribution, but it has failed to do so. Google continues to make false representations its
17 “teams work around-the-clock to identify, remove, and report this content.” 492
18 658. Google has proprietary technology, CSAI Match, that is supposed to combat CSAI
19 (Child Sexual Abuse Imagery) content online. This technology allows Google to identify known
20 CSAM contraband being promoted, shared, and downloaded on the YouTube product. Google’s
21 CSAI Match can identify which portion of the video matches known and previously hashed CSAM
22 and provide a standardized categorization of the CSAM. When a match is detected by Google using
23 CSAI Match, it is flagged so that Google can “responsibly report in accordance to local laws and
24 regulations.” 493
25

26 489
Id.
27
490
How YouTube creators earn money - how YouTube works, YouTube.
491
YouTube.
28
492
Google’s efforts to combat online child sexual abuse material.
493
Google’s efforts to combat online child sexual abuse material.
- 142 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 659. Despite this, Google routinely fails to flag CSAM and regularly fails to adequately report
2 known content to NCMEC and law enforcement and fails to takedown, remove, and demonetize
3 CSAM.
4 660. Separate from CSAM detection, Google also implements an automated system called
5 Content ID “to easily identify and manage [its] copyright-protected content on YouTube.” 494
6 Videos uploaded to YouTube are “scanned against a database of audio and visual content that’s
7 been submitted to YouTube by copyright owners,” and Google can block, monetize, and track that
8 material automatically. 495 Google only grants Content ID to copyright owners who meet its own
9 specific criteria, and these criteria categorically exclude CSAM victims. Google fails to use Content
10 ID systems to block, remove, demonetize, or report CSAM on its product.
11 661. In 2018, Google launched “cutting-edge artificial intelligence (AI) that significantly
12 advances [Google’s] existing technologies,” which Google claimed “drastically improved”
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 detection of CSAM that is distributed by its YouTube product. 496 These claims were false, and
14 misled parents and children into believing its product is safe for minors. Google failed to drastically
15 improve the frequency of CSAM detection, reports, and takedowns on its product.
16 662. Google claims that it will “continue to invest in technology and organizations to help
17 fight the perpetrators of CSAM and to keep our products and our users safe from this type of
18 abhorrent content.” 497 In reality, it fails to do so. Google fails to invest in adequate age verification
19 and continues to fail to remove CSAM from its product.
20 663. Google knows or should have known that YouTube facilitates the production,
21 possession, distribution, receipt, transportation, and dissemination of millions of materials that
22 depict obscene visual representations of the sexual abuse of children, or that violate child
23 pornography laws, each year.
24 664. Google knowingly fails to take adequate and readily available measures to remove these
25 contraband materials from its product in a timely fashion.
26 494
How Content ID Works – YouTube Help, Google.
27
495
Id.
496
Nikola Todorovic, Using AI to help organizations detect and report Child sexual abuse material online Google
28 (2018).
497
Id.
- 143 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 665. In violation of 18 U.S.C. § 2258A, Google knowingly fails to report massive amounts of
2 material in violation of 18 U.S.C. § 2256 and 18 U.S.C. § 1466A.
3 666. YouTube is polluted with illegal material that promotes and facilitates the sexual
4 exploitation of minors, and Google receives value in the form of increased user activity for the
5 dissemination of these materials on its products.
6 667. Google failed to report materials in violation of the reporting requirements of 18 U.S.C.
7 § 2258A. 498
8 668. Google knows that its product is unsafe for children and yet fails to implement
9 safeguards to prevent children from accessing its product.
10 669. Further, there is effectively no way for users to report CSAM on Google’s YouTube
11 product. YouTube does not allow users to specifically report any material posted on its product as
12 CSAM or child pornography. 499
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 670. YouTube Mobile does not provide any way to report users, including users who share
14 CSAM on its product. On the desktop, a viewer can report a user, but Google has made the reporting
15 function difficult to access. Furthermore, reporting requires a viewer to have a Google account and
16 be logged in to the account to make the report. 500
17 5. Google failed to adequately communicate the harm its products cause or
provide instructions regarding safe use.
18

19 671. Since YouTube’s inception, Google has misrepresented, omitted, downplayed, and

20 failed to adequately warn adolescent users about the physical and mental health risks its product

21 poses. These risks include, but are not limited to, product abuse, addiction, and compulsive use;

22 sexual exploitation from adult users; dissociative behavior; damage to body image; social isolation;

23 impaired brain development; and a plethora of mental health disorders like body dysmorphia, eating

24 disorders, anxiety, depression, insomnia, ADD/ADHD exacerbation, suicidal ideation, self-harm,

25

26

27
498
NCMEC, 2019 CyberTipline reports by Electronic Service Providers (ESP).
499
Canadian Centre for Child Protection, Reviewing Child Sexual Abuse Material Reporting Functions on Popular
28 Platforms.
500
Id.
- 144 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 and death. Google knew of these significant risks, but deceptively and fraudulently omitted,
2 downplayed, or misled consumers and the Band regarding these risks.
3 672. Google targets adolescent users via advertising and marketing materials distributed
4 throughout digital and traditional media products. Its advertising and marketing campaigns fail to
5 provide adequate warnings to potential adolescent consumers of the physical and mental risks
6 associated with using YouTube.
7 673. Google further fails to adequately communicate or warn adolescents during the product
8 registration process. At account setup, Google’s product contains no warning labels, banners, or
9 conspicuous messaging to adequately inform adolescent users of the known risks and potential
10 physical and mental harms associated with usage of its product. Instead, Google allows adolescents
11 to easily create an account (or multiple accounts), and to access YouTube with or without an
12 account.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 674. Google’s deceptive conduct continues once an adolescent uses YouTube. Google does
14 not adequately inform adolescent users that their data will be tracked, used to help build a unique
15 algorithmic profile, and potentially sold to Google’s advertising clients.
16 675. Google’s deceptive conduct continues even as adolescents exhibit problematic signs of
17 addictive, compulsive use of YouTube. Google does not adequately communicate or warn users
18 when their screen time reaches harmful levels or when adolescents are accessing the product on a
19 habitual and uncontrolled basis.
20 676. Not only does Google fail to adequately communicate or warn users regarding the risks
21 associated with YouTube, it also does not provide adequate instructions on how adolescents can
22 safely use its product. A reasonable and responsible company would instruct adolescents on best
23 practices and safety protocols when using a product known to pose health risks.
24 677. Google also fails to adequately warn users that:
25 a. sexual predators use YouTube to produce and distribute CSAM;
26 b. adult predators targeting young children for sexual exploitation, sextortion, and
27 CSAM are prevalent on YouTube;
28
- 145 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 c. usage of YouTube can increase the risk of children being targeted and sexually
2 exploited by adult predators; and,
3 d. usage of YouTube can increase risky and uninhibited behavior in children, making
4 them easier targets to adult predators for sexual exploitation, sextortion, and
5 CSAM.
6 678. Finally, Google tells Tribal consumers in Apple’s App Store that YouTube is rated “12+”
7 (for users 12 and older) because it contains only “infrequent/mild” “profanity and crude humor,”
8 “sexual content or nudity,” “alcohol, tobacco, and drug use or references,” “mature/suggestive
9 themes,” “realistic violence,” “simulated gambling,” “medical/treatment information,” “cartoon or
10 fantasy violence,” and “horror/fear themes.” Google knows intends that all these representations
11 will be conveyed to Tribal consumers. These representations are false. As discussed above,
12 YouTube hosts a vast library of videos with profanity, sex, illegal drugs, and other content parents
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 would not expect to find on a “12+” app. Such content is visible and even recommended to younger
14 users. Moreover, as discussed extensively above, use of YouTube among adolescents even 12+ is
15 harmful to the developing brain and, as currently designed, not appropriate for use by the age group.
16 G. IMPACT OF THE DEFENDANT-CREATED MENTAL HEALTH CRISIS ON
PLAINTIFF FOND DU LAC BAND OF LAKE SUPERIOR CHIPPEWA.
17

18 679. Defendants’ conduct has created a public health crisis in Plaintiff’s communities. There

19 has been a surge in the proportion of youth in Plaintiff’s community who are anxious, depressed,

20 or suicidal.

21 680. The increases in anxiety, depression among Plaintiffs’ youth has also contributed to other

22 problems, including delinquency and behavioral problems at home and in school.

23 681. The increase in Tribal youth that report suffering from anxiety and depression—and the

24 connection to Defendants’ platforms—have not gone unnoticed. Tribal youth leaders estimate that

25 a significant percentage of students are negatively impacted from social media exposure, that its

26 youth feel unable to manage their time and unable to control their impulse to use social media, and

27 they express fear and anxiety about what other students post about them on social media.

28
- 146 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 682. The Band has limited resources to put toward urgent priorities, including efforts to stem
2 the social media addiction and mental health crisis, which has had a particularly harmful impact on
3 the Band’s youth.
4 683. More and more of the Band’s resources are needed to combat these problems, leaving a
5 diminished pool of already-scarce resources to devote to positive societal causes like education,
6 cultural preservation, and other social programs.
7 684. Significant further resources will be required now and, in the future, to continue to
8 respond to the threat posed by Defendants’ products and to the addictive habits, mental health
9 issues, and delinquent behavior they have already caused. Specifically, the Band requires additional
10 resources to: hire additional personnel, including counselors and medical professionals to address
11 mental, emotional, and social health issues; develop additional resources to address mental,
12 emotional, and social health issues; increase training for Tribal leaders and members to identify
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 youth exhibiting mental, emotional, and social distress; educate Tribal leaders and members about
14 the harms caused by Defendants’ wrongful conduct; and develop lesson plans to teach youth about
15 the dangers of using Defendants’ platforms.
16 685. The Band cannot, by itself, fully address the existing youth mental health crisis in its
17 communities. Fully addressing the harms to the Band caused by Defendants’ conduct will require
18 a comprehensive approach. Without the resources to fund these measures such as those described
19 herein, the Band will continue to be harmed by the ongoing consequences of Defendants’ conduct.
20 686. The costs that the Band has incurred and will incur in the future in responding to the
21 harm caused by Defendants’ conduct and in providing the public services described in this
22 Complaint are recoverable pursuant to the causes of action raised by the Band.
23 687. Defendants’ misconduct alleged herein is not a series of isolated incidents, but instead
24 involves a sophisticated and intentional effort that has caused a continuing, substantial, and long-
25 term burden to the Band and its members.
26 688. Additionally, the public nuisance created by Defendants and the Band’s requested relief
27 in seeking abatement further compels Defendants to compensate the Band for the substantial
28
- 147 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 resources it will need to continue to expend to address the mental health crisis created by
2 Defendants’ misconduct.
3 689. The creation and maintenance of the youth mental health crisis directly harms the Band
4 by imposing costs on its members and territory. As a result of Defendants’ misconduct, the Band
5 has been, and will be, forced to go far beyond what a governmental entity would be expected to
6 pay to enforce the laws to promote the general health and welfare of the Band and its members.
7 This includes providing new programs and services in direct response to the damage caused by
8 Defendants’ misconduct.
9 690. Defendants’ actions and omissions have substantially, unreasonably, and injuriously
10 interfered with the functions and operations of the Band and have affected the public health, safety,
11 and welfare of the Band’s community. Without the youth mental health crisis that Defendants
12 caused, more time, money, and resources could have been used for the Band’s goal of increasing
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 the health and welfare of its members.


14 V. PLAINTIFF’S CLAIMS
15 COUNT I:
Public Nuisance
16
(Against all Defendants)
17
691. Plaintiff realleges and incorporates by reference each preceding and succeeding
18
paragraph as though set forth fully at length herein.
19
692. Minnesota Statutes, section 609.74 provides, in relevant part:
20
Whoever by an act or failure to perform a legal duty intentionally does any of the
21
following is guilty of maintaining a public nuisance, which is a misdemeanor: (1)
22 maintains or permits a condition which unreasonably annoys, injures or endangers
the safety, health, morals, comfort, or repose of any considerable number of
23 members of the public; or . . . (3) is guilty of any other act or omission declared by
law to be a public nuisance and for which no sentence is specifically provided.
24

25 693. The Band, its members, and especially its youth have a public right to be free from
26 interference with the public safety, health, comfort, or repose. The Band is empowered by law and
27 equity to allege a claim of, and seek redress for, a public nuisance. Plaintiff, in its capacity as a
28
- 148 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 sovereign and as parens patriae, has an important and unique interest in protecting the health and
2 safety of its community.
3 694. Through the deceptive and unlawful conduct described throughout this Complaint, each
4 Defendant has intentionally maintained or permitted, or was at the very least a substantial factor in
5 maintaining or permitting, a public nuisance that has annoyed, injured, and endangered—and will
6 continue to unreasonably annoy, injure, and endanger—the common right of public safety, health,
7 comfort, or repose of the Band and its members.
8 695. Each Defendant unlawfully committed an act or omitted to perform a duty by failing to
9 use reasonable care in the design, promotion, and operation of their platforms.
10 696. Each Defendant’s acts or omissions created or substantially contributed to the youth
11 mental health crisis, thereby annoying, injuring, and endangering the comfort, repose, health, and
12 safety of others, including the Band and its members.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 697. Each Defendant’s acts or omissions offend decency and render the Band and its members
14 insecure.
15 698. Defendants’ conduct has affected and continues to affect a substantial number of people
16 in the Band’s communities and is likely to continue causing significant harm. Defendants’ conduct
17 is ongoing and continues to produce permanent and long-lasting damage.
18 699. This harm to youth mental health and the corresponding impacts on the public health,
19 safety, and welfare of Plaintiff’s community outweigh any social utility of Defendants’ wrongful
20 conduct.
21 700. The rights, interests, and inconvenience to Plaintiff’s community far outweigh the rights,
22 interests, and inconvenience to Defendants, who have profited tremendously from their wrongful
23 conduct.
24 701. But for Defendants’ actions, Plaintiff’s youth would not use social media platforms as
25 frequently or continuously as they do today or be deluged with exploitive and harmful content to
26 the same degree, and the public health crisis that currently exists as a result of Defendants’ conduct
27 would have been averted.
28
- 149 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 702. Defendants’ conduct affects the entire Band. Defendants have a duty to, but have failed,
2 to abate the nuisance they created.
3 703. Logic, common sense, justice, policy, and precedent indicate Defendants’ unfair and
4 deceptive conduct has caused the damage and harm complained of herein. Defendants knew or
5 reasonably should have known that their design, promotion, and operation of their platforms would
6 cause youth to use their platforms excessively, that their marketing methods were designed to
7 appeal to youth, and that their active efforts to increase youth use of their platforms were causing
8 harm to youth, including youth in Plaintiff’s communities.
9 704. Thus, the public nuisance caused by Defendants was reasonably foreseeable, including
10 the financial and economic losses incurred by Plaintiff.
11 705. Defendants’ conduct was a substantial factor in bringing about the public nuisance. By
12 designing, marketing, promoting, and operating their platforms in a manner intended to maximize
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 the time youth spend on their respective platforms—despite knowledge of the harms to children
14 and teens from their wrongful conduct—Defendants directly facilitated the widespread, excessive,
15 and habitual use of their platforms and the public nuisance affecting Plaintiff’s youth. By seeking
16 to capitalize on their success by refining their platforms to increase the time youth spend on their
17 platforms, Defendants directly contributed to the public health crisis and the public nuisance
18 affecting Plaintiff’s youth.
19 706. Defendants’ conduct is especially injurious to Plaintiff because, as a direct and proximate
20 cause of Defendants’ conduct creating or assisting in the creation of a public nuisance, the Band
21 and its members have sustained and will continue to sustain substantial injuries.
22 707. Plaintiff has had to take steps to mitigate the harm and disruption caused by Defendants’
23 conduct.
24 708. As a direct result of the public nuisance Defendants have caused, the Band and its
25 members have directly and proximately suffered harm. Plaintiff requests all the relief to which it is
26 entitled in its own right and relating to the special damage or injury it has suffered, including in its
27 parens patriae capacity to protect the health, safety, and welfare of all members of the Band. This
28 includes actual and compensatory damages in an amount to be determined at trial and an order
- 150 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 providing for the abatement of the public nuisance that Defendants have created or assisted in the
2 creation of, and enjoining Defendants from future conduct contributing to the public nuisance
3 described above. All Defendants are jointly and severally liable for the public nuisance.
4 COUNT II:
5 Negligence and Gross Negligence
(Against all Defendants)
6

7 709. Plaintiff incorporates by reference all preceding paragraphs.

8 710. Defendants had a legal duty to act with the exercise of ordinary care or skill to prevent

9 injury to another under the common law and statute.

10 711. Defendants owed the Band and its members a duty to not expose it to unreasonable risk

11 of harm, and to act with reasonable care as a reasonably careful person and/or company would act

12 under the circumstances.


ROBINS KAPLAN LLP
ATTORNEYS AT LAW

712. Defendants also voluntarily undertook a legal duty by designing, marketing, and
LOS ANGELES

13

14 distributing unreasonably dangerous products that appeal to youth and by issuing public

15 misrepresentations and omissions regarding the risks of their products.

16 713. At all times relevant to this litigation, Defendants owed a duty to the Band and its

17 members to exercise reasonable care in the design, research, development, testing, marketing,

18 supply, promotion, advertisement, operation, and distribution of their social media products,

19 including the duty to take all reasonable steps necessary to design, research, market, advertise,

20 promote, operate, and/or distribute their platforms in a way that is not unreasonably dangerous to

21 users, including children.

22 714. At all times relevant to this litigation, Defendants owed a duty to the Band and its

23 members to exercise reasonable care in the design, research, development, testing, marketing,

24 supply, promotion, advertisement, operation, and distribution of their social media platforms,

25 including the duty to provide accurate, true, and correct information about the risks of using

26 Defendants’ platforms; and appropriate, complete, and accurate warnings about the potential

27 adverse effects of extended social media use, in particular, social media content Defendants

28 directed via their algorithms to users.


- 151 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 715. At all times relevant to this litigation, Defendants knew or, in the exercise of reasonable
2 care, should have known of the hazards and dangers of their respective social media platforms and
3 specifically, the health hazards their platforms posed to youth, especially extended or problematic
4 use of such platforms.
5 716. Accordingly, at all times relevant to this litigation, Defendants knew or, in the exercise
6 of reasonable care, should have known that use of Defendants’ social media platforms by the
7 Band’s youth could cause the Band and its members injuries and thus created a dangerous and
8 unreasonable risk of injury to the Band and its members.
9 717. Defendants also knew or, in the exercise of reasonable care, should have known that
10 users of Defendants’ social media platforms were unaware of the risks and the magnitude of the
11 risks associated with the use of Defendants’ platforms including but not limited to the risks of
12 extended or problematic social media use and the likelihood that algorithm-based recommendations
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 would expose child and adolescent users to content that is violent, sexual, or encourages self-harm,
14 among other types of harmful content.
15 718. There was an extremely high likelihood of Defendants’ behavior foreseeably causing a
16 substantial injury to Plaintiff, which in fact occurred.
17 719. As such, Defendants, by action and inaction, representation and omission, breached their
18 duty of reasonable care, failed to exercise ordinary care, and failed to act as a reasonably careful
19 person and/or company would act under the circumstances in the design, research, development,
20 testing, marketing, supply, promotion, advertisement, operation, and distribution of their social
21 media platforms, in that Defendants designed, researched, developed, tested, marketed, supplied,
22 promoted, advertised, operated, and distributed social media platforms that Defendants knew or
23 had reason to know would negatively impact the mental health of users, particularly youth, and
24 failed to prevent or adequately disclose these risks and injuries.
25 720. Despite their ability and means to investigate, study, and test their social media platforms
26 and to provide adequate warnings, Defendants have failed to do so. Defendants have wrongfully
27 concealed information and have made false and/or misleading statements concerning the safety and
28 use of Defendants’ social media platforms. Defendants breached their duty of care by:
- 152 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 a. designing, researching, developing, marketing, supplying, promoting, advertising,
2 operating, and distributing their social media platforms without thorough research
3 testing;
4 b. failing to sufficiently study and conduct necessary tests to determine whether or not
5 their social media platforms were safe for youth users;
6 c. failing to use reasonable and prudent care in the research, design, development,
7 testing, marketing, supply, promotion, advertisement, operation, and distribution of
8 their social media platforms so as to avoid the risk of encouraging extended social
9 media use;
10 d. designing their social media platforms to maximize the amount of time users spend
11 on the platform and causing excessive and problematic use of their platforms,
12 particularly among youth, through the use of algorithm-based feeds, social
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 reciprocity, and IVR;


14 e. failing to implement adequate safeguards in the design and operation of their
15 platforms to ensure they would not encourage excessive and problematic use of their
16 platforms;
17 f. designing and manufacturing their platforms to appeal to minors and young people
18 who lack the same cognitive development as adults and are particularly vulnerable
19 to social rewards like IVR and social reciprocity;
20 g. failing to take adequate steps to prevent their platforms from being promoted,
21 distributed, and used by minors under the age of 13;
22 h. failing to provide adequate warnings to child and adolescent users or parents who
23 Defendants could reasonably foresee would use their platforms;
24 i. failing to disclose to, or warn, Plaintiff, users, and the general public of the negative
25 mental health consequences associated with social media use, especially for children
26 and adolescents;
27

28
- 153 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 j. failing to disclose to Plaintiff, users, and the general public that Defendants’
2 platforms are designed to maximize the time users, particularly youth, spend on
3 Defendants’ platforms and cause negative mental health consequences;
4 k. representing that Defendants’ platforms were safe for child and adolescent users
5 when, in fact, Defendants knew or should have known that the platforms presented
6 acute mental health concerns for young users;
7 l. failing to alert users and the general public, including Plaintiff’s members, of the
8 true risks of using Defendants’ platforms;
9 m. advertising, marketing; and recommending Defendants’ platforms while concealing
10 and failing to disclose or warn of the dangers known by Defendants to be associated
11 with, or caused by, youth use of Defendants’ platforms;
12 n. continuing to design, research, develop, market, supply, promote, advertise, operate,
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 and distribute Defendants’ platforms with knowledge that Defendants’ platforms are
14 unreasonably unsafe, addictive, and dangerous to youth mental health;
15 o. failing to modify Defendants’ algorithms, which are used to recommend content to
16 users, in a manner that would no longer prioritize maximizing the amount of time
17 users spend on Defendants’ platforms over the safety of its youth users;
18 p. failing to adequately modify Defendants’ algorithm-based recommendations to
19 filter out content that expose child and adolescent users to content that is violent,
20 sexual, or encourages self-harm, among other types of harmful content; and
21 q. committing other failures, acts, and omissions set forth herein.
22 721. Defendants knew or should have known that it was foreseeable that the Band and its
23 members would suffer injuries as a result of Defendants’ failure to exercise reasonable care in
24 designing, researching, developing, testing, marketing, supplying, promoting, advertising,
25 operating, and distributing Defendants’ platforms, particularly when Defendants’ platforms were
26 designed, developed, operated and marketed to maximize the time youth spend on Defendants’
27 platforms. the Band and its members did not know and could not have known the nature and extent
28 of the injuries that could result from the intended use of Defendants’ social media platforms.
- 154 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 722. Defendants’ negligence helped to and did produce, and was the proximate cause of, the
2 injuries, harm, and losses that the Band and its members suffered and will continue to suffer, as
3 detailed above. Such injuries, harm, and losses would not have happened without Defendants’
4 negligence as described herein.
5 723. The mental health crisis caused and/or significantly contributed to by Defendants has
6 caused a major disruptive behavioral situation in Plaintiff’s communities and Plaintiff has had to
7 take steps to mitigate the harm and disruption caused by Defendants’ conduct.
8 724. Defendants conduct was also grossly negligent because Defendants acted recklessly,
9 willfully, and wantonly. Each knew of the substantial risk of harm that their platforms posed to
10 users’ mental health, particularly children and adolescents, yet engaged in that conduct anyway.
11 725. Defendants have acted with oppression, fraud, and malice, actual and presumed.
12 726. Defendants’ conduct, as described above, was intended to serve their own interests
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 despite having reason to know and consciously disregarding a substantial risk that their conduct
14 might significantly injure the rights of others, including the Band and its members, and/or
15 Defendants consciously pursued a course of conduct knowing that it created a substantial risk of
16 significant harm to others, including the Band and its members. Defendants regularly risk the health
17 of users of their platforms with full knowledge of the significant dangers of their platforms.
18 Defendants consciously decided not to redesign, warn, or inform the unsuspecting public, including
19 Plaintiff or its members.
20 727. As a result of Defendants’ negligence and gross negligence, the Band and its members
21 suffered harm to their real and personal property along with other economic losses, including the
22 costs associated with past, present, and future efforts to address, pay for and/or eliminate the youth
23 mental health crisis, in an amount to be proven at trial.
24 COUNT III:
Deceptive Trade Practices
25
Minn. Stat. § 325D.43 et seq.
26 (Against all Defendants)

27 728. Plaintiff incorporates by reference all preceding paragraphs.

28 729. Minnesota Statutes section 325D.44, subdivision 1 provides in part:

- 155 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1
A person engages in a deceptive trade practice when, in the course of business,
2 vocation, or occupation, the person:
3 (5) represents that goods or services have sponsorship, approval, characteristics,
ingredients, uses, benefits, or quantities that they do not have or that a person has a
4 sponsorship, approval, status, affiliation, or connection that the person does not
have;
5 (7) represents that goods or services are of a particular standard, quality, or grade,
or that goods are of a particular style or model, if they are of another; and
6 (14) engages in any other conduct which similarly creates a likelihood of confusion
7 or of misunderstanding
730. Defendants are “persons” within the meaning of Minnesota Statutes section 325D.44.
8
731. Defendants’ platforms are “goods” or “services” within the meaning of Minnesota
9
Statutes section 325D.44.
10
732. In numerous instances in the course of business, vocation, or occupation, Defendants
11
violated Minnesota Statutes section 325D.44, subdivision 1(5), 1(7), and 1(14) by representing that
12
ROBINS KAPLAN LLP
ATTORNEYS AT LAW

goods or services have sponsorship, approval, characteristics, ingredients, uses, benefits, or


LOS ANGELES

13
quantities that they do not have, representing that goods or services are of a particular standard,
14
quality, or grade, or that goods are of a particular style or model, if they are of another, and engaging
15
in deceptive acts, practices, and omissions that caused a likelihood of confusion or of
16
misunderstanding among the Band and its members in connection with its advertising, marketing,
17
promotion, and other representations regarding its goods or services. Those acts, practices, and
18
omissions include, but are not limited to:
19
a. Misrepresenting, directly or indirectly, expressly or by implication, that Defendants’
20
social media platforms are appropriate for those aged 12+ when they are not;
21
b. Misrepresenting, directly or indirectly, expressly or by implication, that Defendants’
22
social media platforms are not psychologically or physically harmful for young
23
users and are not designed to induce young users’ compulsive and extended use,
24
when they are in fact so designed;
25
c. Misrepresenting, directly or indirectly, expressly or by implication, that Defendants’
26
social media platforms are less addictive and/or less likely to result in psychological
27
and physical harm for young users than their social media platforms are in reality;
28
- 156 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 d. Misrepresenting, directly or indirectly, expressly or by implication, that the
2 incidence or prevalence of negative or harmful user experiences on their Social
3 Media Platforms was lower than they actually were;
4 e. Misrepresenting, directly or indirectly, expressly or by implication, that Defendants
5 prioritized young users’ health and safety over maximizing profits, when in fact they
6 subordinated young user health and safety to goals of maximizing profits;
7 f. Misrepresenting, directly or indirectly, expressly or by implication, that Defendants
8 prevent younger users from using their platforms;
9 g. Misrepresenting, directly or indirectly, expressly or by implication, that Defendants’
10 collection of user data was not a purpose of causing those users to become addicted
11 to the social media platforms, when it was.
12 h. Making other false and deceptive representations set forth in this Complaint.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 733. The Band and its members relied on the Defendants’ misrepresentations and omissions
14 in using and allowing their children to use Defendants’ social media platforms.
15 734. The Band and its members have been damaged in the form of money or property as a
16 result of Defendants’ deceptive acts and practices described throughout this complaint and
17 incorporated herein by reference.
18 735. Defendants’ violations of directly and proximately caused the Band to suffer economic
19 damages in an amount to be determined in this litigation.
20 COUNT IV:
Unfair or Unconscionable Acts
21
Minn. Stat. § 325D.43 et seq.
22 (Against all Defendants)

23 736. Plaintiff incorporates by reference all preceding paragraphs.


24 737. Minnesota Statutes section 325D.44, subdivision 1(13) prohibits any person from
25 engaging in “unfair methods of competition” or “unfair or unconscionable acts or practices.” Minn.
26 Stat. § 325D.44, subd. 1(13).
27 738. An unfair method of competition or an unfair or unconscionable act or practice is any
28 method of competition, act, or practice that: (1) offends public policy as established by the statutes,
- 157 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 rules, or common law of Minnesota; (2) is unethical, oppressive, or unscrupulous; or (3) is
2 substantially injurious to consumers.
3 739. In numerous instances in the course of business, vocation, or occupation, Defendants
4 violated Minnesota Statutes section 325D.44, subdivision 1(13) by engaging in unfair or
5 unconscionable acts, practices, and omissions that were unethical, oppressive, or unscrupulous
6 and/or substantially injurious to consumers, as described in detail throughout this Complaint. Those
7 acts, practices, and omissions include, but are not limited to designing and marketing their
8 platforms to appeal to minors and young people who lack the same cognitive development as adults
9 and are particularly vulnerable to social rewards like IVR and social reciprocity; and continuing to
10 design, research, develop, market, supply, promote, advertise, operate, and distribute their
11 platforms with knowledge that they are unreasonably unsafe, addictive, and dangerous to youth
12 mental health.
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 740. These acts, practices, and omissions caused young users’ compulsive and unhealthy use
14 of and addiction to Defendants’ platforms. At all relevant times, Defendants had a thorough
15 understanding of the mental and physical harms and addiction suffered by young users of its
16 platforms. Instead of taking adequate measures to mitigate these damaging effects, Defendants
17 turned a blind eye to them, and persisted in exploiting young users’ psychological vulnerabilities.
18 Defendants’ acts, practices, and omissions alleged herein are unethical, oppressive, and
19 unscrupulous, including because they constitute knowing decisions causing unnecessary and
20 unjustified harm to young users for Defendants’ financial gain.
21 741. Defendants’ acts, practices, and omissions alleged herein also have caused and continue
22 to cause substantial injury to consumers that could not be reasonably avoided. Young users could
23 not have reasonably avoided injuries resulting from Defendants’ acts, practices, and omissions,
24 including because Defendants misrepresented and failed to disclose the dangerous nature of their
25 platforms and because Defendants utilized psychologically manipulative engagement-inducing
26 features, knowing that young users are especially susceptible to those psychologically manipulative
27 tactics.
28
- 158 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 742. Due to Defendants’ unfair and unconscionable acts, practices, and omissions described
2 in this Complaint, the Band and its members are suffering, have suffered, and will continue to suffer
3 substantial injury.
4 743. Defendants’ unfair and unconscionable acts, practices, and omissions described in this
5 Complaint constitute multiple separate violations of Minnesota Statutes section 325D.44,
6 subdivision 1(13).
7 VI. PRAYER FOR RELIEF
8 Plaintiff demands judgment against each of the Defendants to the full extent of the law,
9 including but not limited to:
10 1. Entering an Order that the conduct alleged herein constitutes a public nuisance;
11 2. Entering an Order that Defendants are jointly and severally liable;
12 3. Entering an Order requiring Defendants to abate the public nuisance described
ROBINS KAPLAN LLP
ATTORNEYS AT LAW
LOS ANGELES

13 herein and to deter and/or prevent the resumption of such nuisance;


14 4. Enjoining Defendants from engaging in further actions causing or contributing to
15 the public nuisance as described herein;
16 5. Awarding equitable relief to fund prevention education and treatment for excessive
17 and problematic use of social media;
18 6. Awarding actual, compensatory, and punitive damages;
19 7. Awarding reasonable attorneys’ fees and costs of suit;
20 8. Awarding pre-judgment and post-judgment interest; and
21 9. Such other and further relief as the Court deems just and proper under the
22 circumstances.
23 VII. JURY DEMAND
24 Plaintiff demands a trial by jury on all issues so triable.
25

26

27

28
- 159 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL
1 Dated: July 24, 2024 ROBINS KAPLAN LLP
2

3 By: s/ Daniel A. Allender


Daniel L. Allender (SBN 264651)
4 2121 Avenue of the Stars, Suite 2800
Los Angeles, CA 90067
5 [email protected]
Telephone: (310) 229-5414
6
Timothy Q. Purdon (pro hac vice)
7 1207 West Divide Avenue, Suite 200
Bismarck, ND 58503
8 [email protected]
Telephone: (701) 255-3000
9
Tara D. Sutton (pro hac vice)
10 800 LaSalle Avenue, Suite 2800
Minneapolis, MN 55402
11 [email protected]
Telephone: (612) 349-8500
12
ROBINS KAPLAN LLP

Eric M. Lindenfeld (pro hac vice)


ATTORNEYS AT LAW
LOS ANGELES

13 1325 Avenue of the Americas, Suite 2601


New York, NY 10019
14 [email protected]
Telephone: (212) 980-7400
15
FRAZER PLC
16
T. Roe Frazer II (pro hac vice)
17 30 Burton Hills Boulevard., Suite 450
Nashville, TN 37215
18 [email protected]
Telephone: (503) 242-1072
19

20 ATTORNEYS FOR FOND DU LAC BAND OF


LAKE SUPERIOR CHIPPEWA
21

22

23

24

25

26

27

28
- 160 -
COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL

You might also like