
Meta and YouTube Face Jury Deliberations Over Liability for Harm to Children
Key Takeaways
- Jury to deliberate on liability for child harm by Meta and YouTube.
- Closing arguments after a month of testimony, including Zuckerberg's testimony.
- Allegations claim Instagram and YouTube foster youth addiction through design.
Historic Trial Overview
A historic trial in Los Angeles has reached its final stage as Meta and YouTube face jury deliberations over liability for harm caused to children through their platform designs.
“- Published Kaley would look at Instagram until she fell asleep”
The lawsuit, filed by 20-year-old Kaley (K.G.M.) alleges that she became addicted to social media networks during childhood after starting to use YouTube at age six and Instagram at age nine, despite both companies claiming to prohibit users under 13.

Kaley claims this addiction led to severe mental health consequences including depression, anxiety, body dysmorphia, and suicidal thoughts, as well as social withdrawal and difficulty engaging offline.
The case represents the first major jury trial against social media companies for platform design causing harm to children, with the outcome potentially affecting approximately 1,500 similar lawsuits against Meta, Google, and other tech giants.
This landmark proceeding comes amid growing scrutiny of how tech companies design their platforms to maximize user engagement, particularly among young users.
Executive Testimony
The trial featured significant testimony from top tech executives who defended their companies' practices while acknowledging internal concerns about platform addictive qualities.
Meta CEO Mark Zuckerberg testified under oath with four personal security guards, repeatedly stating that his company has always had policies prohibiting users under 13, though he became frustrated when pressed about internal documents showing Meta executives discussed growing usage among children.

Instagram head Adam Mosseri testified that even 16 hours of Instagram use did not strike him as an addiction, instead referring to such extensive usage as merely "problematic."
Kaley's attorney Mark Lanier pointed to internal Meta and YouTube documents that he said illustrated a clear internal understanding of their platforms' potentially addictive nature, stating "when you're making money off of kids, you have to do it responsibly."
The plaintiff's testimony described how she created multiple accounts on both platforms to drive likes and validation, spending hours scrolling and watching videos while withdrawing from offline activities.
Legal Arguments
The central legal question before the jury is whether Meta and YouTube were negligent in creating and tweaking their products to encourage excessive use, particularly among children.
“The trial in Los Angeles over a case filed by a woman who accuses Instagram and YouTube of harming her mental health due to the addictive design of these apps entered its final stage on Thursday, after nearly a month of testimony that included testimony from Mark Zuckerberg, the founder and CEO of Meta”
Both companies employed different defense strategies - Meta argued that Kaley's mental health struggles stemmed from her personal life and upbringing rather than platform use, while YouTube maintained it is not a social media platform and its features are not addictive.
YouTube's attorney Luis Li emphasized that Kaley and her mother originally did not bring any claims against YouTube when filing the lawsuit, and that she attested under oath having no specific claims against the video platform.
The jury must decide if the platforms' negligence was a "substantial factor" in causing Kaley's harm, though it does not have to be the only factor.
This legal standard differs from previous lawsuits that typically failed due to Section 230 of the Communications Act, which protects online platforms from liability for user-generated content.
The case represents a shift from arguing about specific harmful content to examining whether the platforms themselves were negligently designed.
Broader Implications
The trial has broader implications for social media regulation and the accountability of tech companies in protecting young users.
Kaley's attorneys argued that she was preyed upon as a vulnerable user, while the companies claimed she turned to their platforms as a coping mechanism for existing mental health struggles.

Kaley testified that she began experiencing anxiety and depression around age 10, and developed body dysmorphia after using Instagram filters that altered her appearance.
When asked if she had suffered such feelings prior to social media use, she responded "No, I didn't."
The case highlights the tension between tech companies' business models that maximize engagement and the potential harm this can cause to developing minds.
With about 1,500 similar lawsuits pending, the jury's decision could establish a legal framework for holding social media companies accountable for platform design choices that contribute to youth mental health crises.
Jury Deliberations
The jury is now deliberating on several key questions that will determine whether Meta and YouTube are liable for Kaley's alleged harm.
“Jury to begin deliberations in landmark social media addiction trial After a month of hearings, 12 jurors are set to decide on whether or not social media companies should be liable for harm caused to children using their platforms”
Jurors must decide the case against each platform independently, treating it "as if it were a separate lawsuit," according to Judge Carolyn B. Kuhl.

Only nine out of the 12 jurors need to agree on each count since it's a civil case.
If the jury finds either or both platforms liable, they will then decide on the amount of damages to award Kaley.
In his closing arguments, Lanier asked jurors to consider "What is a lost childhood worth?" while using visual aids like a cupcake with minimal baking soda to illustrate the concept of "substantial factor" negligence.
The jury's decision could set a precedent for future cases by establishing whether social media companies can be held legally responsible for the addictive design features that may contribute to youth mental health problems, potentially forcing major changes in how tech companies design and moderate their platforms for young users.
More on Technology and Science

ByteDance Halts Seedance 2.0 Global Launch Over Copyright Disputes
12 sources compared
Blizzard Bomb Cyclone Batters Midwest, Triggers Widespread Power Outages
10 sources compared

Apple Cuts China App Store Commissions to 25% After Regulator Pressure
25 sources compared
FBI Investigates Hacker Who Uploaded Malware-Laced Games to Steam
12 sources compared