Social Media Platforms on Trial in LA: Are They Damaging Kids’ Brains on Purpose?

.

ER Editor: Readers may also be interested in this NBC report from 2 days ago —

Instagram CEO Adam Mosseri defends platform in landmark trial over social media harms

********

Social Media Platforms on Trial in LA: Are They Damaging Kids’ Brains on Purpose?

G. CALDER for THE EXPOSE

Some of the world’s largest social media companies are now defending themselves in a major US court battle over allegations that their platforms were designed in ways that intensify engagement among children while disregarding foreseeable psychological risks.

Features such as algorithmic feeds, streak systems, and constant notifications were deliberately engineered to maximise time spent on the apps, even when that engagement was found to cause anxiety, compulsive use patterns, and emotional distress. The companies in question are Instagram, Tiktok, Snapchat and YouTube, and so far they deny that their products are addictive, reject claims that they are liable for user behaviour, and maintain that they simply provide tools for connection and creativity.

Mosseri Testifies in Social Media Case Instagram
Mosseri head of Instagram testifies in LA

The Case Brought Against Social Media

The litigation forms part of the consolidated federal case known as the Social Media Adolescent Addiction / Personal Injury Products Liability Litigation (MDL No. 3047), currently proceeding in the U.S. District Court for the Northern District of California.

The multidistrict litigation (MDL) consolidates hundreds of lawsuits filed by families, school districts, and state entities. Plaintiffs allege that platforms including Meta Platforms, TikTok, Snap Inc., and YouTube were designed in ways that contribute to compulsive use among minors, leading to anxiety, depression, eating disorders, and other psychological harms.

Multidistrict litigation is a procedural mechanism used in U.S. federal courts to consolidate similar cases for pretrial proceedings. MDL No. 3047 brings together personal injury claims alleging that social media companies knowingly designed features that increase engagement among adolescents while failing to warn users of associated risks.

The complaints focus on product design rather than isolated incidents. Plaintiffs argue that algorithmic feeds, infinite scrolling, autoplay functions, push notifications, and streak-based reward systems were engineered to maximise time spent on the platforms. The legal theory resembles earlier product liability cases involving tobacco or opioids, where companies were accused of designing products with foreseeable dependency risks.

The defendants argue that their platforms are protected by Section 230 of the Communications Decency Act and that they cannot be held liable for user-generated content. The litigation is expected to test the boundaries of that protection.

State-Level Lawsuits Add Pressure

In addition to the MDL, multiple U.S. state attorneys general have filed separate lawsuits. In 2023, more than 40 states sued Meta, alleging that Instagram and Facebook were designed to exploit young users’ psychological vulnerabilities. The suits reference internal research disclosed by whistleblower Frances Haugen, which suggested that Meta was aware of negative mental health effects among teenage users.

Other states have brought actions against TikTok, claiming that the app’s algorithmic design promotes excessive use among minors and exposes them to harmful content.

These actions increase legal and regulatory scrutiny beyond the federal court proceedings.

How Each Social Media Platform Exploits Children

Tiktok, Snapchat, and Instagram are some of the platforms being scrutinised. Here’s how they trap young people:

TikTok

TikTok has become emblematic of algorithm-driven consumption. Its “For You” feed learns from minimal user interaction and rapidly personalises content streams. The speed at which the algorithm adapts is central to its appeal.

Critics argue that this model produces compulsive viewing cycles, especially among adolescents whose reward systems are highly sensitive to novelty and social validation. Studies have shown that TikTok’s recommendation engine can quickly funnel users toward emotionally charged or body-focused content once interest is detected.

The company says it provides screen-time management tools and moderation systems. However, these safeguards exist within a business model that rewards extended engagement.

Instagram

Instagram, owned by Meta Platforms, has faced sustained criticism over its impact on teenage mental health. Internal research disclosed in 2021 indicated that Instagram worsened body image concerns for a segment of teenage girls.

Features such as visible follower counts, algorithmic ranking, and performance-based metrics reinforce social comparison. For adolescents navigating identity formation, these systems can turn self-presentation into a continuous public evaluation exercise.

Meta has introduced tools such as private accounts for minors and limits on messaging from adults. Whether these changes meaningfully address the underlying engagement incentives remains contested.

Snapchat

Snapchat operates differently but faces similar criticism. Its “streak” feature rewards consecutive days of messaging between users, creating an obligation to maintain daily contact.

While marketed as playful, streak systems introduce a gamified pressure dynamic. Teenagers report anxiety over losing streak counts, which can feel socially punitive.

Snap states that its platform is focused on close friend communication and supports wellbeing initiatives. Yet the behavioural hooks remain built into the product.

Head of Instagram Calls It “Problematic”

Adam Mosseri, who has led Instagram for eight years, is the first high-profile executive to appear in the case that began this week in Los Angeles. Lawyers have argued that the lead plaintiff (known as KGM) was hurt by other things in life, not Instagram.

YouTube is also named in the suit, while Snapchat and TikTok both reached settlements ahead of the trial.

Mosseri agreed early on in his testimony that Instagram should do everything within its power to help keep users safe on the platform, especially young people. However, he then said he did not think it was possible to say how much Instagram use was “too much”.

“It’s important to differentiate between clinical addiction and problematic use,” he added. “I’m sure I’ve said that I’ve been addicted to a Netflix show when I binged it really late one night, but I don’t think it’s the same thing as clinical addiction”. He then repeatedly admitted he was not an expert in addiction.

When Mosseri was asked what he thought of KGM’s longest single day of use on the platform being a shocking 16 hours, he said “that sounds like problematic use”, refusing to acknowledge it being an addiction.

Meta, who owns Instagram, and other social media companies including YouTube, Snapchat, and Tiktok, is facing thousands of other cases brought by families, state prosecutors, and school districts across the US.

The Key Legal Framing: Does It Count as Addiction?

A central issue in the litigation is terminology. Plaintiffs frequently use the language of addiction, while platform representatives consistently avoid describing their products as addictive. Instead, they speak of excessive or “problematic” use. The MDL title itself includes the term “Adolescent Addiction,” but addiction in this context is argued as a behavioural design outcome rather than a medically classified substance dependency.

The science of behavioural addiction is complex. Unlike substance dependency, it does not involve ingestible chemicals. However, neuroscientific research demonstrates that social media engagement activates dopamine-related reward pathways, particularly in adolescents whose impulse control mechanisms are still developing.

The World Health Organization has formally recognised gaming disorder as a behavioural addiction. While social media disorder does not carry identical diagnostic status, studies published in peer-reviewed journals have linked heavy usage to increased anxiety, depression, disrupted sleep, and body dissatisfaction. While correlation does not prove causation, plaintiffs argue that the companies’ own research demonstrates awareness of risks.

The legal reluctance to use the word addiction does not eliminate the behavioural patterns associated with compulsive use. Did the companies design their platforms in ways that foreseeably intensified psychological harm?

Their Business Models Depend on This Exact Behavioural Outcome

Social media platforms are funded by advertising revenue. That revenue increases when user engagement increases. Time spent on the platform translates directly into monetisable data and impressions.

Incentives therefore align with attention capture rather than moderation. Features such as infinite scroll, autoplay video, and push notifications are not accidental. They are optimisation tools.

The legal case now unfolding asks whether companies can continue to rely on this model while disclaiming responsibility for its foreseeable psychological effects on minors.

Final Thought

The Social Media Adolescent Addiction MDL represents one of the most significant legal challenges the industry has faced. The outcome will not eliminate social media use among young people, but it could reshape how platforms design products for minors.

At stake is whether engagement-optimised business models can coexist with meaningful child protection. The court will ultimately decide whether design choices intended to maximise attention also created foreseeable harm.

For an industry built on monetising user engagement, that question cuts to the core of its operating model.

Source

Featured image source:  https://levinlaw.com/newsroom/judge-upholds-schools-social-media-claims/

************

••••

The Liberty Beacon Project is now expanding at a near exponential rate, and for this we are grateful and excited! But we must also be practical. For 7 years we have not asked for any donations, and have built this project with our own funds as we grew. We are now experiencing ever increasing growing pains due to the large number of websites and projects we represent. So we have just installed donation buttons on our websites and ask that you consider this when you visit them. Nothing is too small. We thank you for all your support and your considerations … (TLB)

••••

Comment Policy: As a privately owned web site, we reserve the right to remove comments that contain spam, advertising, vulgarity, threats of violence, racism, or personal/abusive attacks on other users. This also applies to trolling, the use of more than one alias, or just intentional mischief. Enforcement of this policy is at the discretion of this websites administrators. Repeat offenders may be blocked or permanently banned without prior warning.

••••

Disclaimer: TLB websites contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, health, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.

••••

Disclaimer: The information and opinions shared are for informational purposes only including, but not limited to, text, graphics, images and other material are not intended as medical advice or instruction. Nothing mentioned is intended to be a substitute for professional medical advice, diagnosis or treatment.

Be the first to comment

Leave a Reply

Your email address will not be published.


*