Inside a landmark trial testing whether social media platforms engineered addiction.
Can someone be addicted to Instagram?
That question now sits at the centre of a major California courtroom. Adam Mosseri, head of Instagram, testified under oath that the app is not “clinically addictive.” He acknowledged “problematic use.” He rejected the label of addiction.
The testimony comes during one of the first bellwether trials targeting Meta, along with other platforms, over claims that social media companies knowingly built features designed to hook young users.
Families allege those features harmed teens’ mental health. Meta disputes that claim.
The legal and cultural stakes could not be higher.
What’s Happening & Why This Matters
A Landmark Addiction Trial…
The case centres on a now 20-year-old plaintiff identified in court as Kaley. She alleges that Instagram’s design contributed to depression, body image issues, and suicidal ideation. Her lawsuit forms part of more than 1,500 similar claims across the country.
This trial serves as a test case. Courts use bellwether trials to measure how juries respond before handling larger waves of litigation.
Mosseri is the first executive to testify in person.
On the stand, he draws a distinction. He says social media addiction does not exist as a clinical diagnosis. He argues users can experience “problematic use,” much like watching television longer than intended.
“It’s relative,” he explains, noting that he is not a doctor.

Psychologists do not formally classify social media addiction in the Diagnostic and Statistical Manual of Mental Disorders. That fact shapes Meta’s legal defence.
But researchers continue to study compulsive use patterns among teens. Lawmakers worldwide have raised concerns about addictive design.
The courtroom debate blends science, psychology, and product strategy.
Plaintiffs Argue Design Drives Dependency

The plaintiffs’ legal team focuses on design features.
They point to infinite scroll, autoplay, and the like button. They describe these tools as engineered feedback loops. Lawyer Mark Lanier likens them to “digital casinos.”
Lanier presses Mosseri on internal Meta communications. Emails reveal that some employees compared Instagram to a drug. One internal message reportedly stated: “IG is a drug.” Another joked that social media platforms were “pushers.”
Another internal exchange references dopamine and adolescent neurobiology. The plaintiff’s team argues executives ignored internal warnings.
Mosseri disputes the context.
He said Instagram tests new features for younger users before launch. He further stated the platform attempts to balance safety with free expression.
Mosseri rejects claims that Instagram prioritises teen profit.
“We make less money from teens than any other demographic,” he testifies, arguing that teens click fewer ads and have limited spending power.
The courtroom is a collision between product philosophy and psychological risk.
Beauty Filters and Body Image Concerns
The trial also revisits Instagram’s beauty filters.
Plaintiffs argue these filters contribute to body dysmorphia and increased cosmetic surgery interest among teens. Internal Meta documents from 2019 show a dispute over banning facial-distortion filters. Some internal emails note expert concerns about harm.
Initially, Instagram banned filters that distorted facial features. Later, the company modified that policy. It banned filters that explicitly promoted cosmetic procedures but allowed others that enhanced lips or slimmed noses, while reducing recommendations for those filters.
Did Instagram knowingly maintain tools that amplify insecurity?
Or did it take reasonable steps within competitive market pressures?
Mosseri testified that stock price or compensation did not influence his decisions. When questioned about his pay, he confirmed a base salary of nearly $900,000, with total compensation in some years exceeding $10 million through bonuses and stock awards.
Plaintiffs argue that growth incentives create product bias.
Meta denies that connection.
Safety Features vs. Systemic Design

Instagram has rolled out new teen safety features in recent years. These include “teen accounts” with stricter privacy defaults and content restrictions.
However, outside reviews question their effectiveness. Fairplay, a nonprofit advocating for reduced tech influence on children, previously reported that many teen safety tools were either ineffective or no longer fully functional.
The legal strategy avoids arguing about individual posts. Instead, plaintiffs claim systemic addictive design.
The strategy attempts to circumvent Section 230, the federal law that shields platforms from liability for user-generated content. Judge Carolyn Kuhl limits testimony related to specific content exposure under that statute.
The focus is on architecture, not posts.
Families Seek Accountability
Parents fill the courtroom gallery.
Some say they lost children to suicide or exploitation linked to online interactions. John DeMay, whose son died after a sextortion scam, expresses frustration before testimony begins. He says internal documents and public testimony reveal priorities families have long suspected.
Other families describe years of advocacy with limited legislative progress.
They turn to courts.
One parent notes that financial consequences may push change faster than congressional hearings.
The courtroom is a legal forum, and a moral one.
The Scientific Grey Zone
The debate over “clinical addiction” falls in a grey area.
Addiction traditionally refers to substance dependence. Behavioural addictions exist, such as gambling disorder. Social media addiction lacks a formal diagnostic status.
Yet behavioural patterns mirror addiction markers. Compulsion. Withdrawal symptoms. Tolerance. Escalating use.
Mosseri draws a technical distinction. Plaintiffs draw a practical one.
Is a diagnosis required for harm to exist?
That question may shape jury perception.
The Policy Implications
The outcome may reshape platform governance.
If juries find that design choices caused substantial harm, social media companies may face structural redesign mandates.
If courts reject the addiction framing, platforms may continue refining voluntary safety features without deeper architectural change.
Meanwhile, regulatory discussions expand globally. European lawmakers explore stricter youth protections. U.S. senators continue pressing executives on child safety.
Public trust is fragile. The Instagram addiction trial testimony forces companies to defend internal philosophy under oath. Transparency alone changes the landscape.
TF Summary: What’s Next
Adam Mosseri denies that Instagram is “clinically addictive,” distinguishing between addiction and problematic use. Plaintiffs argue internal documents and design features show intentional engagement engineering that harms teens. The jury must weigh scientific nuance against lived experience.
MY FORECAST: Regardless of the verdict, product design scrutiny intensifies. Teen safety defaults expand. Internal research transparency increases. Courts, not Congress, drive the next phase of platform accountability.
— Text-to-Speech (TTS) provided by gspeech | TechFyle

