Instagram Leader Says App Is Not Clinically Addictive in High-Profile Court Case

 

The head of Instagram, Adam Mosseri, told a court that he does not believe the platform can cause clinical addiction, offering key testimony in a closely watched lawsuit examining the mental health impact of social media on young users.

Mosseri appeared in court as part of legal proceedings against Meta Platforms, the parent company of Instagram. The case was brought by a young woman identified as Kaley, who alleges the platform was designed with features intended to keep minors engaged for prolonged periods, contributing to psychological harm. The trial is the first among hundreds of similar claims to move forward, making it a significant test of how courts may address allegations against major digital platforms.

Testimony Focuses on User Behavior and Platform Design

During questioning by plaintiff attorney Mark Lanier, Mosseri acknowledged that individuals may spend more time on Instagram than they feel comfortable with but rejected the idea that such behavior constitutes medical addiction. He described excessive engagement as a matter of personal experience rather than a clinical condition, emphasizing that usage patterns differ widely between individuals.

Mosseri also dismissed claims that Instagram intentionally prioritizes teen engagement for profit. He stated that younger users contribute less to advertising revenue compared with other groups, arguing that financial incentives do not drive efforts to attract minors to the platform.

The testimony provided an uncommon look into how company leadership frames user engagement, safety responsibilities, and platform design choices.

Ongoing Debate Over Youth Well-Being

The case revisits earlier concerns about the influence of social media on adolescents. Internal research disclosed in 2021 by whistleblower Frances Haugen suggested that some platform features could negatively affect body image among teenage girls. Attorneys representing the plaintiff argue that design elements such as continuous scrolling, automated video playback, and visible approval metrics may reinforce compulsive behavior patterns.

Court discussions also examined Instagram’s use of appearance-altering filters. Mosseri explained that filters promoting cosmetic procedures were restricted, while others that modify facial features remained available but were no longer actively recommended. Internal company communications presented during the trial indicated past internal debate about whether such tools could contribute to body image concerns among younger users.

Lanier further questioned whether executive compensation structures tied to company growth could influence product decisions. Mosseri responded that financial considerations did not guide safety-related choices.

Wider Implications for the Tech Industry

Attorneys representing Meta argue that the plaintiff’s mental health challenges were shaped by personal circumstances rather than platform use. The company maintains that it has implemented new safety measures, including stronger privacy settings for teenagers and improved age verification methods.

The proceedings are taking place in Los Angeles, where families who say they have been affected by online harms gathered outside the courthouse in support of stricter accountability for technology companies.

Legal arguments are also shaped by federal protections limiting liability for user-generated content. Carolyn Kuhl, who is presiding over the case, instructed attorneys to avoid arguments focused on specific content shared on the platform, narrowing the scope of the trial.

The outcome of the case may influence future litigation involving social media companies and could help define how courts interpret corporate responsibility in relation to digital engagement and youth well-being.