In a crucial development in initiative to protect children from addiction to social media platforms, a court in the US allowed a lawsuit to advance.
The lawsuit accused social media companies of intentionally harming children through addictive platform designs.
The consolidated lawsuit, filed on behalf of hundreds of minors across the US, alleges companies including Meta (Facebook), Snap, TikTok, and YouTube specifically designed their platforms to “hook” young users, which resulting in mental health issues like anxiety and depression.
Judge Yvonne Gonzalez Rogers, in the order issued in California, said, “Defendants are alleged to target children as a core market and designed their platforms to appeal to and addict them.”
What social media platforms said
Rejecting the allegations, the companies sought to dismiss the lawsuit, arguing the law shielded them from liability. They cited Section 230 of the Communications Decency Act which is also known as Section 230 (1996). The law protects online platforms from legal action over user-generated content.
The judge, after a detailed analysis, observed “the parties’ ‘all or nothing’ approach to the motions to dismiss does not sufficiently address the complexity of the issues facing this litigation.”
The pleas claimed never-ending feeds, push notifications, algorithmic recommendations, and other design features make children addictive.
Intentional design choices on social media platforms are to blame for mental health harms, not the content itself, the plaintiff made counter-argument
Section 230 is not applicable: Court
Thr court agreed that the Section 230 did not prohibit product liability claims focused on design defects like ineffective parental controls, age verification, time limits, and barriers to account deletion.
However, the court dismissed the claims about using algorithms to recommend accounts and some notification features.
The same applies here. The Court must consider the specific conduct through which the defendants allegedly violated their duties to plaintiffs,” justice Rogers added.
“It is these detailed, conduct-specific allegations that require analysis,” he asserted.
Companies ignore users’ mental health
The pleas argued the companies were aware of the mental health effects on children but did little to address safety concerns.
Now, with the decisive progress of the case in the US court, internal documents and data from the tech companies related to their knowledge of potential harms came to light.
“The parties’ all or nothing approach does not fairly or accurately represent the Ninth Circuit’s application of Section 230 immunity. Rather, the Court has conducted an analysis of the actual functionality defects alleged in the complaint,” the court observed.
Lawsuit seeks strict action
The lawsuit seeks bringing social media platforms under ‘product liability law’. It urged to the court to treat platforms as defective products that require improved designs and warnings.
A turning point in social media journey
The IT companies have been enjoying immunity on user content but this time, they have been questioned on algorithms, recommendation systems, and other operational choices embedded in platforms.