Last week, juries in New Mexico and California reached verdicts against Meta and YouTube of $375 million in civil penalties and $6 million for an individual plaintiff, respectively. These rulings are a harbinger of legal action to come, wherein states and individuals take to the courts in response to social problems they associate with social media platforms.
The California trial hinged on the plaintiff’s claim that four social media companies had intentionally designed their product to be addictive, and that her own social media addiction caused great impact to her mental health. Originally, Google, Meta, TikTok, and Snapchat were named, with TikTok and Snapchat settling before the trial. The ruling included Meta and Google’s YouTube. The trial was the first of a series of related lawsuits occurring in California and will be followed by eight more bellwether trials.
In New Mexico, a trial with a similar premise concluded last week. The state argued that Meta had enabled the exploitation of children through the design of its product by hiding what developers knew about the dangers of the platform to children. The case was brought forward by the attorney general of New Mexico and dealt with the state’s consumer protection laws. Like the California case, the primary argument made was that the harms experienced were design features of the platform and that outside content displayed on the platforms were part of the “product” itself.
These trials mark a watershed moment in legal action against social media companies because they address the platforms as a product, and one for which the companies are liable, rather than an outlet or forum for speech and outside content. Elizabeth Nolan Brown, senior editor at Reason Magazine, observed another shift towards “…an embrace of powerlessness and corporate blame when it comes to tech habits, and a rejection of ideas like personal and parental responsibility. We’ve imbued these platforms with an almost magical status, while expecting their proprietors to perform superhuman feats of saving people from themselves.”
While Meta has already announced its intention to appeal the rulings, the shift in both legal arguments and cultural paradigm toward social media will undoubtedly have lasting consequences. There will be no shortage of lawsuits, like those waiting in the wings in California. A barrage of litigation means that social media companies will functionally no longer have the security of Section 230, which protects social media platforms from liability for speech they did not produce and for moderation decisions they make. While a highly litigious environment is far from ideal for America’s tech companies, the effects for consumers are where the rubber meets the road. There will be a chilling effect on free speech, as platform-imposed and legal regulation respond to the new normal and make decisions about the content that children, and adults for that matter, should be able to see and post.
It’s also worth noting the projects for which groups like Meta and Google are responsible: AI breakthroughs, cutting edge research into medical and educational technology, and quite relevantly, tools to help parents monitor their children’s online behavior. It is in the interest of the American people and economy to encourage our companies to continue leading in innovations that save lives.
Child exploitation is profoundly evil and a tragedy. It can lead to a life of mental health woes, trauma, and anguish. Unfortunately, social media is quite often a component of these accounts, which makes the growing attention to its use deeply reasonable.
It would be comforting if the solution were simple, like a hefty lawsuit, a blanket ban, and the assignment of total responsibility to one company. However, the reality is far more complicated. Social media is an avenue for the expression of free speech, parents are rightfully in the best position to make decisions about their children’s social media consumption, and casting all blame and burden upon tech companies will inevitably halt progress that could help address the aforementioned harms, alongside a host of other troubles.









