Legal Lapses: Gonzalez v. Google LLC

Swasti Singhai, Final Focus Editor

Art by Ella Jiang

In just the past decade, technology has been advancing at unprecedented rates:  network speed has increased tenfold, social media users have increased from 970 million to approximately 3 billion, and datasets utilized by artificial intelligence algorithms have increased from 2 zetabytes to 41. 

Yet laws lag far behind. While personal data is easily accessible through the Internet, privacy laws are incomplete. With algorithms dictating many functions in the online sphere, the issue of liability is unclear. The case of Gonzalez v. Google LLC reviews this. 

The Nov. 2015 Paris terrorism attacks tragically killed 137 people; among them was a 23-year-old American student Nohemi Gonzalez. Her family filed a lawsuit against Google, arguing that YouTube’s recommendation system was partially responsible for her death, as they had pointed  users towards ISIS’ recruitment videos, ultimately facilitating the terrorist group’s growth. The lawsuit also argued that Google placed paid advertisements preceding ISIS-created content, thus sharing the resulting ad revenue with ISIS. 

Google defended itself in lower courts with Section 230 from the Telecommunications Act of 1996. The first part of the law provides website platforms with immunity: they cannot be held  liable for information published by third-party users. The court found that the platforms or online services should be treated as merely distributors of content rather than publishers, which are held to a higher standard legally in the context of content regulation. 

The second part of Section 230 gives immunity from civil liabilities to platforms, like Youtube, that remove or restrict content. A civil liability is the legal responsibility of “paying money for damage to another person’s health, business, or property.”Even if content is constitutionally protected, under this Section, providers can remove the material as long as they act “in good faith” while doing so. 

The lower court ruled in favor of Google. The Ninth Circuit Court of Appeals upheld the lower courts’ decision. The family eventually appealed to the US Supreme Court, and their request was granted. 

Their petition argued that the YouTube algorithm is not covered by Section 230, stating, “Whether Section 230 applies to these algorithm-generated recommendations is of enormous practical importance. Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the United States who uses social media.” 

While the Supreme Court’s decision will be released next summer, the Ninth Circuit justices’ opinions, although ruling in Google’s favor, point towards reforming the evidently outdated Section 230. Justice Morgan Christen acknowledged that Section 230 “shelters more activity than Congress envisioned it would,” and Justice Marsha Berzon said that algorithms should fall outside the immunity provided by Section 230, a revision Congress could make. 

In the past year, Supreme Court Justice Clarence Thomas has questioned the law’s broad protections. But decreasing the immunity covered by Section 230 would force tech companies to increase regulation, dramatically decreasing speech online. Doing so may have unintended consequences: red states and blue states, driven by political agendas, would have vastly different interpretations of what should and should not be regulated. 

Section 230 has clearly been unable to  account for the exponential developments in technology. The precedent set by the court in this case will be critical in redefining speech regulation online.