This post is part of a series of resources produced by our Student Research Fellows in Summer 2023. The content does not necessarily reflect the official position of the organization.
Section 230 of the Communications Decency Act (CDA) is a hot topic for debate amongst the legal community. Much discussion has revolved around the relationship between Section 230 and social media platforms, like Facebook and Twitter (recently redubbed X) and big tech companies, like Google. Politicians across party lines seek revision of the law in order to hold these companies liable for speech third-party users post on their platforms.
However, debates surrounding Section 230 frequently ignore important public institutions that could be drastically impacted by any change to the law, like libraries.
What is Section 230?
Section 230 of the 1996 CDA sought to encourage the growth and development of the internet by providing protection and reducing liability for internet service providers. The CDA was largely successful in creating today’s internet, where computer services and content providers are encouraged to thrive without fear of legal repercussions for the content that may be disseminated on their platforms. Without this essential safeguard, many of the platforms we know today might not exist, disincentivized from entering the digital arena for fear of costly and damaging liability due to third-party content.
Libraries are directly implicated in the text of the law itself. Section 230 concerns protections afforded to “interactive computer services,” including “systems operated or services offered by libraries or educational institutions.” Section 230 states, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." Thus, providing these services exempts the provider from liability for the content being posted. Section 230 goes on to give protection to interactive computer services for the removal of any content they deem “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”
For libraries, Section 230 means the protected ability to moderate content posted through their servers and on their platforms (e.g., chatrooms, question and answer forums, discussion boards). It also means protection from legal liability when content is posted or removed. This may seem vastly different from what social media platforms do in hosting massive amounts of third-party content, but these functions are crucial to libraries' ability to fulfill their goals in serving the public.
While focus has been drawn to cases coming down from the Supreme Court concerning Section 230 through discussion of Google and Twitter, libraries are faced with a real issue–any verdict or legislative reform made regarding Section 230 will directly impact their daily operation and ability to function.
How does Section 230 Impact Libraries?
Libraries are havens of information, education, and entertainment as well as spaces that foster research, learning, creativity, and conversation. They serve as critical resources for access. Through libraries, everyone has a way to connect to the internet, obtain information, and use online forums for their research, creative endeavors, and self-expression.
Libraries thrive when they are able to engage openly with the communities they serve, but they can only do so when there is protection from legal repercussions created by citizen contributions.
Inherently, libraries also breed intellectual debate, a type of disagreement that is fundamental to academia, government, medicine, and many other institutions that rely on libraries for their daily work. Libraries encourage this discussion and meeting of the minds so long as conversation remains conducive for everyone involved. Libraries need protection from repercussions for the speech users promulgate on library servers, and they need to retain the ability to moderate harmful speech. Section 230 provides this protection. Without it, libraries could be vulnerable to liability for content posted on their servers or removed by their moderators.
"Libraries thrive when they are able to engage openly with the communities they serve, but they can only do so when there is protection from legal repercussions created by citizen contributions."
For instance, say your local library is hosting an online event educating the public on local book bans. During the Q&A following the presentation, one of the attendees makes an inappropriate comment in the chat riddled with hate speech and offensive language. The moderators of the event quickly remove the content. Under Section 230, the library is protected from being held responsible for that attendee’s comment, as well as for the comment’s removal. The library cannot face legal action from other attendees for being exposed to the attendee’s speech, nor can the attendee bring legal action against the library for their speech being removed.
Beyond the burden of time and energy that litigation requires, it also demands resources and funds that libraries simply do not have. If this protection is eroded, institutions like libraries will be forced to engage in costly risk-mitigation efforts that would staunch the flow of ideas and open communication prized by these institutions.
Discussion of Section 230 in the context of libraries took place at the Association of Research Libraries (ARL) 2021 Spring Association Meeting. The consensus of the ARL discussion was that “moderating content and providing platforms for third-party speech has been a core role of research libraries for centuries.” Meddling with Section 230 would be disruptive to the library community, which has been effectively handling speech issues through institutional policies and codes of conduct without government interference and would like to continue doing so.
What’s the Issue with Section 230?
A benefit to Section 230’s multitude of enemies is a lack of consensus about what changes should be made.
From the right, Section 230 reform originates from fear of threats to free speech online, specifically threats to the ability for conservative views to be heard. These arguments have painted “Big Tech” and the protection afforded to it by Section 230 as a villain seeking to suppress conservative viewpoints on their platforms.
In 2021, conservative leaning states such as Texas and Florida battled Section 230 with state laws that targeted the ability of social media platforms to moderate user content. These laws seek to limit platforms' ability to moderate content with the claimed intention of protecting conservative viewpoints from unfair moderation by liberal tech companies. These laws were challenged in each state's federal courts, resulting in differing decisions in each circuit with no consensus regarding the laws' constitutionality.
Now, in spite of urging from the Biden administration, the Supreme Court has shown no interest in determining whether or not these laws are constitutional.
From the left, critics of Section 230 argue the law is overbroad and does not effectively prevent hate speech and disinformation.
The Biden administration is also not a friend to Section 230. Biden has made numerous calls for bipartisan reform, most notably in a Wall Street Journal op-ed that proclaimed a “need [for] Big Tech companies to take responsibility for the content they spread.”
Yet Biden has given no indication of what such bipartisan reform should look like, nor has he addressed the impact such reform could have outside of Silicon Valley for research institutions like libraries. Meanwhile, the Supreme Court has been busy dealing with two notable cases involving Section 230. Both Google v. Gonzalez and Twitter v. Taamneh were argued this past term.
The cases reflect this similarity: whether or not an internet platform can be held liable for “failing to take meaningful or aggressive action to prevent” a terrorist organization from using their services and whether that makes them liable for “aiding and abetting in those actions.” That is, can internet platforms be held liable for content posted by their users? Further, in the context of a post claiming responsibility for a terrorist attack, are these platforms responsible for assisting terrorists?
The Court said no. It found Twitter and Google did nothing more than transmit information– the very thing they are designed to do. It’s important to note, though, that the Court's unwillingness to hold platforms liable for terrorist activity does not mean it may not decide in future cases that increased accountability for hosting third-party content is necessary. As Section 230 is currently sweepingly inclusive of libraries and public institutions, as well as big tech companies, increased accountability for one will impact the other unless lawmakers include specific exceptions.
Further, as Justice Ketanji Brown Jackson points out in her concurrence, this decision is incredibly narrow. Both cases were brought on appeal to SCOTUS at the motion to dismiss stage of litigation– the first move in often lengthy litigation. The decisions rendered are completely dependent on whether or not these specific allegations made in the complaints were sufficient to be brought to court. SCOTUS found they were not, but this is not necessarily an indicator for opinions in future litigation concerning Section 230.
There are constantly new cases arising concerning Section 230, many of which will reach resolution without being heard by the Supreme Court. Like in the Greek legend, champions of Section 230 cut down the vicious heads of the hydra attacking the law, and more continue to sprout in their place.
For now, so long as politicians and the courts are busy bickering, Section 230 remains untouched and libraries remain safe.
Association of Research Libraries. (n.d). Section 230. Association of Research Libraries–Section 230. https://www.arl.org/section-230/
Biden, J. (2023, January 11). Republicans and Democrats, unite against big tech abuses. Wall Street Journal. https://www.wsj.com/articles/unite-against-big-tech-abuses-social-media-privacy-competition-antitrust-children-algorithm-11673439411
Blair, E. M. (2023, May). From America Online to America, online: reassessing section 230 immunity in a new internet landscape. J. INTELL. PROP. L. 30 (305). https://digitalcommons.law.uga.edu/cgi/viewcontent.cgi?article=1507&context=jipl
Brannon, V.C. (2022, September 22). Free speech challenges to Florida and Texas social media laws. Congressional Research Service. https://crsreports.congress.gov/product/pdf/LSB/LSB10748
Brief of amici curiae Electronic Frontier Foundation, American Library Association, Association of Research Libraries, Freedom to Read Foundation, and the Internet Archive in support of Respondent, Reynaldo Gonzalez, et. al v. Google LLC, 21-1333 (Supreme Court of the United States 2023)
Cummings, J. (2021, August 27). Beyond social media: the full context of Section 230. Educause. https://er.educause.edu/articles/2021/8/beyond-social-media-the-full-context-of-section-230
Electric Frontier Foundation. (n.d) Section 230. Electronic Frontier Foundation. https://www.eff.org/issues/cda230
Fung, B. (2023, January 11). Biden urges Congress to pass bipartisan tech legislation in WSJ op-ed. CNN Business. https://www.cnn.com/2023/01/11/tech/biden-congress-tech-legislation/index.html
Kern, R. (2022, September 8). White House renews call to “remove” Section 230 liability shield. Politico. https://www.politico.com/news/2022/09/08/white-house-renews-call-to-remove-section-230-liability-shield-00055771
Klosek, K. (2021, June). Section 230 of the Communications Decency Act: research library perspectives. Association of Research Libraries. https://www.arl.org/wp-content/uploads/2021/07/2021.07.06-Issue-Brief-Section-230-of-Communications-Decency-Act.pdf
Knox, O. (2023, January 12). Biden calls for changing big tech moderation rules. But not how. The Washington Post. https://www.washingtonpost.com/politics/2023/01/12/biden-calls-changing-big-tech-moderation-rules-not-how/
Legal Information Institute. (n.d). 47 U.S. Code § 230 - Protection for private blocking and screening of offensive material. Cornell Law School. https://www.law.cornell.edu/uscode/text/47/230
Liptak, A. (2023, August 13). Biden administration urges justices to hear cases on social media laws. The New York Times. https://www.nytimes.com/2023/08/14/us/supreme-court-social-media-texas-florida.html
Office of the Attorney General. (n.d). Department of Justice’s review of Section 230 of the Communications Decency Act of 1996. Department of Justice. https://www.justice.gov/archives/ag/department-justice-s-review-section-230-communications-decency-act-1996
Ortutay, B. (2023, February 21). What you should know about Section 230, the rule that shaped today’s internet. PBS. https://www.pbs.org/newshour/politics/what-you-should-know-about-section-230-the-rule-that-shaped-todays-internet
Pittman, F.P., Anderson, H. & Oltean, J. (2023, June 15). Supreme Court declines to reconsider foundational principles of internet platform liability. White & Case. https://www.whitecase.com/insight-alert/supreme-court-declines-reconsider-foundational-principles-internet-platform-liability
Sye, D. (2021, June 22). The uncertain fate of Section 230. The Office for Intellectual Freedom of the American Library Association. https://www.oif.ala.org/the-uncertain-fate-of-section-230/
Twitter, Inc. v. Taamneh. No. 21–1496. (US Supreme Court 2023) https://www.oyez.org/cases/2022/21-1496
Wheeler, T. (2023, January 31). The Supreme Court takes up Section 230. Brookings. https://www.brookings.edu/articles/the-supreme-court-takes-up-section-230/
About the Author
Bella Wetherington is preparing to start her second year of law school at Georgetown University Law Center and hopes to use her legal education in the areas of media, entertainment, technology, and the arts.