Editor’s Note: For another viewpoint, see Point: Biden Should Revoke Section 230 Before We Lose Our Democracy.
Section 230 of the Communications Decency Act prevents digital intermediaries from being treated as the “publisher or speaker” of their users’ speech and blocks litigation over platforms’ decisions to remove speech they deem violent, obscene, or otherwise objectionable. Platforms are under no obligation to remove speech, with some exceptions, but cannot be required to carry speech, either. The law applies universally to digital intermediaries; Facebook is not liable for its users’ speech, and the New York Times is not liable for its comments section. By properly placing responsibility for harmful or unlawful speech with the speaker, Section 230 maximizes the ability of companies to produce publishing tools.
In the 25 years since its passage, this prescient rule has paid tremendous dividends. Americans are served by a dizzying array of publishing intermediaries, allowing us to communicate in real time via text, audio, and video. We have created forums for work, worship, play and romance, serving every imaginable nice interest and minority. Of course, not all interconnection has been positive. Extremists and criminals use the internet too. Some argue that amending or repealing Section 230 would compel platforms to suppress extremist speech and criminal activity.
However, exposing platforms to broad liability for user speech would lead to the removal of much more than dangerous speech.
Platforms already make extensive use of their ability to remove unwanted speech, filtering spam, threats, advertisements for illegal goods, foreign propaganda, and even simply off-topic speech. Popular platforms review millions of posts a day, often with the assistance of imperfect software. At this scale, some innocent speech will inevitably be misunderstood, mislabeled, and removed. Over the past few years, major platforms’ rules have become more stringent and expansive, prompting concerns about censorship and bias.
Demanding that platforms assume liability for their users’ speech will at best exacerbate the accidental removal of innocent speech. However, it also runs the risk of limiting who can speak online at all. Digital intermediaries usually review speech after publication. Speech may be flagged, either by other users, human moderators, or algorithms, and placed in queue for adjudication. Section 230 allows platforms to remain open by default and worry about excluding misuse when it occurs, giving a voice to everyone with an internet connection.
In contrast, newspapers and other traditional publishers filter, edit, and modify submissions before publication. While this allows them to safely assume full ownership of the speech they publish, it dramatically limits who can speak. Editing is a laborious and time-consuming process. Even if a newspaper wanted to publish every letter to the editor, it would have neither the space nor the time to do so. This model often produces consistently high-quality speech, but tends to favor some perspectives over others, offering only a narrow slice of elite sentiment.
Repealing Section 230 would make social media more like traditional media by making it exclusive. With limited resources to review speech before publication, platforms would have to determine whose perspectives should be prioritized. There is little reason to think their selections would differ greatly from newspapers. If replies and responses had to be reviewed as well, social media would lose most of its interactivity, becoming another conduit through which speech is passively received.
Without Section 230, platform moderators would not become more deliberate, they would simply remove more. The threat of costly litigation does little to inspire thoughtful decision making — moderators will act quickly to eliminate any source of legal risk. When Congress amended Section 230 in 2017 to expose platforms to liability for speech promoting prostitution or sex trafficking, Craigslist did not moderate its personal advertisements page more cautiously, it shut the page down.
Indeed, without Section 230’s protections, many smaller forums would simply shut down, or look to be acquired by larger firms. Could the operators of V8Buick.com, a forum for antique car collectors with 38,000 users, afford even a single years-long defamation lawsuit? The easiest way to avoid legal liability is acquisition.
Apart from suppressing speech, repealing Section 230 would suppress competition, agglomerating activity onto large platforms such as Facebook. Without Section 230, Facebook, but not V8Buick.com, could afford to litigate controversies over user speech.
Repealing Section 230 is a drastic step that would upend the internet, punishing successful firms and internet users for the behavior of an antisocial minority. Heaping legal liability on platforms will not render them more thoughtful or judicious. It will cause some to close, and others to exclude all but the most inoffensive sentiments.