acm-header
Sign In

Communications of the ACM

Law and technology

Internet Immunity and the Freedom to Code


230 buoy

Credit: Roman Art

The internet's freedom to code is in jeopardy. In 1996, Congress enacted 47 U.S.C. § 230 ("Section 230"), which says Internet services are not liable for third-party content in many cases. In practice, for over two decades, Section 230 has legally immunized coders' decisions about how to gather, organize, and publish third-party content.

Section 230 has become a political target by all sides, but reforming it will impair coding freedom. In this Law and Technology column, I explain how Section 230 came into existence, the effects it has had, and why technologists should rally behind it to preserve their ability to build the next generation of Internet services.

Back to Top

Section 230's Origins and the Moderator's Dilemma

Two Seminal Cases. Two 1990 court rulings laid the foundation for Section 230. (For more about Section 230's history, see Jeff Kosseff's excellent book, The Twenty-Six Words That Created the Internet); see https://bit.ly/2G8ATH7.

In 1991, in Cubby v. CompuServe, CompuServe defeated a defamation claim for carrying a third-party publication called Rumorville. The court said CompuServe could be liable if it knew or should have known about the defamation. However, CompuServe lacked that knowledge because Rumorville uploaded its content directly to CompuServe's servers, without any human pre-screening by CompuServe, and CompuServe had not been told about the defamation.

Despite CompuServe's win, the Cubby ruling was not great for other online services that publish third-party content. First, CompuServe passively hosted Rumorville and exercised no editorial control over it. While passive hosting might work for some professionally produced content, the rough-and-tumble universe of user-generated content usually requires more active management.

Second, if Cubby had notified CompuServe of the alleged defamation, CompuServe would have had to remove the content to avoid liability. Defamation is easy to allege—and difficult for Internet services to evaluate. Thus, under Cubby's rule, Internet services would have routinely received take-down notices claiming third-party content was defamatory, and the services would have honored the notices regardless of their legitimacy. Indeed, we have seen analogous problems with the Digital Millennium Copyright Act's notice-and-takedown scheme for claiming that users infringed copyright.

A 1995 decision, Stratton Oakmont v. Prodigy, delivered even worse news to the Internet industry. Prodigy advertised itself as a "family-friendly" service. It operated popular message boards. A user posted messages that allegedly defamed the plaintiff (the investment bank unfavorably portrayed in the 2013 movie Wolf of Wall Street). The court said that Prodigy was the legally responsible publisher of user-submitted posts because Prodigy had removed other user postings from its message boards and touted itself as family-friendly.

The Stratton Oakmont decision created a paradox called the "Moderator's Dilemma." According to Stratton Oakmont, moderating user content increased the service's potential legal liability for any harmful content it missed. Accordingly, services had to moderate user-submitted content perfectly or accept liability for any mistaken decisions. Alternatively, it might be legally wiser for Internet services to passively host user content—like CompuServe's passive distribution of Rumorville—than to do any moderation at all. Following Cubby and Stratton Oakmont, the Internet community was not sure which approach was better.

The Online Pornography Overreaction. In 1995, sensational (and largely overblown) stories reported that children could easily access pornography online. Congress responded to this panic with a new crime that would send online service operators to jail if they allowed children to access pornography.

Two Congressmen, Reps. Cox and Wyden, envisioned a different approach. Instead of banning online pornography, they thought online services would voluntarily curb user-submitted pornography—if the services did not face the Moderator's Dilemma created by the Stratton Oakmont decision. Thus, Cox and Wyden proposed shielding online services from liability for third-party content, with the hope that online services would feel legally secure enough to perform content moderation duties that benefit everyone. That proposal became Section 230.

Though the criminal liability provisions and Section 230's immunity were intended as alternatives, Congress combined them into a single law called the Communications Decency Act ("CDA"). In 1997, the U.S. Supreme Court struck down the CDA's criminal provisions, leaving Section 230 in place.

Back to Top

What Section 230 Does

Section 230 gives technologists enormous freedom to design and implement user-generated content services. As a federal appeals court explained in 2016 while ruling in favor of Section 230's immunity: "[the plaintiff's] claims challenge features that are part and parcel of the overall design and operation of the website ... Features such as these, which reflect choices about what content can appear on the website and in what form, are editorial choices that fall within the purview of traditional publisher functions."

This legal standard facilitates innovation in several ways.

First, services may freely experiment with new ways of gathering, sorting, and presenting user-generated content. Under different liability rules, those experiments would expose the services to liability for any harmful content they missed, discouraging experimentation and innovation. For example, plaintiffs have argued that user-generated content sites should face liability for different ways they algorithmically promote or excerpt user content. For now, Section 230 forecloses those arguments.

Second, Section 230 helps innovative services launch without having been perfected, so services can error-correct and fine-tune their technology in response to actual usage. For example, new online services can launch without replicating Google's $100M+ investment in filtering technology or hiring tens of thousands of content reviewers like Facebook has. Without Section 230, online services would need to deploy industrial-grade content controls at launch, which would significantly raise the costs of entry and make it impossible for many innovative services to reach the market at all.

Third, Section 230 permits diverse industry practices to emerge. If the law required mistake-free content moderation, the industry would gravitate toward a single content moderation technological solution that minimized liability. Instead, Section 230 enables services to choose among a virtually infinite number of content moderation techniques—allowing services to optimize their content moderation for their specific audience's needs. Thus, Facebook can tightly restrict hate speech, while Reddit can tolerate subreddits that span a wide ideological spectrum. Similarly, services competing for an identical audience can deploy different solutions that become additional points of competitive differentiation.

Due to Section 230, industry "best practices" for content moderation did not get set in stone during the Internet's earliest days. Instead, best practices for user-generated content continue to iterate, and those iterations potentially deliver ever-increasing social benefits from content moderation.

Furthermore, because of Section 230, lawyers typically do not define product specifications for new user-generated content services; technologists and marketing experts do. It means coders can code without waiting for legal clearance. In a less-favorable legal environment, that will be reversed.

Back to Top

Section 230's Imperiled Future

Section 230 was a bipartisan law, and it garnered significant bipartisan support for its first two decades. That has changed. It has few friends today, and both Democrats and Republicans have publicly targeted Section 230. Congress' gridlock is notorious, but reforming Section 230 has the potential to gather enough bipartisan cooperation to break through that gridlock.

We have already seen one bipartisan incursion into Section 230. In 2018, Congress passed FOSTA, a law that reduced Section 230's immunity for the advertising of commercial sex. Congress was repeatedly warned that FOSTA would not actually solve the problems it targeted, and that it would resurrect the Moderator's Dilemma that Section 230 had negated. Despite those warnings, Congress overwhelmingly passed FOSTA. Worse, its sponsors still consider the law a success, despite the growing evidence that FOSTA has not helped victims and has eliminated or degraded free speech on the Internet.

FOSTA is part of a growing global trend of imposing criminal liability on online service executives for not adequately restricting harmful user content. Australia recently enacted similar liability, and the U.K. has proposed doing so as well. The risk of criminal liability devastates entrepreneurship because entrepreneurs are willing to risk money, but they will not risk their personal liberty.


Often, regulators assume services can just "nerd harder," that is, magically solve an impossible technological task.


Reducing Section 230's immunity invariably would impair the freedom to design and innovate. (The First Amendment might provide some backup protection, but not enough). For example, to reduce online service "bias," some conservative regulators favor "must-carry" obligations that force online services to treat all content equally (despite decades-long conservative opposition to must-carry obligations in other media). Such a rule would be a policy disaster because it would enable spammers, trolls, and miscreants to overwhelm services with their anti-social content and behavior. This policy would also inhibit new technological ways to filter and present content because the law would dictate a single option.

Often, regulators assume that services can just "nerd harder," that is, magically solve an impossible technological task. For example, Europe's Copyright Directive requires online services to ensure no copyright infringing material appears on their services; and Germany's NetzDG law requires online services to remove online terrorist-related content within one hour. In theory, "nerd harder" rules encourage technologists to innovate even more. In practice, "nerd harder" laws either eliminate user-generated content outright, or lead to a single industrywide standard dictated by legal considerations, stifling innovation.

Back to Top

Conclusion

In the 1990s, Lawrence Lessig (now of Harvard Law) distinguished "East Coast code," legislation produced by regulators, from "West Coast code," software produced by technologists. Both types of code can regulate online behavior, but technologists often prefer West Coast code because it is more adaptable and usually spurs, rather than inhibits, additional innovation.

Section 230 is an example of East Coast code—but with the twist that it prevents other East Coast code from controlling user-generated content. Section 230 has literally sidelined thousands of state and local regulators from imposing their code on the Internet.

All that is at risk now. Across the globe and in the U.S., regulators are aggressively attempting to shape the Internet to their specifications. This regulatory frenzy will take more control out of the hands of the West Coast coders and put it into the hands of the East Coast coders.

The intervention of East Coast coders will not help the Internet reach its technological and social potential. Section 230 gave technologists the freedom to develop the modern Internet. Not every Internet service has embraced Cox & Wyden's hope that they would voluntarily undertake socially responsible content moderation. Still, Section 230 demonstrated that protecting the freedom to code allows technologists to develop amazing solutions that produce extraordinary social utility and, concomitantly, extraordinary social and private financial benefits. Regulators should look for more opportunities to do that—starting by protecting Section 230 from becoming the next victim of East Coast code.

Back to Top

Author

Eric Goldman ([email protected]) is a professor at Santa Clara University School of Law, Santa Clara, CA, USA.


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.


 

No entries found