Beyond the Sidewalk's End: The Internet without Section 230
===========================================================

(1) No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1). (47 U.S.C. § 230(c))

With the Supreme Court set to rule on Google’s liability for algorithmically-promoted terrorism content under Section 230 (archive) of the Communications Decency Act, I began imagining what may become of the Internet without its protections. Occasionally during moments of exasperation and frustration when I see the horrors wrought when commoners (archive) are allowed access to computers with an Internet connection, I fantasize about an alternate reality. In this admittedly elistist fantasy, computers remained primarily tools of education, science and engineering, and the Internet looks more like it did before a small handful of websites dominated a majority of the traffic and content generation: a diverse collection of smaller, individual websites and semi-isolated networks with the creative control solely in the hands of the operators and direct members of the surrounding communities. The technical burden of publishing content on the Internet fell mostly to the individual creating it, rather than publishing platforms like YouTube and Facebook that do most of the technical work for you, at the cost of needing to abide by their rules and to be subject to their moderating discretion.

In short: you had to know what you were doing (archive).

With the survival of Section 230’s protections uncertain, a return to these norms may no longer be out of the question.

Without Recommendation Engines
------------------------------

If providers can be held liable for consequences of user exposure to content promoted by recommendation engines (“the algorithm,”) the sophistication of these mechanisms may become significantly hobbled, with them perhaps even being done away with altogether. Content on social media platforms may be presented only in chronological order. Questions would arise about what constitutes a recommendation engine; are search engines just a type of recommendation engine? Certain types of platforms like YouTube could become extremely difficult to use without users possessing foreknowledge of what specific content they want to access.

Section 230 has always protected the ability of online services and their users to highlight relevant, useful content from others, connecting people to the material they enjoy. Legal risk for recommending or organizing content would reduce useful services like showing the best job listings, listing the most relevant products, or displaying the most helpful videos of recipes, songs, or sources of news, entertainment and information.

——Google, Gonzalez v Google and the future of an open, free and safe internet (archive)

I think Google’s assessment of the possible outcomes of a ruling against them is reasonable. Even moreso, such a ruling could precede a wave of new lawsuits that may seek to further chip away at Section 230’s afforded protections. The following is speculation on how the topology of what we currently know as the Internet may change in response.

Everything or Nothing
---------------------

If platforms can no longer afford the risk of permitting their users from publishing their own content without either the strictest of moderation or a complete lack of moderation, they will split into several distinct types: unmoderated badlands where only illegal content is removed, private websites and networks, and Club Penguin (archive).

As documented in The Twenty-Six Words that Created the Internet (archive), a 1995 ruling by a New York state court judge against the ISP Prodigy established a precedent that a computer service “provider” can be considered liable as a “publisher” if it exercises any editorial control of the content it hosts, even when submitted by its users.

By actively utilizing technology and manpower to delete notes from its computer bulletin boards on the basis of offensiveness and “bad taste”, for example, PRODIGY is clearly making decisions as to content (see, Miami Herald Publishing Co. v. Tornillo, supra ), and such decisions constitute editorial control….PRODIGY’s conscious choice, to gain the benefits of editorial control, has opened it up to a greater liability than CompuServe and other computer networks that make no such choice.

——Stratton Oakmont, Inc. v. Prodigy Services Co., 23 Media L. Rep. 1794 (N.Y. Sup. Ct. 1995) (archive)

If Section 230’s protections become undone, it is reasonable to predict that this precendent will again become relevant and applicable.

Section 230 supports free expression online, enabling websites to offer a wide-range of information and allow a diversity of voices to be discovered and heard. Without Section 230, some websites would be forced to overblock, filtering content that could create any potential legal risk, and might shut down some services altogether. That would leave consumers with less choice to engage on the internet and less opportunity to work, play, learn, shop, create, and participate in the exchange of ideas online.

——Google, Gonzalez v Google and the future of an open, free and safe internet (archive)

Survival
--------

Without the assistance of major platforms to publish and promote your content, each website and Internet service effectively becomes its own publisher, of its own operator’s content, unless you opt to publish on a platform alongside offensive content. Websites will diversify, and so will the infrastructure hosting and connecting them, becoming tightly coupled with each other. Special interest communities will form around constructing and sharing common infrastructure, and only members of those communities will be able to publish on it.

Even worse, connectivity itself may become balkanized. The potential liability of serving problematic content will see not just the hosting infrastructure becoming highly localized but the connectivity infrastructure as well, requiring you to be both a member of the relevant communities to even access their content and sufficiently geographically adjacent to an access point, as peering between these communities will be limited.

An extremely prescient example of this has recently taken place, with the attempted (and ultimately failed) removal of the Kiwi Farms (archive) from the Internet. A long-running DDoS attack has been targeting the site for months. It has suffered frequent outages as a result, and its owner Joshua Moon has struggled (but succeeded) in keeping it online due to pressure on infrastructure providers to discontinue not just DDoS protection but also hosting and even layer 3 connectivity. Tier 1 providers such as CentryLink and Arelion blackholed the site’s IP address range; access to the site was occasionally dependent on who your ISP was. None of this was done at the request or order of any legal proceeding or judgement, it was exclusively private companies providing important infrastructure (archive) exercising their right to moderate the content they serve. Section 230 also protects this, as does the United States’ lack of net neutrality rules.

I consider Joshua’s perspective on this valuable because of the unique challenges he has faced with keeping his site reachable from the clearnet. Irrespective of any deserved condemnation of the legally-hosted content of this particular site and its operator, the normalization of lower-level infrastructure providers engaging in content moderation will inevitably affect the Internet’s topology.

The Internet is also very fragile. It has many moving parts and drives the politics of our world today. For this reason, it will die, and it will die very soon. Our big-I Internet is being broken apart and will soon become many little-i internets. Every large country or trade union will have its own local and strictly regulated internet. Connections between internets will occur on the vestigial remains of the big-I Internet, requiring a special business permits to access.

——Joshua Moon, Where the Sidewalk Ends: The Death of the Internet (archive)

I believe the topology fragmentation can become even more extreme than this. It’s not so unreasonable to speculate that fear of legal liability could produce similar results as does fear of negative PR and business loss (archive).

I may yet see my strange regressive fantasy come to pass.

· section-230