Republicans Float Section 230 Legislation

Online platforms as distributors, liable for illegal content? Per a Supreme Court Justice, that’s what a group of Republicans is proposing.

A group of Republicans propose a new approach on reforming 47 U.S.C. 230 with a bill that makes clear that online platforms could still be considered distributors of material, opening them to potential liability for knowingly or having constructive knowledge distributing illegal material.

Twitter

The Republican Study Committee follows Justice Thomas and seeks to peel back Section 230 protection for knowingly distributing illegal content.

Cocktail Party

The Republican Study Committee (RSC), a group of conservative House Republicans, have taken an approach on reforming 47 U.S.C. 230 (aka Section 230) inspired by Supreme Court Justice Clarence Thomas. The bill would target illegal content online by tweaking the statute to make clear online fora like Facebook, Google, Twitter, Reddit, etc. are distributors of content, leaving them open for liability if they know or should know about illegal content on their platforms. It is not clear Democrats would support this bill, however, for they have additional goals in reforming Section 230 to which Republicans have thus far been cool.  

Meeting

This bill represents a new approach to Section 230 reform, for it would reinstate the legal doctrine associated with distributors of written materials in the real world to the online world. The RSC is proposing to do so in the name of taking on online entities such as Pornhub, which has reportedly hidden behind Section 230 in order to slow-walk the removal of illegal content such as child pornography, non-consensual pornography, and footage of rape. The politics of the bill are favorable in that the bill sets aside the usual Republican tropes about bias and censorship in order to address what everyone across the spectrum agrees is a major problem. However, it is not clear that Democrats would allow a Republican bill to pass without appending some of their priorities on Section 230. And if this were to happen, passage of such legislation in the Senate would become very difficult, for Republicans seem opposed to the Democratic emphasis on what might be called hate speech and abuse directed at women and minorities.

Geek Out

The Republican Study Committee (RSC) issued their proposal to reform Section 230 that would peel back the liability shield Twitter, Facebook, Google and others have when the host content they know, or should know, is illegal. The RSC claims it “has served as the conservative caucus of House Republicans and a leading influencer on the Right since its original founding in 1973” and is the organization to which the most conservative Republicans of the House belong. Moreover, it is reputedly the single largest ideological caucus in the Congress with over 150 of the 211 House Republicans being members. Therefore, any proposal the RSC puts forth may well have the support of the majority of Republicans in the House. Needless to say, RSC proposals do not often have Democratic support, which is obviously necessary for any bill to pass both the House and Senate.

And yet, turning to the substance of the bill, the RSC’s bill may represent a new approach on Section 230 modification. The “Stop Shielding Culpable Platforms Act” (H.R.2000) would clarify Section 230 to ensure that platforms would still face legal liability as distributors, which they currently do not under case law. The bill adds a new subsection to 47 U.S.C. 230(c)(1) making clear that even though providers cannot be treated as publishers or speakers, they can be treated as distributors. This distinction would have legal consequences, for the latter category of entities have traditionally been liable when they have made available illegal content under two conditions: 1) they knew the material was illegal; and 2) they should have known the material was illegal (known in the legal field as “constructive knowledge.”) Therefore, if a platform were informed it was hosting illegal content, such as child pornography, and it did not expeditiously remove this content, it could be sued under the distributor doctrine. Presumably, distributing defamatory material a platform knows is false, or should have known it is false, would also open it liability.

In its one page summary of the bill, the RSC asserted:

it has recently been alleged that Twitter left up a child pornography video despite being notified by the victim up until federal officials demanded its removal. Although Pornhub recently took steps to scrub its platform of illegal content, including child pornography, rape, and other illegal activity, it was previously reported that its executive believed Section 230 would shield them.

The RSC added:

However, it is completely absurd that online platforms simultaneously spend vast amounts of time censoring the viewpoints of conservatives—including banning President Trump—in the name of protecting the public while they knowingly share illegal and harmful content. All the while they are relying on Section 230 to protect them.

The Stop Shielding Culpable Platforms Act would correct this inequity by clarifying that Section 230 does not shield online platforms when they knowingly share such content. It does so by ensuring that Section 230 is not inappropriately interpreted to prevent platforms from being treated as a distributor of content.

The RSC made clear that it was working from Justice Clarence Thomas’ statement appended to a Supreme Court of the United States’ (SCOTUS) denial to hear a case in which one company claimed Section 230 provided legal protection from otherwise antitrust and anti-competitive behavior. SCOTUS opted not to hear Malwarebytes, Inc. v. Enigma Software Group USA, and in his statement on the denial of the petition for a writ of certiorari, Thomas explained the background:

This case involves Enigma Software Group USA and Malwarebytes, two competitors that provide software to enable individuals to filter unwanted content, such as content posing security risks. Enigma sued Malwarebytes, alleging that Malwarebytes engaged in anticompetitive conduct by reconfiguring its products to make it difficult for consumers to download and use Enigma products. In its defense, Malwarebytes invoked a provision of §230 that states that a computer service provider cannot be held liable for providing tools “to restrict access to material” that it “considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” §230(c)(2). The Ninth Circuit relied heavily on the “policy” and “purpose” of §230 to conclude that immunity is unavailable when a plaintiff alleges anticompetitive conduct.

One supposes the most ardent defender of Section 230 would agree that antitrust and anti-competitive behavior falls outside the liability shield. But Thomas uses the case as a soapbox from which to propound his views on the proper reading of Section 230 that the RSC subsequently turned into legislation, in part.

Thomas recited the caselaw on Section 230, making note of where decisions have strayed from the text of the provision and other case law, particularly in ignoring the long recognized distinction between the strict liability a publisher or speaker faces for transmitting illegal content and the lesser level of liability for distributors doing the same in a knowing fashion or in a fashion when they should have known. Thomas said a decision in 1995, Stratton Oakmont, Inc. v. Prodigy Services Co., blurred the distinction between publishing and distributing and impelled Congress to enact Section 230. Thomas asserted his view on the proper scope of liability protection for platforms under 47 U.S.C. 230:

Taken at face value, §230(c) alters the Stratton Oakmont rule in two respects. First, §230(c)(1) indicates that an Internet provider does not become the publisher of a piece of third-party content—and thus subjected to strict liability— simply by hosting or distributing that content. Second, §230(c)(2)(A) provides an additional degree of immunity when companies take down or restrict access to objectionable content, so long as the company acts in good faith. In short, the statute suggests that if a company unknowingly leaves up illegal third-party content, it is protected from publisher liability by §230(c)(1); and if it takes down certain third-party content in good faith, it is protected by §230(c)(2)(A).

Thomas decried that “[c]ourts have discarded the longstanding distinction between “publisher” liability and “distributor” liability.” Thomas also took issue with a common reading of Section 230 that “departed from the most natural reading of the text by giving Internet companies immunity for their own content.” He claimed that when online platforms publish their own content (as opposed to the third-party content referenced in 230(c)(1)), they should not have liability protection. The game Thomas seems to be hunting in making this argument is that Twitter ought not to have liability protection for appending corrections to false statements made by people like former President Donald Trump, for such additions are content the platform has created and not inside the legal shield of Section 230. Of course, lower courts disagree with Thomas and have held that a publisher may alter content, for surely if I submit an op-ed piece to The New York Times, it is clear the paper may change, edit, or shorten the piece if they choose to run it. Thomas disagrees with the notion and instead sees Section 230 this way:

Taken together, both provisions in §230(c) most naturally read to protect companies when they unknowingly decline to exercise editorial functions to edit or remove third-party content, §230(c)(1), and when they decide to exercise those editorial functions in good faith, §230(c)(2)(A).

Put another way, platforms can only leave up all the posted material or take it down and nothing in between except for illegal material they know about or should know about. This is somewhat of a desired outcome for Republicans because Twitter could no longer correct tweets. It could either take them down or leave them up. And this is where the RSC’s bill is relevant. The organization explicitly followed Thomas’ lead by proposing legislation that would make platforms distributors subjecting them to liability for illegal content they know or should know about.

However, the RSC would not stop at this reform as detailed in a January 2021 memorandum. The RSC would follow Thomas’ advice and revise Section 230 to enshrine the notion that platforms can have legal protection to either take down or leave material:

Address the Conflation of C1 and C2:  As previously noted, Section 230(c)(1) (C1) and Section 230(c)(2) (C2) immunities are often conflated despite the fact that C1’s immunity is supposed to be confined to instances where an online platform passively allows content to remain on its website while C2 immunity is specifically designed to apply to instances of content moderation. Congress could amend Section 230 to clarify that C1 should not be interpreted to protect content moderation decisions.  [House Judiciary Committee] Ranking Member [Jim] Jordan’s “Protect Speech Act,” for instance, would address this important issue.

The RSC proposed further Section 230 changes that aim to remove legal protection for Facebook and others for “producing content:”

Clarify the Definition of Internet Content Provider: Congress could attempt to modify the definition of internet content provider to ensure that online platforms appreciably involved in the production of content hosted on their site are treated as internet content providers, and thus lose their immunity under Section 230. Such a reform could have significant ramifications in terms of imposing publisher liability on online platforms that fail to act passively toward content on their sites. However, this reform could make Section 230 function more closely to what its drafters intended. Moreover, it would take a large step toward addressing conservative concerns with social media company biases. Social media companies that undertake expansive content moderation policies could then be considered content providers rather than merely an “interactive computer service.”

These policies are several bridges too far given the current legislative landscape and may be more about messaging.

But in a related development, last week SCOTUS rejected a lawsuit as moot brought by the Knight First Amendment Institute against former President Donald Trump because he blocked them on Twitter. The United States Court of Appeals for the Second Circuit had ruled against Trump, holding the blocking of the plaintiffs violated the First Amendment. SCOTUS dodged the case because Trump is no longer President and hence there is no case or controversy. Nonetheless, Thomas decided to weigh in on Section 230 adjacent issues, notably the First Amendment. He voices his disagreement with the Second Circuit over its holding that Trump’s tweets constituted a public space and hence he could not block people over their viewpoints. But, Thomas goes on to ponder whether online platforms are akin to common carriers as railroads, telegraph companies, and phone companies that have been regulated as such. Thomas also proposes treating online platforms as public accommodations, another special class of private entities that are regulated differently than other entities. The upshot to Thomas’ wonderings seems to be finding a legal construct to achieve the broadly held Republican goal of requiring platforms to carry all viewpoints, which is the most charitable reading. A more cynical reading is that Republicans want to force platforms to carry conservative content and to bar these platforms from “censoring” them.

As mentioned earlier, Democrats in the House, Senate, and White House would have to buy into these proposals. Perhaps the RSC’s bill may work for some Democrats, but they may well want some of their preferred policies in return for signing onto the RSC bill. For example, they may want language exempting from legal protection any content that violates civil rights law, or they may want to address the underlying algorithms, to name two approaches some House Democrats have floated. However, these policies may not pass muster with Republicans, meaning support for a bill falls apart.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Charisse Kenion on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s