Republican Section 230 Proposal

Subscribe to my newsletter, The Wavelength, if you want the content on my blog delivered to your inbox four times a week before it’s posted here.

As part of their legislative package of bills to take on “Big Tech,” House Energy and Commerce Committee Ranking Member Cathy McMorris Rodgers (R-WA) and House Judiciary Committee Ranking Member Jim Jordan (R-OH) unveiled a draft bill that will likely be the House Republicans’ negotiating position and rallying cry on 47 U.S.C. 230 (aka Section 230). Rodgers, Jordan, and many other Republicans in Congress have claimed that social media platforms like Facebook and Twitter have long targeted conservative speech and have stymied and censored content posted by conservatives. They have made these claims despite almost all available, objective evidence showing the opposite (see here and here.)

Rodgers and Jordan summarized their bill in their press release:

Preserving constitutionally protected speech, led by Republican Leaders Cathy McMorris Rodgers (R-WA) and Jim Jordan (R-OH), to remove liability protections for companies who censor constitutionally protected speech on their platforms, require appeals processes, and transparency for content enforcement decisions.   

This bill is mostly messaging, for there is no way I see for it to be enacted given Democratic control of the House, Senate, and White House. As a result, Rodgers and Jordan are planting a flag for Republicans to rally around, to fire up their base, and to give Republicans a position from which they can push back against Democratic proposals to revamp Section 230 such as Representatives Jan Schakowsky (D-IL) and Kathy Castor’s (D-FL) bill, the “Online Consumer Protection Act” (H.R.3067) (see here for more detail and analysis), Senate Intelligence Committee Chair Mark Warner (D-VA), Senator Mazie Hirono (D-HI) and Senator Amy Klobuchar’s (D-MN) “Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act” (S.299) (see here for more detail and analysis). or Representative Yvette Clarke’s (D-NY) discussion draft, the “Civil Rights Modernization Act of 2021,” (see here for more detail and analysis).

Big picture, Rodgers and Jordan’s bill would strip big social media platforms of significant legal protection and open them to suit if they were to keep moderating content as they presently do. It bears parenthetical mention that Republicans oppose generally a private right of action against technology companies for data breaches and privacy violations but this bill does not establish a private right of action for people to sue “Big Tech.” Rather the Federal Trade Commission and state attorneys general would be empowered to enforce this new bill.

This is a draft bill, and so there are brackets in various places where Rodgers and Jordan will need to go back and fill in the blanks.

From the start, Rodgers and Jordan makes clear their bill is aimed squarely as the social media platform giants. Section 230 would be amended to strip all liability protection from “covered entities” for content others post and for moderating content. Only the biggest of companies would be covered entities, for in order to qualify an “interactive computer service” (the term of art from Section 230) would need to reap $3 billion or more in annual revenue and also have 300 million monthly users. This group would likely only encompass Facebook, Twitter, Instagram, Snapchat, YouTube, and possibly a handful of others. Consequently, were this bill enacted, these companies could be treated as publishers of content on their platforms and face liability the same way newspapers and television stations do for the content they publish. Moreover, these companies could also be sued for moderating content. However, it bears some emphasis that the portions of this section defining “covered entities” has $3 billion and 300 million users in brackets, suggesting these numbers and the class of entities subject to these provisions might change.

However, Rodgers and Jordan do not intend for the aforementioned changes to Section 230 to immediately take effect. Rather they have designed them to operate as a Sword of Damocles over the heads of “Big Tech” through a five-year transitional rule that would keep most of the current statute in place. And yet, this transitional rule expires at the end of five years, requiring and affirmative act of Congress and the White House to extend it. Clearly, the message here to social media companies is behave or your liability protection vanishes, and by behave, given their public comments (Jordan having been much more insistent on this point), they mean social media platforms would keep their hands off of conservative speech and figures while presumably cracking down on speech conservatives do not like.

Digging into the transitional rule language, a few things should be stressed. First, the revised Section 230 language would apply only to covered entities, and smaller platforms (e.g. Parler) would still operate under the current Section 230. Second, the legal protection for users of interactive computer services under 47 U.S.C. 230(c)(2) is deleted. Therefore, users could be sued on account of moderation decisions platforms make under the revised, parallel Section 230. Third, (c)(2)(A) is revised to remove liability protection for moderating constitutionally protected speech, and Facebook and YouTube, among others, could be sued for removing speech protected under the First Amendment. However, this provision is a bit confused, for it retains legal protection for removing or limiting the availability of content the platform has an objectively reasonable belief is “obscene, lewd, lascivious, filthy, excessively violent, or harassing.” Some of this speech is likely protected under the First Amendment, and so United States courts would be drawn into this issue quickly.

Fourth, at present, Section 230 does not require an “objectively reasonable belief” that content is among the categories the statute provides legal protection for moderating. Right now, under current law, this is a subjective matter. Per this revised language, “Big Tech” could no longer moderate or remove content based on its subjective views. Fifth, Rodgers and Jordan dispense with the term “otherwise objectionable” in (c)(2)(A), another category of content platforms and users have legal protection in moderating. This class of content has often rankled Republicans at hearings because it is thought liberal values and mores drive moderation on the basis that it is “otherwise objectionable.”

Sixth, the definition of “information content provider” would be changed in this new parallel Section 230. Currently this term is defined as:

any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.

Rodgers and Jordan would add language expanding what constitutes being “responsible in whole or in part” to include “those instances in which an information content provider utilizes an algorithm to amplify, promote, or suggest content to a user unless a user knowingly and willfully selects an algorithm to display such content.” As a result, the algorithms all the big platforms use without user control to push up or push down content would make them responsible in whole or in part for content that could give rise to a lawsuit. Therefore, platforms would need to provide a means for users to select the algorithm they want, much like Twitter allows users to toggle between posts promoted through the platform’s algorithm or posts listed in chronological order.

Rodgers and Jordan’s bill imposes additional requirements and restrictions on large social media platforms. All such companies “shall implement and maintain reasonable and user-friendly appeals processes for decisions about content on such covered company’s platforms.” Moreover, if a covered entity “edits, alters, blocks, or removes” content, it must immediately contact the person or entity that posted such content and state clearly why the action was taken citing the specific provisions of the company’s content policies. Additionally, this notice must also explain the appeals process. Covered entities will have a specific number of days in which to respond, allow users to make their case, and make a final determination. The sections on the specific number of days in which platforms have to respond and make a decision are empty and bracketed.

Republicans have also picked up the flag of former President Donald Trump regarding suspensions and permanent bans in light of his bans from Facebook, Twitter, and YouTube. Covered entities would need to establish and operate a process informing users when and why they are banned or suspended. This process would have an appeals procedure under which companies would need to respond in a certain timeframe, allow users to make their cases, and make a decision within a specified timeframe. Again, the timeframe portions of the bill have been left blank.

Covered entities must file their “content enforcement decisions and appeals decisions” quarterly with the Federal Trade Commission (FTC). Specifically, these companies would need to provide this information to the FTC:

(1) Content that such covered company altered, flagged, or removed from such covered company’s platforms.

(2) The number of user accounts suspended based on content enforcement decisions.

(3) The number of user accounts permanently banned based on content enforcement decisions.

(4) For all content enforcement decisions, the number of such decisions related to conservative content and conservative accounts.

(5) For all content enforcement decisions, the number of such decisions related to liberal content and liberal accounts.

(6) The number of appeals filed for content decisions and corresponding decisions on such appeals.

(7) The number of appeals filed for suspensions or permanent bans of accounts and corresponding decisions on such appeals.

Of course, how each platform defines conservative and liberal content and accounts will likely differ. For example, would one platform consider white nationalist content conservative? Or racist material? Or terrorist material? Or libertarian material? And, surely any FTC effort to define conservative and liberal would lead to a political food fight. Nonetheless, the FTC will post these quarterly filings absent privileged and confidential material.

Of greater importance, the FTC would be able to punish violators of the Rodgers/Jordan bill and seek civil fines for first offenses as they would be deemed violations of a rule prohibiting unfair or deceptive practices. State attorneys general would also be allowed to enforce the act, including injunctive relief and restitution of the sort Republicans apparently oppose for the FTC generally under Section 13(b).

The Rodgers/Jordan bill would preempt all state laws.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Brian Wangenheim on Unsplash

Photo by Spencer Davis on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s