|A conservative Justice on the Supreme Court opines on Section 230, and then the FCC announces it will proceed with rulemaking to clarify Section 230.|
There continues to be intense and growing scrutiny of social media platforms that benefit from the liability shield in 47 U.S.C. 230 (Section 230), particularly from Republicans who seem intent on both cowing Facebook, Twitter, and others into not taking down Republican and conservative misinformation, disinformation, and lies and using the alleged but not proven claim that conservatives face bias on social media as a campaign issue. Unlike other recent weeks, a Justice of the Supreme Court of the United States has all but asked a lower court to send a Section 230 case so that the scope of the law can be decided. It is almost as if Republicans see no other pressing technology issue before them.
The Supreme Court opted not to hear a case, Malwarebytes, Inc. v. Enigma Software Group USA, in which one of the parties had tried to claim that Section 230 shielded it from antitrust claims, a very creative application of the law. In his statement on the denial of the petition for a writ of certiorari, Justice Clarence Thomas explained the background:
This case involves Enigma Software Group USA and Malwarebytes, two competitors that provide software to enable individuals to filter unwanted content, such as content posing security risks. Enigma sued Malwarebytes, alleging that Malwarebytes engaged in anticompetitive conduct by reconfiguring its products to make it difficult for consumers to download and use Enigma products. In its defense, Malwarebytes invoked a provision of §230 that states that a computer service provider cannot be held liable for providing tools “to restrict access to material” that it “considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” §230(c)(2). The Ninth Circuit relied heavily on the “policy” and “purpose” of §230 to conclude that immunity is unavailable when a plaintiff alleges anticompetitive conduct.
Thomas then goes on to recite the caselaw on Section 230, making note of where decisions have strayed from the text. Thomas concludes by asserting:
- Paring back the sweeping immunity courts have read into §230 would not necessarily render defendants liable for online misconduct. It simply would give plaintiffs a chance to raise their claims in the first place. Plaintiffs still must prove the merits of their cases, and some claims will undoubtedly fail. Moreover, States and the Federal Government are free to update their liability laws to make them more appropriate for an Internet-driven society.
- Extending §230 immunity beyond the natural reading of the text can have serious consequences. Before giving companies immunity from civil claims for “knowingly host[ing] illegal child pornography,” Bates, 2006 WL 3813758, *3, or for race discrimination, Sikhs for Justice, 697 Fed. Appx., at 526, we should be certain that is what the law demands.
- Without the benefit of briefing on the merits, we need not decide today the correct interpretation of §230. But in an appropriate case, it behooves us to do so.
Thomas’ statement served as a clarion call for conservatives who envision themselves oppressed by platforms like Twitter and Facebook despite the most popular content is consistently from those on the right.
Thereafter, the chair of the Federal Communications Commission (FCC) announced that that the “[t]he Commission’s General Counsel has informed me that the FCC has the legal authority to interpret Section 230…[and] [c]onsistent with this advice, I intend to move forward with a rulemaking to clarify its meaning.” Pai namechecked Thomas’ statement in which he “pointed out that courts have relied upon ‘policy and purpose arguments to grant sweeping protections to Internet platforms’ that appear to go far beyond the actual text of the provision.”
Working along a parallel track is pressure on the Senate committee that oversees the FCC to vet, hold a hearing on, and approve Trump’s nominee for the FCC. Commissioner Mike O’Reilly was lukewarm to the EO and his appointment to the FCC was expiring. And so, in typical Trump Administration fashion, the White House decided that the policy was not the problem. Personnel was. Consequently, Nathan Simington of the National Telecommunications and Information Administration (NTIA) was nominated to replace O’Reilly, and the Senate Commerce, Science, and Transportation Committee is set to take up the nomination on 10 November.
In May, after Twitter factchecked two of his Tweets regarding false claims made about mail voting in California in response to the COVID-19 pandemic, President Donald Trump signed a long rumored executive order (EO) seen by many as a means of cowing social media platforms: the “Executive Order on Preventing Online Censorship.” This EO directed federal agencies to act, and one has by asking the Federal Communications Commission (FCC) to start a rulemaking, which has been initiated. However, there is at least one lawsuit pending to enjoin action on the EO that could conceivably block implementation.
In the EO, the President claimed
Section 230 was not intended to allow a handful of companies to grow into titans controlling vital avenues for our national discourse under the guise of promoting open forums for debate, and then to provide those behemoths blanket immunity when they use their power to censor content and silence viewpoints that they dislike. When an interactive computer service provider removes or restricts access to content and its actions do not meet the criteria of subparagraph (c)(2)(A), it is engaged in editorial conduct. It is the policy of the United States that such a provider should properly lose the limited liability shield of subparagraph (c)(2)(A) and be exposed to liability like any traditional editor and publisher that is not an online provider.
Consequently, the EO directs that “all executive departments and agencies should ensure that their application of section 230(c) properly reflects the narrow purpose of the section and take all appropriate actions in this regard.”
Following the directive in the EO, on 27 July, the NTIA filed a petition with the FCC, asking the agency to start a rulemaking to clarify alleged ambiguities in 47 USC 230 regarding the limits of the liability shield for the content others post online versus the liability protection for “good faith” moderation by the platform itself.
The NTIA asserted “[t]he FCC should use its authorities to clarify ambiguities in section 230 so as to make its interpretation appropriate to the current internet marketplace and provide clearer guidance to courts, platforms, and users…[and] urges the FCC to promulgate rules addressing the following points:
- Clarify the relationship between subsections (c)(1) and (c)(2), lest they be read and applied in a manner that renders (c)(2) superfluous as some courts appear to be doing.
- Specify that Section 230(c)(1) has no application to any interactive computer service’s decision, agreement, or action to restrict access to or availability of material provided by another information content provider or to bar any information content provider from using an interactive computer service.
- Provide clearer guidance to courts, platforms, and users, on what content falls within (c)(2) immunity, particularly section 230(c)(2)’s “otherwise objectionable” language and its requirement that all removals be done in “good faith.”
- Specify that “responsible, in whole or in part, for the creation or development of information” in the definition of “information content provider,” 47 U.S.C.
§ 230(f)(3), includes editorial decisions that modify or alter content, including but not limited to substantively contributing to, commenting upon, editorializing about, or presenting with a discernible viewpoint content provided by another information content provider.
- Mandate disclosure for internet transparency similar to that required of other internet companies, such as broadband service providers.
NTIA argued that
- Section 230(c)(1) has a specific focus: it prohibits “treating” “interactive computer services,” i.e., internet platforms, such as Twitter or Facebook, as “publishers.” But, this provision only concerns “information” provided by third parties, i.e., “another internet content provider”68 and does not cover a platform’s own content or editorial decisions.
- Section (c)(2) also has a specific focus: it eliminates liability for interactive computer services that act in good faith “to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”
In early August, the FCC asked for comments on the NTIA petition, and comments were due by 2 September. Over 2500 comments have been filed, and a cursory search turned up numerous form letter comments drafted by a conservative organization that were then submitted by members and followers.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.