A Bipartisan Pair of Senators Revise Their 230 Legislation

Senators Schatz and Thune take a process-oriented approach to reforming Section 230 to regularize content moderation and make the process more intelligible for people to flag content and appeal moderation decisions.

First the shameless pitch. You can receive all the content posted on my blog by subscribing to my free (for now) newsletter, the Wavelength, which covers the major events in the world of technology policy, politics, and law. I spent more than a decade working on these issues in Washington DC as both a staffer in Congress and a registered lobbyist, so I have a perspective not found elsewhere.

Subscribe today.

A Democratic and a Republican have revised and released a bill to narrow the 47 U.S.C. 230 liability shield to require platforms to take down illegal content and activity and to regularize processes for content that may violate a platform’s policies.

Twitter

Instead of targeting certain content, the PACT Act targets all illegal content and lays out the process platforms must use to weigh complaints about content that may violate terms of service.

Cocktail Party

Another Section 230 bill has dropped. This time it is one from last year that has been refreshed and tweaked. However, it is the same backdrop for reforming the liability shield the Facebooks, Twitters, YouTubes, and others enjoy. Everyone agrees there is a problem with the online world, but the two parties have vastly different descriptions of the problem and diagnoses for solving these issues.

And so, a bipartisan Senate bill that focuses more on the process by which content moderation occurs sounds like it could thread the various policy and political needles. Perhaps, but other stakeholders are still drafting their bills, meaning Section 230 legislation is unlikely to move soon. Moreover, the bill imposes serious costs on large platforms that will need to establish large operations to handle and adjudicate complaints, and, so this alone gives them incentive to fight this legislation.

Meeting

Senator Brian Schatz (D-HI) and Senate Minority Whip John Thune (R-SD) are focusing on the process that social media use to moderate content, seeking to make it uniform for large platforms. Their bill establishes a timeline under which platforms would need to remove illegal content and activity. The bill would require these platforms to publish understandable use policies to make clear what is acceptable content and activity on each platform, and what is not. Theoretically, with clearer rules of the road, users would know what may run afoul of the platform’s policies. In the event, content is taken down or flagged, there would be a process for the person or entity that posted the content to appeal the decision.

While the bill would give people more opportunities to contest takedown decisions or to flag content that possibly violates the use policies, the bill may do little to police some of the online harm some think needs addressing. For example, a person spewing anti-Asian American racism that does not make any credible threats of violence or violate any statutes would only need to worry not violating the content policy of the platform in question. Admittedly, some platforms like Facebook are taking down racist material, but they may claim it is not their place to police or censor free speech. Therefore, if a large platform decides to take a hands off, absolutist approach to content, then racist, sexist, or abusive material may not be taken down no matter how many people flag it. Additionally, the bill’s enforcement mechanism in this sort of case is the Federal Trade Commission (FTC) punishing platforms that do not follow the process rules for handling complaints. The substance of the decisions is outside the FTC’s purview.

Moreover, Schatz and Thune seem to be using the traditional distributor theory of legal liability as their basis on illegal content and activity. Under this doctrine, distributors are held to a lesser standard than publishers in that actual or constructive knowledge is needed before they are liable. This bill requires a court ruling on the legality of content or activity before a platform must take down said content or face legal liability, for this is language narrowing Section 230’s shield. And so, to have illegal content or activity removed, someone will need to litigate and obtain a judgment, whether that be a private citizen, an advocacy organization, a state attorney general, or the federal government. It is unclear if a litigant could obtain a ruling on a class of content (e.g., child pornography or revenge porn) and then platforms would need to take this content down. The language added to the revised PACT Act suggests that may be hard, for the amount and type of information about the content and its whereabouts necessary to satisfy the notice requirement has been heightened.

Geek Out

Senator Brian Schatz (D-HI) and Senate Minority Whip John Thune (R-SD) have reintroduced and revised the “Platform Accountability and Consumer Transparency (PACT) Act,” (S.797) “bipartisan legislation to update Section 230 of the Communications Act” as explained in their press release. The revisions change some significant parts of the bill, but the basis framework remains the same.

Under the PACT Act, so-called “interactive computer services” (ICS) (the term of art used in Section 230) that encompasses many online platforms would need to draft and publish “acceptable use polic[ies]” that would inform users of what content may be posted, a breakdown of the process by which the online platform reviews content to make sure it is in compliance with policy, and spell out the process people may use to report potentially policy-violating content, illegal content, and illegal activity.

Section 230’s liability shield would be narrowed with respect to illegal content and activity. If a provider knows of the illegal content and activity but does not remove it within 24 hours, then they would lose the shield from lawsuits. So, if Facebook fails to take down a posting urging someone to assassinate the President, a federal crime, within 24 hours of being notified it was posted, it could be sued. However, Facebook and similar companies would not have an affirmative duty to locate and remove illegal content and activity, however, and could continue to enjoy Section 230 liability if there is either type of content on its platform so long as there is no notice provided. And yet, Section 230 would be narrowed overall as the provision making clear that all federal criminal and civil laws and regulations are outside the liability protection. Currently, this provision only pertains to federal criminal statutes. And state attorneys general would be able to enforce federal civil laws if the lawsuit could also be brought on the basis of a civil law in the attorney general’s state.

The Federal Trade Commission (FTC) would be explicitly empowered to act under the bill. Any violations of the process by which an online platform reviews notices of potentially policy-violating content, appeals, and transparency reports would be violations of an FTC regulation defining an unfair or deceptive act or practice, allowing the agency to seek civil fines for first violations. But, this authority is circumscribed by a provision barring the FTC from reviewing “any action or decision by a provider of an interactive computer service related to the application of the acceptable use policy of the provider.” This limitation would seem to allow an online platform to remove content on its own initiative if it violates the platform’s policies without the FTC being able to review such decisions. This would provide ample incentive for Facebook, Twitter, Reddit, and others to police their platforms so that they could avoid FTC action. The FTC’s jurisdiction would be widened to include non-profits regarding how they manage removing content based on a user complaint the same way for profit entities would be subject to the agency’s scrutiny.

There are more wrinkles to the PACT Act introduced last year, so see here for more detail and analysis on that bill.

Schatz and Thune have changed the PACT Act in a number of key ways. First, they changed the Findings and Policy sections to reflect new emphases. For example, one of the findings was changed to articulate the “compelling government interest in having providers of interactive computer services provide information to the public about their content moderation policies and practices because of the impact those policies may have on the speech interests of their consumers.” Last year’s bill stressed the lack of protection for Americans online save for federal criminal statutes. Similarly, another change places emphasis on the need for Americans to be able to understand the terms of service of the platforms they use instead of a lack of legal protection online. Another new finding posits that the U.S. government should hold platforms accountable for failing to respond to people’s content moderation concerns.

The new PACT Act introduces a third class of entities beyond “small business providers” and the companies large enough to be subject to the provisions of the bill: “individual providers.” The new class of entities are those interactive computer services (ICS) that have fewer than 100,000 unique monthly visitors and less than $1 million in revenue. This class of entities would not need to have a live company representative to which people could complain about content nor an online complaint system. They would not need to meet the new obligation of larger platforms to review and respond to complaints about possibly policy-violating content. Nor would this new class of entities need to notify people when their content is removed and allow them an appeal process subject to certain exceptions, some of which were changed from the original bill, but more on that later. Nor would individual providers need to issue biannual transparency reports.

The revised bill increases the universe of small business providers by increasing the annual revenue limit from $25 million to $50 million. The deadline for responding to illegal content and activity and policy violating content would be changed from a reasonable period of time to four days for illegal material after proper notice (i.e., the same deadline as larger ICS) and 21 days for complaints alleging that content violates the ICS’s acceptable use policies. Small business providers would still not need to have live company representatives, nor would they need to issue transparency reports.

Another significant change is that ICS would get more time to takedown illegal content and activity. The new bill allows ICS four days (instead of 24 hours) to remove illegal content or illegal activity once they receive proper notice. There is new language giving ICS ten days to takedown illegal content or activity after a default judgment or stipulated settlement is entered once the ICS receives proper notice.

The new PACT Act modifies the threshold for ICS to respond to complaints about content that may violate its policies, adding a “good faith” requirement, presumably to allow ICS to ignore voluminous, bothersome, and repeated complaints. This seems like a wise tweak to avoid the changing of this complaint system into a weapon for trolls and others to abuse. On the other hand, depending on how “good faith” is read, ICS may enforce this qualification in ways that minimize their responsibilities to respond to complaints.

The required complaint adjudication process is modified to allow ICS a bit more time to deal with “extraordinary investigation[s].” ICS would be allowed “reasonable extension[s]” to deal with complaints requiring extraordinary investigation. However, neither term is defined, which may lead to ICS reading them in ways favorable for allowing them more time to respond in these cases. The FTC would have still have authority to police how ICS adhere to their complaint processes regarding content violating policies but not the application of the policies themselves (i.e., the decisions on whether content complies or not with its acceptable use policy.)

The situations under which ICS would not need to notify people or give them a right of appeal if policy violating content is taken down is expanded. Notably, if an ICS “reasonably believes” such notice may risk imminent harm to people or impede a law enforcement investigation, no such notice is needed. The previous bill made this exception to the notice and appeal process in the event the ICS actually knew of a law enforcement investigation, so the new bill significantly widens this loophole. Additionally, notice and a chance to appeal would not need to be provided if a law enforcement agency makes a request of the ICS not based on the reasonable belief it would impede an investigation.

Both versions of the PACT Act require ICS (but not individual providers and small business providers) to publish transparency reports providing data on their flagging, complaint, removal processes, and other aspects of their systems. But the frequency of these reports is changed from every three months to every six months. Additionally, ICS no longer need to report on the flagging of spam and fraudulent activity through “internal automated detection tool[s].” However, the reporting requirements are increased to include the number of times content is flagged as illegal or as violating policy by what may be understood as non-users (i.e., government agencies, researchers, and other ICS.) ICS must also report the number of times it declined to act based on complaints about policy violating content that required a reasonable extension of the time to respond because of extraordinary investigation. The previous PACT Act’s mandate to ICS to include information on the tool, practices, and techniques used to enforce acceptable use policies has been changed allowing for a vaguer description instead of language that could have been read to require disclosure of actual tools, practices, and techniques. Another modification would allow ICS to add any information they think aids transparency, and there is a new duty to protect the privacy of content providers.

The first PACT Act carved out a number of classes of online entities that would usually qualify as ICS:

A provider of an ICS that is used by another interactive computer service for the management, control, or operation of that other ICS, including for services such as web hosting, domain registration, content delivery networks, caching, back-end data storage, and cloud management

Consequently, companies like Amazon Web Services would not need to comply with the new PACT Act nor face increased liability under the revisions to Section 230. This language remains the same in the new version, but new language is added exempting broadband internet access services, meaning companies like Verizon FIOS, Comcast, and other internet service providers (ISPs).

Additionally, there is new language making clear the PACT Act does not affect Rule 65 of the Rules of the Federal Rules of Civil Procedure (which govern injunctions and temporary restraining orders), the All Writs Act (i.e., the late 18th Century law the Federal Bureau of Investigation tried to use to get Apple aid in unlocking encrypted phones), and intellectual property law.

Schatz and Thune also change the proposed Section 230 carveout to reflect the aforementioned changes to the time ICS have to remove illegal content and activity after notice of a court judgment (four days) and after notice of default judgments and stipulated settlements (ten days.) There is a change, however, that would essentially allow ICS to try to get default judgments and stipulated settlements vacated if the plaintiff did not make out a prima facie case (i.e., a legal term meaning did not allege all required elements to prevail in a claim). This new language may get at instances where a party wins a lawsuit mainly because it was uncontested. This new provision would shield the ICS from liability while it seeks to win these sorts of cases. It is unclear if ICS would face liability if they ultimately lose such cases.

The type of information that qualifies as notice to trigger an ICS’ obligations to take down illegal content and activity or face liability is changed slightly. The new bill requires “information reasonably sufficient to allow” the ICS to find the illegal material whereas the previous bill required mere identification of the illegal content or activity. Additionally, after receiving proper notice of illegal content or activity, ICS must also notify the content provider before removing the content.

Likewise, the above mentioned expansion of ICS not subject to the bill (i.e., broadband internet service providers) would be added to the Section 230 modification.

Moreover, Section 230 would also be altered to make clear that all of Section 230 does not limit federal or state enforcement of federal criminal and civil statutes and federal regulations.

Finally, the effective date of the PACT Act would be changed from 12 months to 18 months after becoming law.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Sara Kurfeß on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s