House E&C Republicans signal areas of tech policy they may cooperate with Democrats on.
The minority on the primary committee of jurisdiction over the technology industry has decided to stake out some legislative territory in the form of a staff memorandum on options Members may pursue in legislation. While this is purportedly a staff memorandum, it is unlikely the Ranking Member, Representative Cathy McMorris Rodgers (R-WA) would allow something this substantive to be released if she did not agree with the exercise. This is not to say McMorris Rodgers’ preferred options are detailed in the memorandum, for there are a range of options, some of which are mutually exclusive. That being said, issuing a staff memorandum gives McMorris Rodgers plausible deniability should the proposals prove unpopular with Republicans on the committee, House, Senate, or the wider policy ecosystem in Washington.
Additionally, the public release of the memorandum serves to continue and develop Republican positions, and to allow the minority to be for something as opposed to merely being against what the majority wants. This dynamic along with the fact this memorandum could have been circulated only among the Republican offices suggest the one of the purposes of the memorandum is messaging.
Republican staff take further aim at 47 U.S.C 230 (Section 230). Over half the proposals deal with peeling back or removing the protection companies like Twitter, Facebook, Reddit, Google, and others enjoy from lawsuits about content on their platforms. Staff pick up Republican claims that these companies have an anti-conservative bias as shown through “censorship,” “shadow banning,” and “cancel culture.” Staff then offer a range of policy options with a few discrete goals: 1) getting platforms to not moderate conservative content; 2) addressing illegal content; and 3) bringing transparency and regularity to decisions to moderate content or to suspend or kick people off of platforms.
Otherwise, staff expand on a new approach to technology issues Republicans spoke about at a recent hearing: the mental of health of children. They offer policy prescriptions for this as well as ways to safeguard the online world for children thorough a possible expansion of a current statute. Finally, staff also fold in provisions to address public safety and law enforcement.
There may be some areas of common concern that could yield legislation. Democrats may be open to modifying Section 230 to ensure the large platforms have transparent, comprehensible moderation processes that allow appeals of adverse action. Likewise, Democrats could be very interested in legislation to better protect children, but they would likely want more robust proposals than what the Republican staff offer.
The minority on the House Energy and Commerce Committee have followed up on their “Big Tech Accountability Platform” with “Staff Legislative Concepts,” which is framed as the beginning of legislative drafting. Staff explained “this outreach is to start a considerate, inclusive, and transparent process as staff consider and develop legislation under our Platform…[and] is to first focus on legislative concepts and then work with stakeholders and interested parties as we develop legislative text.”
Nonetheless, Republican staff wanted to stress three points:
1. We will protect free speech: Republicans worked hard to repeal the Federal Communications Commission’s Fairness Doctrine and we will not advocate for a new one.
2. We will be mindful of small businesses and entrepreneurship: Any policy we pursue will balance these essential interests to preserve competition.
3. We will promote American tech leadership and innovation: We will continue to promote American global leadership while working to address issues here at home.
The first item would seem to preempt calls for some version of the Fairness Doctrine online that would possibly entail posts and content to have appended counter views or some sort of fact checking. The second point may signal a new wrinkle in Republican rhetoric and thinking on “Big Tech” with respect to anti-trust and competition issues. They may argue they are acting on behalf of small businesses and entrepreneurs “Big Tech” are currently harming, and Republicans are not anti-business per se. Finally, Republican staff seem to working in the national security and economic competition angle in that many in Washington are focused on the U.S-People’s Republic of China (PRC) relationship.
Next, staff concede “[r]esolving these complex issues will not be easy.” Staff makes an interesting assertion: “[w]e must embrace our conservative principles to find a viable solution consistent with the First Amendment that enables individuals to express themselves freely and protects the right of private companies to control their property.” This sounds like staff trying to walk back some Republicans from their attack on platforms’ alleged violations of the First Amendment, perhaps in recognition that taking this argument too far may lead to other assertions that could impinge the rights to property. At present the Supreme Court of the United States (SCOTUS) has generally not found people have First Amendment free speech rights with private actors such as Twitter or Facebook. And so, this sentence may recognize that the Republican push to establish actual First Amendment rights on online platforms or pressuring the platforms to provide these absent a court ruling may cross comfort lines in dictating to private companies about how they should conduct their businesses. Thus far, this potential tension in the Republican party has been papered over.
Not surprisingly, Republican staff start with 47 U.S.C. 230 (Section 230) given the hue and cry among Republicans to take on “Big Tech’s” “censoring” and “use of “cancel culture.” Staff seems to be following the lead of SCOTUS Justice Clarence Thomas in his commentary appended to the court’s refusal to take a suit against former President Donald Trump for blocking people on Twitter. The United States Court of Appeals for the Second Circuit had ruled against Trump, holding the blocking of the plaintiffs violated the First Amendment. SCOTUS dodged the case because Trump is no longer President and hence there is no case or controversy. Nonetheless, Thomas decided to weigh in on Section 230 adjacent issues, notably the First Amendment. He voices his disagreement with the Second Circuit over its holding that Trump’s tweets constituted a public space and hence he could not block people over their viewpoints. But Thomas goes on to ponder whether online platforms are akin to common carriers as railroads, telegraph companies, and phone companies that have been regulated as such. Thomas also proposes treating online platforms as public accommodations, another special class of private entities that are regulated differently than other entities. The upshot to Thomas’ wonderings seems to be finding a legal construct to achieve the broadly held Republican goal of requiring platforms to carry all viewpoints, which is the most charitable reading. A more cynical reading is that Republicans want to force platforms to carry conservative content and to bar these platforms from “censoring” them.
Hence, Republican staff offer one set of options on Section 230 under the heading of “Limit the Right of Exclusion:”
- Define Big Tech companies as places of public accommodation and prohibit discrimination based on political affiliation and/or viewpoint.
- Alternatively, define Big Tech companies as places of public accommodation and limit liability protections to content moderation processes that provide a measure of due process to users.
Apparently, Thomas’ notion of turning online platforms into common carriers was not appealing. Is this because then the Federal Communications Commission (FCC) may get jurisdiction over them? Is this because if this were the case, Democrats and other stakeholders may join the dots and make the argument if online platforms cannot discriminate against different viewpoints, internet service providers (ISP) may not discriminate against content flowing across its networks (aka net neutrality)? It is not clear, and I am obviously speculating.
Staff go on to propose a second approach on which Republicans may seek to draft legislation to reform Section 230 under a separate heading “Require Reasonable Moderation Practices:”
- Require Big Tech companies to implement and maintain reasonable moderation practices to address illegal drug sales; child exploitation, including child pornography and trafficking; targeted harassment or bullying of users under the age of 18; terrorism; counterfeit products and materials sales; and all other illegal content on their platforms.
- Failure to implement and maintain such reasonable moderation practices is a violation of Section 5 of the Federal Trade Commission (FTC) Act.
- Such companies may be liable for content decisions related to content included above but may assert liability protections if they implement and maintain reasonable moderation practices.
This second tranche sounds very much like the “Platform Accountability and Consumer Transparency (PACT) Act,” (S.797), a bill recently reintroduced by Senator Brian Schatz (D-HI) and Senate Minority Whip John Thune (R-SD). Instead of targeting certain content, the PACT Act targets all illegal content and lays out the process platforms must use to weigh complaints about content that may violate terms of service. (see here for more detail and analysis.) A process based approach to reforming Section 230 may be the best hope for those looking to revise this liability shield.
A third set of options for Section 230 is offered by Republican staff under the heading of “Limit Liability to Protected Speech:”
- Modify Section 230 to only provide liability protection for moderation of speech that is not protected by the First Amendment or specifically listed in the statute.
This would open platforms to lawsuits for moderating speech traditionally protected by the First Amendment, which is first and foremost political speech. And so, the Republican goal of pushing the platforms not to “censor” conservative viewpoints and content could be achieved. It would also serve to stop what some may call a judicial expansion of Section 230 that protects platforms in ways never envisioned when Congress passed this language and not supported by a reasonable interpretation of the text.
Republican staff offered a fourth option titled “Remove Liability Protections,” that would merely “[r]emove liability protection under Section 230 for content moderation decisions made by Big Tech companies that discriminate based on political affiliation or viewpoint.” This may be the most direct but least graceful way to achieve Republican policy goals. It almost goes without saying this would not be supported by Democrats.
Under a fifth heading, staff float the idea of requiring an appeal process for moderation of First Amendment speech, suspensions, and deplatformings. Platforms would be required to have appeals processes that meet certain requirements. This is similar to the PACT Act, but that bill would mandate a process for people to complain about content that violates the platform’s and for those whose content is taken down to appeal those moderation decisions. This policy proposal goes beyond the PACT Act.
The sixth tranche would remove “Big Tech” companies from Section 230 altogether while maintaining them for smaller companies and new entrants. Is it unclear whether this would necessarily require that when companies reached a certain size, they be cast out of Section 230. An ancillary proposal would be to remove liability protection for “companies engaged in targeted behavioral advertising,” which would strike at the business model of the large platforms. There have been proposals to peel back Section 230 protection for all paid advertisements, but Republican staff seem to be proposing going after only targeted behavioral advertising.
The seventh Section 230 proposal would set a five-year expiration date for this language, which would create a process by which Congress would have to debate and pass a new authorization. This may be a bit of a Trojan Horse, for some Members may sell this as a means by which Congress can regularly revisit and adjust Section 230 as needed. However, if there were enough opposition to Section 230, a reauthorization could be blocked and the liability protection would cease to exist.
The eighth set focus on the content moderation processes of platforms and make them subject to FTC and state attorneys general enforcement powers against unfair and deceptive practices. Again, this covers much the same ground that the PACT Act does and may prove the most likely policy path for Section 230 legislation.
Interestingly, staff are not proposing that Members offer legislation seeking to address the underlying algorithms that amplify content. This is an approach a number of Democrats have taken to Section 230 by seeking to ensure that the algorithms the platforms use do not result in the promotion of hate speech, violent content, or discriminatory material. One may conclude Republicans are not keen to this approach. Likewise, there is nary a mention of civil rights violations online. At least one Democratic bill would remove liability protection under Section 230 for material that violates federal civil rights law.
Next Republican staff turn to a policy area where they may find agreement with Democrats much more easily. There is a proposal to expand the Child Online Privacy Protection Act (COPPA) by expanding the statute “to focus the FTC where protections for teens and younger children have been compromised.” This language does not exactly make clear what protection this would be. Staff also proposes to “[r]equire Big Tech companies to report on what kind of advertising they are conducting on this age group, how they have authenticated the material on the platform, including how user’s ages have been verified, whether the content is user-generated by this age-group, and what research has been conducted to establish protocols for the delivery of content to this age group.” This other option does not seem to immediately entail expanded enforcement of COPPA but may set the predicate for future legislation.
Then staff turn to a recent policy focus of Republicans on the committee: children’s mental health. This line of interest debuted at the 25 March hearing with the CEOs of Facebook, Alphabet/Google, and Twitter. However, given the extent of the alleged harm to children and teens, staff do not seem alarmed enough to propose any changes that would limit “Big Tech:”
- Require Big Tech companies to track trends on their product’s impact on children’s mental health, including degraded self-worth, self-harm, targeted harassment of children and cyberbullying and any offline harm resulting from such bullying.
- Direct relevant federal agencies to work with Big Tech and other interested parties to develop an educational campaign about the risks Big Tech poses to children’s mental health and well-being.
This dichotomy was present at the 25 March hearing. In her opening statement, McMorris Rodgers made the case “Big Tech” is harming children:
Your platforms are my biggest fear as a parent. I’m a mom of three school- aged kids. My husband and I are fighting the Big Tech battles in our household every day. It’s a battle for their development,
A battle for their mental health and — ultimately — a battle for their safety. I’ve monitored where your algorithms lead them. It’s frightening. I know I’m not alone. After multiple teenage suicides in my community, I reached out to our schools and we started asking questions.
But, on the other, the strongest remedy she offered is giving parents more information about the risks of “Big Tech:”
Big Tech needs to be exposed and completely transparent for what you are doing to our children so parents like me can make informed decisions. We also expect Big Tech to do more to protect children because you haven’t done enough. Big Tech has failed to be good stewards of your platforms.
The final group of policy areas in which “Big Tech” may face policy changes regard law enforcement:
- Require Big Tech companies to collaborate with law enforcement to educate the public on what recourses the public has when their safety and security has been violated.
- Require such companies to have protocols in place to assist law enforcement to protect individuals from harm in a timely fashion.
- Require such companies to cooperate with law enforcement to target perpetrators that promote, post, or otherwise engage in the distribution of illegal drug sales; child exploitation, including child pornography and trafficking; targeted harassment or bullying of users under the age of 18; terrorism; counterfeit products and materials sales; and all other illegal content.
- Require such companies to report on their efforts to cooperate with law enforcement.
- FTC will consult with local, state, and other federal authorities on the level of cooperation that the companies have provided law enforcement.
I wonder if the second bullet point “protocols” to “assist law enforcement” to protect people from “harm” is not an entry into the encryption war, suggesting that Members think about drafting legislation requiring “Big Tech” maintain the ability to decrypt devices and services.
It bears note that staff did not mention privacy or data protection even though McMorris Rodgers, then the ranking member on the Consumer Protection and Commerce Subcommittee, issued a privacy discussion draft with Chair Jan Schakowsky (D-IL) in late 2019 (see here for more analysis.) This may be a sign that McMorris Rodgers wants to negotiate this issue separate from the other “Big Tech” issues.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.
Photo by Giu Vicente on Unsplash