Republicans continue to spotlight Section 230 and supposed bias against conservatives on social media platforms. |
On 1 October , against a backdrop of coordinated, increasing Republican focus on 47 U.S.C. 230 (aka Section 230), the Senate Commerce, Science, and Transportation Committee voted unanimously to subpoena three technology CEOs:
- Jack Dorsey, Chief Executive Officer of Twitter;
- Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and
- Mark Zuckerberg, Chief Executive Officer of Facebook.
Ahead of the markup, it appeared that Democrats on the committee would oppose the efforts, which the top Democrat claimed
Taking the extraordinary step of issuing subpoenas is an attempt to chill the efforts of these companies to remove lies, harassment, and intimidation from their platforms. I will not participate in an attempt to use the committee’s serious subpoena power for a partisan effort 40 days before an election.
However, the chair and ranking member worked out an agreement that would expand the scope of the subpoenas beyond just Section 230 and would include privacy and “media domination.” With broader language, Democrats chose to vote yes, making it a unanimous vote.
This hearing, and the markup held by the Senate Judiciary Committee held the same day are part of the larger Republican narrative to cast technology companies as biased against conservative viewpoints. This was also on display during today’s House Judiciary Committee hearing on possible antitrust behavior by large technology companies when Ranking Member Jim Jordan’s (R-OH) statement had little to say about antitrust law or anticompetitive behavior and focused solely on Section 230. Republicans seem intent on shining a light on what they call a conservative bias among technology companies where allegedly viewpoints from the right are taken down, censored, and edited at a much higher rate than those on the left. These claims are made even though studies and data from Facebook have shown that conservative content is often the most popular on online platforms.
In his opening statement, Chair Roger Wicker (R-MS) said Dorsey, Zuckerberg, and Pichai declined to attend a hearing, necessitating subpoenas. Wicker stated
This Congress, the Commerce Committee and other committees in this body have examined the growing and unprecedented power and influence that Facebook, Google, and Twitter have in the United States. We have questioned how they are protecting and securing the data of millions of Americans. We have explored how they are combatting disinformation, fraud, and other online scams. We have examined whether they are providing a safe and secure internet experience for children and teens. We have discussed how they are removing content from their sites that encourages extremism and mass violence. We have examined their use of secret algorithms that may manipulate users and drive compulsive usage of the internet among our youth. And most recently, we have reviewed how they are moderating content across their platforms and applying their terms of service and community standards to their users.
Wicker added that “[w]ith over 4.5 billion internet users today, we recognize the challenge of addressing many of the issues I mentioned and policing obscene and other indecent material online…[and] Section 230 of the Communications Decency Act, however, was enacted almost 25 years ago to address this very challenge.” Wicker asserted
- Over time, this law has undeniably allowed the modern internet to flourish. It has helped usher in more speech and more expression, and limited the proliferation of truly reprehensible content. However, following repeated and consistent reports of political bias and the suppression of certain viewpoints, I fear that Section 230’s sweeping liability protections for Big Tech are stifling a true diversity of political discourse on the internet. According to a 2018 Pew study, seven out of 10 Americans agree.
- On the eve of a momentous and highly-charged election, it is imperative that this committee of jurisdiction and the American people receive a full accounting from the heads of these companies about their content moderation practices.
Wicker claimed “[t]his is not a partisan issue…[because] [b]oth candidates for President today agree.” Wicker asserted:
In May, the President of the United States rightly questioned whether Section 230 has outlived its usefulness. The Democrat[ic] nominee for President has done the same, calling for Big Tech’s liability shield to be “revoked immediately.” One Democrat[ic] member of this committee stated in July that “there is no reason for these platforms to have blanket immunity, a shield against any accountability that is not enjoyed by any other industry in the same way.” This member also acknowledged that “there is a broad consensus that Section 230 as it currently exists no longer affords sufficient protection to the public.” And just last week, another Democrat[ic] member of this committee joined a letter to Facebook demanding answers regarding the company’s inconsistent enforcement of its content moderation policies.
Ranking Member Maria Cantwell (D-WA) remarked:
I actually can’t wait to ask Mr. Zuckerberg further questions. I’m so proud that when we had a hearing before with Mr. Zuckerberg, I asked him an infamous question that’s now part of a movie: “What was their interference in the last elections?” At which point, the woman who is a whistleblower inside the organization says to the camera, “He’s lying.” So, can’t wait to have Mr. Zuckerberg here again.
Cantwell stated:
- I think the issues that we are discussing of how we function in an information age are of extreme importance. I think the issue of privacy and also media domination by the platforms when they put their foot on the throats of local news media is also an issue. So I appreciate [Wicker’s] offer today of adding to the subpoena language both privacy and media as a discussion point we can bring up in the subpoenas.
- What I don’t want to see is a chilling effect on individuals who are in a process of trying to crack down on hate speech or misinformation about COVID during a pandemic. Part of this discussion will end up being about the fact that some of these social platforms have tried to move forward in a positive way and take down information that is incorrect.
- I welcome the debate about 230. I think it should be a long and thoughtful process, not sure that a long and thoughtful process will happen before the election, but I understand my colleagues’ desires here today. So, happy to move forward on these subpoenas with the additions that [Wicker] so graciously added.
At the 1 October markup of Section 230 legislation, the Senate Judiciary Committee opted to hold over the “Online Content Policy Modernization Act” (S.4632) to try to reconcile the fifteen amendments submitted for consideration. The Committee could soon meet again to formally markup and report out this legislation. Even if the Senate passes Section 230 legislation, it is not clear there will be sufficient agreement with Democrats in the House to get a final bill to the President before the end of this Congress. The primary reason is that Democrats are focused on hate speech on online platforms aimed at women, minorities, and other groups, some of which is coming from the far right. In their public remarks, Republicans have not called this feature of platforms a problem, and they seem more focused on alleged bias and actions against conservative viewpoints.
Chair Lindsey Graham (R-SC) submitted an amendment revising the bill’s reforms to Section 230 that incorporate some of the below amendments but includes new language. For example, the bill includes a definition of “good faith,” a term not currently defined in Section 230. This term would be construed as a platform taking down or restricting content only according to its publicly available terms of service, not as a pretext, and equally to all similarly situated content. Moreover, good faith would require alerting the user and giving him or her an opportunity to respond subject to certain exceptions. The amendment also makes clear that certain existing means of suing are still available to users (e.g. suing claiming a breach of contract.)
Senator Mike Lee (R-UT) offered a host of amendments:
- EHF20913 would remove “user[s]” from the reduced liability shield that online platforms would receive under the bill. Consequently, users would still not be legally liable for the content posted by another user.
- EHF20914 would revise the language the language regarding the type of content platforms could take down with legal protection to make clear it would not just be “unlawful” content but rather content “in violation of a duly enacted law of the United States,” possibly meaning federal laws and not state laws. Or, more likely, the intent would be to foreclose the possibility a platform would say it is acting in concert with a foreign law and still assert immunity.
- EHF20920 would add language making clear that taking down material that violates terms of service or use according to an objectively reasonable belief would be shielded from liability.
- OLL20928 would expand legal protection to platforms for removing or restricting spam,
- OLL20929 would bar the Federal Communications Commission (FCC) from a rulemaking on Section 230.
- OLL20930 adds language making clear if part of the revised Section 230 is found unconstitutional, the rest of the law would still be applicable.
- OLL20938 revises the definition of an “information content provider,” the term of art in Section 230 that identifies a platform, to expand when platforms may be responsible for the creation or development of information and consequently liable for a lawsuit.
Senator Josh Hawley (R-MO) offered an amendment that would create a new right of action for people to sue large platforms for taking down his or her content if not done in “good faith.” The amendment limits this right only to “edge providers” who are platforms with more than 30 million users in the U.S. , 300 million users worldwide, and with revenues of more than $1.5 billion. This would likely exclude all platforms except for Twitter, Facebook, Instagram, TikTok, Snapchat, and a select group of a few others.
Senator John Kennedy (R-LA) offered an amendment that removes all Section 230 legal immunity from platforms that collect personal data and then uses an “automated function” to deliver targeted or tailored content to a user unless a user “knowingly and intentionally elect[s]” to receive such content.
Also this week, Senators Joe Manchin (D-WV) and John Cornyn (R-TX) introduced a bill that would change Section 230 to ostensibly “to stop the illicit sale of opioids and other drugs online” per their press release through a new requirement that online platforms report content indicating such activity is occurring. However, this bill would sweep much wider than controlled substances. The customary explanatory preamble of legislation in Congress gives the game away: “[t]o require reporting of suspicious transmissions in order to assist in criminal investigations and counterintelligence activities relating to international terrorism, and for other purposes.”
Under the “See Something, Say Something Online Act of 2020” (S.4758), online platforms would need to report “known suspicious transmissions” of “major crimes,” a term defined to include “crimes of violence,” “domestic or international terrorism,” and “serious drug offense[s].” Online platforms would need to report all such transmissions it should reasonably know about in the form of a “suspicious transmission activity report” (STAR) to the Department of Justice (DOJ) within 30 days unless it has evidence of an active sale or solicitation of drugs or terrorism. However, these STARs would be exempt from Freedom of Information Act (FOIA) requests. Platforms must establish a mechanism by which people can report suspicious activity as well. Failing to report such activity will result in the removal of the Section 230 liability shield and could be sued in a civil or criminal action because a failure to report would make the platform itself the publisher of the content, opening it to legal jeopardy.
Consequently, platforms would need create a system to vet material posted online for anything objectionable that could then be reported. It is safe to assume we would see overreporting, which begs the question of how the or state and local law enforcement agencies would choose to manage those possible crimes. Also, does the DOJ or other law enforcement agencies even have the capacity to manage what could be a considerable number of reports, triaging those serious enough to require immediate action. Also, would such a statute create a greater incentive to move to encrypted platforms and also for the development of such platforms.
It is interesting the Manchin/Cornyn bill seems to steer clear of child pornography and other exploitative sexual material. In contrast, the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020” (EARN IT Act of 2020) (S.3398) would change Section 230 by narrowing the liability shield and potentially making online platforms liable to criminal and civil actions for having child sexual materials on their platforms. Perhaps Manchin and Cornyn are interested in trying to add the bill to the EARN IT Act during Senate consideration or the “Online Content Policy Modernization Act” during the Senate Judiciary Committee markup. Or it may be that Manchin and Cornyn are trying to have an approach to fighting the opioid epidemic they can show to voters in their states. In any event, it is unclear what their intentions are at this point. However, it bears note that the provision requiring the reporting of domestic terrorism may appeal to many Democratic stakeholders, for they have repeatedly expressed concerns about the online activity of white supremacists and the effect of this content offline.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.
Image by Gerd Altmann from Pixabay