The House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) and Representative Kathy Castor (D-FL) introduced their “Online Consumer Protection Act” (OCPA) (H.R.3067) in mid-May 2021. However, the House Energy and Commerce Committee has not yet acted upon the bill, and there may not be any action on Section 230 soon given the apparent stalemate between Republicans and Democrats on the types of changes they want. Moreover, other legislative priorities may occupy this committee for the immediate future.
Nonetheless, this bill may represent the approach some House Democrats take in addressing some of the problems they see with how 47 U.S.C. 230 (aka Section 230) is used to shield social media platforms and online marketplaces from oversight, regulation of content moderation, and consumer protection laws.
As such, OCPA aims to address both the problems associated with the lack of transparency among social media platforms in how they remove content and suspend or block users and issues arising from counterfeit and unsafe goods sold online for which consumers are having trouble holding entities responsible. Both classes of entities have been using Section 230 to fend off litigation and calls for what Schakowsky, Castor, and other Democrats would consider proper and safe content moderation and selling and distributing of online goods.
OCPA would change U.S. regulation of both “social media platforms” such as Facebook, Twitter, TikTok, and Instagram and “online marketplaces” such as Amazon and eBay. Interestingly, this second category of entities would seem to encompass Best Buy, Walmart, and other such marketplaces with third-party sellers. And so, a Macy’s or CVS that is selling items online for itself only would not qualify. The Federal Trade Commission (FTC) would receive additional authority to regulate both classes of entities, and unlike a number of bills there is no exception for businesses and entities below a certain size. All entities that are either social media platforms or online marketplaces would experience new oversight and scrutiny from the FTC to the extent the already stretched agency can take on new tasks and missions.
Unlike a number of other Section 230 bills, H.R.3067 features some of the process-oriented reforms for how online platforms moderate and remove content like those found in the “Platform Accountability and Consumer Transparency Act” (PACT Act) (S.797). The text of the liability shield in Section 230(c) would not be altered, as some have proposed in other legislation, and instead OCPA enshrines in law a functional curtailing of Section 230 through specifying that it does not protect online marketplaces and social media platforms for violating the new statute.
Online marketplaces and social media platforms would need to publish new terms of service explaining how they deal with a range of issues and the rights users and sellers would have, new consumer protection policies, and new consumer protection programs (which are different from the aforementioned policies) to establish the procedures by which they would meet the requirements of the new law. The FTC would regulate both classes of entities and could seek civil penalties and injunctive relief as could state governments. Moreover, people could sue social media platforms and online marketplaces (the so-called private right of action.)
In terms of what the bill would require, OCPA requires social media platform and online marketplace to “establish, maintain, and make publicly available at all times and in a machine-readable format, terms of service in a manner that is clear, easily understood, and written in plain and concise language.” These terms of service must include:
- any terms or conditions of use of any service provided by such person to a consumer;
- any policies of such person with regard to such service or use of such service by a consumer; and
- a consumer protection policy (more on this below)
Additionally, all terms of service must at least detail the following:
- payment methods;
- content ownership, including content generated by a user;
- policies related to sharing user content with third parties;
- any disclaimers, limitations, notices of nonliability, or the consequences of not agreeing to or complying with the terms of service; and
- any other topic the [FTC] deems appropriate.
Of course, per the last item, the FTC may add other required items for disclosure in these new terms of service. Some of this information may cause heartburn among some of the larger companies. For example, Facebook would need to detail all the third parties with whom they share user content, and unlike many of the data privacy bills, actual entities would need to be named as opposed to categories of entities.
As mentioned, social media platforms and online marketplaces would need to draft and issue consumer protection policies, which have different elements for each class of entities. One must keep in mind these are subsections of the newly legal requirement for terms of service under OCPA. Social media platforms’ consumer protection policies must include:
- a description of the content and behavior permitted or prohibited on its service both by the platform and by users;
- whether content may be blocked, removed, or modified, or if service to users may be terminated and the grounds upon which such actions will be taken;
- whether a person can request that content be blocked, removed, or modified, or that a user’s service be terminated, and how to make such a request;
- a description of how a user will be notified of and can respond to a request that his or her content be blocked, removed, or modified, or service be terminated, if such actions are taken;
- how a person can appeal a decision to block, remove, or modify content, allow content to remain, or terminate or not terminate service to a user, if such actions are taken; and
- any other topic the Commission deems appropriate.
A number of these new requirements would address claims that Facebook, Twitter, and others act opaquely and arbitrarily in punishing some users and removing some content while other, similar conduct and content goes unaddressed. Now social media platforms would need to explain what is allowed and what is not, the conditions under which content may be moderated, if users can be banned and under what circumstances, how users can ask that content be moderated, have an appeals process for those who content is indeed moderated, and any other thing the FTC deems necessary.
For online marketplaces, the new consumer protection policies must include:
- a description of the products, product descriptions, and marketing material, allowed or disallowed on the marketplace;
- whether a product, product descriptions, and marketing material may be blocked, removed, or modified, or if service to a user may be terminated and the grounds upon which such actions will be taken;
- whether users will be notified of products that have been recalled or are dangerous, and how they will be notified;
This information would go to the problem of online marketplaces (predominantly Amazon) operating as black boxes into which consumers cannot determine the company’s policies about what can and cannot be sold, if or when products can be removed, and if the marketplace will tell them about recalled or dangerous products.
There are additional policies that must be divulged for both sellers and users of online marketplaces. For sellers, these companies must detail in their consumer protection policies:
- how sellers are notified of a report by a user or a violation of the terms of service or consumer protection policy;
- how to contest a report by a user;
- how a seller who is the subject of a report will be notified of what action will be or must be taken as a result of the report and the justification for such action;
- how to appeal a decision of the online marketplace to take an action in response to a user report or for a violation of the terms of service or consumer protection policy; and
- the policy regarding refunds, repairs, replacements, or other remedies as a result of a user report or a violation of the terms of service or consumer protection policy.
These disclosures seem designed to clarify when a seller may have her products removed for violating terms of service or consumer protection policies, how she would be notified, how she may appeal, and the marketplace’s policies about redress for users regarding products that fail to meet standards.
The consumer protection policies of online marketplaces would also address how users can report violations, including dangerous, fraudulent, and deceptive products, if a person who complains will be notified of subsequent action, how to file an appeal of an adverse decision, when and if refunds, repairs, or other remedies are available. Specifically, the user section of consumer protection policies should explain:
- whether a user can report suspected fraud, deception, dangerous products, or violations of the online marketplace’s terms of service, and how to make such report;
- whether a user who submitted a report will be notified of whether action was taken as a result of the report, the action that was taken and the reason why action was taken or not taken, and how the user will be notified;
- how to appeal the result of a report; and
- under what circumstances a user is entitled to refund, repair, or other remedy and the remedy to which the user may be entitled, how the user will be notified of such entitlement, and how the user may claim such remedy;
The FTC would have six months after enactment to “conduct a study to determine the most effective method of communicating common consumer protection practices in short-form consumer disclosure statements or graphic icons that disclose the consumer protection and content moderation practices of social media platforms and online marketplaces.” The FTC would then need to draft regulations to “require social media platforms and online marketplaces to communicate their consumer protection and content moderation practices, and any other information as the Commission may determine, in a clear and conspicuous manner.” The FTC would then need to determine if “such rules would advance consumer understanding of consumer protection and content moderation practices of social media platforms and online marketplaces,” and if so, then it would finalize them. If not, the agency would scrap the regulations.
Covered entities would also need to “establish and implement a consumer protection program that includes policies, practices, and procedures regarding consumer protection and content moderation.” Such consumer protection programs would:
- ensure compliance with applicable Federal, State, and local consumer protection laws;
- develop, implement, and ensure compliance with the terms of service…;
- develop and implement policies regarding the content and behavior permitted on its service both by the platform and users, and ensure compliance with such policies, practices and procedures;
- mitigate risks that could be harmful to consumer’s safety, well-being, and reasonable expectations of users of the social media platform or online marketplace;
- implement reasonable safeguards within, and training and education of employees and contractors of, the social media platform or online marketplace to promote compliance with all consumer protection laws and the consumer protection program; and
- disclose any other requirement the Commission deems appropriate
These requirements would mandate that social media platforms and online marketplaces take action to meet all applicable consumer protection laws, comply with their terms of service, spell out what content and behavior are allowed, address and reduce risks to the safety, well-being, and reasonable expectations of users, and any other requirement the FTC deems necessary. All of this would need to be done on a sliding scale considering the size and complexity of the company, the costs, and the types of activities users in which users are engaging.
In executing the consumer protection programs, OCPA get specific in how online marketplaces and social media platforms will need to take steps to achieve the above listed goals:
- establish processes to monitor, manage, and enforce the social media platform’s or online marketplace’s consumer protection program, and demonstrate the covered entity’s compliance with Federal, State, and local consumer protection laws;
- establish processes to assess and mitigate the risks to individuals resulting from the social media platform’s or online marketplace’s amplification of content or products not in compliance with its terms of service;
- establish a process to periodically review and update the consumer protection program;
- appoint a consumer protection officer, who reports directly to the chief executive officer; and
- establish and implement controls to monitor and mitigate known or reasonably foreseeable risks to consumers resulting from hosting content or products.
And while OCPA would not exclude smaller social media platforms and online marketplaces, there would be filing requirements incumbent only upon entities with more than $250,000 in revenue or more than 10,000 monthly users. These annual filings would need to detail:
- a detailed and granular description of [terms of service and consumer protection policies] and [consumer protection programs];
- the name and contact information of the consumer protection officer…;and
- a description of any material changes in the consumer protection program or the terms of service since the most recent prior disclosure to the Commission.
Each company required to make such filings must have their “principal executive officer” and “consumer protection officer” sign the filings and be responsible for any and all material misrepresentations or omissions. Moreover, these filings must be based on these officers’ actual knowledge of “the consumer protection practices of the social media platform or online marketplace.” The FTC must make public all such filings and may withhold information that should not be public in the agency’s view.
The FTC could treat all violations of OCPA as violations of a trade regulation, allowing the agency to seek over $43,000 per violation for even first offenses along with any injunctive relief it can seek. The FTC would receive power to carry out normal notice and comment rulemakings to implement OCPA instead of the onerous, almost impossible to use Moss-Magnuson rulemaking procedures.
State attorneys general or other designated state officials may enforce OCPA through filing suit in federal court in order to:
- enjoin further such violation by such person;
- enforce compliance with this Act;
- obtain civil penalties; and
- obtain damages, restitution, or other compensation on behalf of residents of the State.
State governments are not barred from suing in their courts under state statutes for conduct that also violates OCPA.
People would be allowed to sue under a private right of action in either federal or state court. Moreover, any “pre-dispute arbitration agreement or pre-dispute joint action waiver” that could block a person’s right to sue under OCPA because they would be deemed invalid. People could sue for actual damages, reasonable attorney’s fees and litigation costs, and “any other relief, including equitable or declaratory relief, that the court determines appropriate.”
OCPA clearly states Section 230 has no relevance to violations of this act and does not preempt state statutes. Section 230 would be amended to clarify that it does not impinge the FTC’s ability to enforce any law in its purview.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.
Photo by Jeremy Thomas on Unsplash
Photo by Fabrizio Chiagano on Unsplash