Spotlight: A Privacy Bill A Week: “Consumer Data Protection Act”

Last week, we dived into Senator Catherine Cortez Masto’s (D-NV) “Digital Accountability and Transparency to Advance Privacy Act” (DATA Privacy Act) (S. 583). Of course, Cortez Masto served as the attorney general of Nevada for eight years prior to succeeding former Senator Harry Reid (D-NV), and this bill demonstrates her background as her state’s top prosecutor. This week, we will analyze the most stringent, most pro-consumer bill on privacy that I have seen introduced in this or the last Congress.

In November, Senate Finance Committee Ranking Member Ron Wyden (D-OR) released the “Consumer Data Protection Act” discussion draft, section-by-section, and one-pager, legislation not to be confused with Senator Bob Menendez’s (D-NJ) “Consumer Data Protection Act” (S. 2188), a data security and breach notification bill. In short, Wyden’s bill would vastly expand the power of the Federal Trade Commission (FTC) to police both the security and privacy practices off many U.S. and international multinational companies. The FTC would receive the authority to levy fines in the first instance, potentially as high as the European Union’s General Data Protection Regulation of 4% of annual gross revenue. Moreover, the operative definition of the “personal information” that must be protected or subject to the privacy wishes of a consumer is very broad. The bill would also sweep into the FTC’s jurisdiction artificial intelligence (AI) and algorithms (i.e. so-called big data).

The “Consumer Data Protection Act” would dramatically expand the types of harms the FTC could use its authority to punish to explicitly include privacy violations and noneconomic injuries. Currently, the FTC must use its Section 5 powers to punish unfair and deceptive practices, or another statutory basis such as COPPA, to target the privacy practices it considers unacceptable. Wyden’s bill would allow the FTC to enforce the FTC Act, as amended by his bill, to punish “noneconomic impacts and those creating a significant risk of unjustified exposure of personal information” as among those “substantial injur[ies]” made illegal. It is worth seeing the proposed language in the context of the section of the FTC’s organic statute (i.e. 15 U.S.C. 45(n)):

(n) Standard of proof; public policy considerations

The Commission shall have no authority…to declare unlawful an act or practice on the grounds that such act or practice is unfair unless the act or practice causes or is likely to cause substantial injury including those involving noneconomic impacts and those creating a significant risk of unjustified exposure of personal information to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition (emphasis added to differentiate the language the bill would add.)

The FTC’s new authority would likely be defined in court actions to test the outer limits of what constitutes “noneconomic impacts” and the types of substantial injuries that create a significant risk of unjustified exposure of personal information. If this language were enacted, undoubtedly industry groups and conservative advocacy organizations would zealously search for test cases to try to circumscribe this authority as narrowly as possible. Finally, it bears note that this sort of language harkens back to the FTC’s construction of its statutory powers in the 1960’s and 1970’s that was considered so expansive that a Democratic Congress reined in the agency and limited its purview.

The FTC’s authority to levy civil fines through an administrative proceeding would be dramatically expanded along the lines of the EU’s power to levy massive fines under the General Data Protection Regulation. Notably, without securing a court order, the agency could impose civil fines as part of a cease and desist order which shall be the higher of $50,000 per violation or 4% of the annual gross revenue of the offender in the previous fiscal year. The upper limits of such a fine structure get very high, very quickly. For example, a violation with 100,000 people affected yields an upper boundary of $5 billion assuming one violation per person. The privacy violations associated with Facebook’s conduct with Cambridge Analytica affected 87 million worldwide, and again assuming one violation per person, the upper boundary of the fine the FTC could levy would be $4,350,000,000,000. However, the FTC would likely not exercise this power to the utmost possible fine but rather dial back the fine to a more reasonable but still punitive amount. Nonetheless, the FTC would have the ability to recover up to $50,000 per violation or 4% of gross annual revenue for any violations of cease and desist orders by filing an action in federal court.

Despite expanding the FTC’s powers dramatically, those entities subject to the agency’s new enforcement powers would not include many medium and small businesses. Covered entities are described as those entities with more “than $50,000,000 in average annual gross receipts for the 3-taxable-year period preceding the fiscal year” and the “personal information” of more than 1,000,000 consumers, and 1,000,000 consumer devices. Additionally, a covered entity may be an affiliate or subsidiary of an entity that meets the aforementioned qualifications. Finally, the term “covered entity” covers all data brokers or commercial entities “that, as a substantial part of their business, collects, assembles, or maintains personal information concerning an individual who is not a customer or an employee of that entity in order to sell or trade the information or provide third- party access to the information.”

Additionally, a subset of these covered entities with more than $1 billion in annual revenues that “stores, shares, or uses personal information on more than 1,000,000 consumers or consumer devices” or those “that stores, shares, or uses personal information on more than 50,000,000 consumers or consumer devices” must submit annual data protection reports to the FTC. Those entities must report “in detail whether, during the reporting period, the covered entity complied with the regulations” the FTC will promulgate to effectuate the “Consumer Data Protection Act” and the extent to which they did not comply by detailing which regulations were violated and the number of consumers affected.

Each report must “be accompanied by a written statement by the chief executive officer, chief privacy officer (or equivalent thereof), and chief information security officer (or equivalent thereof) of the company” that certifies the report fully complies with the requirements of the new statute. If any such person certifies an annual data protection report while knowing it does not meet the requirements of this section or with intentional knowledge it does not faces jail time and/or a personal fine based on income depending on which state of knowledge the actor had in falsely certifying a report. Any CEO, chief privacy officer, or chief information security officer that knowingly certifies a false report faces a fine of the greater of $1 million or 5% of the highest annual compensation for the previous three years and up to ten years in prison. Intentional violations expose these corporate officials to the greater of a $5 million fine or 25% of the highest annual compensation for the previous three years and 20 years in prison.

Of course, falsely certifying knowing that a report fails to meet all the requirement exposes a person to less criminal liability than intentionally certifying. However, the substantive difference between knowing certification and intentional certification is not immediately clear. Perhaps the bill intends knowing to be constructive knowledge (i.e. known or should have known) while intentionality in this context means actual knowledge.

With respect to the information covered entities would need to safeguard, the bill defines “personal information,” which is “any information, regardless of how the information is collected, inferred, or obtained that is reasonably linkable to a specific consumer or consumer device,” which is a very broad definition. Wyden’s bill also defines “use,” “share,” and “store” in the context of personal information:

  • “share’’—
    • means the actions of a person, partnership, or corporation transferring information to another person, partnership, or corporation; and
    • includes actions to knowingly—
    • share, exchange, transfer, sell, lease, rent, provide, disclose, or otherwise permit access to information; or
    • enable or facilitate the collection of personal information by a third party.
  • ‘‘store’’—
    • means the actions of a person, partnership, or corporation to retain information; and
    • includes actions to store, collect, assemble, possess, control, or maintain information.
  • ‘‘use’’ means the actions of a person, partnership, or corporation in using information, including actions to use, process, or access information.

The FTC would be required to promulgate detailed regulations discussed in more detail below within two years of enactment. This timeline may be more realistic than many of the other bills which task the agency with detailed, extensive rulemakings within a year, a deadline the FTC may have trouble meeting. Nonetheless, the agency could take the first year or even 15 months to draft proposed regulations for comment.

The bill would task the FTC with establishing and running a ‘‘Do Not Track’’ data sharing opt-out website that would stop covered entities from sharing a consumer’s personal information subject to certain exceptions including the use of personal information acquired before a consumer opts out. These would be in the case when a covered entity needs to share the information to achieve the primary purpose under which the information was initially acquired. Additionally, this bar would be in effect for personal information a covered entity acquires from non-covered entities.

The FTC would also need to determine technological means that a consumer’s opt-out on its website can be effectuated through web browsers or operating systems. The agency would also need to devise a method by which covered entities could determine which consumers have opted out, possibly through the development of an FTC Application Programming Interface (API). Thereafter, covered entities would have a duty to check at regular intervals the FTC’s opt-out database to ensure they are honoring the consumers’ decisions to opt out. Covered entities would not need to respect a consumer’s desire to opt-out in the event of required legal disclosures they need to make to the government such as under warrants or subpoenas. The FTC would also need to “establish standards and procedures, including through an API, for a covered entity to request and obtain consent from a consumer who has opted-out…for the covered entity to not be bound by the opt-out,” including providing a list of third parties with whom personal information might be shared and a description of such information. And, if the covered entity requires consumers to consent to usage of their personal information before its products or services can be used, then the covered entity must “notify the consumer that he or she can obtain a substantially similar product or service in exchange for monetary payment or other compensation rather than by permitting the covered entity to share the consumer’s personal information.”

The FTC must also “establish standards and procedures requiring that when a non-covered entity that is not the consumer shares personal information about that consumer with a covered-entity, the covered entity shall make reasonable efforts to verify the opt-out status of the consumer whose personal information has been shared with the covered entity.” Thereafter covered entities may only use or store this personal information if a consumer has not opted out on the FTC’s website or if the covered entity has received the consumer’s consent for non-covered entities to collect and share their information.

Additionally, the FTC must draft regulations detailing the “standards and procedures” covered entities and non-covered entities must follow “to request and obtain consent from a consumer…that clearly identifies the covered entity that will be storing or using the personal information and provides the consumer” at the time consent is sought. Consumers must be informed “in a form that is understandable to a reasonable consumer” detailing the entity from whom personal information is to be obtained, the type of personal information to be collected, and the purposes for which such information shall be used.

Certain acts would be prohibited. Covered entities could not require consumers to change their opt-out election on the FTC’s website in order to access products and services “unless the consumer is also given an option to pay a fee to use a substantially similar service that is not conditioned upon a requirement that the consumer give the covered entity consent to not be bound by the consumer’s opt-out status.” Moreover, this fee “shall not be greater than the amount of monetary gain the covered entity would have earned had the average consumer not opted-out.”

Wyden’s bill also marries data security requirements with privacy protections for consumers, a position articulated by a number of prominent Democrats. Notably, the FTC would need to promulgate regulations that

  • require each covered entity to establish and implement reasonable cyber security and privacy policies, practices, and procedures to protect personal information used, stored, or shared by the covered entity from improper access, disclosure, exposure, or use;
  • require each covered entity to implement reasonable physical, technical, and organizational measures to ensure that technologies or products used, produced, sold, offered, or leased by the covered entity that the covered entity knows or has reason to believe store, process, or otherwise interact with personal information are built and function consistently with reasonable data protection practices;

The FTC would also need to draft regulations requiring “each covered entity to provide, at no cost, not later than 30 business days after receiving a written request from a verified consumer about whom the covered entity stores personal information” a way to review any personal information stored, including how and when such information was acquired and a process for challenging the accuracy of any stored information. Additionally, these regulations would “require each covered entity to correct the stored personal information of the verified consumer if, after investigating a challenge by a verified consumer…the covered entity determines that the personal information is inaccurate.” Covered entities would also need to furnish a list of the entities with whom the consumer’s personal information was shared and other detailed information, including the personal information of the consumer the covered entity acquired not from the consumer but a third party.

The “Consumer Data Protection Act” would also institute regulations and requirements related to the increasing use of so-called “big data,” algorithms, machine learning, and artificial learning. The FTC would need to promulgate regulations mandating that each covered entity must “conduct automated decision system impact assessments of existing high-risk automated decision systems, as frequently as the Commission determines is necessary; and…new high-risk automated decision systems, prior to implementation.” However, it would be helpful to examine the bill’s definitions of ‘‘automated decision system,’’ “automated decision system impact assessment,’’ ‘‘high-risk automated decision system’’ and “high-risk information system:”

  • ‘‘automated decision system’’ means “a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts consumers.
  • “automated decision system impact assessment’’ means a study evaluating an automated decision system and the automated decision system’s development process, including the design and training data of the automated decision system, for impacts on accuracy, fairness, bias, discrimination, privacy, and security
  • ‘‘high-risk automated decision system’’ means an automated decision system that—
    • taking into account the novelty of the technology used and the nature, scope, context, and purpose of the automated decision system, poses a significant risk—
      • to the privacy or security of personal information of consumers; or
      • of resulting in or contributing to inaccurate, unfair, biased, or discriminatory decisions impacting consumers;
      • makes decisions, or facilitates human decision making, based on systematic and extensive evaluations of consumers, including attempts to analyze or predict sensitive aspects of their lives, such as their work performance, economic situation, health, personal preferences, interests, behavior, location, or movements, that—
        • alter legal rights of consumers; or
        • otherwise significantly impact consumers;
        • involves the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests;
        • systematically monitors a large, publicly accessible physical place; or
    • meets any other criteria established by the Commission in regulations…
  • ‘high-risk information system’’ means an information system that—
    • taking into account the novelty of the technology used and the nature, scope, context, and purpose of the information system, poses a significant risk to the privacy or security of personal information of consumers;
    • involves the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests;
    • systematically monitors a large, publicly accessible physical place; or
    • meets any other criteria established by the Commission in regulations…

Consequently, algorithmic decision-making would be swept into the FTC’s new regime to govern privacy and data security. However, politically, this is not close to being on most Members’ consciousness as being related to privacy and data security. This reality marks the “Consumer Data Protection Act” as among the most forward looking of the bills that have been introduced over the last year. And, yet it is likely that any privacy or data security bill Congress passes will not include such provisions; however, a state like California could decide to wade into this area, which, again like with privacy, this could force policymakers in Washington to consider an issue percolating up to the federal level from one of the state laboratories of democracy.

In terms of enforcement, the bill explicitly bars the use of any contracts contrary to the rights and requirements in the “Consumer Data Protection Act.” Like virtually all the other bills on privacy, the FTC would be able to ask a federal court for civil fines for a first offense as high as a bit more than $40,000 per violation in addition to all the FTC’s other powers.

This bill is likely the outer bounds desired by the most ardent privacy and civil liberties advocate, and therefore is highly unlikely to get enacted in its current form. Other Democratic bills are far more modest in scope, and few of them address both security and privacy. The chances of enactment are very low, but Congressional interest in privacy legislation will continue because of the GDPR and the California Consumer Privacy Act.

One thought on “Spotlight: A Privacy Bill A Week: “Consumer Data Protection Act”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s