Brown’s Bill Is The Most Privacy Friendly So Far

The top Democrat on the Senate Banking Committee has released the most pro-privacy bill yet.

Even though Senate Banking, Housing and Urban Affairs Ranking Member Sherrod Brown (D-OH) introduced his draft privacy bill in June, the “Data Accountability and Transparency Act of 2020,” too much has been happening to take a proper look at the bill. Now that I have, I can say this is the most privacy and consumer rights friendly bill introduced in this Congress and quite possibly any of the recent Congresses. I wonder if Democrats could pass such a strong, restrictive bill even with super majorities in both chambers and a Democratic President, for the resistance by industry would be very fierce.

In terms of what this bill would do, most notably, a new agency would be created, the Data Accountability and Transparency Agency (DATA) that would be outside the appropriations process like the Consumer Financial Protection Bureau (CFPB), which limits Congress’ power over the agency. It would be headed by a Director appointed by the President and then confirmed by the Senate who could serve a five-year term. The agency would also have a Deputy Director. Again, this uses the CFPB as the template and not the Federal Trade Commission (FCC) or Federal Communications Commission (FCC), independent agencies with five Commissioners each. Also,  like the CFPB, and unlike the FTC, the agency would be charged with policing unfair, deceptive, and abusive privacy practices in violation of this new law. It appears the DATA (incidentally, a terrible acronym for an agency) would work alongside existing federal agencies, and so the FTC could still police privacy and data security.

Moreover, the Brown bill uses preemption model from the “Financial Services Modernization Act of 1999” (P.L. 106-102) (aka Gramm–Leach–Bliley) under which states would be allowed to regulate privacy above the federal standard so long as  a state statute is not inconsistent. And, state statutes would preempted only to the degree they are countered to the new federal law.

And, of course, Brown’s bill allows people to sue for violations, and on the most generous terms I’ve seen among the privacy bills.

Not surprisingly, the definitions are drafted in ways that are uber pro-privacy. For example, ‘‘personal data’’ is defined as “electronic data that, alone or in combination with other data—

  • could be linked or reasonably linkable to an individual, household, or device; or
  • could be used to determine that an individual or household is part of a protected class.”

This is a very broad definition of personal information a U.S. resident would have protected under the bill because it covers more than just data like names, addresses, Social Security numbers, etc. and instead covers all data that could be linked to a person, household, or device. This is a broader definition than most bills, which actually specify the sorts of data. For example, some bills treat specific geolocation data as deserving more protection than other data. However, it is often the case that this is defined as any such data that pinpoints a person’s location to within 1750 feet, meaning that data that locates a person within, say, 2000 feet, or less than half a mile would not be protected. Brown’s definition is simpler, broader, and quite possibly much easier to implement.

Likewise, what is considered a violation under the bill is also very broadly written. A ‘‘privacy harm’’ is “an adverse consequence, or a potential adverse consequence, to an individual, a group of individuals, or society caused, or potentially caused, in whole or in part, by the collection, use, or sharing of personal data, including:

(A) direct or indirect financial loss or economic harm, including financial loss or economic harm arising from fraudulent activities or data security breaches;

(B) physical harm, harassment, or a threat to an individual or property;

(C) psychological harm, including anxiety, embarrassment, fear, other trauma, stigmatization, reputational harm, or the revealing or exposing of an individual, or a characteristic of an individual, in an unexpected way;

(D) an adverse outcome or decision, including relating to the eligibility of an individual for the rights, benefits, or privileges in credit and insurance (including the denial of an application or obtaining less favorable terms), housing, education, professional certification, employment (including hiring, firing, promotion, demotion, and compensation), or the provision of health care and related services;

(E) discrimination or the otherwise unfair or unethical differential treatment with respect to an individual, including in a manner that is prohibited under section 104;

(F) the interference with, or the surveillance of, activities that are protected by the First Amendment to the Constitution of the United States;

(G) the chilling of free expression or action of an individual, or society generally, due to perceived or actual pervasive and excessive collection, use, or sharing of personal data;

(H) the impairment of the autonomy of an individual or society generally; and

(I) any harm fairly traceable to an invasion of privacy tort; and

(J) any other adverse consequence, or potential adverse consequence, consistent with the provisions of this Act, as determined by the Director.

I’ve quoted the entire definition of “privacy harm” because I think it helps one understand the full range of what harms the new privacy agency would be policing. First, it would be beyond actual financial or economic harms and go “psychological harm,” which may present courts with problems as they try to navigate what anguish meets this standard and which does not. Second, it covers activities that are protected under the First Amendment, the chilling of free expression, the impairment of a person’s impairment, and “any harm fairly traceable to an invasion of privacy tort.” This may be the widest definition of what is harm of any of the privacy bills introduced in this or any other recent Congress. Finally, the DATA could determine any other consequence, real or potential, qualifies as a privacy harm.  

“protected class’’ means the actual or perceived race, color, ethnicity, national origin, religion, sex, gender, gender identity, sexual orientation, familial status, biometric information, lawful source of income, or disability of an individual or a group of individuals.

The bill would outright ban data collection, use, or sharing unless for a permissible purpose, which include:

  • To provide a good, service, or specific feature requested by an individual in an intentional interaction.
  • To engage in journalism, provided that the data aggregator has reasonable safeguards and processes that prevent the collection, use, or sharing of personal data for commercial purposes other than journalism.
  • To employ an individual, including for administration of wages and benefits, except that a data aggregator may not invasively collect, use, or share the employee’s personal data in carrying out this paragraph.
  • Where mandated to comply with Federal, State, or local law.
  • Consistent with due process, direct compliance with a civil, criminal, or regulatory inquiry, investigation, subpoena, or summons.
  • To bring or defend legal claims, provided that the parties or potential parties take all necessary measures, including, as applicable, obtaining a protective order, to protect against unnecessary public disclosure of personal data.
  • To detect or respond to security incidents, protect against malicious, deceptive, fraudulent, or illegal activity, or prosecute those responsible for that activity.
  • Free expression by individuals on a social network or media platform.
  • In exigent circumstances, if first responders or medical personnel, in good faith, believe danger of death or serious physical injury to an individual, or danger of serious and unlawful injury to property, requires collection, use, or sharing of personal data relating to the exigent circumstances.
  • The development and delivery of advertisements—
    • based on the content of the website, online service, or application to which the individual or device is connected; and
    • excludes advertising based on the use of any personal data collected or stored from previous interactions with the individual or device, a profile of the individual or device, or the previous online or offline behavior of the individual or device.
  • To offer discounted or free goods or services to an individual if—
    • the offering is in connection with the voluntary participation by the individual in a program that rewards individuals for patronage; and
    • personal data is only collected to track purchases for loyalty rewards under the program

Again, I’ve quoted at length to show how restrictive the bill is. This is the list of permissible purposes, and one will not find a list of exemptions that pare back the privacy rights ostensibly granted by the bill. For the private sector, the first purpose will be the most relevant as they would be allowed to provide services, products, or goods requested by a person who has intentionally interacted with the entity (aka a data aggregator under the bill). Use of the word intention would seem to rule out accidental or questionable interaction. There is also not language making product or service development an exception like it is in many other bills.

Moreover, with respect to the online advertising industry, behavioral advertising would seem to not be a permissible purpose, at least the variety under which a company aggregates data from different sources to form a profile on a person. Moreover, “[c]ollecting, using, or sharing personal data to generate advertising revenue to support or carry out a permissible purpose is not a permissible purpose.”

The “Data Accountability and Transparency Act of 2020” would permit loyalty or reward programs and even allow a business to offer tiered pricing. And, entities could not charge higher or different prices if a person exercises her rights under the bill.

Brown’s bill would place very strict limits of what entities could do with personal data. To wit, it is provided that “[e]xcept where strictly necessary to carry out a permissible purpose, a data aggregator shall not—

  • share personal data with affiliated entities, service providers, or third parties;
  • use personal data for any purpose other than to carry out a permissible purpose;
  • retain personal data for any time longer than strictly necessary to carry out a permissible purpose; or
  • derive or infer data from any element or set of personal data.”

There is a list of prohibited practices, including, as mentioned, a bar on charging higher prices or providing lesser service or products if one chooses to exercise his rights under the bill. Also, businesses would be prohibited from re-identifying anonymized data or from commingling personal data from different sources. Violating these prohibitions could lead to treble damages.

It also seems like the bill bans most differential pricing:

It is unlawful for a data aggregator to collect, use, or share personal data for advertising, marketing, soliciting, offering, selling, leasing, licensing, renting, or otherwise commercially contracting for housing, employment, credit, or insurance in a manner that discriminates against or otherwise makes the opportunity unavailable or offered on different terms on the basis of a protected class or otherwise materially contributes to unlawful discrimination.

I suppose if there is differential pricing not based on a protected class, then it might be acceptable. However, I’m struggling to think of what that might look like.

This section also makes illegal the use of personal data for vote suppression. This language is an obvious non-starter with Republicans like Senate Majority Leader Mitch McConnell (R-KY) and would find few fans in the White House given recent and persistent allegations of vote suppression efforts by the Trump Campaign in 2016.

Brown’s use of the disparate impact standard in proving discrimination is anathema to most conservatives who have long made the case that disparate treatment should be the measuring stick for determining if discrimination has occurred.

Moreover, if a data aggregator use automated decision-making systems, then it must continually assess whether any bias against a protected class is occurring or any disparate impact against a protected class is happening.  

People would be able to access and port their personal information, and this right is much broader than those provided in other bills. They would be able to access to specific pieces of information collected, used or shared about, the permissible purposes used to collect the data, and the service providers and third parties with whom the information was shared. On this latter point, normally other privacy bills provide a person with access, upon request, to the categories of such entities and not the actual entities themselves.

Brown’s privacy bill provides a right of transparency that mandates in each party’s online privacy policy that the following information be made available:

  • A description of the personal data that the data aggregator collects, uses, or shares.
  • The specific sources from which personal data is collected.
  • A description of the sources from which personal data is collected.
  • The permissible purposes for which personal data is collected, used, or shared.
  • The affiliates, service providers, or third parties with which the data aggregator shares personal data, and the permissible purpose for such sharing.
  • A description of the length of time for which personal data is retained.
  • If personal data is collected and retained as anonymized data, a description of the techniques and methods used to create the anonymized data.

Again, this right provides more specific information than comparable rights in other privacy bills.

Data aggregators would have the affirmative duty to ensure information it collects is correct, and people would have the “right to require that a data aggregator that retains the individual’s personal data correct any inaccurate or incomplete personal data.” Moreover, data aggregators must correct any inaccurate or incorrect information as directed by a person. In other bills, there is language requiring businesses to make best or reasonable efforts but nothing like a guarantee for people or a duty for businesses.

People would be able to ask data aggregators to delete personal information, and they must unless these data are needed to complete a permissible purpose.

Brown’s bill has novel language stipulating “[a]n individual has the right to object to the claimed permissible purpose for any personal data that a data aggregator has collected, used, or shared of such individual.” Consequently, a data aggregator must “produce evidence supporting the data aggregator’s claim that the collection, use, or sharing of such individual’s personal data—

  • was strictly necessary to carry out a permissible purpose;
  • was not used or shared for any other purpose; and
  • has not been retained for any time longer than strictly necessary to carry out a permissible purpose.”

Presumably, failing to produce evidence at all or sufficient evidence constitutes a violation punishable by the new agency.

People would also be allowed to request that a person must review material decisions made via automated processes.

Brown puts an interesting twist on the customary language in almost all privacy bills requiring security commensurate with the type of information being collected, used, and shared. The bill creates a duty of care, which as I seem to be recalling against my will from law school, makes any violations of such duty a tort, permitting people to sue under tort law. Nonetheless, the bill that

A data aggregator shall implement and maintain reasonable security procedures and practices, including administrative, physical, and technical safeguards, appropriate to the nature of the personal data and the purposes for which the personal data will be collected, used, or shared…

Moreover, this duty of a data aggregator extends to service providers and the former are made explicitly liable for the violations of the latter.

If a data aggregator receives a request to exercise these rights that is verified, it must do so and at no cost. This would not apply to frivolous and irrelevant requests, however.

This new agency would be housed in the Federal Reserve Bank and would be able to keep and use the proceeds from its actions to fund operations. Just like the CFPB, this would ensure independence from Congress and the Executive Branch, and just like the CFPB, this is likely a non-starter with Republicans.

The new Data Accountability and Transparency Agency, as noted, would be empowered to “take any action authorized under this Act to prevent a data aggregator or service provider from committing or engaging in any unfair, deceptive, or abusive act or practice in connection with the collection, use, or sharing of personal data.” Moreover,

The Agency may prescribe rules applicable to a data aggregator identifying unlawful, unfair, deceptive, or abusive acts or practices in connection with the collection, use, or sharing of personal data, which may include requirements for the purpose of preventing such acts or practices. Rules under this section shall not limit, or be interpreted to limit, the scope of unlawful, deceptive, or abusive acts or practices in connection with the collection, use, or sharing of personal data.

The agency’s powers to punish unfair acts is drafted similarly to the FTC’s powers which add the caveat that any such acts must be unavoidable and is not outweighed by countervailing benefits to people or competition. It bears note that the agency would be able to punish unfair practices “likely” to cause privacy harms or “other substantial harm” to people in addition to actual practices.

An abusive practice is one that:

  • materially interferes with the ability of an individual to understand a term of condition of a good or service; or
  • takes unreasonable advantage of—
    • a lack of understanding on the part of the individual of the material risks, costs, or conditions of the product or service;
    • the inability of the individual to protect their interests in selecting or using a product or service; or
    • the reasonable reliance by the individual on a data aggregator or service provider to act in the interests of the individual.

Deceptive practices are not defined, and so it is likely the new agency’s powers would be the same as the FTC’s with respect to this type of illegal conduct. Also, the new agency would be able to punish violations of any privacy law, which would bring all the disparate privacy regimes under the roof of one entity in the U.S.

The new agency would receive the authority to punish bad actors in the same bifurcated fashion as the FTC and some other agencies: either through an administrative proceeding or by going to federal court. However, regarding the latter route, the agency would not need to ask the Department of Justice (DOJ) to file suit for it. This detail is salient because this is more and more coming to be the de facto Democratic position on this issue.

Whatever the case, the agency would be able to seek any appropriate legal or equitable relief, the latter term encompassing injunctions, disgorgement, restitution, and other such relief for violations. And, of course, the new agency would be able to punish violations of this new law or any federal privacy with civil fines laid out in tiers:

  • For any violation of a law, rule, or final order or condition imposed in writing by the Agency, a civil penalty may not exceed $5,000 for each day during which such violation or failure to pay continues.
  • [F]or any person that recklessly engages in a violation of this Act or any Federal privacy law, a civil penalty may not exceed $25,000 for each day during which such violation continues.
  • [F]or any person that knowingly violates this Act or any Federal privacy law, a civil penalty may not exceed $1,000,000 for each day during which such violation continues.”

It seems like these tiers would only result in the per day violation total and would not be multiplied by the number of affected people. If so, $5,000 a day is a sum most large companies would probably not register, and even $25,000 a day is bearable for enormous companies like Facebook or Amazon.

Nonetheless, violations arising from re-identifying personal data are punished under the last tier (i.e. $1 million per day), and any of these violations might result in criminal prosecution, for the agency may refer such violations to DOJ. CEOs and Boards of Directors could be prosecuted for knowing and intentional violations, which is a fairly high bar, and face up to ten years in prison and a $10 million fine if convicted.

Brown’s bill provides people with a right to sue entities, including government agencies under some circumstances, that violate this act. Also, people may sue the new agency for failing to promulgate required regulations or for adopting rules that violate the act. Plaintiffs would be able to win between $100-$1000 per violation per day, punitive damages, attorney’s fees and litigation costs, and any other relief the court sees fit to grant. Many Republicans and industry stakeholders are, of course, opposed to a private right of action, but Brown’s is beyond what the other bills are offering because it would allow for the award of punitive damages and fees related to the bringing of the litigation. They would likely argued, with justification, that there would be a wave of class action lawsuits. Another non-starter with Republicans is that the act circumvents a threshold consideration that weeds out lawsuits in federal court by stating that a mere violation of the act is an injury for purposes of the lawsuit. This language sidesteps the obstacle upon which may suits are dashed, for one whose privacy has been violated often cannot show an injury in the form of monetary or economic losses.

Like other bills, pre-dispute arbitration agreements and pre-dispute joint action waiver” signed by any person shall not be valid or enforceable in court, meaning companies cannot limit legal liability by requiring that people waive their rights as part of the terms of service as is now customary.  

As noted previously, the bill would not preempt all state privacy laws. Rather only those portions of state laws that conflict with this act would be preempted, and states would be free to legislate requirements more stringent than the new federal privacy regulatory structure. Moreover, the bill makes clear that common law actions would still be available.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Jan Alexander from Pixabay

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s