CPRA Analyzed

The CCPA follow on bill on the ballot in California will significantly change how the state regulates privacy, which will set the de facto standard for the U.S. in the absence of federal legislation.      

With the “California Privacy Rights Act” (CPRA) having been successfully added to the November ballot on which Californians will vote in November, it is worth taking a deeper look at a bill. This bill would replace the “California Consumer Privacy Act” (CCPA) (AB 375), which just came into full effect with the publishing of final regulations on 14 August. Nonetheless, as the Office of the Attorney General was drafting regulations, the organization that pushed for passage of the CCPA, Californians for Consumer Privacy (CCP), completed the drafting of a follow on bill. CCP Chair and Founder Alistair Mactaggart explained his reasoning for a second ballot initiative: “[f]irst, some of the world’s largest companies have actively and explicitly prioritized weakening the CCPA…[and] [s]econd, technological tools have evolved in ways that exploit a consumer’s data with potentially dangerous consequences.” Moreover, if polling released earlier this month by CCP is close to being accurate, then an overwhelming majority of Californians support enactment of the CPRA, meaning a significantly new privacy scheme will come into effect in the new few years in California.

Of course, it may be fair to assert this bill looks to solve a number of problems created by the rush in June 2018 to draft a bill all parties could accept in order to get the CCPA removed from the ballot. Consequently, the CCPA package that was enacted was sloppily drafted in some places with inconsistent provisions that necessitated two rounds of legislation to fix or clarify the CCPA.

As under the CCPA, the CPRA would still not allow people to deny businesses the right to collect and process their personal information unlike some of the bills pending in Congress. Californians could stop the sale or sharing of personal information, but not the collection and processing of personal data short of forgoing or limiting online interactions and many in-person interactions. A person could request the deletion of personal information collected and processed subject to certain limitations and exceptions businesses are sure to read as broadly as possible. Additionally, a new agency would be created to police and enforce privacy rights, but legitimate questions may be posed about its level of resources. Nonetheless, the new statute would come into effect on 1 January 2023, leaving the CCPA as the law of California in the short term, and then requiring businesses and people to adjust to the new regime.

In the findings section CCP explicitly notes the bills introduced to weaken or rollback the CCPA as part of the reason as to why the CPRA should be enacted. Changes to the California Code made by ballot initiative are much harder to change or modify than the legislative route for enacting statutes. Notably, the CPRA would limit future amendments to only those in furtherance of the act, which would rule out any attempts to weaken or dilute the new regime. Moreover, the bill looks at privacy rights through the prism of an imbalance in information and is founded on the notion that should people in California have more information and real choice in how and when their personal data is shared, proceeded, and collected, then the most egregious data practices would stop. Of course, this conceptual framework differs from the one used by others in viewing data collection and processing as being more like pollution or air quality, situations any one individual is going to have limited impact over, thus necessitating collective government action to address deleterious effects. In the view of the CCP, Californians will be on better footing to negotiate their privacy with companies like Facebook and Google. Notably, the CCP asserted:

  • In the same way that Ingredient labels on foods help consumers shop more effectively, disclosure around data management practices will help consumers become more informed counterparties In the data economy, and promote competition, Additionally, if a consumer can tell a business not to sell his or her data, then that consumer will not have to scour a privacy policy to see whether the business Is, In fact, selling that data, and the resulting savings in time Is worth, in the aggregate, a tremendous amount of money.
  • Consumers should have the information and tools necessary to limit the use of their information to non-invasive, pro-privacy advertising, where their personal information is not sold to or shared with hundreds of businesses they’ve never heard of, If they choose to do so. Absent these tools, it will be virtually Impossible for consumers to fully understand these contracts they are essentially entering into when they interact with various businesses.

The CPRA would change the notification requirements for businesses interested in collecting, processing, and sharing personal data in Section 1798.100 of the Civil Code (i.e. language added by the CCPA and some of the follow bills the legislature passed.) This requirement would be binding on the companies that control collection and not just the entities doing the actual collecting, which suggests concern that the ultimate user of personal data would be shielded from revealing its identity to people. Worse still, the CCPA language may create an incentive to use front companies or third parties to collect personal data. Moreover, the CPRA makes clear that if a company is using another company to collect personal data it will ultimately control, it may meet its notice requirements by posting prominently on its website all the enumerated information. This may be a loophole large companies may use to avoid informing people who is controlling data collection.

The new language that tightens the information people must be provided as part of this notice, namely the purposes for which personal data is collected or used and whether the entity is proposing to sell or share this information. Moreover, the CPRA would mandate that the notice also include any additional purposes for which personal data are collected and used “that are incompatible with the disclosed purpose for which the personal information was collected.”

The changes to Section 1798.100 and the underlying CCPA language that would remain will apply to a new category of information created by the CPRA: “sensitive personal information.” This term is defined to mean:

  • personal Information that reveals
    • a consumer’s social security, driver’s license, state Identification card, or passport number;
    • a consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account;
    • a consumer’s precise geolocation;
    • a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership;
    • the contents of a consumer’s mail, email and text messages, unless the business Is the Intended recipient of the communication;
    • a consumer’s genetic data; and
  • the processing of biometric Information for the purpose of uniquely identifying a consumer;
  • personal Information collected and analyzed concerning a consumer’s health; or
  • personal Information collected and analyzed concerning a consumer’s sex life or sexual orientation.

However, should any of these data be “publicly available” as defined by the CPRA, then it is no longer subject to the heightened requirements normally due this new class of information. For example, the new notice people must be given will list the categories of sensitive personal information collected and the purposes for which such information is collected or used. Additionally, people must be told whether this subset of personal data will be shared or sold.

The CPRA would limit collection, use, processing, and sharing of personal data to purposes “reasonably necessary and proportionate” to achieve the purpose of the information collection. Quite clearly, much will hang on what turns out to be “reasonable,” and this may be construed by the new data protection agency in regulation and ultimately courts in litigation. However, this provision also allows the “collection, use, retention, and sharing of a consumer’s personal information…for another disclosed purpose that Is compatible with the context in which the personal information was collected.” This will also need fleshing out either by regulation or litigation, or both. This seems to allow companies to specific another purpose for its data activities so long as it is compatible with the context of collection. And yet, it is not clear what would determine compatibility. If a person is agreeing to a grocery store chain’s data activities, might the company legally try to collect information regarding a person’s health?

This section also requires businesses to enter into contracts with third parties, service providers, and contractors to ensure they follow the CPRA and to specify that information sold or shared by the business is for limited and specific purposes.

Businesses are obligated to use “reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal Information from unauthorized or illegal access, destruction, use, modification, or disclosure.” This is a familiar construct that contemplates a sliding scale of security measures with lesser steps being all right for less valuable information, say deidentified data, but with higher standards being needed for more sensitive personal data. The challenge in such a regime is that reasonable minds might theoretically disagree about reasonable measures, but it may be the purview of the caselaw construing the CPRA that will point the way to how businesses should secure information.

Section 1798.105 spells out a person’s right to delete personal information and expands the obligation of businesses to direct their service providers and contractors to delete information upon receipt of a valid request. Third parties would be notified of deletion requests and expected to also delete unless doing so would be impossible or “involves disproportionate effort,” a term likely to be given as expansive a reading as possible by many businesses. There still are numerous exceptions for deletion requests, many of which will also likely be read expansively by businesses reluctant to honor deletion requests, including:

  • Complete the transaction for which the personal information was collected, fulfill the terms of a written warranty or product recall conducted In accordance with federal low, provide a good or service requested by the consumer, or reasonably anticipated by the consumer within the context of a business’s ongoing business relationship with the consumer, or otherwise perform a contract between the business and the consumer.
  • Help to ensure security and integrity to the extent the use of the consumer’s personal information Is reasonably necessary and proportionate for those purposes.
  • Debug to identify and repair errors that Impair existing Intended functionality.
    Exercise free speech, ensure the right of another consumer to exercise that consumer’s right of free speech, or exercise another right provided for by law.
  • To enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business and compatible with the context in which the consumer provided the Information.
  • Comply with a legal obligation.

However, the CPRA eliminated the exception that could be used to deny deletion requests in the CCPA that allowed a business to “use the consumer’s personal information, internally, in a lawful manner that is compatible with the context in which the consumer provided the information.”

The CPRA creates a new section, 1798.106, titled “Consumers’ Right to Correct Inaccurate Personal Information,” that requires businesses to correct inaccurate personal information in light of the type of information and why it is being processed. Businesses must disclose that people have this right if a person submits a verifiable request to correct inaccurate personal information. However, companies are only required to make reasonable commercial efforts in correcting inaccurate personal information. It appears that a rulemaking is necessary to flesh out what would be a reasonable commercial efforts.

Section 1798.110 is amended by the CPRA but more or less keeps the CCPA’s right to know about and access being collected about them with some significant changes. For example, there is an expansion of one of the categories businesses must provide to people who utilize this right. This category is the commercial or business purpose for which collection and selling of personal information must be given to requesters currently under the CCPA. Under the CPRA, businesses would also need to inform people of the other entities with whom they share personal information, thus closing a significant loophole, for companies like Facebook share people’s information but do not sell it. Under the CCPA, a Facebook would not need to divulge to a person with which companies it is sharing one’s information.

Also, the CPRA would deem in compliance those companies that post on its website the categories of personal information, the sources of this information, its business or commercial purposes, and the categories of third parties to whom personal information is disclosed. It seems likely many companies will go this route, meaning the only personal information they would need to furnish upon a request would be the specific pieces of information on the person making the request. And yet, the CRPA strikes the CCPA requirement that businesses keep personal information for one-time transactions or to reidentify or link to these data.

Section 1798.115 of the CCPA would also be changed by expanding the universe of data a person may request and receive regarding how their personal information is shared and sold. The CPRA keeps the basic structure of the CCPA in this regard and merely expands it to include shared as well as sold for the following:

  • The categories of personal information sold or shared and the categories of third parties to whom such information was sold or shared
  • The categories of personal information disclosed about a person for business purposes and the categories of persons to whom such information was disclosed

Third parties would be barred from selling or sharing personal information that has been sold to or shared with them unless they provide explicit notice and people have the opportunity to opt-out.

The CRPA similarly changes Section 1798.120 (aka the right to opt out of the sharing or selling of one’s personal information). However, it keeps the CCPA’s right to opt out of sales or sharing at any time. Likewise, teenagers between 13-16 would need to affirmatively agree to selling or sharing, and for any child less than 13, his or her parents must affirmatively agree.

A new Section 1798.121, a “Consumers’ Right to Limit Use and Disclosure of Sensitive Personal Information,” would allow people to stop businesses from collecting and using sensitive personal information in some cases. As a general matter, if a person limits collection or use of this class of information, then the business would be limited to “that use which is necessary to perform the services or provide the goods reasonably expected by an average consumer who requests such goods or services” subject to some of the exceptions embodied in the definition of “business purpose.” Businesses may, however, provide notice of additional uses of sensitive personal information a person must further limit if these new uses ore objectionable or unwanted.

The CPRA changes the provision barring retaliation against people who opt out of certain practices or use their CCPA rights. The general prohibition on punishing people who use their rights under the bill with different prices, services, or products would be maintained. The CPRA would expand this protection to employees, contractors, and applicants for employment. However, the CPRA keeps the CCPA exemption for so-called loyalty programs to offer different prices or services but only if the difference is reasonably related to the value the person’s data provides to the business. The CCPA contains language requiring the linkage to be directly related, this is change may seen as a subtle weakening of the connection between the value of a person’s data and the rewards or prices offered through membership in a loyalty program. This will almost certainly result in businesses in California using current programs or establishing new programs to press people to share personal information in exchange for better prices or services. After all, all they would need to do is show the value of the person’s data is reasonably related to the advantages of membership. Like other similar provisions in the bill, regulation and litigation will define the parameters of what is reasonably related. Like the CCPA, the new bill would require people to opt into such programs, and should a person refuse, the business would need to wait 12 months before making the offer again.

Many of the previously discussed changes to the CCPA necessitate alterations to a key section of the statute, Section 1798.130, that details notice, disclosure, correction, and deletion requests. Businesses with physical locations must still offer two means for people to make such requests, but the CPRA would allow online businesses to merely make available an email address. Anyone who has ever tried to resolve disputes and problems via email knows this process can often be frustrating, but the new statute would allow companies like Facebook or Google to merely offer an email address.

The new 1798.130 also makes clear the 45-day window for businesses to deliver required information to people after receiving a verified request also includes making requested corrections and deletions. A potential hurdle is established for requests, however. In light of the type of information in question, a business may seek to authenticate a person’s identity before granting the request but may not force a person to create an account with the business if they do not have one. To be fair, this provision may be aimed at the mischief that could be created if a person decides to impersonate someone else and ask businesses to delete their personal information. There are likely even other such possible situations in which havoc could be wreaked by a malicious person.

In any event, the disclosure of information would need to cover the previous 12 months under the CPRA, and after new regulations are put in place, people would be able to ask for and receive information stretching back before the preceding 12 months. But such a request could be denied on the grounds of impossibility or disproportionate effort. Presumably the new regulations would address when these types of situations may be the case. Another limitation on this right is that businesses would not need to provide information before 1 January 2022.

If a person submits requests to learn what type of personal information has been collected or has been sold or shared to a business’ contractor or service provider, they have no obligation to respond. And yet, these entities must assist a business that receives such requests.

The CPRA stipulates that businesses are required to provide the following types of information if person asks for the data the entity has:

the categories of sources from which the consumer’s personal information was collected; the business or commercial purpose for collecting, or selling or sharing the consumer’s personal information; and the categories of third parties to whom the business discloses the consumer’s personal information.

A business is also obligated to provide the “specific pieces of personal information obtained from the consumer in a format that is easily understandable to the average consumer, and to the extent technically feasible, in a structured, commonly used, machine-readable format, which also may be transmitted to another entity at the consumer’s request without hindrance.”

Regarding the type of information a business must give to people who ask to know what, if any, information was sold or shared about them, a business must furnish two lists:

  • A list of the categories of personal information It has sold or shared about consumers in the preceding 12 months by reference to the enumerated category or categories in [the revised definition of personal information and new definition of sensitive personal information] that most closely describe the personal Information sold or shared, or If the business has not sold or shared consumers’ personal information in the preceding 12 months, the business shall prominently disclose that fact in Its privacy policy.
  • A list of the categories of personal information it has disclosed about consumers for a business purpose in the preceding 12 months by reference to the enumerated category in subdivision (c) that most closely describes the personal information disclosed, or If the business has not disclosed consumers’ personal information for a business purpose In the preceding 12 months, the business shall disclose that fact.

The categories of personal information a business must provide are “information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household:

(A) Identifiers such as a real name, alias, postal address, unique personal Identifier, online identifier, Internet Protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.

(B) Any personal information described in subdivision (e) of Section 1798.80.

(C) Characteristics of protected classifications under California or federal law.

(D) Commercial information, including records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies.

(E) Biometric information.

(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an Internet website, application, or advertisement.

(G) Geolocation data.

(H) Audio, electronic, visual, thermal, olfactory, or similar Information. (I) Professional or employment-related Information.

(J) Education information, defined as information that is not publicly available personally Identifiable information as defined In the Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99).

(K) Inferences drawn from any of the Information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.

The CPRA modifies the CCPA standards on links on a business’ website allowing people to opt out of the sale or sharing of personal information. It also adds a requirement that such a link be placed on a website to allow people to opt out of the use or disclosure of sensitive personal information. A business would now be allowed to have one link for both if it wants, and it would also be allowed to remind people of the advantages of being a member of the business’ loyalty program and any associated charges or fees with not joining. This provision would seem to allow some businesses, at least those who can make the case of a reasonable relation between the discounts provided and the value of personal information, to pose a possibly uncomfortable dilemma to people: your privacy or your money. Put another way, the CPRA may well result in a price being put on one’s privacy with those of means or those intensely dedicated to privacy being able or willing to limit these practices while everyone else will acquiesce in the face of higher prices of worse services or products. Additionally, companies would not need to have links on their website if they allow for opting out through their platform, technology, or app.

If a person opts out, companies would have to wait 12 months before asking again whether they would not mind allowing the business to sell or share their personal information or use of disclose their sensitive personal information. But, one should bear in mind that even if a person opts out of the sale or sharing of personal information, a business may still collect or process it subject to other requirements in the CPRA. This right is limited to the dissemination of personal information through sales or a sharing arrangement.

The CPRA revises some key definitions and introduces new definitions, the most significant of which was discussed earlier: sensitive personal information. Another key change is to criteria for businesses subject to the CPRA. Each of the three thresholds for becoming a regulated business are changed:

  • First, language is changed to make clear a company must have earned $25 million in gross revenues in the preceding year to qualify on the basis of income.
  • Second, the threshold for the number of people is changed. It is raised from 50,000 to 100,000, and instead of counting people and devices, the latter is stricken and now households may be counted. Obviously, a household will likely include multiple devices, so counting by household allows for a higher threshold generally. Also, the counting is limited to the activities of businesses buying, selling, or sharing personal information, and so mere collection and processing is not counted, meaning if a business does not partake in any of the three enumerated categories, it would not qualify under this prong even if collects and processes the personal information of, say, 1 million Californians.
  • Thirdly, the threshold for businesses deriving 50% or more of their income selling consumers’ personal information is broadened to include sharing, meaning more entities might qualify on the basis of this prong.

Also, of note, the definition of business purpose was altered, and new definitions are provided for consent, contractor, cross-context behavioral advertising, dark pattern, non-personalized advertising and others.

The section on exemptions to the bars in the CCPA is rewritten and expanded by the CPRA. Businesses may disregard the obligations placed on by this privacy statute under a number of circumstances. For example, added circumstances include complying with a subpoena or court order or responding to direction by law enforcement agencies. Moreover, government agencies would able to make emergency requests for personal information to business if acting in good faith, asserts a legal right to do so, and follows with a court order within 3 days. There is also language that adds contractors to the CCPA’s provisions on the liability of a business for violations by its service providers that requires actual knowledge of such violations.

The CPRA keeps the CCPA’s grant of authority to allow people to sue for violations but casually tightens the circumstances under which this may happen to those in which one’s personal information is not encrypted and not redacted. The CCPA allows for a suit if a person’s personal information is neither encrypted nor redacted. Consequently, if a business uses either method of securing information it cannot be sued.

As noted, the bill would establish a California Privacy Protection Agency that would take over enforcement of the revised CCPA from the Office of the Attorney General. It would consist of a five-member board including a chair. At the earlier date of either 1 July 2021 or six months after the new agency informs the Attorney General it is ready to begin drafting rules, it shall have rulemaking authority. However, before this date, the Attorney General may have the authority or opportunity to begin some of the CPRA rulemakings during an interregnum that may serve to complicate implementation. Nonetheless, among other powers, the new agency would be able to investigate and punish violations with fines of up $2,500 per violation except for intentional violations and those involving the personal information of minor children, which could be fined at a rate of $7,500 per violation. Like the Federal Trade Commission, the California Privacy Protection Agency would be able to bring administrative actions inside the agency or go to court to sue. However, this new entity would only be provided $5 million during its first year and $10 million a year thereafter, which begs the question as to whether the new agency will be able to police privacy in California in a muscular way.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading and Other Developments (6 June)

Other Developments

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

  • A number of tech trade groups are asking the House Appropriations Committee’s Commerce-Justice-Science Subcommittee “to direct the National Institute of Standards and Technology (NIST) to create guidelines that help companies navigate the technical and ethical hurdles of developing artificial intelligence.” They argued:
    • A NIST voluntary framework-based consensus set of best practices would be pro-innovation, support U.S. leadership, be consistent with NIST’s ongoing engagement on AI industry consensus standards development, and align with U.S. support for the OECD AI principles as well as the draft Memorandum to Heads of Executive Departments and Agencies, “Guidance for Regulation of Artificial Intelligence Applications.”
  • The Department of Defense (DOD) “named seven U.S. military installations as the latest sites where it will conduct fifth-generation (5G) communications technology experimentation and testing. They are Naval Base Norfolk, Virginia; Joint Base Pearl Harbor-Hickam, Hawaii; Joint Base San Antonio, Texas; the National Training Center (NTC) at Fort Irwin, California; Fort Hood, Texas; Camp Pendleton, California; and Tinker Air Force Base, Oklahoma.”  The DOD explained “[t]his second round, referred to as Tranche 2, brings the total number of installations selected to host 5G testing to 12…[and] builds on DOD’s previously-announced 5G communications technology prototyping and experimentation and is part of a 5G development roadmap guided by the Department of Defense 5G Strategy.”
  • The Federal Trade Commission announced a $150,000 settlement with “HyperBeard, Inc. [which] violated the Children’s Online Privacy Protection Act Rule (COPPA Rule) by allowing third-party ad networks to collect personal information in the form of persistent identifiers to track users of the company’s child-directed apps, without notifying parents or obtaining verifiable parental consent.”
  • The National Institute of Standards and Technology (NIST) released Special Publication 800-133 Rev. 2, Recommendation for Cryptographic Key Generation that “discusses the generation of the keys to be used with the approved  cryptographic  algorithms…[which] are  either  1) generated  using  mathematical  processing  on  the  output  of  approved  Random  Bit  Generators (RBGs) and  possibly  other  parameters or 2) generated based on keys that are generated in this fashion.”
  • United States Trade Representative (USTR) announced “investigations into digital services taxes that have been adopted or are being considered by a number of our trading partners.” These investigations are “with respect to Digital Services Taxes (DSTs) adopted or under consideration by Austria, Brazil, the Czech Republic, the European Union, India, Indonesia, Italy, Spain, Turkey, and the United Kingdom.” The USTR is accepting comments until 15 July.
  • NATO’s North Atlantic Council released a statement “concerning malicious cyber activities” that have targeted medical facilities stating “Allies are committed to protecting their critical infrastructure, building resilience and bolstering cyber defences, including through full implementation of NATO’s Cyber Defence Pledge.” NATO further pledged “to employ the full range of capabilities, including cyber, to deter, defend against and counter the full spectrum of cyber threats.”
  • The Public Interest Declassification Board (PIDB) released “A Vision for the Digital Age: Modernization of the U.S. National Security Classification and Declassification System” that “provides recommendations that can serve as a blueprint for modernizing the classification and declassification system…[for] there is a critical need to modernize this system to move from the analog to the digital age by deploying advanced technology and by upgrading outdated paper-based policies and practices.”
  • In a Department of State press release, a Declaration on COVID-19, the G7 Science and Technology Ministers stated their intentions “to work collaboratively, with other relevant Ministers to:
    • Enhance cooperation on shared COVID-19 research priority areas, such as basic and applied research, public health, and clinical studies. Build on existing mechanisms to further priorities, including identifying COVID-19 cases and understanding virus spread while protecting privacy and personal data; developing rapid and accurate diagnostics to speed new testing technologies; discovering, manufacturing, and deploying safe and effective therapies and vaccines; and implementing innovative modeling, adequate and inclusive health system management, and predictive analytics to assist with preventing future pandemics.
    • Make government-sponsored COVID-19 epidemiological and related research results, data, and information accessible to the public in machine-readable formats, to the greatest extent possible, in accordance with relevant laws and regulations, including privacy and intellectual property laws.
    • Strengthen the use of high-performance computing for COVID-19 response. Make national high-performance computing resources available, as appropriate, to domestic research communities for COVID-19 and pandemic research, while safeguarding intellectual property.
    • Launch the Global Partnership on AI, envisioned under the 2018 and 2019 G7 Presidencies of Canada and France, to enhance multi-stakeholder cooperation in the advancement of AI that reflects our shared democratic values and addresses shared global challenges, with an initial focus that includes responding to and recovering from COVID-19. Commit to the responsible and human-centric development and use of AI in a manner consistent with human rights, fundamental freedoms, and our shared democratic values.
    • Exchange best practices to advance broadband connectivity; minimize workforce disruptions, support distance learning and working; enable access to smart health systems, virtual care, and telehealth services; promote job upskilling and reskilling programs to prepare the workforce of the future; and support global social and economic recovery, in an inclusive manner while promoting data protection, privacy, and security.
  • The Digital, Culture, Media and Sport Committee’s Online Harms and Disinformation Subcommittee held a virtual meeting, which “is the second time that representatives of the social media companies have been called in by the DCMS Sub-committee in its ongoing inquiry into online harms and disinformation following criticism by Chair Julian Knight about a lack of clarity of evidence and further failures to provide adequate answers to follow-up correspondence.” Before the meeting, the Subcommittee sent a letter to Twitter, Facebook, and Google and received responses. The Subcommittee heard testimony from:
    • Facebook Head of Product Policy and Counterterrorism Monika Bickert
    • YouTube Vice-President of Government Affairs and Public Policy Leslie Miller
    • Google Global Director of Information Policy Derek Slater
    • Twitter Director of Public Policy Strategy Nick Pickles
  • Senators Ed Markey (D-MA), Ron Wyden (D-OR) and Richard Blumenthal (D-CT) sent a letter to AT&T CEO Randall Stephenson “regarding your company’s policy of not counting use of HBO Max, a streaming service that you own, against your customers’ data caps.” They noted “[a]lthough your company has repeatedly stated publicly that it supports legally binding net neutrality rules, this policy appears to run contrary to the essential principle that in a free and open internet, service providers may not favor content in which they have a financial interest over competitors’ content.”
  • The Brookings Institution released what it considers a path forward on privacy legislation and held a webinar on the report with Federal Trade Commissioner (FTC) Christine Wilson and former FTC Commissioner and now Microsoft Vice President and Deputy General Counsel Julie Brill.

Further Reading

  • Google: Overseas hackers targeting Trump, Biden campaigns” – Politico. In what is the latest in a series of attempted attacks, Google’s Threat Analysis Group announced this week that People’s Republic of China affiliated hackers tried to gain access to the campaign of former Vice President Joe Biden and Iranian hackers tried the same with President Donald Trump’s reelection campaign. The group referred the matter to the federal government but said the attacks were not successful. An official from the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) remarked “[i]t’s not surprising that a number of state actors are targeting our elections…[and] [w]e’ve been warning about this for years.” It is likely the usual suspects will continue to try to hack into both presidential campaigns.
  • Huawei builds up 2-year reserve of ‘most important’ US chips” ­– Nikkei Asian Review. The Chinese tech giant has been spending billions of dollars stockpiling United States’ (U.S.) chips, particularly from Intel for servers and programable chips from Xilinx, the type that is hard to find elsewhere. This latter chip maker is seen as particularly crucial to both the U.S. and the People’s Republic of China (PRC) because it partners with the Taiwan Semiconductor Manufacturing Company, the entity persuaded by the Trump Administration to announce plans for a plant in Arizona. Shortly after the arrest of Huawei CFO Meng Wanzhou in 2018, the company began these efforts and spent almost $24 billion USD last year stockpiling crucial U.S. chips and other components.
  • GBI investigation shows Kemp misrepresented election security” – Atlanta-Journal Constitution. Through freedom of information requests, the newspaper obtained records from the Georgia Bureau of Investigation (GBI) on its investigation at the behest of then Secretary of State Brian Kemp, requested days before the gubernatorial election he narrowly won. At the time, Kemp claimed hackers connected to the Democratic Party were trying to get into the state’s voter database, when it was Department of Homeland Security personnel running a routine scan for vulnerabilities Kemp’s office had agreed to months earlier. The GBI ultimately determined Kemp’s claims did not merit a prosecution. Moreover, even though Kemp’s staff at the time continues to deny these findings, the site did have vulnerabilities, including one turned up by a software company employee.
  • Trump, Biden both want to repeal tech legal protections — for opposite reasons” – Politico. Former Vice President Joe Biden (D) wants to revisit Section 230 because online platforms are not doing enough to combat misinformation, in his view. Biden laid out his views on this and other technology matters for the editorial board of The New York Times in January, at which point he said Facebook should have to face civil liability for publishing misinformation. Given Republican and Democratic discontent with Section 230 and the social media platforms, there may be a possibility legislation is enacted to limit this shield from litigation.
  • Wearables like Fitbit and Oura can detect coronavirus symptoms, new research shows” –The Washington Post. Perhaps wearable health technology is a better approach to determining when a person has contracted COVID-19 than contact tracing apps. A handful of studies are producing positive results, but these studies have not yet undergone the per review process. Still, these devices may be able to determine disequilibrium in one’s system as compared to a baseline, suggesting an infection and a need for a test. This article, however, did not explore possible privacy implications of sharing one’s personal health data with private companies.
  • Singapore plans wearable virus-tracing device for all” – Reuters. For less than an estimated $10 USD for unit, Singapore will soon introduce wearable devices to better track contacts to fight COVID-19. In what may be a sign that the city-state has given up on its contact tracing app, TraceTogether, the Asian nation will soon release these wearables. If it not clear if everyone will be mandated to wear one and what privacy and data protections will be in place.
  • Exclusive: Zoom plans to roll out strong encryption for paying customers” – Reuters. In the same vein as Zoom allowing paying customers to choose where their calls are routing through (e.g. paying customers in the United States could choose a different region with lesser surveillance capabilities), Zoom will soon offer stronger security for paying customers. Of course, should Zoom’s popularity during the pandemic solidify into a dominant competitive position, this new policy of offering end-to-end encryption that the company cannot crack would likely rouse the ire of the governments of the Five Eyes nations. These plans breathe further life into the views of those who see a future in which privacy and security are commodities to be bought and those unable or unwilling to afford them will not enjoy either. Nonetheless, the company may still face a Federal Trade Commission (FTC) investigation into its apparently inaccurate claims that calls were encrypted, which may have violated Section 5 of the FTC Act along with similar investigations by other nations.
  • Russia and China target U.S. protests on social media” – Politico. Largely eschewing doctored material, the Russian Federation and the People’s Republic of China (PRC) are using social media platforms to further drive dissension and division in the United States (U.S.) during the protests by amplifying the messages and points of views of Americans, according to an analysis of one think tank. For example, some PRC officials have been tweeting out “Black Lives Matter” and claims that videos purporting to show police violence are, in fact, police violence. The goal to fan the flames and further weaken Washington. Thus far, the American government and the platforms themselves have not had much of a public response. Additionally, this represents a continued trend of the PRC in seeking to sow discord in the U.S. whereas before this year use of social media and disinformation tended to be confined to issues of immediate concern to Beijing.
  • The DEA Has Been Given Permission To Investigate People Protesting George Floyd’s Death” – BuzzFeed News. The Department of Justice (DOJ) used a little known section of the powers delegated to the agency to task the Drug Enforcement Agency (DEA) with conducting “covert surveillance” of to help police maintain order during the protests following the killing of George Floyd’s, among other duties. BuzzFeed News was given the two page memorandum effectuating this expansion of the DEA’s responsibilities beyond drug crimes, most likely by agency insiders who oppose the memorandum. These efforts could include use of authority granted to the agency to engage in “bulk collection” of some information, a practice the DOJ Office of the Inspector General (OIG) found significant issues with, including the lack of legal analysis on the scope of the sprawling collection practices.
  • Cops Don’t Need GPS Data to Track Your Phone at Protests” – Gizmodo. Underlying this extensive rundown of the types of data one’s phone leaks that is vacuumed up by a constellation of entities is the fact that more law enforcement agencies are buying or accessing these data because the Fourth Amendment’s protections do not apply to private parties giving the government information.
  • Zuckerberg Defends Approach to Trump’s Facebook Posts” – The New York Times. Unlike Twitter, Facebook opted not to flag President Donald Trump’s tweets about the protests arising from George Floyd’s killing last week that Twitter found to be glorifying violence. CEO Mark Zuckerberg reportedly deliberated at length with senior leadership before deciding the tweets did not violate the platform’s terms of service, a decision roundly criticized by Facebook employees, some of whom staged a virtual walkout on 1 June. In a conference call, Zuckerberg faced numerous questions about why the company does not respond more forcefully to tweets that are inflammatory or untrue. His answers that Facebook does not act as an arbiter of truth were not well freceived among many employees.
  • Google’s European Search Menu Draws Interest of U.S. Antitrust Investigators” – The New York Times. Allegedly Department of Justice (DOJ) antitrust investigators are keenly interested in the system Google lives under in the European Union (EU) where Android users are now prompted to select a default search engine instead of just making its Google’s. This system was put in place as a response to the EU’s €4.34 billion fine in 2018 for imposing “illegal restrictions on Android device manufacturers and mobile network operators to cement its dominant position in general internet search.” This may be seen as a way to address competition issues while not breaking up Google as some have called for. However, Google is conducting monthly auctions among the other search engines to be of the three choices given to EU consumers, which allows Google to reap additional revenue.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Dueling COVID-19 Privacy Bills Released

Democratic stakeholders answer a Republican proposal on how to regulate privacy issues raised by COVID-19 contact tracing. The proposals have little chance of enactment and are mostly about positioning.  

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Late last week, a group of Democratic stakeholders on privacy and data security issues released the “Public Health Emergency Privacy Act” (S.3749), a bill that serves as a counterpoint to the “COVID-19 Consumer Data Protection Act” (S.3663) legislation introduced a few weeks ago to pose a solution to the privacy issues raised by contact tracing of COVID-19. However, the Democratic bill contains a number of provisions that many Republicans consider non-starters such as a private right of action for individuals and no preemption of state laws. It is not likely this bill will advance in the Senate even though it may possibly be moved in the House. S. 3749 was introduced by Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) and Representatives Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Suzan DelBene (D-WA). 

In a way, the “Public Health Emergency Privacy Act” makes the case that “Health Insurance Portability and Accountability Act” (HIPAA)/“Health Information Technology for Economic and Clinical Health Act” (HITECH Act) regulations are inadequate to protect the privacy and data of people in the United States for the reason that these regulations apply only to healthcare providers, their business associates, and other named entities in the healthcare system. Entities collecting healthcare information, even very sensitive information, are not subject to HIPAA/HITECH regulations and would likely be regulated, to the extent they are, by the Federal Trade Commission (FTC) or state agencies and attorneys general.

The “Public Health Emergency Privacy Act” would cover virtually all entities, including government agencies except for public health authorities (e.g. a state department of health or the Centers for Disease Control and Prevention), healthcare providers, service providers, people acting in a household capacity. This remarkable definition likely reflects plans announced around by the world by governments to eschew Google and Apple’s contact tracing app to develop its own. Moreover, it would also touch some efforts aside and apart from contact tracing apps. Moreover, this is the first bill introduced recently that proposes to treat public and private entities the same way with respect to how and when they may collect personal data.

The types of data protected under the act are “emergency health data” which are defined as “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” The bill then lists a number of examples of emergency health data that is sweeping and comprehensive. This term includes “information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual” related to testing for COVID-19 and related genetic and biometric information. Geolocation and proximity information would also be covered by this definition. Finally, emergency health data encompasses “any other data collected from a personal device,” which seems to cover all data collection from a person’s phone, the primary means of contact tracing proposed thus far. However, it would appear that data on people collected at the household or higher level may not be subject to this definition and hence much of the bill’s protections.

The authority provided to covered entities is limited. First, collection, use, and disclosure of emergency health data are restricted to good faith health purposes. And yet, this term is not defined in the act, and the FTC may need to define it during the expedited rulemaking the bill. However, it seems a fairly safe assumption the agency and courts would not construe using emergency health data for advertising would not be a good faith public health purpose. Next, covered entities must allow people to correct inaccurate information and the entity itself has the duty to take reasonable efforts on its own to correct this information. Covered entities must implement reasonable safeguards to prevent discrimination on the basis of these data, which is a new obligation placed on covered entities in a privacy or data security bill. This provision may have been included to bar government agencies collecting and using covered data for discriminatory purposes. However, critics of more expansive bills may see this as the camel’s nose under the tent for future privacy and data security bills. Finally, the only government agencies that can legally be provided emergency health data are public health authorities and then only “for good faith public health purposes and in direct response to exigent circumstances.” Again, the limits of good faith public health purposes remain unclear and then such disclosure can only occur in exigent circumstances, which likely means when a person has contracted or been exposed to COVID-19.

Covered entities and service providers must implement appropriate measure to protect the security and confidentiality of emergency health data, but the third member of the usual triumvirate, availability, is not identified as requiring protection.

The “Public Health Emergency Privacy Act” outright bars a number of uses for emergency health data:

  • Commercial advertising, recommendations for e-commerce, or machine learning for these purposes
  • Any discriminatory practices related to employment, finance, credit, insurance, housing, or education opportunities; and
  • Discriminating with respect to goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation

It bears note the covered data cannot be collected or disclosed for these purposes either.

The act contains very strong consent language. Covered entities may not collect, use, or disclose emergency health data unless a person has provided express affirmative consent, which requires knowing, informed choice that cannot be obtained through the use of deceptive practices nor inferred through a person’s inaction. However, there are significant exceptions to this consent requirement that would allow covered entities to collect, use, or disclose these data, including

  • protecting against malicious, deceptive, fraudulent, or illegal activity; or
  • detecting, responding to, or preventing information security incidents or threats;
    the covered organization is compelled to do so by a legal obligation.

But these purposes are valid only when necessary and solely for the fulfillment of named purpose.

In a related vein, covered entities must allow people to revoke their consent and it must be effective as soon as practicable but no later than 15 days afterward. Moreover, the person’s emergency health data must be destroyed or rendered not linkable within 30 days of revocation of consent.

Covered entities must provide clear and conspicuous notice before or at the point of data collection that explains how and why the emergency health data is collected, used, and disclosed. Moreover, it must also disclose the categories of recipients to whom a covered entity discloses covered data. This notice must also disclose the covered entity’s data retention policies and data security policies and practices but only with respect to emergency health data. The notice must also inform consumers on how they may exercise rights under the act and how to file a complaint with the FTC.

There would also be a public reporting requirement for those covered entities collecting, using, or disclosing the emergency health data of 100,000 or more people. Every three months, these entities would need to report on aggregate figures on the number of people from whom data was collected, used, and disclosed. These reports must also detail the categories of emergency health data collected, used and disclosed and the categories of third parties with whom such data are shared.

The bill requires covered entities to destroy or make not linkable emergency health data 60 days after the Secretary of Health and Human Services declares the end of the COVID-19 public health emergency, or a state does so, or 60 days after collection.

The FTC would be given the daunting task of beginning a rulemaking 7 days after enactment “to ensure a covered organization that has collected, used, or disclosed emergency health data before the date of enactment of this Act is in compliance with this Act, to the degree practicable” that must be completed within 45 days. There is also a provision requiring the Department of Health and Human Services (HHS) to issue guidance so that HIPAA/HITECH Act regulated entities do not need to meet duplicative requirements, and, moreover, the bill exempts these entities when operating under the HIPAA/HITECH Act regulations.

The FTC, state attorneys general, and people would be able to enforce this new act through a variety of means. The FTC would treat violations as f they were violations of regulation barring a deceptive or unfair practice, meaning the agency could levy fines in the first instance of more than $43,000 a violation. The FTC could seek a range of other relief, including injunctions, restitution, disgorgement, and remediation. However, the FTC could go to federal court without having to consult with the Department of Justice, which is a departure from current law and almost all the privacy bills introduced in this Congress. The FTC’s jurisdiction would be broadened to include common carriers and non-profits for purposes of this bill, too. The FTC would receive normal notice and comment rulemaking authority that is very broad as the agency would be free to promulgate any regulations it sees necessary to effectuate this act.

State attorneys general would be able to seek all the same relief the FTC can so long as the latter is not already bringing an action. Moreover, any other state officials empowered by state statute to bring similar actions may do so for violations of this act.

People would be allowed to sue in either federal or state court to vindicate violations. The bill states that any violation is to be considered an injury in fact to forestall any court from finding that a violation does not injure the person, meaning her suit cannot proceed. The act would allow people to recover between $100-1000 for any negligent violation and between $500-5000 for any reckless, willful, or intentional violation with no cap on total damages. People may also ask for and receive reasonable attorney’s fees and costs and any equitable or declaratory relief a court sees fit to grant. Moreover, the bill would disallow all pre-lawsuit arbitration agreements or waivers of rights. Much of the right to sue granted to people by the “Public Health Emergency Privacy Act” will be opposed by many Republicans and industry stakeholders.

The enforcement section raises a few questions given that entities covered by the bill include government agencies. Presumably, the FTC cannot enforce this act against government agencies for the very good reason they do not have jurisdiction over them. However, does the private right of action waive the federal and state government’s sovereign immunity? This is not clear and may need clarification of the bill is acted upon in either chamber of Congress.

This bill would not preempt state laws, which, if enacted, could subject covered entities to meeting more than one regime for collecting, using, and disclosing emergency health data.

Apart from contact tracing, the “Public Health Emergency Privacy Act” also bars the use of emergency health data to abridge a person’s right to vote and it requires HHS, the United States Commission on Civil Rights, and the FTC to “prepare and submit to Congress reports that examines the civil rights impact of the collection, use, and disclosure of health information in response to the COVID–19 public health emergency.”

Given the continued impasse over privacy legislation, it is little wonder that the bill unveiled a few weeks ago by four key Senate Republicans takes a similar approach that differs in key aspects. Of course, there is no private right of action and it expressly preempts state laws to the contrary.

Generally speaking, the structure of the “COVID–19 Consumer Data Protection Act of 2020” (S.3663) tracks with the bills that have been released thus far by the four sponsors: Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The FTC and state attorneys general would enforce the new protections.

The scope of the information and entities covered is narrower than the Democratic bill. “Covered data” is “precise geolocation data, proximity data, a persistent identifier, and personal health information” but not aggregated data, business contact information, de-identified data, employee screening data, and publicly available information. Those entities covered by the bill are those already subject to FTC jurisdiction along with common carriers and non-profits, but government agencies are not, again unlike the bill put forth by Democrats. Entities must abide by this bill to the extent they “collect[], process[], or transfer[] such covered data, or determine[] the means and purposes for the collection, processing, or transfer of covered data.”

Another key definition is how an “individual” is determined, for it excludes any person acting in her role as an employee and almost all work-related capacities, making clear employers will not need to comply with respect to those working for them.

“Personal health information” is “information relating to an individual that is

  • genetic information of the individual; or
  • information relating to the diagnosis or treatment of past, present, or future physical, mental health, or disability of the individual; and
  • identifies, or is reasonably linkable to, the individual.

And yet, this term excludes educational information subject to the “Family Educational Rights and Privacy Act of 1974” or “information subject to regulations promulgated pursuant to the HIPAA/HITECH Acts.

The “COVID–19 Consumer Data Protection Act of 2020” bars the collection, processing, or transferring of covered data for a covered purpose unless prior notice is provided, a person has provided affirmative consent, and the covered entity has agreed not to collect, process, or transfer such covered data for any other purpose than those detailed in the notice. However, leaving aside the bill’s enumerated allowable purposes for which covered data may collected with consent, the bill provides a number of exemptions from this general bar. For example, collection, processing, or transfers necessary for a covered entity to comply with another law are permissible apparently in the absence of a person’s consent. Moreover, a covered entity need not obtain consent for operational or administrative tasks not disclosed in the notice provided to people.

The act does spell out “covered purposes” for which covered entities may collect, process, or transfer covered data with consent after notice has been given:

  • Collecting, processing, or transferring the covered data of an individual to track the spread, signs, or symptoms of COVID–19.
  • Collecting, processing, or transferring the covered data of an individual to measure compliance with social distancing guidelines or other requirements related to COVID–19 that are imposed on individuals under a Federal, State, or local government order.
  • Collecting, processing, or transferring the covered data of an individual to conduct contact tracing for COVID–19 cases.

Covered entities would be required to publish a privacy policy detailing for which of the above covered purposes a person’s covered data would be collected, processed, or transferred. This policy must also detail the categories of entities that receive covered data, its data retention and data security policies.

There would be reporting requirements that would affect more covered entities than the Democratic bill. Accordingly, any covered entity collecting, processing or transferring covered data for one of the enumerated covered processes must issue a public report 30 days after enactment and then every 60 days thereafter.

Among other provisions in the bill, people would be able to revoke consent, a request that covered entities must honor within 14 days. Covered entities must also ensure the covered data are accurate, but this requirement falls a bit short of people being granted to right to correct inaccurate data as they would, instead, be merely able to report inaccuracies. There is no recourse if a covered entity chooses not to heed these reports. Covered entities would need to “delete or de-identify all covered data collected, processed, or transferred for a [covered purpose] when it is no longer being used for such purpose and is no longer necessary to comply with a Federal, State, or local legal obligation, or the establishment, exercise, or defense of a legal claim.” Even though there is the commandment to delete or de-identify, the timing as to when that happens seems somewhat open-ended as some covered entities could seek legal obligations to meet in order to keep the data on hand.

Covered entities must also minimize covered data by limiting collection, processing, and transferring to “what is reasonably necessary, proportionate, and limited to carry out [a covered purpose.]” The FTC must draft and release guidelines “recommending best practices for covered entities to minimize the collection, processing, and transfer of covered data.” However, these guidelines would most likely be advisory in nature and would not carry the force of law or regulation, leaving covered entities to disregard some of the document if they choose. Covered entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of such data,” a mandate broader than the duty imposed by the Democratic bill.

The FTC and state attorneys general would enforce the new regime. The FTC would be able to seek civil fines for first violations in the same way as the Democrat’s privacy bill. However, unlike the other bill, the Republican bill would nullify any Federal Communications Commission (FCC) jurisdiction to the extent it conflicted with the FTC’s in enforcement of the new act. Presumably, this would address jurisdictional issues raised by placing common carriers under FTC jurisdiction when they are usually overseen by the FCC. Even though state laws are preempted, state attorneys general would be able to bring actions to enforce this act at the state level. And, as noted earlier, there is no private right of action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Moran Releases Long Awaited Privacy Bill Without Blumenthal

Senator Jerry Moran (R-KS) has released his long-awaited privacy and data security bill, the “Consumer Data Privacy and Security Act of 2020” (S.3456) that is not cosponsored by Senator Richard Blumenthal (D-CT) even though the two Senators have been in talks since late 2018 along with other Senators to draft a bipartisan bill. Of course, Moran chairs the Senate Commerce, Science, and Transportation Committee’s Manufacturing, Trade, and Consumer Protection Subcommittee and so is a key stakeholder with input on any privacy and data security legislation coming from that committee. However, Moran’s bill is likely a nonstarter with Senate and House Democrats because it does not provide people with a private right of action and it would preempt state laws like the “California Consumer Privacy Act” (CCPA) (AB 375). Moreover, the Federal Trade Commission’s (FTC) ability to obtain civil fines would be limited only to situations where the entity in question had actual knowledge of the violations as opposed to the standard many agencies use to enforce: constructive knowledge (i.e. knew or should have known.) This, too, is contrary to not only the Democratic privacy bills but also some of the Republican bills, which would allow the FTC to levy fines on the basis of constructive knowledge.

However, like almost all the other bills, the “Consumer Data Privacy and Security Act of 2020” would require covered entities to obtain express affirmative consent to collect from and process the personal data of people after providing extensive disclosure and notice about who and with whom their personal information would be shared. Likewise, this bill would give people certain rights, such as a right to access, correct, delete, and port their personal data. People would also be granted the right of erasure under which a covered entity must delete or de-identify the personal data of any person who submits a verified request. However, small businesses would be exempted from from granting requests to access and the right to correct. There are, again like many other privacy bills, circumstances under which a covered entity may decline to grant a request to exercise these rights. For example, if doing so would violate a law or legal process, then the covered entity could say no to a person. Likewise, if a person’s life is in imminent danger, then a request could also be denied. There are other such circumstances, some of which privacy and civil liberties advocates will assert will turn out to be such wide loopholes that the rights will cease to be meaningful as they have with some of the other bills.

In terms of who would be subject to the Act, entities covered by the bill would be those currently subject to FTC jurisdiction and non-profits and common carriers. Moreover, the bill has fairly expansive definitions of “personal data” and “sensitive personal data,” like many of the other bills.

Like some of the privacy bills, large covered entities would have additional privacy obligations and responsibilities. For those entities that collect and process the personal data of 20 million or more people per year or the sensitive personal data of 1 million or more a year, then these entities must have a privacy officer to advise the entity on compliance and monitoring. Also, these large entities must also take extra steps for making material changes to their privacy policies, including privacy impact assessments and the development and implementation of a comprehensive privacy policy.

The Consumer Data Privacy and Security Act of 2020 tracks with other privacy bills in requiring that covered entities must also implement data security safeguards to protect the integrity, confidentiality, and security of personal data. There would be a sliding scale of sorts with less sensitive data requiring less rigorous protection and conversely the more sensitive the data, the more stringent the safeguards that must be used. Covered entities must also conduct periodic, regular risk assessments and then remediate any turned up risks. Covered entities must also ensure their service providers and any third parties with whom they are sharing personal data are instituting data security standards but at a lower defined standard than the covered entity itself. For example, the latter entities must only protect the security and confidentiality of the information they hold, collect, or process for a covered entity and are not responsibility for the integrity of the information.

When a covered entity uses a service provider to collect or process personal data, it must use a binding contract and perform due diligence to ensure the service provider has the appropriate procedures and controls to ensure the privacy and security of personal data. The covered entity also has the responsibility to investigate the service provider’s compliance with the act if a reasonable person would determine there is a high probability of future non-compliance.

As noted, the FTC would be the federal enforcer of the Act under the rubric of its current Section 5 powers to seek a range of injunctive and equitable remedies to punish unfair and deceptive practices. The FTC would also be able to seek civil fines of up to $43,530 per violation but only for knowing violations, and there is no language for adjusting the per violation fine amount for inflation, a power the FTC otherwise has. State attorneys general could enforce the Act just as the FTC could.

The bill expressly preempts state laws on privacy and data security and makes clear that state laws may not interfere with HIPAA, Gramm-Leach-Bliley, FERPA, and others. Moreover, the “Consumer Data Privacy and Security Act of 2020” would not affect federal privacy laws like Gramm-Leach-Bliley, COPPA, FCRA, and others, and if entities currently subject to those federal laws are in compliance with the privacy and data security requirements, then they will be deemed in compliance with the Act.

Revised Data Care Act Released

Senator Brian Schatz (D-HI) and his cosponsors have reintroduced a slightly changed version of the “Data Care Act” (S. 2961), a privacy bill that would impose upon many entities that collect and use the personal data of people a fiduciary duty of care. In December 2018, Schatz and his cosponsors introduced the “Data Care Act” (S. 3744) at a time when the Senate Commerce, Science, and Transportation Committee and other committees of jurisdiction had just begun examining the issues related to privacy in light of the recent passage of the “California Consumer Privacy Act” (CCPA) (A.B. 375).  Fourteen other Democratic Senators joined Schatz, including presidential candidates Senators Michael Bennet (D-CO), Amy Klobuchar (D-MN) and Cory Booker (D-NJ). This bill took a novel approach to the issues presented by mass collection and processing ;personal data by extending the concept of fiduciary responsibility currently binding on health care professionals and attorneys with respect to the patients and clients’ information to “online service providers.” Most of the original cosponsors are again sponsoring this bill; however, no Republicans cosponsored the first or current iteration of the bill, suggesting the fiduciary framework is not appealing to Senate Republicans.

Of course, Schatz and Klobuchar are also sponsoring the “Consumer Online Privacy Rights Act” (COPRA) (S. 2968) (see here for more analysis) along with Senate Commerce, Science, and Transportation Committee Ranking Member Maria Cantwell (D-WA). COPRA that would empower the Federal Trade Commission (FTC) to police privacy and data security violations through augmented authority, not preempt state laws to the extent they provide greater protection, largely leave in place existing federal privacy statutes such as the “Financial Services Modernization Act of 1999” (aka Gramm-Leach-Bliley) and “Health Insurance Portability and Availability Act of 1996” (HIPAA), and allow individuals to sue.

Incidentally, Senator Ed Markey (D-MA) is also sponsoring both bills, and he has his own bill, the “Privacy Bill of Rights Act” (S. 1214), which was one of the only bill to get an A in the Electronic Privacy Information Center’s report on privacy bills. (See here for more analysis.) Finally, Klobuchar had also released a narrower bill with a Republican cosponsor, the “Social Media Privacy Protection and Consumer Rights Act of 2019” (S. 189), that would require major tech companies to give consumers an opportunity to opt in or opt out of the company’s data usage practices after offering enhanced notice of the practices for which the personal data may used. (See here for more analysis.)

And, Schatz has been in negotiations with other members of the Senate Commerce, Science, and Transportation Committee with the goal of developing a bipartisan bill to regulate privacy at a federal level. As discussed in past issues of the Technology Policy Update, stakeholders in both the House and Senate continue to negotiate privacy bills but significant disagreements have been reported regarding whether such a bill has a private right of action, preempts the CCPA and other state laws, and whether a new regime is primarily enhanced notice and consent or certain conduct would no longer be allowed amongst other issues.

Turning to the Data Care Act, this legislation was built on a concept fleshed out by law professor Jack Balkin in his article “Information Fiduciaries and the First Amendment“ that would place duties on companies collecting and using consumer data similar to those that lawyers and doctors must meet in how they handle client and patient information. Balkin explained that these so-called “information fiduciaries” should “have special duties to act in ways that do not harm the interests of the people whose information they collect, analyze, use, sell, and distribute.”

In short, under the “Data Care Act,” “online service providers” would be severely be limited on how they collect, share, and sell the personally identifiable information (PII) (known as “individual identifying data” in the bill), for these companies would need to treat their customers’ PII as privileged and deserving of a greater level of protection, much like the HIPAA regulations impose this standard on health care providers or bar associations’ rules on attorneys. What’s more, the scope of who is an online service provider would seem to encompass most consumer-facing companies doing business on the internet.

An “online service provider” is defined as an entity “engaged in interstate commerce over the internet or any other digital network; and in the course of business, collects individual identifying data about end users, including in a manner that is incidental to the business conducted.” This very sweeping definition would cover almost any business or entity doing business in the U.S. even if it is not across state lines as the Supreme Court has often construed the Commerce Clause. However, unlike other bills, the FTC would have the discretionary authority to exclude categories of online service providers from the fiduciary duties the bill would otherwise impose. Normally, the other privacy bills create a threshold below which limited obligations attach for smaller and mid-sized businesses except for data brokers. The FTC is directed to consider the privacy risks posed by the category of online service provider.

The bill requires that “[a]n online service provider shall fulfill the duties of care, loyalty, and confidentiality” towards consumers’ personal information, which is also broadly defined in the bill.  The duty of care requires online service providers to “reasonably” safeguard “individual identifying data” from unauthorized access and notify consumers of any breach of this duty, subject to FTC regulations that would be promulgated. The duty of loyalty would require online service providers to not use the information in a way that benefits them to the detriment of consumers, including uses that would result in reasonably foreseeable material physical or financial harm to the consumer. Finally, the duty of confidentiality limits the disclosure or sale of consumers’ information to instances where the duties of care and loyalty are observed (i.e. when the information must be safeguarded and not used to the detriment of consumers).

Moreover, the bill would require that should an online service provider wish to share or sell consumers’ information with a third party, they would need to enter into a contract with the other party that requires them to meet the same duties of care, loyalty, and confidentiality. The revised bill further tightens this requirement by stipulating that “If an online service provider transfers or otherwise provides access to individual identifying data to another person, the requirements of [the duties of loyalty, care, and confidentiality] shall apply to such person with respect to such data in the same manner that such requirements apply to the online service provider.” Note that this additional requirement pertains to the transfer of PII to any person and not just other online service providers, meaning virtually any transfer would be captured by this standard and thus a potential loophole in the bill was closed.

The FTC would enforce the act and would have the authority to levy fines in the first instance for violations, but state attorneys general would also be able to bring actions for violations in the event the FTC does not act or after FTC action. This latter power has long been a Democratic priority in the realm of data security and may be a non-starter with Republicans. Moreover, the bill does not preempt state laws, meaning the FTC could investigate a violation under this act and states could investigate under their laws. The FTC would be given authority under the Administrative Procedure Act (APA) to promulgate regulations regarding data breach notification instead of the much more onerous Moss-Magnuson rulemaking procedures the FTC must otherwise use. These regulations include the aforementioned regulations on breach notification, some possible exemptions to the duties that would otherwise apply to online service providers (e.g. small companies) but also more broadly . The bill expands the FTC’s jurisdiction over non-profit entities and common carriers that may also be online service providers.

There is no private right of action like many of the Democratic bills, which would disappoint many stakeholders on the left but would conversely please many industry and Republican stakeholders. Nor would people have the explicit right to access, correct, delete, or port their information as they would in other bills; and yet, the fiduciary concept would necessarily entail some of these rights. There are no provisions on obtaining a person’s consent, for the onus is entirely on how the covered entity handles the information. In short, this seems to be a framework that would sidestep issues related to notice and consent regimes. Additionally, unlike almost all the other bills, there are not detailed exceptions under which a person’s consent would not be needed to collect and process information (e.g. for security processes, to protect against fraud, or to develop new products.)

Privacy Bill A Week: United States Consumer Data Privacy Act of 2019

The majority staff of the Senate Commerce Committee circulated the “United States Consumer Data Privacy Act of 2019” (CDPA), a draft data privacy bill days after Ranking Member Maria Cantwell (D-WA) and her cosponsors released the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis). Of course, these competing proposals came before the Senate Commerce, Science, and Transportation Committee’s hearing on legislative proposals on privacy.

In the main, this bill shares the same framework with COPRA with some key, significant differences, including:

  • COPRA expands the FTC’s jurisdiction in policing privacy harms whereas CDPA would not
  • COPRA places a duty of loyalty on covered entities to people whose covered data they process or transfer; CDPA does not have any such duty
  • CDPA does not allow people to sue if covered entities violate the new federal privacy and security regime; COPRA would allow such suits to move forward
  • CDPA preempts state privacy and data security laws; COPRA would establish a federal floor that states like California would be able to legislate on top of
  • CDPA would take effect two years after enactment, and COPRA would take effect six months after enactment.
  • The bar against a person waiving her privacy rights under COPRA is much broader than CDPA
  • COPRA would empower the FTC to punish discriminatory data processing and transfers; CDPA would require the FTC to refer these offenses to the appropriate federal and state agencies
  • CDPA revives a concept from the Obama Administration’s 2015 data privacy bill by allowing organizations and entities to create standards or codes of conduct and allowing those entities in compliance with these standards or codes to be deemed in compliance with CDPA subject to FTC oversight; COPRA does not have any such provision
  • COPRA would require covered entities to conduct algorithmic decision-making impact assessments to determine if these processes are fair, accurate, and unbiased; no such requirement is found in CDPA

However, as noted the basic framework both bills create in establishing a federal privacy and data security regime are similar. Broadly, people would receive new rights, largely premised on being accurately informed of how their personal data would be used by covered entities. However, people would need to affirmatively consent before such data processing and transfers could occur.

The bills have similar definitions of what data is covered, what constitutes sensitive covered data, and the entities covered by the bill. Among the key similarities are:

  • Both bills would require affirmative express consent for a range of data processing and transferring with COPRA requiring this sort of consent under more circumstances
  • Like COPRA, CDPA marries data security requirements to privacy requirements; however, both COPRA and CDPA would deem entities already in compliance with a number of existing federal laws (e.g. Gramm-Leach-Bliley and HIPPA) to be in compliance with their data security requirements, and yet language in both bills suggests that to the extent that these federal standards fall short of the new data security standards, these entities would need to meet additional requirements
  • Both bills would allow people to request a copy of their covered data being held by a covered entity, delete or de-identify covered data, to correct or complete such data, and to port their data to another covered entity; however, COPRA would provide additional rights such as the aforementioned duty of loyalty and a right to opt-out of transfers
  • COPRA and CDPA would provide additional authority for the FTC to police data security with COPRA giving the agency broad authority to promulgate regulations and providing more descriptive guidance on how to do so with CDPA provided very targeted rulemaking authority that would likely continue the current case-by-case enforcement regime at the FTC
  • The FTC could seek civil fines in the first instance of $42,530 per violation along with the current range of equitable and injunctive relief it can seek under both COPRA and CDPA
  • Both bills allow state attorneys general could seek the same relief in the event of alleged violations

Separately from the release of this draft, Chair Roger Wicker (R-MS) said he was willing to allow a limited right for people to sue under a federal privacy bill but only to obtain injunctive relief and not monetary damages. This is a significant concession, for Republicans, including Wicker, have long characterized a private right of action as being out of the question. Of course, Wicker does not speak for other Republicans on the committee nor those in the Senate, so it is not exactly clear how much support he has for such a proposal. In the same vein, Wicker remarked to the media that the other main sticking points with Cantwell are on preemption and on a duty of loyalty. However, he may have been making this statement with some optimism for there are other, significant differences between these two bills, suggesting more negotiating is in order.

Also, it has been reported that Senators Richard Blumenthal (D-CT) and Jerry Moran (R-KS) are still working on their privacy bill but are not yet ready to release bill text. It is possible the release of these two bills speeds them to completion on the draft so they can lay down their marker.

However, turning to the substance of the bill, let’s start, as always, with definitions. Covered entities are “any person who operates in or affects interstate or foreign commerce,” which is a very broad definition that would sweep almost every entity in the U.S. and some overseas into it.

Covered data is defined as “information that identifies or is linked or reasonably linkable to an individual or a device that is linked or reasonably linkable to an individual.” The bill further provides “information held by a covered entity is linked or reasonably linkable to an individual if, as a practical matter, it can be used on its own or in combination with other information held by, or readily accessible to, the covered entity to identify the individual or a device associated with that individual.” However, covered data does not include: aggregated data; de-identified data; employee data; or publicly available information. Aggregated data is a new term among the privacy bills we’ve looked at thus far and is “information that relates to a group or category of individuals or devices that does not identify and is not linked or reasonably linkable to any individual.”

“Sensitive covered data” “means any of the following forms of covered data of an individual” including but not limited to:

  • A unique, government-issued identifier, such as a Social Security number, passport number, or driver’s license number.
  • Any covered data that describes or reveals the diagnosis or treatment of past, present, or future physical health, mental health, or disability of an individual.
  • A financial account number, debit card number, credit card number, or any required security or access code, password, or credentials allowing access to any such account.
  • Covered data that is biometric information.
  • Precise geolocation information capable of determining with reasonable specificity the past or present actual physical location of an individual or device at a specific point in time.
  • The contents of an individual’s private communications or the identity of the parties subject to such communications, unless the covered entity is the intended recipient of the communication;
  • Covered data revealing an individual’s racial or ethnic origin, or religion in a manner inconsistent with the individual’s reasonable expectation regarding the processing or transfer of such information.
  • Covered data revealing the sexual orientation or sexual behavior of an individual in a manner inconsistent with the individual’s reasonable expectation regarding the processing or transfer of such information.
  • Covered data about the online activities of an individual that relate to a category of covered data described in another subparagraph of this paragraph.
  • Covered data that is calendar information, address book information, phone or text logs, photos, or videos maintained on an individual’s device.
  • Any other category of covered data designated by the Commission pursuant to a rulemaking under [Administrative Procedure Act] if the Commission determines that the processing or transfer of covered data in such category in a manner that is inconsistent with the reasonable expectations of an individual would be likely to be highly offensive to a reasonable individual.

This is a fairly comprehensive list of covered data that would be considered sensitive.

Additionally, the FTC would be allowed to add other types of data if the agency goes through a rulemaking, providing flexibility and allowing the agency to address any future, unforeseen uses of personal data.

De-identified data is “information held by a covered entity that…does not identify, and is not linked or reasonably linkable to an individual or device” only if the covered entity publicly commits not to not re-identify the person or device. The covered entity would also need to put in place technical and organizational procedures to stop any possible linkage. Additionally, covered entities may not disclose de-identified data to any other entities without a contract or legal instrument barring the re-identification of the data.

CDPA defines affirmative express consent as “upon being presented with a clear and conspicuous description of an act or practice for which consent is sought, an affirmative act by the individual clearly communicating the individual’s authorization for the act or practice.”

Covered entities will not be able to deny goods or services to an individual because the individual exercised any of the rights established under” the CDPA. Additionally, for each service or product, a covered entity must publish a privacy policy that is “clear and conspicuous” to both the public at large and a person before or at the point of which collection of covered data begins. The CDPA spells out the elements a privacy policy must contain, among other features, the categories of covered data collected, the processing purposes for each category, the categories of third parties to whom the data is transferred and the purposes of such transfers, and a detailed description of data retention practices and data security practices. Any material changes to a covered entity’s privacy policy shall require obtaining affirmative express consent anew from people before any processing or transferring of covered data may occur.

The CDPA requires covered entities to fulfill the requests of people to access, correct, complete, delete or port their covered data within 45 days after receiving a verified request. However, a person may not request to access their covered data more than two times in a 12-month period, and for any additional requests, covered entities may charge a fee for such access. Of course, if a covered entity cannot verify the identity of the requester, then it does not need to meet the request. A covered entity may also deny a request if it would require the maintenance of information solely to fulfill the request, it is impossible or demonstrably impracticable to comply, or it necessitates the re-identification of de-identified data. The CDPA stipulates that none of these rights of obligations may be waived by a person in an agreement between a covered entity and a person. The FTC must promulgate regulations under the APA to implement this section.

Regarding the right to access one’s covered data, a covered entity must either provide the covered data or “an accurate representation” that is processed, any purposes for which such covered data is transferred, and a list of any third parties or service providers who have received covered data. A person has the right to request that a covered entity “correct inaccuracies or incomplete information with respect to the covered data of the individual that is processed by the covered entity; and notify any service provider or third party to which the covered entity transferred such covered data of the corrected information.” A person may also ask that a covered entity delete or de-identify any covered data the covered entity is processing and alert any third parties or service providers the covered entity has transferred the person’s covered data to. Finally, subject to technical feasibility, covered entities must generally provide covered data “in a portable, structured, standards-based, interoperable, and machine-readable format that is not subject to licensing restrictions.”

In regard to sensitive covered data, a covered entity must obtain affirmative express consent before it can process this subset of covered data or transfer it to a third party. This section also details how covered entities are to obtain affirmative express consent. People must be provided with notice that

  • includes a description of the processing purpose for which consent is sought;
  • clearly identifies and distinguishes between a processing purpose that is necessary to fulfill a request made by the individual and a processing purpose that is not necessary to fulfill a request made by the individual;
  • includes a prominent heading that would enable a reasonable individual to easily identify the processing purpose for which consent is sought; and
  • clearly explains the individual’s right to provide or withhold consent.

Covered entities will not be able to infer consent if a person does not act or in his continued use of the covered entity’s services or products. Moreover, a person must be presented “with a clear and conspicuous means to withdraw affirmative express consent.”

The language on the consent related to the sensitive covered data of minors is a bit confusing. Parents will be able to consent on behalf of their minor children in the same manner as they may consent for themselves (i.e. affirmative express consent). And yet, covered entities may not transfer the sensitive covered data of those 16 and younger to a third party if there is actual knowledge of the person’s age and unless the individual consents or her parent does.

Generally, covered entities must minimize how they collect, process, or share covered data to what is necessary for that purpose. Specifically, covered entities “shall not collect, process, or transfer covered data beyond

  • what is reasonably necessary, proportionate, and limited to provide or improve a product, service, or a communication about a product or service, including what is reasonably necessary, proportionate, and limited to provide a product or service specifically requested by an individual or reasonably anticipated within the context of the covered entity’s ongoing relationship with an individual;
  • what is reasonably necessary, proportionate, or limited to otherwise process or transfer covered data in a manner that is described in the privacy policy that the covered entity is required to publish…or
  • what is expressly permitted by this Act or any other applicable Federal law.

There are exceptions to the rights granted to people just like all the other data privacy bills, which we will turn to momentarily. However, this section requires a bit of elaboration. The FTC will undoubtedly need to determine the broad strokes of what is “necessary, proportionate, and limited” in the different contexts that clause if used. And, yet the FTC is not broadly granted rulemaking authority under the APA to implement the CDPA, and so the agency would probably need to hash out these terms through the “common law” it is currently using to forge the federal data security and privacy regime. And, this may be the case even though the agency is required to issue guidelines recommending best practices for covered entities to minimize the collection, processing, and transfer of covered data in accordance with this section” within one year of enactment. Such guidelines will, of course, inform covered entities of the agency’s thinking, but the “necessary, proportionate, and limited” formulation may present a number of close cases that may be adjudicated by courts and/or the FTC.

CDPA lays out the rights, responsibilities, and roles of service providers and third parties under the new federal privacy regime. However, as always, let’s look at who would qualify as either. First service providers would be “with respect to a set of covered data, a covered entity that processes or transfers such covered data for the purpose of performing 1 or more services or functions on behalf of, and at the direction of, another covered entity that” is not a part of that covered entity. Third parties are those entities that are not service providers that receive covered data and, again, are not owned or affiliated with the covered entity. There are also definitions of “service provider data” and “third party data.” Regarding the former, it shall be those data that service providers are given by covered entities or those covered data the service provider collects on behalf of the covered entity and then processed or transferred per the covered entity’s instructions or direction. This could be firms that have dedicated services for processing covered data, possibly even data brokers. Third party data shall be those covered data that are not service provider data that are received from a covered entity. For example, BestBuy transferring covered data with the proper consent to Walmart would make the latter a third party and those covered data are third party data.

The Act stipulates that service providers may process “service provider data” only at the direction of the covered entity that provided the data and may not undertake any additional processing sua sponte. Likewise, the service provider may not transfer service provider data to third parties without the covered entity having obtained affirmative express consent in the first instance. What’s more service providers must delete and deidentify these data as soon as possible after the agreed upon processing has occurred or as soon after the completion of processing as is practicable.

Service providers do not need to respond to a person’s request to access, correct, complete, delete, or port covered data, but they must help covered entities fulfill these requests to the degree possible and upon being notified, they must comply with the request a person has made of a covered entity. However, service providers do not need to get affirmative express consent from consumers to transfer their sensitive covered data to third parties. Nor need service providers minimize covered data. So, it would appear that once a person provides a covered entity the necessary consent to process or transfer their sensitive covered data, then this subset of covered data may be transferred onward or processed by a third party. Additionally, it appears covered entities could transfer sensitive covered data to service providers without the affirmative express consent of a people, and then service providers appear free to process such data and to transfer it onward. However, the definition of “process” may weigh against such a reading, for it covers retention and handling of covered data, so perhaps this scenario is contrary to the constraints placed on covered entities.

Third parties “shall not process third party data for a processing purpose inconsistent with the reasonable expectation of the individual to whom such data relates.” Additionally, third parties “may reasonably rely on representations made by the covered entity that transferred third party data regarding the reasonable expectations of individuals to whom such data relates, provided that the third party conducts reasonable due diligence on the representations of the covered entity and finds those representations to be credible.” And, like service providers third parties do not need to respond to a person’s request to access, correct, complete, delete, or port covered data nor minimize data retention.

Nonetheless, covered entities must exercise reasonable due diligence in selecting a service provider or transferring covered data to a third party in order to ensure compliance with the CDPA.

A subset of covered entities would need to meet other requirements. “Large data holders” “shall conduct a privacy impact assessment that weighs the benefits of the covered entity’s covered data collection, processing, and transfer practices against the potential adverse consequences to individual privacy of such practices.” Those covered entities that are large data holders are those that “processed or transferred the covered data of more than 5,000,000 individuals or devices that are linked or reasonably linkable to such individuals” or “processed or transferred the sensitive covered data of more than 100,000 individuals or devices that linked or reasonably linkable to such individuals (excluding any instance where the covered entity processes the log-in information of an individual or device to allow the individual or device to log in to an account administered by the covered entity).” Covered entities would need to determine annually if they have passed either threshold and have become a large data holder that needs to conduct an annual privacy impact assessment. Thereafter, these assessments would need to be conducted every two years and would need to be approved by the entity’s privacy officer.

Like the other privacy bills, there are circumstances under which covered entities may disregard some of the responsibilities to people. In terms of exceptions to the general rights laid out for people, “a covered entity may collect, process or transfer covered data for any of the following purposes, provided that the collection, processing, or transfer is reasonably necessary, proportionate, and limited to such purpose:

  • To complete a transaction or fulfilling an order or service specifically requested by an individual, including associated routine administrative activities such as billing, shipping, and accounting.
  • To perform internal system maintenance and network management.
  • Subject to [language governing biometrics], to detect or respond to a security incident, provide a secure environment, or maintain the safety of a product or service.
  • Subject to [language governing biometrics], to protect against malicious, deceptive, fraudulent, or illegal activity.
  • To comply with a legal obligation or the establishment, exercise, or defense of legal claims.
  • To prevent an individual from suffering serious harm where the covered entity believes in good faith that the individual is at risk of death or serious physical injury.
  • To effectuate a product recall pursuant to Federal or State law.
  • To conduct internal research to improve, repair, or develop products, services, or technology.
  • To engage in an act or practice that is fair use under copyright law.
  • To conduct a public or peer-reviewed scientific, historical, or statistical research that—
    • is in the public interest;
    • adheres to all applicable ethics and privacy laws; and
    • is approved, monitored, and governed by an institutional review board or other oversight entity that meets standards promulgated by the Commission pursuant to [the Administrative Procedure Act]

However, in availing themselves of these exceptions to many of the rights detailed in Title I of the bill, covered entities would not be allowed to breach the ban on denying goods or services because a person exercised their rights under the CDPA nor would they be able to disregard the rights of access, correction, completion, deletion, or portability. Similarly, the covered entity must still adhere to its privacy policy.

As noted earlier, covered entities may “not process or transfer covered data of an individual that is biometric information” “to detect or respond to a security incident, provide a secure environment, or maintain the safety of a product or service” or “to protect against malicious, deceptive, fraudulent, or illegal activity” unless “these activities are “limited to real-time or short-term processing” and comply with to-be-promulgated FTC regulations. There is the further stipulation that “the covered entity does not transfer such information to a third party other than to comply with a legal obligation or to establish, exercise, or defend a legal claim.”

Small businesses would be provided with a limited carve out under the CDPA from heeding requests to access, correct, complete, delete, or port covered data and from the data minimization requirements binding on other covered entities. Such exempted small businesses would be those whose gross annual revenues for the preceding three years is $25 million or less, processing of covered data did not exceed more than 100,000 people or devices, and whose revenue from transferring covered data was less than 50% of its annual revenue.

Senate Commerce Republican staff have apparently acceded to Democratic insistence that data security be made part of a privacy bill as the CDPA contains such language. The bill provides generally that “[a] covered entity shall establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of sensitive covered data.” These data security standards should be “appropriate to the size and complexity of the covered entity, the nature and scope of the covered entity’s collection or processing of sensitive covered data, the volume and nature of the sensitive covered data at issue, and the cost of available tools to improve security and reduce vulnerabilities.” These standards should be designed to

  • identify and assess anticipated human and technical vulnerabilities to sensitive covered data;
  • take preventative and corrective action to address anticipated and known vulnerabilities to sensitive covered data; and
  • delete sensitive covered data after it is no longer needed for the purpose for which it was collected unless such retention is necessary to comply with a law.”

Theoretically, those covered entities processing and transferring sensitive covered data would need to implement more robust data security standards than covered entities just handling covered data.

The FTC may, but is not required to, promulgate regulations under the APA and must consult with the National Institute for Standards and Technology (NIST). However, the FTC must “issue guidance to covered entities on how to—

  • identify and assess vulnerabilities to sensitive covered data, including—
    • the potential for unauthorized access to sensitive covered data;
    • human and technical vulnerabilities in the covered entity’s collection or processing of sensitive covered data;
    • the management of access rights; and
    • the use of service providers to process sensitive covered data; and
  • take preventative and corrective action to address vulnerabilities to sensitive covered data.”

If the FTC chooses to skip regulations and instead issue guidance, covered entities might be wise to heed the FTC’s views in the latter document, but they would not be required to meet any articulated standards.

And yet, those covered entities in compliance with the “Financial Modernization Act of 1999” (P.L. 106-102) (aka Gramm-Leach-Bliley) and the “Health Insurance Portability and Accountability Act of 1996” (P.L. 104-191) (HIPAA), mainly financial services and healthcare entities respectively, would be deemed to be in compliance with the CDPA. However, this compliance would be only with respect to “information security requirements.” Additionally,

Covered entities must also designate privacy officers and data security officers that “shall be responsible for, at a minimum…coordinating the covered entity’s policies and practices regarding the processing of covered data; and…facilitating the covered entity’s compliance with this Act.” Furthermore, “[a] covered entity shall maintain internal controls and reporting structures to ensure that appropriate senior management officials of the covered entity are involved in assessing risks and making decisions that implicate compliance with this Act.” Those entities in compliance with a range of federal privacy regimes regarding “data collection, processing, or transfer activities” under those statutes would be deemed to be in compliance but only with respect to “the data collection, processing, or transfer activities governed by such laws.”

In terms of enforcing the CDPA, the FTC would be able to seek civil penalties in the first instance and common carriers and non-profits would be added to the universe of entities the FTC can police. Like COPRA, this bill would establish a “Data Privacy and Security Victims Relief Fund” in which the FTC shall deposit “any civil penalty obtained against any covered entity in any judicial or administrative action the Commission commences to enforce this Act or a regulation promulgated under this Act.” These FTC may use these funds “to provide redress, payments or compensation, or other monetary relief to individuals affected by an act or practice for which civil penalties have been imposed under this Act.”

State attorneys general may also bring actions to seek a range of remedies including to enjoin conduct in violation of the CDPA and to “obtain damages, civil penalties, restitution, or other compensation on behalf of the residents of the State.” If two or attorneys general file suit against the same covered entity for the same conduct, the cases would be combined in federal court in the District of Columbia. Moreover, the FTC may intervene in an action brought by a state attorney general, and if the FTC brings an action first, state attorneys general may not bring actions until the FTC’s action finishes.

The CDPA uses a concept from the Obama Administration’s “Consumer Privacy Bill of Rights Act of 2015:” the creation of voluntary codes that private entities may adhere to after the FTC has signed off on them. Accordingly, the FTC “may approve certification programs developed by 1 or more covered entities or associations representing categories of covered entities to create standards or codes of conduct regarding compliance with 1 or more provisions in this Act.” Consequently, “[a] covered entity that complies with a certification program approved by the Commission shall be deemed to be in compliance with the provisions of this Act addressed by such program.” However, “[a] covered entity that has certified compliance with an approved certification program and is found not to be in compliance with such program by the Commission shall be considered to be in violation of the section 5 of the Federal Trade Commission Act…prohibition on unfair or deceptive acts or practices.”

The CDPA would preempt state laws on privacy but not any such laws or provisions regarding data breach notification. The CDPA would take effect two years after enactment, allowing covered entities, the FTC and other time to get prepared for the new privacy standards.

The FTC would receive limited responsibility to address discriminatory data processing or transferring. Notably, if the agency receives credible evidence of possible violations of federal laws barring discrimination (e.g. the 1964 Civil Rights Act), it would not investigate and possibly bring an action. Rather, the FTC would transfer this information to federal or state regulators with explicit authority to regulate discrimination.

The FTC would need to use its current Section 6(b) authority to obtain information from entities to examine “the use of algorithms to process covered data in a manner that may violate Federal anti-discrimination laws.” The FTC would send out demands for information and entities must answer upon pain of potential penalties. The agency would need to publish a report on its findings within three years and then publish guidance “to assist covered entities in avoiding discriminatory use of algorithms.”

Additionally, within six months of enactment of the CDPA, the National Institute of Standards and Technology (NIST) “shall develop and publish a definition of “digital content forgery” and accompanying explanatory materials” and no later than one year after NIST’s report, the FTC must “publish a report regarding the impact of digital content forgeries on individuals and competition.” The FTC must update the report at least every two years or more frequently if necessary.

The CDPA lifts a structure from the “California Consumer Privacy Act” (CCPA) (AB 375) in setting up a regime for data brokers to annually register with the FTC. The data broker would need to provide contact information and pay a $100 fee. Failure to do so could result in a fine of $50 per day and no more than $10,000 per year. The FTC would then publish the registration information on its website.

Privacy Bill A Week: Consumer Online Privacy Rights Act

Yesterday, we posted the political backdrop for the introduction of the “Consumer Online Privacy Rights Act“ (COPRA). Today, let’s turn to the substance of the bill.

Under COPRA, entities covered by the new requirements is a broad class simply defined as those already subject to the FTC Act and “process[] or transfer[] covered data.” The bill carves out sub-classes of entities that might otherwise be covered but some of which may not fall into the definition of covered entity.

Service providers are defined to be covered entities that are performing a service on behalf of another covered entity that process of transfer covered data. However, the definition is written to include only those activities undertaken at the behest of another covered entity and is explicit that the “term does not include a covered entity that processes or transfers the covered data outside of the direct relationship between the service provider and the covered entity.” Consequently, entities such as Verizon and Amazon would be deemed service providers only to the extent they are providing services like broadband internet and cloud services. Otherwise, they would be covered entities and subject to all the responsibilities the bill would place on them. Third parties are those who received covered data from covered entities for processing or transfer that are not service providers, affiliates, subsidiaries, or otherwise controlled by the covered entity.

Additionally, small businesses would be carved out of much of the bill, and these are defined as those with $25 million or less in annual revenues for the preceding three years, processed the covered data of fewer than 100,000 individuals, and earns 50% or less of its gross revenue from processing covered data. So, non-profits and other discrete classes of entities would be outside the confines of this bill (e.g. some of the activities in the privacy and data security spheres of telecommunications companies would still be regulated by the Federal Communications Commission.)

“Covered data” is “information that identifies, or is linked or reasonably linkable to an individual or a consumer device, including derived data.” But this term excludes “de-identified data,” “employee data,” and “public records.” Turning to those terms, de-identified data are generally “information that cannot reasonably be used to infer information about, or otherwise be linked to, an individual, a household, or a device used by an individual or household.” However, before any such information may be deemed de-identified data, in addition to ensuring the information cannot be linked to a person, device, or household and also that inferences cannot be reasonably drawn, the entity must put in place reasonable measures to block the re-identification of such information and publicly commit not to re-identifying and to only process or transfer in a de-identified state. Moreover, any entity seeking to de-identify data must also obligate any other entities who receive the information to meet all of the aforementioned requirements.

Employee data are the information employers collect, process, and transfer solely related to a person’s employment, application for employment, emergency contacts, and administration of benefits. Public records are “information that is lawfully made available from Federal, State, or local government records provided that the covered entity processes and transfers such information in accordance with any restrictions or terms of use placed on the information by the relevant government entity.” This last definition may receive some scrutiny, for a number of Departments of Motor Vehicles are selling the personal information of people who hold driver’s licenses, so this could prove a significant loophole that may be exploited.

COPRA creates a subset of covered data, ‘‘sensitive covered data,’’ which includes the following list, which has been shortened:

  • A government-issued identifier, such as a Social Security number, passport number, or driver’s license number.
  • Any information that describes or reveals the past, present, or future physical health, mental health, disability, or diagnosis of an individual.
  • Biometric information.
  • Precise geolocation information that reveals the past or present actual physical location of an individual or device.
  • The content or metadata of an individual’s private communications or the identity of the parties to such communications unless the covered entity is an intended recipient of the communication.
  • Information revealing an individual’s race, ethnicity, national origin, religion, or union membership in a manner inconsistent with the individual’s reasonable expectation regarding disclosure of such information.
  • Information revealing the sexual orientation or sexual behavior of an individual in a manner inconsistent with the individual’s reasonable expectation regarding disclosure of such information.
  • Information revealing online activities over time and across third-party website or online services.
  • Calendar information, address book information, phone or text logs, photos, or videos maintained on an individual’s device.
  • A photograph, film, video recording, or other similar medium that shows the naked or undergarment-clad private area of an individual.
  • Any other covered data processed or transferred for the purpose of identifying the above data types.
  • Any other covered data that the Commission determines to be sensitive covered data through a rulemaking pursuant to [the Administrative Procedure Act]

While we will not dive into all the categories of information considered sensitive covered data, one bears mention for it sets COPRA apart from the only major privacy bill introduced in the House this year, the “Online Privacy Act of 2019“ (H.R. 4978). In COPRA, both the content and metadata of private communications are provided privileged status. The same is not true in the other bill, which protects only the contents of communications with metadata being subject to lesser standards.

A final definition to note: “affirmative express consent.” Since so much of a person’s rights under COPRA is linked to the provision of “affirmative express consent,” it bears a bit of investigation. First, the bill makes clear that this cannot be inferred by a person’s actions or inaction or even her continued use of a covered entity’s products and services. Consequently, only affirmative actions that clearly communicate agreement in response to a specific request that meets defined criteria will qualify. Namely, this request must be by itself, describe each act or practice for which consent is being requested, expressed in easily understood language, and explains applicable rights. Any consent short of this would violate the Act, for then any subsequent processing or transference of covered data would be contrary to a number of requirements.

Covered entities would have a duty of loyalty. However, the bill is not explicit as to whom this duty if owed, but the context is fairly clear that this duty is due to the people whose covered data is collected, processed, and transferred. This duty has two parts: 1) a prohibition against engaging in deceptive or harmful data practices; and 2) processing or transferring covered data in any way that violates COPRA. The definition of what is deceptive is the same as those practices currently barred as deceptive under the FTC Act, but COPRA would institute a new definition of harmful that would considerably widen the scope of the FTC’s powers to punish illegal privacy or data security practices. Specifically, harmful data practices are “the processing or transfer of covered data in a manner that causes or is likely to cause any of the following:

  • Financial, physical, or reputational injury to an individual.
  • Physical or other offensive intrusion upon the solitude or seclusion of an individual or the individual’s private affairs or concerns, where such intrusion would be offensive to a reasonable person.
  • Other substantial injury to an individual.”

Obviously, the FTC will have views on how to construe some potentially harmful data practices that will ultimately be adjudicated upon by federal courts. For example, how would one define “reputational harm”? Likewise, what constitutes “[o]ther substantial injury” given that financial, physical, reputational, and broad privacy harms are already enumerated. Quite possibly, this language was included to provide the agency and courts with the flexibility to include new harms yet to be seen. As for the other component of the duty of loyalty, it is simply not to violate the myriad requirements of the Act, which provides a very broad means for the FTC and state attorneys general to pursue and prosecute violations.

People would be able to request and receive a human-readable version of their covered data a covered entity holds along with the names of all third parties with whom such information has been shared and why. Covered entities must make publicly available “a privacy policy that provides a detailed and accurate representation of the entity’s data processing and data transfer activities.” This policy must include

  • each category of data the covered entity collects and the processing purposes for which such data is collected
  • whether the covered entity transfers covered data and, if so—
    • each category of service provider and third party to which the covered entity transfers covered data and the purposes for which such data is transferred to such categories; and
    • the identity of each third party to which the covered entity transfers covered data and the purposes for which such data is transferred to such third party, except for transfers to governmental entities pursuant to a court order or law that prohibits the covered entity from disclosing such transfer;
  • how long covered data processed by the covered entity will be retained by the covered entity and a description of the covered entity’s data minimization policies;
  • how individuals can exercise his or her individual rights; and
  • a description of the covered entity’s data security policies

This is a fairly comprehensive list of information a consumer must be provided. Unless the FTC issues regulations or guidance directing covered entities to use a uniform format or keep this disclosure to a certain length, it is possible covered entities will favor longer, denser privacy policies in order to either obfuscate or discourage reading.

And, of course, any material changes to a covered entity’s privacy policy will require obtaining affirmative express consent from users.

Another right granted by COPRA is that of deletion. Upon receiving a verified request from a person, a covered entity must delete the requested information and then also inform third parties and service providers of the deletion request. However, it is not clear that the latter two entities would be bound to honor the request and actually carry out the deletion. It may be necessary for the FTC”s regulations to require such language be inserted into contracts between covered entities and their service providers and third parties.

Likewise, an individual may ask that a covered entity correct any inaccuracies in the covered data they hold and process. Again, any such request would need to be verified and again the covered entity would need to inform third parties and service providers.

The bill creates a right of data portability in that covered entities must honor verified requests from people and provide them with both human-readable and machine-readable copies of their covered data. COPRA also establishes a right to object to and opt-out of transfers of covered data to third parties, and the FTC would need to conduct a rulemaking to establish the procedures one may use to affect this right. The bill lists the features this final rule must have, including requirements for clear and conspicuous opt-out notices and easy to use mechanisms and a centralization of opting out so a person will not need to repeatedly opt-out of a covered entity’s transfers.

Furthermore, covered entities may neither process nor transfer a person’s sensitive covered data with “prior, affirmative express consent” and must “provide an individual with a consumer-friendly means to withdraw affirmative express consent to process the sensitive covered data of the individual.” However, covered entities do not need prior, affirmative express consent to process or transfer publicly available information. Considering that these passages are in the same section of the bill, the drafters are clearly contemplating that sensitive covered data may be available from public sources. For example, as mentioned earlier, some DMVs are selling the personal information of drivers, making some available information that would likely be considered sensitive covered data that could then be processed and transferred without the consent of the person to which it pertains.

Covered entities must limit their data processing and transferring to what is necessary, proportionate, and limited. This right to data minimization would task covered entities with engaging in the bare minimum “to carry out the specific processing purposes and transfers described in the privacy policy made available by the covered entity as required” unless it has affirmative express consent for other processing or transferring. This right to data minimization would be abridged by the exceptions discussed below.

Cantwell has long expressed her view that privacy legislation should include data security requirements, and so COPRA does. Covered entities must “establish, implement, and maintain reasonable data security practices to protect the confidentiality, integrity, and accessibility of covered data…appropriate to the volume and nature of the covered data at issue.” This provision spells out further requirements, including the need to conduct vulnerability assessments to turn up reasonable foreseeable threats, developing and implementing a process to address any such vulnerabilities, destroying or deleting any covered data that is no longer needed or for which affirmative express consent to hold has not been provided, and to properly train the covered entity’s employees to properly handle and safeguard covered data. The FTC would need to issue training guidelines to assist covered entities, and even though this provision does not specifically task the agency with promulgating regulations, COPRA provides the FTC with a broad grant of authority to promulgate regulations under the Administrative Procedure Act.

Next, the bill turns to the civil rights granted to individuals residing in the U.S. regarding data privacy, many of which address practices the Obama Administration called digital redlining. Covered entities are barred from processing or transferring covered data on the basis of real, or perceived, classes, including but not limited to, race, national origin, ethnicity, gender, sexual orientation and others, for a variety of defined purposes. Broadly speaking the purposes for processing and transferring covered data using protected classes pertain to differentiating opportunities for employment, education, housing, and credit on the basis of different classes. As an example of a practice that would be barred is the Department of Housing and Urban Development’s allegations that Facebook allowed people placing ads on the social platform to target certain racial groups and exclude others. This bar on discriminatory treatment would also be applied to public accommodations writ large meaning any services or products offered generally to the public. Consequently, covered data could not be used by covered entities to discriminate against women, for example, in providing a different, lower price for men for a service. Additionally, “[a] covered entity may request advice from the Commission concerning the covered entity’s potential compliance with this subsection, in accordance with the Commission’s rules of practice on advisory opinions.”

These civil rights are extended to algorithmic decision making. Covered entities using algorithmic decision making in processing or transferring covered data in the same contexts must perform impact assessments annually, keep them on file, and make them available to the FTC upon request. Presumably, the FTC could use these impact assessments as evidence, if warranted, in finding that a covered entity has violated the Act through discriminatory actions flowing from such decision making. In any event, the FTC would be required to public a report “examining the use of algorithms” for decision making in this context within 3 years of enactment and then every 3 years thereafter.

COPRA would bar people from being allowed to waive certain of their rights under any circumstances and other rights under circumscribed circumstances. Those rights that cannot be waived are the duty of loyalty covered entities owe to people, data portability, data minimization, data security, and the various civil rights. And yet, the rights of access, transparency, deletion, correcting inaccuracies may be waived if three circumstances are present:

  • “there exists a direct relationship between the individual and the covered entity initiated by the individual;
  • the provision of the service or product requested by the individual requires the processing or transferring of the specific covered data of the individual and the covered data is strictly necessary to provide the service or product; and
  • an individual provides affirmative express consent to such specific limitations.”

Of course, in the latter category, covered entities that believe all three conditions are at work will prompt or perhaps even require people to waive those rights. And, it is all but certain that covered entities will seek to expand as much as possible the concept of what “is strictly necessary to provide the service or product.” Consequently, should the provision of a service such as FaceTime require the processing and/or transfer of covered data, then Apple would need to obtain affirmative, express consent and only after an individual initiates the relationship. However, would covered entities be able to advertise or spam people with offers for their services and products in exchange for waivers? Also, it will undoubtedly be a point of contention as to what processing and transferring of covered data is necessary for certain services and products to be provided. Presumably, a company like Google could make the case that its provision of free email through Gmail is financed through the harvesting and sharing of data and without this, it is not viable. It seems to me the FTC will need to weigh in on the contours of what constitutes “strictly necessary” in terms of seeking waivers from these rights.

Of course, the exercise of a number of these rights hinges on verifying that the person making the request is who he claims to be (i.e. the rights to access, transparency, deletion, correction, and portability). Covered entities would be able to deny people the exercise of these rights if they cannot reasonably verify the identity of the requester, which seems on its face a reasonable step to avoid allowing people to make mischief with others’ data and accounts. Covered entities must request additional information to verify a person’s identity in cases of uncertainty. In any event, covered entities must minimize burdens and cannot charge for these requests.

And yet, there circumstances that would allow covered entities to deny these requests:

  • if complying with the request would be demonstrably impossible,
  • complying with the request would prevent the covered entity from carrying out internal audits, performing accounting functions, processing refunds, or fulfilling warranty claims, provided that the covered data that is the subject of the request is not processed or transferred for any purpose other than such specific activities;
  • the request is made to correct or delete publicly available information, and then only to the extent the data is publicly available information;
  • complying with the request would impair the publication of newsworthy information of legitimate public concern to the public by a covered entity, or the processing or transfer of information by a covered entity for such purpose;
  • complying with the request would impair the privacy of another individual or the rights of another to exercise free speech; or
  • the covered entity processes or will process the data subject to the request for a specific purpose described in [provisions detailing when express affirmative consent is not needed], and complying with the request would prevent the covered entity from using such data for such specific purpose

However, covered entities may also deny these requests if they reasonably believe they would interfere with a contract between the covered entity and another individual.

COPRA also stipulates that “[t]he rights and remedies provided for in this section shall not be waived by any policy form or condition of employment, including by a predispute arbitration agreement.” Moreover, “[n]o predispute arbitration agreement shall be valid or enforceable if the agreement requires arbitration of a dispute.”

As noted earlier, covered entities may process or transfer covered data without in the affirmative express consent of a person “provided that the processing or transfer is reasonably necessary, proportionate, and limited to such purpose:

  • To complete a transaction or fulfill an order or service specifically requested by an individual, such as billing, shipping, or accounting.
  • To perform system maintenance, debug systems, or repair errors to ensure the functionality of a product or service provided by the covered entity.
  • To detect or respond to a security incident, provide a secure environment, or maintain the safety of a product or service.
  • To protect against malicious, deceptive, fraudulent or illegal activity.
  • To comply with a legal obligation or the establishment, exercise, or defense of legal claims.
  • To prevent an individual from suffering harm where the covered entity believes in good faith that the individual is in danger of suffering death or serious physical injury.
  • To effectuate a product recall pursuant to Federal or State law.
  • To conduct scientific, historical, or statistical research in the public interest that adheres to all other applicable ethics and privacy laws and is approved, monitored, and governed by an institutional review board or a similar oversight entity that meets standards promulgated by [the FTC in an APA rulemaking.]

The FTC and state attorneys general will need to closely monitor the use of these exceptions by covered entities, for the inclination of regulated entities is to push the limits of legal or excepted behavior. Consequently, regulators will need to review the use of these exceptions lest one or more become the exception that ate the federal privacy statute.

The FTC will need to promulgate regulations “identifying privacy protective requirements for the processing of biometric information” for two of the above exceptions to the requirement for affirmative express consent: to detect or respond to a security incident, provide a secure environment, or maintain the safety of a product or service, or to protect against malicious, deceptive, fraudulent or illegal activity. This section further details the requirements of such a rulemaking.

The bill carves out “the publication of newsworthy information of legitimate public concern to the public by a covered entity, or to the processing or transfer of information by a covered entity for that purpose.”

COPRA would exempt those covered entities subject to other federal privacy and data security statutes such as the “Financial Services Modernization Act of 1999” (aka Gramm-Leach-Bliley) and “Health Insurance Portability and Availability Act of 1996” (HIPAA) to a certain degree. There are provisions making clear that entities in compliance with the named federal regimes shall be deemed to be in compliance with the privacy and data security requirements of COPRA “with respect to data subject to the requirements of such regulations, part, title, or Act.” This would suggest that for data that falls outside those regimes (e.g. biometric data and geolocation data are not subject to Gramm-Leach-Bliley), any covered entities would need to meet the privacy and data security requirements of COPRA in addition to their existing responsibilities. The FTC must issue guidance describing the implementation of this section within one year.

COPRA would add compliance responsibilities for “large data holders,” those covered entities that process or transfer the covered data of 5 million or more individuals per year or processed or transferred the sensitive covered data of 100,000 or more individuals in a year. These entities would need to annually certify compliance with the Act after a review of its internal procedures and processes for compliance. The CEO, chief privacy officer, and chief data security officer must sign this certification. This language is obviously aimed at the largest of data collectors and processors and is intended to make the CEOs aware and responsible for privacy and data security practices, so they would not be able to claim they were ignorant of problems that turn up.

However, all covered entities must designate both chief privacy and chief data security officers who “shall be responsible for, at a minimum—

  • implementing a comprehensive written data privacy program and data security program to safeguard the privacy and security of covered data throughout the life cycle of development and operational practices of the covered entity’s products or services;
  • annually conducting privacy and data security risk assessments, data hygiene, and other quality control practices; and
  • facilitating the covered entity’s ongoing compliance with this Act.”

COPRA spells out the responsibilities of service providers and third parties. Service providers may only process covered data in accordance with the wishes of the covered entity from whom it received the information or to comply with a legal obligation. Service providers may not transfer covered data “without the affirmative express consent… of the individual to whom the service provider data is linked or reasonably linkable.” Additionally, service providers must delete or de-identify covered data once they have completed their services for a covered entity. Third parties may not “process third party data for a purpose that is inconsistent with the expectations of a reasonable individual” and “may reasonably rely on representations made by the covered entity that transferred third party data regarding the expectation of a reasonable individual, provided the third party conducts reasonable due diligence on the representations of the covered entity and finds those representations to be credible.” Service providers and third parties would be exempted from some of the rights people would be given under COPRA (e.g. the right of access.)

Covered entities must exercise reasonable due diligence regarding service providers and third parties:

  • in selecting a service provider and conduct reasonable oversight of its service providers to ensure compliance with the applicable requirements of this section; and
  • in deciding to transfer covered data to a third party, and conduct oversight of third parties to which it transfers data to ensure compliance with the applicable requirements of this subsection.

The bill has provisions to protect and encourage whistleblowers in coming forward to uncover illegal privacy and data security practices. Additionally, the National Institute of Standards and Technology “shall publish a report regarding digital content forgeries,” an area of increasing concern for policymakers as deep fakes become more and more prevalent and lifelike.

With respect to enforcement, the FTC would receive broad authority to draft regulations and guidance to effectuate COPRA. The FTC and state attorneys general could bring actions under this bill. They could seek civil penalties of $42,530 per violation in the first instance and all the other relief that can currently be sought such as equitable remedies including rescission, disgorgement, and injunctions. All of this is fairly anodyne and even Republicans have come to accept what they long resisted earlier in the decade when data security legislation was debated and opposed state attorneys general getting on the field or giving the FTC authority to seek fines for first offenses. However, what many stakeholders may be relying on is that the FTC and state attorneys general are only capable of bringing so many actions and there may well be conduct that goes unpunished that is quite possibly at odds with COPRA.

Additionally, the FTC must “establish a new Bureau within the Commission comparable in structure, size, organization, and authority to the existing Bureaus with the Commission related to consumer protection and competition” within two years of enactment. However, this bill does not specifically authorize extra appropriations for this purpose and rather includes language authorizing those sums necessary to implement the Act. And, without additional funds to set up and resource this new Bureau, then this may be a hollow grant of authority that may be obeyed by the FTC cannibalizing its other current operations. However, an account titled the “Data Privacy and Security Relief Fund” would be established to collect all civil awards won by the FTC and to primarily make consumers whole who were harmed by covered entities.


As noted, individuals could sue for violations in any competent federal or state court and could win the greater of actual damages and between $100-$1000 per violation, punitive damages, and attorney’s fees. This is the most expansive such right in a major privacy bill released this year and may be seen as the lynchpin of enforcement efforts, for if state attorneys general and the FTC are only able to police a small set of violations, then people and their attorneys through the use of class actions may be able to enforce the statute for many companies may emphasize compliance in order to avoid a huge settlement. And yet, giving plaintiffs’ attorneys another means by which they can sue corporations is anathema to Republicans. Therefore, it will be an uphill battle for any private right of action to survive in a final privacy and data security bill passed by the Senate and sent to the White House.

Senate Commerce Democrats Unveil Privacy Bill

The last bill we examined on privacy and data security was Representatives Anna Eshoo (D-CA) and Zoe Lofgren’s (D-CA) the “Online Privacy Act of 2019” (H.R. 4978), a long, comprehensive bill that has little chance of being enacted as it is. Another such bill has been introduced by Senate Democratic stakeholders that takes a comprehensive approach by marrying privacy and data security requirements. Senate Commerce Committee Ranking Member Maria Cantwell (D-WA) and three other Democrats on the committee, Brian Schatz (D-HI), Ed Markey (D-MA) and Amy Klobuchar (D-MN), have released the “Consumer Online Privacy Rights Act” (COPRA). This bill would empower the Federal Trade Commission (FTC) to police privacy and data security violations through augmented authority, not preempt state laws to the extent they provide greater protection, largely leave in place existing federal privacy statutes such as the “Financial Services Modernization Act of 1999” (aka Gramm-Leach-Bliley) and “Health Insurance Portability and Availability Act of 1996” (HIPAA), and allow individuals to sue. Of course, many of these approaches are contrary to the publicly espoused positions of numerous Republican and industry stakeholders. The sponsors released a one-page summary and a short report titled “The State of Online Privacy and Data Security.”

COPRA was released ahead of the Senate Commerce, Science, and Transportation Committee’s December 4 hearing “Examining Legislative Proposals to Protect Consumer Data Privacy,” suggesting that Democrats wanted to define their positions on privacy and data security issues while also highlighting that the majority party in the Senate has failed to release a bill. It is unclear, however, if this bill signals that Cantwell’s ongoing talks with Chair Roger Wicker (R-MS) have stalled. Cantwell and Wicker have been in discussions since the summer on a privacy bill after it appear the efforts undertaken by an ad hoc committee working group had not produced fruit. Nonetheless, Wicker stated that “[t]he legislation released today reflects where the Democrats want to go..[b]ut any privacy bill will need bipartisan support to become law.” He added that “I am committed to continuing to work with the ranking member and my colleagues on both sides of the aisle to get a bill that can get across the finish line…[and] I expect that we will have a bill to discuss at next week’s hearing.”

It merits mention that Senator Richard Blumenthal (D-CT), the ranking member of the Manufacturing, Trade, and Consumer Protection Subcommittee, is not a cosponsor. Blumenthal has long called for both privacy and data security legislation and has often pressed federal agencies to better protect consumers. He has been working with the chair of the subcommittee, Senator Jerry Moran (R-KS), on a privacy bill, and yet despite having worked for over a year on a bill, no text has been released.

It also bears mention that the sponsorship of COPRA suggests that Senate Democrats are coalescing around a single position whereas its Members have taken a number of different approaches. The bill came shortly after Cantwell, and the top Democrats on three other committees released their principles for privacy legislation (See here for more analysis), signaling agreement on the broad outlines of such legislation. The other three ranking members were Patty Murray (D-WA) (Senate HELP), Dianne Feinstein (D-CA) (Senate Judiciary), and Sherrod Brown (D-OH) (Senate Banking). This agreement on principles brokered by Senate Minority Leader Chuck Schumer (D-NY) may smooth some of the jurisdictional battles that have traditionally dogged attempts to address data security or cybersecurity.

Schatz, the ranking member on the Communications, Technology, Innovation and the Internet Subcommittee, led the drafting and introduction of the “Data Care Act,” (S. 3744) in the last Congress. This bill which would extend the concept of fiduciary responsibility currently binding on health care professionals and attorneys with respect to the patients and clients’ information to “online service providers” such as Facebook, Google, Apple, etc. (See here for more extensive analysis.) Likewise, Senator Ed Markey (D-MA) introduced the “Privacy Bill of Rights Act” (S. 1214), which was the only bill to get an A in the first draft of the Electronic Privacy Information Center’s report on privacy bills. (See here for more analysis.) Finally, Klobuchar had cosponsored the “Data Care Act” and had also released a narrower bill with a Republican cosponsor, the “Social Media Privacy Protection and Consumer Rights Act of 2019” (S. 189), that would require major tech companies to give consumers an opportunity to opt in or opt out of the company’s data usage practices after offering enhanced notice of the practices for which the personal data may used. (See here for more analysis.)

Senate Democrats Release Privacy Principles

The ranking members of four Senate Committees have released their principles for any privacy legislation, many of which are likely to be rejected by Republicans and many industry stakeholders (e.g. no preemptions of the “California Consumer Privacy Act” (AB 375) and a private right of action for consumers).

Nonetheless, Senators Maria Cantwell (D-WA), Dianne Feinstein (D-CA), Patty Murray (D-WA), and Sherrod Brown (D-OH) agreed to these principles, and reportedly Senate Minority Leader Chuck Schumer (D-NY) convened and facilitated the effort, which has come ahead of the release of any of the privacy bills that have been under development this year in the Senate.

Of course, the Senate Commerce, Science, and Transportation Committee had convened an informal working group late last year consisting of Cantwell, Chair Roger Wicker (R-MS) and Senators John Thune (R-SD), Jerry Moran (R-KS), Brian Schatz (D-HI), and Richard Blumenthal (D-CT) to hash out a privacy bill. However, like most other such efforts, the timeline for releasing bill text has been repeatedly pushed back even after Wicker and Cantwell tried working by themselves on a bill late in the summer. Additionally, Moran and Blumenthal, the chair and ranking member of the Manufacturing, Trade, and Consumer Protection Subcommittee, have been working on a bill for some time as well but without a timeline for releasing text.

And, the efforts at this committee are in parallel to those in other committees. Senate Judiciary Chair Lindsey Graham (R-SC) has gotten his committee onto the field with hearings on the subject and has articulated his aim to play a role in crafting a bill. Likewise, the Senate Banking Committee has held hearings and are looking to participate in the process as well. But, like Senate Commerce, no bills have been released.

Of course, it is easier to write out one’s principles than to draft legislation. And yet, the release of these desired policies elegantly puts down a marker for Senate Democrats at a time when the majority in the chamber is struggling to coalesce and release a privacy bill. The move also demonstrates cohesion among the top Democrats on four of the committees with a slice of jurisdiction over privacy and data security issues: Commerce, Banking, HELP, and Judiciary.

A Privacy Bill A Week: Online Privacy Act of 2019

Last week, we dived into the last White House on privacy, the discussion draft of the “Consumer Privacy Bill of Rights Act of 2015“ released by the Obama Administration. This bill was released in conjunction with a report on privacy issues and then proceeded to go nowhere as there was scant appetite on Capitol Hill to legislate on privacy. Let us flash forward to the present where privacy has moved to the fore, and the first of the long-anticipated privacy bills has been released.

Representatives Anna Eshoo (D-CA) and Zoe Lofgren (D-CA) unveiled the “Online Privacy Act of 2019” (H.R. 4978), which they started working on earlier this year when it seemed clear that the House Energy and Commerce Committee’s effort to craft a bill had stalled as Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky’s (D-CA) timeline for when a bill might be unveiled continued to repeatedly slip. It must be said that this bill is going to be a non-starter with Republicans in the Senate and White House not least of which because it gives consumers a private right of action, creates a new federal agency to police data security, and does not preempt state statutes. Moreover, given Eshoo’s close political relationship with Speaker Nancy Pelosi (D-CA), this bill may be viewed in two contexts: 1) Pelosi may approve of the substance of the bill; and 2) in not trying to dissuade Eshoo and Lofgren, the Speaker may have intended to prod House Energy and Commerce to produce a bill. Finally, it bears note that Eshoo challenged current House Energy and Commerce Chair Frank Pallone Jr when former Representative Henry Waxman (D-CA) stepped down as the top Democrat, and relations between the two reportedly remain affected by the bruising contest. In any event, it is a comprehensive bill that takes a number of new approaches on some of the aspects of privacy and data security, and Eshoo and Lofgren also released a one-page summary and a section-by-section summary.

Big picture, the bill would create a new agency to oversee a new privacy and security regime, the United States Digital Privacy Agency (DPA), meaning, that unlike virtually every other privacy bill, the Federal Trade Commission (FTC) would not be the primary enforcer. However, there may still be a role for the FTC to play as discussed below. The bill unites privacy with data security, which has been a policy preference of a number of high-profile Democrats including Schakowsky and Senate Commerce, Science, and Transportation Committee Ranking Member Maria Cantwell (D-WA). Republicans have been lukewarm on this notion, however. Moreover, express, affirmative consent would generally be needed before most businesses could collect, process, maintain, or disclose a person’s personal information subject to a number of exceptions. Businesses would need to state clearly and concisely their privacy and security policies, be responsive to people exercising their rights visa vis their data, and closely supervise the service providers and third-parties with whom personal information is disclosed.

As always, it is crucial to digest the key definitions for this will inform the scope of the Act. Those entities covered by the “Online Privacy Act of 2019” is a broad group, spanning most businesses in the U.S: “a person who…intentionally collects, processes, or…maintains personal information; and…sends or receives such personal information over the internet or a similar communications network.” There are two crucial exemptions: 1) people not engaged in commercial activities and those engaging in commercial activities that is considered “de minimis;” and 2) small businesses, which are defined as entities not selling personal information, earning less than 50% of revenue from processing personal information for targeted or behavioral advertising, not having held the personal data of 250,000 or more people in the last six months, having 200 or fewer employees, and earning $25 million or less in gross revenue in the preceding year. If a small business’s status changes, and it crosses the threshold into being a covered entity, then there is a nine-month grace period before it must begin complying the Act.

Besides covered entities, two other classes of entities figure prominently in the bill: “service providers” and “third-parties.” A “service provider” is a “covered entity” that generally “processes, discloses, or maintains personal information, where such person does not process, disclose, or maintain the personal information other than in accordance with the directions and on behalf of another covered entity.” A third-party is “a person…to whom such covered entity disclosed personal information; and…is not…such covered entity…a subsidiary or corporate affiliate of such covered entity…or…a service provider of such covered entity.” Consequently, almost all disclosures of personal information made by a covered entity would likely be to either a service provider or a third party, the latter of which can be a covered entity itself.

The bill defines “data breach” in fairly standard terms as “unauthorized access to or acquisition of personal information or contents of communications maintained by such covered entity.” This term has evolved over the last decade to include mere access as opposed to exfiltration or acquisition. The Act coins a new term to cover some possible privacy violations: “data sharing abuse.” This means “processing, by a third party, of personal information or contents of communications disclosed by a covered entity to the third party, for any purpose other than—

  • a purpose specified by the covered entity to the third party at the time of disclosure; or
  • a purpose to which the individual to whom the information relates has consented.”

Personal information is simply and very comprehensively defined as “any information maintained by a covered entity that is linked or reasonably linkable to a specific individual or a specific device, including de-identified personal information and the means to behavioral personalization created for or linked to a specific individual.” Moreover, “personal information” does not include “publicly available information related to an individual” or “information derived or inferred from personal information, if the derived or inferred information is not linked or reasonably linkable to a specific individual.” Under this definition, is there an inadvertent loophole created whereby information not maintained by a covered entity is not personal information for purposes of this Act, and therefore, such information would be beyond many of the requirements of the Act?

The bill uses the definition of “contents” of a communication from the “Electronic Communications Privacy Act” (ECPA) (P.L. 99-508) that is used for wiretapping and electronic surveillance, among other purposes, which shows the intent to allow the legal structure for government surveillance to coexist frictionless alongside the new privacy regime. However, most metadata, which includes the call detail records currently being debated regarding reauthorization of National Security Agency authority, would be covered at a lesser level by this Act, meaning private sector entities could collect, process, maintain, and disclose metadata.

De-identified information are generally those data “that cannot reasonably identify, relate to, describe, reference, be capable of being associated with, or be linked, directly or indirectly, to a particular individual or device.” However, this definition stipulates further conditions that must be met: “provided that a business that uses de-identified information—

  • has de-identified the personal information using best practices for the types of data the information contains;
  • has implemented technical safeguards that prohibit re-identification of the individual with whom the information was linked;
  • has implemented business processes that specifically prohibit re-identification of the information;
  • has implemented business processes to prevent inadvertent release of de-identified information; and
  • makes no attempt to re-identify the information.”

This language is going in the right direction, for de-identification of personal information will likely need to be permanent or as close to permanent as possible to forestall the temptation some entities will invariably face to re-identify old personal information and derive value from it.

The “Online Privacy Act of 2019” spells out what constitutes a “privacy harm” and a “significant privacy harm,” two key definitions in helping covered entities gauge the sensitivity of certain information and their legal obligations in handling such information. “Privacy harm” is “adverse consequences or potential adverse consequences to an individual or society arising from the collection, processing, maintenance, or disclosure of personal information.” Such harms are identified in the definition and worth quoting in full:

  • direct or indirect financial loss or economic harm;
  • physical harm;
  • psychological harm, including anxiety, embarrassment, fear, and other demonstrable mental trauma;
  • adverse outcomes or decisions with respect to the eligibility of an individual for rights, benefits, or privileges in employment (including hiring, firing, promotion, demotion, and compensation), credit and insurance (including denial of an application or obtaining less favorable terms), housing, education, professional certification, or the provision of health care and related services;
  • stigmatization or reputational harm;
  • price discrimination;
  • other adverse consequences that affect the private life of an individual, including private family matters and actions and communications within the home of such individual or a similar physical, online, or digital location where such individual has a reasonable expectation that personal information will not be collected, processed, or retained;
  • chilling of free expression or action of an individual, group of individuals, or society generally, due to perceived or actual pervasive and excessive collection, processing, disclosure, or maintenance of personal information by a covered entity;
  • impairing the autonomy of an individual, group of individuals, or society generally; and
  • other adverse consequences or potential adverse consequences, consistent with the provisions of this Act, as determined by the Director

This list of privacy harms is as expansive, and perhaps even more so, than almost any other bill analyzed. Additionally, this list is not comprehensive, and the DPA may add other harms.

A related, crucial definition is that of “significant privacy harm” which is “adverse consequences to an individual arising from the collection, processing, maintenance, or disclosure of personal information, limited” to three specific privacy harms:

  • direct or indirect financial loss or economic harm;
  • physical harm; and
  • adverse outcomes or decisions with respect to the eligibility of an individual for rights, benefits, or privileges in employment (including hiring, firing, promotion, demotion, and compensation), credit and insurance (including denial of an application or obtaining less favorable terms), housing, education, professional certification, or the provision of health care and related services.”

A term related to these two is “protected class,” which is “the actual or perceived race, color, ethnicity, national origin, religion, sex (including sexual orientation and gender identity), familial status, or disability of an individual or group of individuals.”

The Act intends to create a safe harbor for many of the notice and consent requirements that will encourage greater use of encryption or similar methods that would make personal information and the contents of communications very hard to access. To this end, the bill defines a term “privacy preserving computing” as “the collecting, processing, disclosing, or maintaining of personal information that has been encrypted or otherwise rendered unintelligible using a means that cannot be reversed by a covered entity, or a covered entity’s service provider,” subject to further requirements. Additionally, the DPA “may determine that a methodology of privacy preserving computing is insufficient for the purposes of this definition,” so covered entities and service providers would not be free to deem any security measure “privacy preserving computing.”

The Act also makes clear that a covered entity’s sharing of personal information with a third party for any sort of remuneration will be a sale or selling, and hence the definition is “the disclosure of personal information for monetary consideration by a covered entity to a third party for the purposes of processing, maintaining or disclosing such personal information at the third party’s discretion.”

Now, let’s turn to the substance of the Act. The bill makes clear that no one may waive its requirements, and any contracts or instruments to do so are null and void. Additionally, no one may agree to any pre-dispute arbitration under this bill, meaning that no person will be forced to accept mandatory arbitration as is often the case when one agrees to the terms of service for an application or to use a device.

The “Online Privacy Act of 2019” would take effect one year after enactment, but there is curious language making clear the effective date does not “affect[] the authority to take an action expressly required by a provision of this Act to be taken before the effective date.”

The Act carves out journalism from the privacy and security requirements of the bill to the extent an organization like The New York Times is engaged in bona fide journalism as opposed to other commercial activities such as selling photographs, the latter of which may qualify such an entity as a covered entity. There is a definition of journalism, and one wonders if companies like Facebook or Google will try to get some of their activities exempted on the basis that they qualify as journalism.

The Act adds a section to the federal criminal code on extortion and threats titled “Disclosure of personal information with the intent to cause harm,” that makes a criminal offense of the actual or attempted disclosure of personal information to threaten, intimidate, or harass another person in order to commit or incite an act of violence. It is also a criminal offense to do so if a person is placed in reasonable fear of death or serious bodily injury. Violators would face fines and prison sentences of up to five years in addition to any state liability as this would seem to cover a number of crimes ushered in by the digital age: doxing, revenge porn, or making public a person’s private information in order to intimidate.

Title I of the “Online Privacy Act of 2019” would provide individuals with a number of rights regarding their personal information and how it may and may not be used by covered entities. First, people would receive a right of access which entails each covered entity making available a reasonable mechanism by which a person find out the categories of personal information and contents of communications being held and those obtained from third parties. Moreover, this information must also contain all the third parties, subsidiaries, and affiliates to whom personal information has been disclosed. Also, individuals must be able to easily access a clear and concise description of the commercial and businesses purposes for which the covered entity collects, maintains, processes, and discloses personal information. Finally, covered entities must provide a list of all automated decision-making processes it employs and those a person may ask that a human being make instead of the automated processes. Covered entities may sidestep a number of these requirements by making it publicly available on its website in a conspicuous location obviating the need for people to make requests.

Individuals would also get a right of correction allowing for the use of a reasonable mechanism to dispute the accuracy and completeness of personal information being held by a covered entity, but only if this personal information is processed such fashion as to “increase reasonably foreseeable significant privacy harms.” This language suggests that data processing that would result in mere privacy harms, say those that would impact one’s personal, familial communications, would not need to be corrected or completed. In any event, covered entities have the option to correct or complete as requested, tell the requester the information is complete or correct, respond that insufficient information does not allow for the correction or completion, or deny the request on the basis of exemptions discussed below. Small businesses are exempted from this responsibility. Of course, what ultimately is determined to be a significant privacy harm will be the result of case-by-case adjudication by the new DPA, likely in court.

People could ask that their personal information be deleted, including those data acquired from third parties or inferred by the covered entity. Again, on the basis of Section 109 exemptions, this request could be denied.

Individuals will receive a right of portability, and in order to effectuate this right, the DPA must annually publish in the Federal Register categories of online services and products that are determined to be portable. However, before a final list is published, the DPA must release an initial list of portable services and products and accept comments. Once it has been established which services and products are portable, then covered entities must allow individuals to request and receive their personal information and/or contents of communications for purposes of taking their business to a competitor. There is also language that contemplates asking one covered entity to directly transmit this information on account of a person’s request.

Upon request, covered entities must have humans make decisions instead of an “automated processing of personal information of an individual, if such processing increases reasonably foreseeable significant privacy harms for such individual.”

Before a covered entity may engage in behavior personalization, it must obtain express, affirmative consent from a person to collect, process, maintain or disclose personal information for this purpose. Behavior personalization is a term defined in the Act and “means the processing of an individual’s personal information, using an algorithm, model, or other means built using that individual’s personal information collected over a period of time, or an aggregate of the personal information of one or more similarly situated individuals and designed to—

  • alter, influence, guide, or predict an individual’s behavior;
  • tailor or personalize a product or service; or
  • filter, sort, limit, promote, display or otherwise differentiate between specific content or categories of content that would otherwise be accessible to the individual.”

This right seems squarely aimed at the use of one’s data to show him advertising based on their browsing history, searches, location, occupation, and the huge volumes of other data collected daily. Moreover, if a person denies such consent, then the product or service must be provided without the behavior personalization unless this is infeasible at which point only the core service or product need be provided. And, if it is infeasible to provide core services or products, then a covered entity may altogether deny a product or service. It is likely covered entities will seek to define “infeasible” as broadly as possible in order to leverage consent for its products and services and so that it may continue the lucrative practice of personalized advertising.

A person would also get the right to be informed which entails any covered entity that begins collecting personal information on a person despite there not being a direct relationship, the covered entity must inform the person within 30 days by writing.

There would be established a right to impermanence that would limit the holding a person’s personal information for no more time than she consented to. Covered entities must obtain affirmative, express consent from people for categories of personal information for as long as the original purpose for collection is completed or by a certain date. And yet, there is an exemption for implied consent when long-term maintenance of personal information is an obvious, core feature of a product or service and these data are maintained only to provide the product or service.

As mentioned, Section 109 details the exemptions that may allow a covered entity to disregard the rights bestowed on people under the “Online Privacy Act of 2019,” which include

  • Detecting, responding to, or preventing security incidents or threats.
  • Protecting against malicious, deceptive, fraudulent, or illegal activity.
  • Complying with specific law enforcement requests or court orders.
  • Protecting a legally recognized privilege or other legal right.
  • Protecting public safety.
  • Collection, processing, or maintenance by an employer pursuant to an employer-employee relationship of records about employees or employment status, except—
    • where the information would not be reasonably expected to be collected in the context of an employee’s regular duties; or
    • was disclosed to the employer by a third party.
  • Preventing prospective abuses of a service by an individual whose account has been previously terminated.
  • Routing a communication through a communications network or resolving the location of a host or client on a communications network.
  • Providing transparency in advertising or origination of user generated content.

However, the covered entity will need to have “technical safeguards and business processes that limit the collection, processing, maintaining, or disclosure of such personal information” to the aforementioned purposes.

This section also details the reasons why a covered entity may decline a request made pursuant to one of the rights listed in Title I:

  • A requester’s identity cannot be confirmed
  • If the request would create a legitimate risk to the privacy, security, safety, or other rights of another person
  • A legitimate risk to free expression
  • In regard to completing or deleting requests, if doing so would stop a transaction or process set into motion but not completed per a person’s request or such a request would undermine the integrity of a legally significant transaction

Service providers are altogether exempted from Title I, and covered entities employing privacy preserving computing are exempted from certain rights of people: right of access, right to human review of automated decisions, right of human review, and the right to individual autonomy. However, this exemption applies only to the data processing performed with privacy preserving computing.

Covered entities must reply to requests within 30 days and may not normally charge a fee for fulfilling requests unless it is determined that the requests are excessive or unfounded, then a covered entity may charge a fee subject to DPA approval.

Title II of the “Online Privacy Act of 2019” details the requirements placed on covered entities, service providers and third parties.

Covered entities must have a reasonable articulable basis for collecting, processing, maintaining, and disclosing personal information related to the reasonable business needs of the entity. Additionally, the covered entity must keep no more personal information than is necessary to effectuate the business or commercial purpose and these needs are to be balanced against privacy intrusions, possible privacy harms, and the reasonable expectations of people whose information in question. Additionally, covered entities should not collect more personal information than is necessary to carry out its business purpose nor should it hold these data longer than necessary. However, covered entities may engage in “ancillary” collection, processing, maintenance, and disclosure of personal information in certain circumstances subject to certain requirements. For example, if these activities are substantially similar to the original ones and it is the same type of personal information being collected and no privacy harms would result, then notice and consent are not required. However, notice is required for ancillary activities if:

  • The ancillary activities the covered entity is engaged in are similar to the original activities and there is a privacy harm risk
  • The ancillary activities are not substantially similar and there is not risk of privacy harms; or
  • The activities are substantially similar and would result in privacy harm but privacy preserving computing is used

Consequently, notice and consent would be required for any other ancillary activities that do not fall into those categories.

Covered entities would also need to limit the access of employees and contractors to personal information and the contents of communication on the basis of an articulable rationale that balances reasonable business needs, the potential for privacy harm, and the reasonable expectations of individuals. Moreover, covered entities must maintain records on all access.

There is a requirement that covered entities cannot collect or maintain any personal information unless they are in compliance with the Act. However, this requirement does not cover processing or maintaining personal information.

The disclosure of personal information by covered entities to third parties is limited only to situations when a person consents. And, any such consent is only valid after a person has been notified of all the categories of third parties the personal information may be disclosed to, the personal information to be shared, and the business purposes for doing so. Sales of personal information would be more severely constrained. Each sale to a third party by a covered entity must be agreed to by a person. What’s more, covered entities must disclose the parameters of the original purpose for the collection of the information when it sells it to a third party. Regarding the use of privacy preserving computing and de-identified personal information, disclosure does not require consent for either designation, but consent is always required for the sales of personal information.

There are provisions designed to sweep into U.S. jurisdiction players in the data ecosystem that are outside the country. The bill bars covered entities from disclosing personal information to entities not subject to U.S. jurisdiction or not in compliance with the Act. However, a safe harbor is created under which covered entities and non-U.S. entities could do business that is largely premised on the latter being willing to comply with the Act, having the cash available to pay fines for violations, and evince a willingness to be subject to DPA enforcement. The non-U.S. entity also needs to sign an agreement with the DPA. This section, however, makes clear it is seeking to create a data localization requirement in the U.S or to restrict a covered entity’s internal disclosures, so that Microsoft, say, could continue shuttling personal data around the globe to its servers without running afoul of this section.

Covered entities are barred from re-identifying de-identified information unless allowed by one of the Section 109 exemptions, and this prohibition attaches to third parties that may have the de-identified information. However, “qualified research entities” are not covered by this restriction, and it would be up to the DPA to determine who may be considered one.

A covered entity’s ability to collect, process, maintain, or disclose the contents of communication would be limited only to those situations where there is a security incident or threat, the processing is expressly requested by one of the parties to the communication, and other specified purposes. There is an exception for publicly available communications, and covered entities cannot stop people using their services or products from encrypting their communications. There is a safe harbor for service providers acting at the direction of a covered entity with a reasonable belief the directions comply with the Act.

Covered entities could not process personal information in a way that impinges a person’s opportunities on the basis of a protected class in education, employment, housing, credit, healthcare, finance, and a range of other areas. The same is true of public accommodations. Moreover, the DPA is required to promulgate regulations to effectuate this section.

The use of genetic information would be very severely limited, and more or less these types of data would only be available for medical testing and even then, subject to restrictions.

The DPA will establish a minimum percentage threshold for people to read and understand a notice for purposes of consent or a privacy policy that covered entities would need to meet or exceed before its notice or privacy policies would be allowed to be used. The DPA will establish a procedure to vet the data submitted by covered entities to show compliance with this requirement. Moreover, the DPA will make available the notices and privacy policies of all covered entities. All covered entities must make available reasonable mechanisms for people to revoke consent. And, not surprisingly, deceptive notices and privacy policies are barred.

Pursuant to these DPA approved notices, covered entities must provide clear and concise notice of the personal information being collected, maintained, processed, or disclosed. Additionally, covered entities may not collect, process, maintain, or disclose personal information without consent if it creates or increases the risk of foreseeable privacy harms. However, consent will be implied if the personal information activities of an entity are obvious on their face and notice is provided. However, privacy preserving computing would be exempt from the notice and consent requirements.

Covered entities shall, of course, have privacy policies regarding tis personal information activities, including a general description of its practices, an explanation as to how individuals may exercise their Title I rights, the categories of personal information collected, the business or commercial purposes for which such data will be used, and other requirements.

Information security would be a part of the new regime covered entities must comply with. Consequently, covered entities must design and establish an information security system to protect personal information based on the sensitivity of the data and the types of activities in which the covered entity is engaged. The information security system must include

  • A written security policy
  • A means of identifying, assessing, and mitigation security vulnerabilities
  • A process for disposing personal information securely
  • A process for overseeing those with access to personal information; and
  • A plan or protocol to respond to data breaches or data sharing abuses

In the event a data breach or data sharing abuse occurs, the covered entities must report it to the DPA within 72 hours of discovery unless the event is unlikely to create or increase foreseeable privacy harms. Any notifications made after this 72-hour window must be accompanied by reasons why it was delayed. Additionally, a covered entity must alert other covered entities from whom they personal information, and people must be notified if there is a risk of increased privacy harms.

Title III details the DPA’s structure and powers. The DPA would be headed by a Director appointed by the President and confirmed by the Senate, and the Director could appoint a Deputy Director. The Director would serve a five year, and the bill is silent is on how many terms a Director may serve. The agency would receive broad powers to set itself up and to promulgate regulations for its operations or to regulate entities under its jurisdictions. The DPA must consult with other federal agencies and state agencies in policing privacy and security. Finally, the agency would have appropriations of $550 million per year for the next five years authorized, but the Appropriations Committees would have to actually make these funds available in annually in an appropriations bill.

Title IV lays out the enforcement of the Act. The DPA could enforce the Act in two separate ways, much like the FTC’s current means of enforcement. It could initiate an internal, administrative process that would result in a cease and desist order, allowing any such defendant in this action the opportunity to challenge the agency at an agency hearing and then appealing from what would presumably be an administrative law judge’s decision to the full agency, and then to a U.S. Circuit Court of Appeals. Or, the DPA could file a complaint in a U.S. District Court and litigate against a defendant. In either case, the agency could seek civil penalties of up to $42,530 per person, and this number could get high depending on the number of people involved. For example, in the Facebook/Cambridge Analytica case where more than 87 million people were affected, if the DPA sought the maximum civil fine for each violation, the potential liability would be more than $37 trillion. It is important to note that civil penalties are calculated per person and not per violation, for the latter method could yield even larger numbers as it is easy to contemplate multiple violations per person. However, a court’s directions under the Act in terms of the factors to consider when meting out a fine would weigh against such a gigantic, company crushing fine.

In enforcing the act, the DPA must coordinate with other federal regulators, which means multiple, overlapping jurisdictions is the likely future landscape is this bill is enacted. These agencies may refer cases to the DPA for prosecution, and yet, should one federal agency initiate an action for a privacy or security violation, the DPA may not also bring an action. Moreover, the Act requires the DPA to execute an agreement with the FTC to coordinate enforcement. State attorneys general may bring actions under this Act but only if the DPA is not doing so, and the state needs to provide notice to the DPA before proceeding.

As noted, a private right of action is available for people to allege violations of the Act. However, a class action seeking civil damages could only be brought by a non-profit and not plaintiffs’ attorneys, suggesting a class action for injunctive relief may be brought by a plaintiffs’ attorney. There is also a provision allowing a whistleblower to bring an action after first allowing the DPA the option to litigate. If the DPA accepts and prevails, the whistleblower would be entitled to 15%, but if the whistleblower litigates the case, she may be entitled to between $25 and 50% of the award.

In terms of the relief the DPA or a state attorney may recover aside from civil penalties, a range of equitable relief:

  • Rescission or reformation of contracts;
  • Refund of moneys;
  • Restitution;
  • Disgorgement or compensation for unjust enrichment;
  • Payment of damages or other monetary relief;
  • Public notification regarding the violation, including the costs of notification; and
  • Limits on the activities or functions of the person;

Additionally, the DPA or state attorneys general may also seek to recover all the costs of prosecuting the case.