Another Federal Privacy Bill

Senate Commerce Republicans revise and release privacy bill that does not budge on main issues setting them apart from their Democratic colleagues.

Last week, in advance of tomorrow’s hearing on privacy legislation, the chair and key Republicans released a revised version of draft legislation released last year to mark their position on what United States (U.S.) federal privacy regulation should be. Notably, last year’s draft and the updated version would still preempt state laws like the “California Consumer Privacy Act” (CCPA) (AB 375), and people in the U.S. would not be given the right to sue entities that violate the privacy law. These two issues continue to be the main battle lines between Democratic and Republican bills to establish a U.S. privacy law. Given the rapidly dwindling days left in the 116th Congress and the possibility of a Democratic White House and Senate next year, this may be both a last gasp effort to get a bill out of the Senate and to lay down a marker for next year.

The “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) was introduced by Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS), Senate Majority Whip and  Communications, Technology, Innovation, and the Internet Subcommittee Chair John Thune (R-SD), Transportation and Safety Subcommittee Chair Deb Fischer (R-NE), and Safety, and Senator Marsha Blackburn (R-TN). However, a notable Republican stakeholder is not a cosponsor: Consumer Protection Subcommittee Chair Jerry Moran (R-KS) who introduced his own bill, the “Consumer Data Privacy and Security Act of 2020” (S.3456) (See here for analysis).

As mentioned, Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis). Here’s how I summarized the differences at the time: in the main, CDPA shares the same framework with COPRA with some key, significant differences, including:

  • COPRA expands the FTC’s jurisdiction in policing privacy harms whereas CDPA would not
  • COPRA places a duty of loyalty on covered entities to people whose covered data they process or transfer; CDPA does not have any such duty
  • CDPA does not allow people to sue if covered entities violate the new federal privacy and security regime; COPRA would allow such suits to move forward
  • CDPA preempts state privacy and data security laws; COPRA would establish a federal floor that states like California would be able to legislate on top of
  • CDPA would take effect two years after enactment, and COPRA would take effect six months after enactment.
  • The bar against a person waiving her privacy rights under COPRA is much broader than CDPA
  • COPRA would empower the FTC to punish discriminatory data processing and transfers; CDPA would require the FTC to refer these offenses to the appropriate federal and state agencies
  • CDPA revives a concept from the Obama Administration’s 2015 data privacy bill by allowing organizations and entities to create standards or codes of conduct and allowing those entities in compliance with these standards or codes to be deemed in compliance with CDPA subject to FTC oversight; COPRA does not have any such provision
  • COPRA would require covered entities to conduct algorithmic decision-making impact assessments to determine if these processes are fair, accurate, and unbiased; no such requirement is found in CDPA

As a threshold matter, the SAFE DATA Act is in the latest in a line of enhanced notice and consent bills founded on the logic that if people were informed and able to make choices about how and when their data are used, then the U.S. would have an ideal data and privacy ecosystem. This view, perhaps coincidentally, dovetails with Republican views on other issues where people should merely be given information and the power to choose, and any bad outcomes being the responsibility of those who made poor choices. This view runs counter to those who see privacy and data security as being akin to environmental or pollution problems, that is being beyond the ability of any one person to manage or realistically change.

Turning to the bill before us, we see that while covered entities may not outright deny services and products to people if they choose to exercise the rights granted under the bill visa vis their covered data, a covered entity may charge different prices. This structure would predictably lead to only those who can afford it or are passionately committed to their privacy being able to pay for more privacy. And yet, the rights established by the bill for people to exercise some control over their private information cannot be waived, forestalling the possibility that some covered entities would make such a waiver a term of service like many companies do with a person’s right to sue.

Covered entities must publish privacy policies before or at the point of data collection, including:

  • The identity of the entity in charge of processing and using the covered data
  • The categories of covered data collected and the processing purposes of each category
  • Whether transfers of covered data occur, the categories of those receiving such data, and the purposes for which transfers occur
  • The entity’s data retention and data security policies generally; and
  • How individuals may exercise their rights.

Any material changes mean new privacy policies provided to people and consent again must be obtained before collection and processing may resume.

There is, however, language not seen in other privacy bills: “[w]here the ownership of an individual’s device is transferred directly from one individual to another individual, a covered entity may satisfy its obligation to disclose a privacy policy prior to or at the point of collection of covered data by making the privacy policy available under (a)(2)” (i.e. by posting on the entity’s website.) So, if I give an old phone to a friend, a covered entity may merrily continue collecting and processing data because I consented and my friend’s consent is immaterial. Admittedly, this would seem to be a subset of all devices used in the U.S., but it does not seem to be a stretch for covered entities to need to obtain consent if they determine a different person has taken over a device. After all, they will almost certainly be able to discern the identity of the new user and that the device is now being used by someone new.

Section 103 of the SAFE DATA Act establishes a U.S. resident’s rights to access, correct, delete, and port covered data. People would be able to access their covered data and correct “material” inaccuracies or incomplete information at least twice a year at no cost provided the covered entity can verify their identity. Included with the right to access would be provision of the categories of third parties to whom covered data has been transferred and a list of the categories of purposes. There is a long list of reasons why covered entities would not need to comply, including but not limited to:

  • If the covered entity must “retain any covered data for the sole purpose of fulfilling the request; “
  • If it would “be impossible or demonstrably impracticable to comply with;”
  • If a request would “require the covered entity to combine, relink, or otherwise reidentify covered data that has been deidentified;”
  • If it would “result in the release of trade secrets, or other proprietary or confidential data or business practices;”
  • If it would “interfere with law enforcement, judicial proceedings, investigations, or reasonable efforts to guard against, detect, or investigate malicious or unlawful activity, or enforce contracts;”
  • If it would “require disproportionate effort, taking into consideration available technology, or would not be reasonably feasible on technical grounds;”
  • If it would “compromise the privacy, security, or other rights of the covered data of an- other individual;”
  • If it would “be excessive or abusive to another individual; or
  • If t would “violate Federal or State law or the rights and freedoms of another individual, including under the Constitution of the United States.”

This extensive list will give companies not interested in complying with plenty of reason to proffer as to why they will not provide access or correct. Nonetheless, the FTC would need to draft and implement regulations “establishing requirements for covered entities with respect to the verification of requests to exercise rights” to access and correct. Perhaps the agency will be able to address some foreseeable problems with the statute as written.

Explicit consent is needed before a covered entity may transfer or process the “sensitive covered data” of a person. The first gloss on this right is that a person’s consent is not needed to collect, process, and transfer the “covered data” of a person. Elsewhere in the section, it is clear that one has a limited opt out right: “a covered entity shall provide an individual with the ability to opt out of the collection, processing, or transfer of such individual’s covered data before such collection, processing, or transfer occurs.”

Nonetheless, a bit of a detour back into the definitions section of the bill is in order to understand which types of data lay on which side of the consent line. “Covered data” are “information that identifies or is linked or reasonably linkable to an individual or a device that is linked or reasonably linkable to an individual” except for publicly available data, employment data, aggregated data, and de-identified data. Parenthetically, I would note the latter two exceptions would seem to be incentives for companies to hold personal information in the aggregate or in a de-identified state as much as possible so as to avoid triggering the requirements of the SAFE DATA Act.

“Sensitive covered data” would be any of the following (and, my apologies, the list is long):

  • A unique, government-issued identifier, such as a Social Security number, passport number, or driver’s license number, that is not required to be displayed to the public.
  • Any covered data that describes or reveals the diagnosis or treatment of the past, present, or future physical health, mental health, or disability of an individual.
  • A financial account number, debit card number, credit card number, or any required security or access code, password, or credentials allowing access to any such account.
    Covered data that is biometric information.
  • A persistent identifier.
  • Precise geolocation information (defined elsewhere as anything within 1750 feet)
  • The contents of an individual’s private communications, such as emails, texts, direct messages, or mail, or the identity of the parties subject to such communications, unless the covered entity is the intended recipient of the communication (meaning metadata is fair game; and this can be incredibly valuable. Just ask he National Security Agency)
  • Account log-in credentials such as a user name or email address, in combination with a password or security question and answer that would permit access to an online account.
  • Covered data revealing an individual’s racial or ethnic origin, or religion in a manner inconsistent with the individual’s reasonable expectation regarding the processing or transfer of such information. (of course, this sort of qualifying language always makes me think according to who’s definition of “reasonable expectation”)
  • Covered data revealing the sexual orientation or sexual behavior of an individual in a manner inconsistent with the individual’s reasonable expectation regarding the processing or transfer of such information. (See the previous clause)
  • Covered data about the online activities of an individual that addresses or reveals a category of covered data described in another subparagraph of this paragraph. (I suppose this is intended as a backstop against covered entities trying to backdoor their way into using sensitive covered data by claiming its covered data from online activities.)
  • Covered data that is calendar information, address book information, phone or text logs, photos, or videos maintained for private use on an individual’s device.
  • Any covered data collected or processed by a covered entity for the purpose of identifying covered data described in another paragraph of this paragraph. (again, this seems aimed at plugging a possible loophole in that ordinary covered data can probably be processed or combined with other covered data to arrive at some categories of “sensitive covered data.”)
  • Any other category of covered data designated by the Commission pursuant to a rulemaking under section 553 of title 5, United States Code (meaning the FTC can use normal rulemaking authority and not the shackles of the Moss-Magnuson rulemaking procedures to expand this definition as needed).

So, we have a subset of covered data that would be subject to consent requirements, including notice with a “clear description of the processing purpose for which the sensitive covered data will be processed;” that “clearly identif[ies] any processing purpose that is necessary to fulfill a request made by the individual” that “include[s] a prominent heading that would enable a reasonable individual to easily identify the processing purpose for which consent is sought; and “clearly explain[s] the individual’s right to provide or withhold consent.”

Finally, the FTC may but does not have “to establish requirements for covered entities regarding clear and conspicuous procedures for allowing individuals to provide or withdraw affirmative express consent for the collection of sensitive covered data.” If the agency chooses to do so, it may use the normal notice and comment procedures virtually every other U.S. agency may.

Covered entities must minimize collection, processing, and retention of covered data to “what is reasonably necessary, proportionate, and limited” except if permitted elsewhere in the SAFE DATA Act or another federal statute. Interestingly, the FTC would not be tasked with conducting a rulemaking but would instead need to issue guidelines with best practices on how covered entities would undertake such minimization.

Service providers must follow the direction of the covered entity with whom they are working and delete or deidentify data after they have finished work upon it. Third parties are limited in processing covered data to only those purposes consistent with the reasonable expectations of the individual to whom the data belong. However, third parties do not need to obtain consent for processing sensitive covered data or covered data. However, covered entities must perform due diligence to ensure that service providers and third parties will comply with the requirements particular to these two classes of entities. However, there is no obligation beyond due diligence and no suggestion of liability for the misdeeds and violations of service providers and third parties.

Large data holders would need to conduct periodic privacy impact analyses with an eye toward helping these entities improve their privacy policies. This class of covered entities are those that have processed or transferred the covered data of 8 million or more people in a given year or the sensitive covered data of 300,000 people.

The SAFE DATA Act would generally allow covered entities to collect, process, and transfer the covered data of people without their consent so long as these activities are reasonably necessary, proportionate and limited to the following purposes:

  • To initiate or complete a transaction or to fulfill an order or provide a service specifically requested by an individual, including associated routine administrative activities such as billing, shipping, financial reporting, and accounting.
  • To perform internal system maintenance, diagnostics, product or service management, inventory management, and network management.
  • To prevent, detect, or respond to a security incident or trespassing, provide a secure environment, or maintain the safety and security of a product, service, or individual.
  • To protect against malicious, deceptive, fraudulent, or illegal activity.
  • To comply with a legal obligation or the establishment, exercise, analysis, or defense of legal claims or rights, or as required or specifically authorized by law.
  • To comply with a civil, criminal, or regulatory inquiry, investigation, subpoena, or summons by an Executive agency.
  • To cooperate with an Executive agency or a law enforcement official acting under the authority of an Executive or State agency concerning conduct or activity that the Executive agency or law enforcement official reasonably and in good faith believes may violate Federal, State, or local law, or pose a threat to public safety or national security.
  • To address risks to the safety of an individual or group of individuals, or to ensure customer safety, including by authenticating individuals in order to provide access to large venues open to the public.
  • To effectuate a product recall pursuant to Federal or State law.

People would not be able to opt out of collection, processing, and transferring covered data. As mentioned earlier, U.S. residents would receive a limited right to opt out, and it is in Section 108 that one learns the things a person cannot opt out of. I suppose it should go without saying that covered entities will interpret these terms as broadly as they can in order to forestall people from opting out. The performance of “internal system maintenance, diagnostics, product or service management, inventory management, and network management” seems like a potentially elastic definition that could be asserted to give cover to some covered entities.

Speaking of exceptions, small businesses would not need to heed the rights of individuals regarding their covered data, do not need to minimize their collection, processing, and transferring covered data, and will not need to have data privacy and security officers. These are defined as entities with gross annual revenues below $50 million per year, that has processed the covered data of less than 1 million people, has fewer than 500 employees, and earns less than 50% of its revenue from transferring covered data. On its face, this seems like a very generous definition of what shall be a small business.

The FTC would not be able to police processing and transferring of covered data that violates discrimination laws. Instead the agency would need to transfer these matters to agencies of jurisdiction. The FTC would be required to use its 6(b) authority to “examin[e] the use of algorithms to process covered data in a manner that may violate Federal anti-discrimination laws” and then publish a report in its findings and guidance on how covered entities can avoid violating discrimination laws.

Moreover, the National Institute of Standards and Technology (NIST) must “develop and publish a definition of ‘‘digital content forgery’’ and accompanying explanatory materials.” One year afterwards, the FTC must “publish a report regarding the impact of digital content forgeries on individuals and competition.”

Data brokers would need to register with the FTC, which would then publish a registry of data brokers on its website.

There would be additional duties placed on covered entities. For example, these entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of covered data.” However, financial services companies subject to and in compliance with Gramm-Leach-Bliley regulations would be deemed to be in compliance with these data security obligations. The same would be true of entities subject to and in compliance with the “Health Insurance Portability and Accountability Act” and “Health Information Technology for Economic and Clinical Health Act.” Additionally, the FTC may “issue regulations to identify processes for receiving and assessing information regarding vulnerabilities to covered data that are reported to the covered entity.”

The SAFE DATA Act has language new to federal privacy bills on “opaque algorithms.” Specifically, covered internet platforms would not be able to use opaque algorithms unless notice is provided to users and an input-transparent algorithm version is available to users. The term ‘‘covered internet platform’’ is broad and encompasses “any public-facing website, internet application, or mobile application, including a social network site, video sharing service, search engine, or content aggregation service.” An “opaque algorithm” is “an algorithmic ranking system that determines the order or manner that information is furnished to a user on a covered internet platform based, in whole or part, on user-specific data that was not expressly provided by the user to the platform for such purpose.”

The bill makes it an unfair and deceptive practice for “large online operator[s]” “to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”

A covered entity must have

  • 1 or more qualified employees or contractors as data privacy officers; and
  • 1 or more qualified employees or contractors…as data security officers.

Moreover, “[a] covered entity shall maintain internal controls and reporting structures to ensure that appropriate senior management officials of the covered entity are involved in assessing risks and making decisions that implicate compliance with this Act.”

There are also provisions protecting whistleblowers inside covered entities that “voluntarily provide[] [“original information”] to the [FTC]…relating to non-compliance with, or any violation or alleged violation of, this Act or any regulation promulgated under this Act.”

Like virtually all the other bills, the FTC would be able to levy civil fines of more than $42,000 per violation, and state attorneys general would also be able to enforce the new privacy regime. However, the FTC would be able to intervene and take over the action if it chose, and if two or more state attorneys general are bringing cases regarding the same violations, then the cases would be consolidated and heard in the federal court in the District of Columbia. The FTC would also get jurisdiction over common carriers and non-profits for purposes of enforcing the SAFE DATA Act.

And then there is new language in the SAFE DATA Act that seems aimed at addressing a pair of cases before the Supreme Court on the extent of the FTC’s power to seek and obtain certain monetary damages and equitable relief. The FTC has appealed an adverse ruling from the U.S. Court of Appeals for the Seventh Circuit while the other case is coming from the U.S. Court of Appeals for the Ninth Circuit.

Like the forerunner bill released last November, the FTC would be empowered to “approve voluntary consensus standards or certification programs that covered entities may use to comply with 1 or more provisions in this Act.” These provisions came from an Obama Administration privacy bill allowing for the development and usage of voluntary consensus-based standards for covered entities to comply with in lieu of the provisions of that bill.

The SAFE DATA Act would not impinge existing federal privacy laws but would preempt all privacy laws at the state level. Ironically, the bill would not preempt data breach notification laws. One would think if uniformity across the U.S. were a driving motivation, doing so would be desirable.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Further Reading, Other Developments, and Coming Events (19 August)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The United States (U.S.) Department of Commerce tightened its chokehold on Huawei’s access to United States’ semiconductors and chipsets vital to its equipment and services. This rule follows a May rule that significantly closed off Huawei’s access to the point that many analysts are projecting the People’s Republic of China company will run out of these crucial technologies sometime next year without a suitable substitute, meaning the company may not be able to sell its smartphone and other leading products. In its press release, the department asserted the new rule “further restricts Huawei from obtaining foreign made chips developed or produced from U.S. software or technology to the same degree as comparable U.S. chips.”
    • Secretary of Commerce Wilbur Ross argued “Huawei and its foreign affiliates have extended their efforts to obtain advanced semiconductors developed or produced from U.S. software and technology in order to fulfill the policy objectives of the Chinese Communist Party.” He contended “[a]s we have restricted its access to U.S. technology, Huawei and its affiliates have worked through third parties to harness U.S. technology in a manner that undermines U.S. national security and foreign policy interests…[and] his multi-pronged action demonstrates our continuing commitment to impede Huawei’s ability to do so.”
    • The Department of Commerce’s Bureau of Industry and Security (BIS) stated in the final rule that it is “making three sets of changes to controls for Huawei and its listed non-U.S. affiliates under the Export Administration Regulations (EAR):
      • First, BIS is adding additional non-U.S. affiliates of Huawei to the Entity List because they also pose a significant risk of involvement in activities contrary to the national security or foreign policy interests of the United States.
      • Second, this rule removes a temporary general license for Huawei and its non-U.S. affiliates and replaces those provisions with a more limited authorization that will better protect U.S. national security and foreign policy interests.
      • Third, in response to public comments, this final rule amends General Prohibition Three, also known as the foreign-produced direct product rule, to revise the control over certain foreign-produced items recently implemented by BIS.”
    • BIS claimed “[t]hese revisions promote U.S. national security by limiting access to, and use of, U.S. technology to design and produce items outside the United States by entities that pose a significant risk of involvement in activities contrary to the national security or foreign policy interests of the United States.”
    • One technology analyst claimed “[t]he U.S. moves represent a significant tightening of restrictions over Huawei’s ability to procure semiconductors…[and] [t]hat puts into significant jeopardy its ability to continue manufacturing smartphones and base stations, which are its core products.”
  • The Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP) have released their annual guidance to United States department and agencies to direct their budget requests for FY 2022 with respect to research and development (R&D). OMB explained:
  • For FY2022, the five R&D budgetary priorities in this memorandum ensure that America remains at the global forefront of science and technology (S&T) discovery and innovation. The Industries of the Future (IotF) -artificial intelligence (AI), quantum information sciences (QIS), advanced communication networks/5G, advanced manufacturing, and biotechnology-remain the Administration’s top R&D priority. This includes fulfilling President Trump’s commitment to double non-defense AI and QIS funding by FY2022:
    • American Public Health Security and Innovation
    • American Leadership in the Industries of the Future and Related Technologies
    • American Security
    • American Energy and Environmental Leadership
    • American Space Leadership
  • In light of the significant health and economic disruption caused by the COVID-19 pandemic, the FY2022 memorandum includes a new R&D priority aimed at American Public Health Security and Innovation. This priority brings under a single, comprehensive umbrella biomedical and biotechnology R&D aimed at responding to the pandemic and ensuring the U.S. S&T enterprise is maximally prepared for any health-related threats.
  • Lastly, this memorandum also describes/our high-priority crosscutting actions. These actions include research and related strategies that underpin the five R&D priorities and ensure departments and agencies deliver maximum return on investment to the American people:
    • Build the S&T Workforce of the Future
    • Optimize Research Environments and Results
    • Facilitate Multisector Partnerships and Technology Transfer
    • Leverage the Power of Data
  • Despite the Trump Administration touting its R&D priorities and achievements, the non-partisan Congressional Research Service noted
    • President Trump’s budget request for FY2021 includes approximately $142.2 billion for research and development (R&D) for FY 2021, $13.8 billion (8.8%) below the FY2020 enacted level of $156.0 billion. In constant FY 2020 dollars, the President’s FY 2021 R&D request would result in a decrease of $16.6 billion (10.6%) from the FY 2020 level.
  • Two key chairs of subcommittees of the Senate Commerce, Science, and Transportation Committee are pressing the Federal Trade Commission (FTC) to investigate TikTok’s data collection and processing practices. This Committee has primary jurisdiction over the FTC in the Senate and is a key stakeholder on data and privacy issues.
    • In their letter, Consumer Protection Subcommittee Chair Jerry Moran (R-KS) and Communications, Technology, Innovation Chair John Thune (R-SD) explained they are “are seeking specific answers from the FTC related to allegations from a Wall Street Journal article that described TikTok’s undisclosed collection and transmission of unique persistent identifiers from millions of U.S. consumers until November 2019…[that] also described questionable activity by the company as it relates to the transparency of these data collection activities, and the letter seeks clarity on these practices.”
    • Moran and Thune asserted “there are allegations that TikTok discretely collected media access control (MAC) addresses, commonly used for advertisement targeting purposes, through Google Android’s operating system under an “unusual layer of encryption” through November 2019.” They said “[g]iven these reports and their potential relevancy to the “Executive Order on Addressing the Threat Posed by TikTok,” we urge the Federal Trade Commission (FTC) to investigate the company’s consumer data collection and processing practices as they relate to these accusations and other possible harmful activities posed to consumers.”
    • If the FTC were to investigate, find wrongdoing, and seek civil fines against TikTok, the next owner may be left to pay as the White House’s order to ByteDance to sell the company within three months will almost certainly be consummated before any FTC action is completed.
  • Massachusetts Attorney General Maura Healey (D) has established a “Data Privacy and Security Division within her office to protect consumers from the surge of threats to the privacy and security of their data in an ever-changing digital economy.” Healey has been one of the United States’ more active attorneys general on data privacy and technology issues, including her suit and settlement with Equifax for its massive data breach.
    • Her office explained:
      • The Data Privacy and Security Division investigates online threats and the unfair or deceptive collection, use, and disclosure of consumers’ personal data through digital technologies. The Division aims to empower consumers in the digital economy, ensure that companies are protecting consumers’ personal data from breach, protect equal and open access to the internet, and protect consumers from data-driven technologies that unlawfully deny them fair access to socioeconomic opportunities. The Division embodies AG Healey’s commitment to continue and grow on this critical work and ensure that data-driven technologies operate lawfully for the benefit of all consumers.
  • A California appeals court ruled that Amazon can be held liable for defective products their parties sell on its website. The appellate court reversed the trial court which held Amazon could not be liable.
    • The appeals court recited the facts of the case:
      • Plaintiff Angela Bolger bought a replacement laptop computer battery on Amazon, the popular online shopping website operated by defendant Amazon.com, LLC. The Amazon listing for the battery identified the seller as “E-Life, ”a fictitious name used on Amazon by Lenoge Technology (HK) Ltd. (Lenoge). Amazon charged Bolger for the purchase, retrieved the laptop battery from its location in an Amazon warehouse, prepared the battery for shipment in Amazon-branded packaging, and sent it to Bolger. Bolger alleges the battery exploded several months later, and she suffered severe burns as a result.
      • Bolger sued Amazon and several other defendants, including Lenoge. She alleged causes of action for strict products liability, negligent products liability, breach of implied warranty, breach of express warranty, and “negligence/negligent undertaking.”
    • The appeals court continued:
      • Amazon moved for summary judgment. It primarily argued that the doctrine of strict products liability, as well as any similar tort theory, did not apply to it because it did not distribute, manufacture, or sell the product in question. It claimed its website was an “online marketplace” and E-Life (Lenoge) was the product seller, not Amazon. The trial court agreed, granted Amazon’s motion, and entered judgment accordingly.
      • Bolger appeals. She argues that Amazon is strictly liable for defective products offered on its website by third-party sellers like Lenoge. In the circumstances of this case, we agree.
  • The National Institute of Standards and Technology (NIST) issued Special Publication 800-207, “Zero Trust Architecture,” that posits a different conceptual model for an organization’s cybersecurity than perimeter security. NIST claimed:
    • Zero trust security models assume that an attacker is present in the environment and that an enterprise-owned environment is no different—or no more trustworthy—than any nonenterprise-owned environment. In this new paradigm, an enterprise must assume no implicit trust and continually analyze and evaluate the risks to its assets and business functions and then enact protections to mitigate these risks. In zero trust, these protections usually involve minimizing access to resources (such as data and compute resources and applications/services) to only those subjects and assets identified as needing access as well as continually authenticating and authorizing the identity and security posture of each access request.
    • A zero trust architecture (ZTA) is an enterprise cybersecurity architecture that is based on zero trust principles and designed to prevent data breaches and limit internal lateral movement. This publication discusses ZTA, its logical components, possible deployment scenarios, and threats. It also presents a general road map for organizations wishing to migrate to a zero trust design approach and discusses relevant federal policies that may impact or influence a zero trust architecture.
    • ZT is not a single architecture but a set of guiding principles for workflow, system design and operations that can be used to improve the security posture of any classification or sensitivity level [FIPS199]. Transitioning to ZTA is a journey concerning how an organization evaluates risk in its mission and cannot simply be accomplished with a wholesale replacement of technology. That said, many organizations already have elements of a ZTA in their enterprise infrastructure today. Organizations should seek to incrementally implement zero trust principles, process changes, and technology solutions that protect their data assets and business functions by use case. Most enterprise infrastructures will operate in a hybrid zero trust/perimeter-based mode while continuing to invest in IT modernization initiatives and improve organization business processes.
  • The United Kingdom’s Government Communications Headquarters’ (GCHQ) National Cyber Security Centre (NCSC) released “Cyber insurance guidance” “for organisations of all sizes who are considering purchasing cyber insurance…not intended to be a comprehensive cyber insurance buyers guide, but instead focuses on the cyber security aspects of cyber insurance.” The NCSC stated “[i]f you are considering cyber insurance, these questions can be used to frame your discussions…[and] [t]his guidance focuses on standalone cyber insurance policies, but many of these questions may be relevant to cyber insurance where it is included in other policies.”

Further Reading

  • I downloaded Covidwise, America’s first Bluetooth exposure-notification app. You should, too.” By Geoffrey Fowler – The Washington Post. The paper’s technology columnist blesses the Apple/Google Bluetooth exposure app and claims it protects privacy. One person on Twitter pointed out the Android version will not work unless location services are turned on, which is contrary to the claims made by Google and Apple, an issue the New York Times investigated last month. A number of European nations have pressed Google to remove this feature, and a Google spokesperson claimed the Android Bluetooth tracing capability did not use location services, begging the question why the prompt appears. Moreover, one of the apps Fowler names has had its own privacy issues as detailed by The Washington Post in May. As it turns out Care19, a contact tracing app developed when the governor of North Dakota asked a friend who had designed a app for football fans to meet up, is violating its own privacy policy according to Jumbo, the maker of privacy software. Apparently, Care19 shares location and personal data with FourSquare when used on iPhones. Both Apple and state officials are at a loss to explain how this went unnoticed when the app was scrubbed for technical and privacy problems before being rolled out.
  • Truss leads China hawks trying to derail TikTok’s London HQ plan” By Dan Sabbagh – The Guardian. ByteDance’s plan to establish a headquarters in London is now under attack by members of the ruling Conservative party for the company’s alleged role in persecuting the Uighur minority in Xinjiang. ByteDance has been eager to move to London and also eager to avoid the treatment that another tech company from the People’s Republic of China has gotten in the United Kingdom (UK): Huawei. Nonetheless, this decision may turn political as the government’s reversal on Huawei and 5G did. Incidentally, if Microsoft does buy part of TikTok, it would be buying operations in four of the five Five Eyes nations but not the UK.
  • Human Rights Commission warns government over ‘dangerous’ use of AI” By Fergus Hunter – The Sydney Morning Herald. A cautionary tale regarding the use of artificial intelligence and algorithms in government decision-making. While this article nominally pertains to Australia’s Human Rights Commission advice to the country’s government, it is based, in large part, on a scandal in which an automated process illegally collected $721 million AUD from welfare beneficiaries. In the view of the Human Rights Commission, decision-making by humans is still preferable and more accurate than automated means.
  • The Attack That Broke Twitter Is Hitting Dozens of Companies” By Andy Greenberg – WIRED. In the never-ending permutations of hacking, the past has become the present because the Twitter hackers use phone calls to talk their way into gaining access to a number of high-profile accounts (aka phone spear phishing.) Other companies are suffering the same onslaught, proving the axiom that people may be the weakest link in cybersecurity. However, the phone calls are based on exacting research and preparation as hackers scour the internet for information on their targets and the companies themselves. A similar hack was reportedly executed by the Democratic People’s Republic of Korea (DPRK) against Israeli defense firms.
  • Miami Police Used Facial Recognition Technology in Protester’s Arrest” By Connie Fossi and Phil Prazan – NBC Miami. The Miami Police Department used Clearview AI to identify a protestor that allegedly injured an officer but did not divulge this fact to the accused or her attorney. The department’s policy on facial recognition technology bars officers from making arrests solely on the basis of identification through such a system. Given the error rates many facial recognition systems have experienced with identifying minorities and the use of masks during the pandemic, which further decreases accuracy, it is quite likely people will be wrongfully accused and convicted using this technology.
  • Big Tech’s Domination of Business Reaches New Heights” By Peter Eavis and Steve Lohr – The New York Times. Big tech has gotten larger, more powerful, and more indispensable in the United States (U.S.) during the pandemic, and one needs to go back to the railroads in the late 19th Century to find comparable companies. It is an open question whether their size and influence will change much no matter who is president of the U.S. next year.
  • License plate tracking for police set to go nationwide” By Alfred Ng – c/net. A de facto national license plate reader may soon be activated in the United States (U.S.). Flock Safety unveiled the “Total Analytics Law Officers Network,” (TALON) that will link its systems of cameras in more than 700 cities, allowing police departments to track cars across multiple jurisdictions. As the U.S. has no national laws regulating the use of this and other similar technologies, private companies may set policy for the country in the short term.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Dueling COVID-19 Privacy Bills Released

Democratic stakeholders answer a Republican proposal on how to regulate privacy issues raised by COVID-19 contact tracing. The proposals have little chance of enactment and are mostly about positioning.  

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Late last week, a group of Democratic stakeholders on privacy and data security issues released the “Public Health Emergency Privacy Act” (S.3749), a bill that serves as a counterpoint to the “COVID-19 Consumer Data Protection Act” (S.3663) legislation introduced a few weeks ago to pose a solution to the privacy issues raised by contact tracing of COVID-19. However, the Democratic bill contains a number of provisions that many Republicans consider non-starters such as a private right of action for individuals and no preemption of state laws. It is not likely this bill will advance in the Senate even though it may possibly be moved in the House. S. 3749 was introduced by Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) and Representatives Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Suzan DelBene (D-WA). 

In a way, the “Public Health Emergency Privacy Act” makes the case that “Health Insurance Portability and Accountability Act” (HIPAA)/“Health Information Technology for Economic and Clinical Health Act” (HITECH Act) regulations are inadequate to protect the privacy and data of people in the United States for the reason that these regulations apply only to healthcare providers, their business associates, and other named entities in the healthcare system. Entities collecting healthcare information, even very sensitive information, are not subject to HIPAA/HITECH regulations and would likely be regulated, to the extent they are, by the Federal Trade Commission (FTC) or state agencies and attorneys general.

The “Public Health Emergency Privacy Act” would cover virtually all entities, including government agencies except for public health authorities (e.g. a state department of health or the Centers for Disease Control and Prevention), healthcare providers, service providers, people acting in a household capacity. This remarkable definition likely reflects plans announced around by the world by governments to eschew Google and Apple’s contact tracing app to develop its own. Moreover, it would also touch some efforts aside and apart from contact tracing apps. Moreover, this is the first bill introduced recently that proposes to treat public and private entities the same way with respect to how and when they may collect personal data.

The types of data protected under the act are “emergency health data” which are defined as “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” The bill then lists a number of examples of emergency health data that is sweeping and comprehensive. This term includes “information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual” related to testing for COVID-19 and related genetic and biometric information. Geolocation and proximity information would also be covered by this definition. Finally, emergency health data encompasses “any other data collected from a personal device,” which seems to cover all data collection from a person’s phone, the primary means of contact tracing proposed thus far. However, it would appear that data on people collected at the household or higher level may not be subject to this definition and hence much of the bill’s protections.

The authority provided to covered entities is limited. First, collection, use, and disclosure of emergency health data are restricted to good faith health purposes. And yet, this term is not defined in the act, and the FTC may need to define it during the expedited rulemaking the bill. However, it seems a fairly safe assumption the agency and courts would not construe using emergency health data for advertising would not be a good faith public health purpose. Next, covered entities must allow people to correct inaccurate information and the entity itself has the duty to take reasonable efforts on its own to correct this information. Covered entities must implement reasonable safeguards to prevent discrimination on the basis of these data, which is a new obligation placed on covered entities in a privacy or data security bill. This provision may have been included to bar government agencies collecting and using covered data for discriminatory purposes. However, critics of more expansive bills may see this as the camel’s nose under the tent for future privacy and data security bills. Finally, the only government agencies that can legally be provided emergency health data are public health authorities and then only “for good faith public health purposes and in direct response to exigent circumstances.” Again, the limits of good faith public health purposes remain unclear and then such disclosure can only occur in exigent circumstances, which likely means when a person has contracted or been exposed to COVID-19.

Covered entities and service providers must implement appropriate measure to protect the security and confidentiality of emergency health data, but the third member of the usual triumvirate, availability, is not identified as requiring protection.

The “Public Health Emergency Privacy Act” outright bars a number of uses for emergency health data:

  • Commercial advertising, recommendations for e-commerce, or machine learning for these purposes
  • Any discriminatory practices related to employment, finance, credit, insurance, housing, or education opportunities; and
  • Discriminating with respect to goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation

It bears note the covered data cannot be collected or disclosed for these purposes either.

The act contains very strong consent language. Covered entities may not collect, use, or disclose emergency health data unless a person has provided express affirmative consent, which requires knowing, informed choice that cannot be obtained through the use of deceptive practices nor inferred through a person’s inaction. However, there are significant exceptions to this consent requirement that would allow covered entities to collect, use, or disclose these data, including

  • protecting against malicious, deceptive, fraudulent, or illegal activity; or
  • detecting, responding to, or preventing information security incidents or threats;
    the covered organization is compelled to do so by a legal obligation.

But these purposes are valid only when necessary and solely for the fulfillment of named purpose.

In a related vein, covered entities must allow people to revoke their consent and it must be effective as soon as practicable but no later than 15 days afterward. Moreover, the person’s emergency health data must be destroyed or rendered not linkable within 30 days of revocation of consent.

Covered entities must provide clear and conspicuous notice before or at the point of data collection that explains how and why the emergency health data is collected, used, and disclosed. Moreover, it must also disclose the categories of recipients to whom a covered entity discloses covered data. This notice must also disclose the covered entity’s data retention policies and data security policies and practices but only with respect to emergency health data. The notice must also inform consumers on how they may exercise rights under the act and how to file a complaint with the FTC.

There would also be a public reporting requirement for those covered entities collecting, using, or disclosing the emergency health data of 100,000 or more people. Every three months, these entities would need to report on aggregate figures on the number of people from whom data was collected, used, and disclosed. These reports must also detail the categories of emergency health data collected, used and disclosed and the categories of third parties with whom such data are shared.

The bill requires covered entities to destroy or make not linkable emergency health data 60 days after the Secretary of Health and Human Services declares the end of the COVID-19 public health emergency, or a state does so, or 60 days after collection.

The FTC would be given the daunting task of beginning a rulemaking 7 days after enactment “to ensure a covered organization that has collected, used, or disclosed emergency health data before the date of enactment of this Act is in compliance with this Act, to the degree practicable” that must be completed within 45 days. There is also a provision requiring the Department of Health and Human Services (HHS) to issue guidance so that HIPAA/HITECH Act regulated entities do not need to meet duplicative requirements, and, moreover, the bill exempts these entities when operating under the HIPAA/HITECH Act regulations.

The FTC, state attorneys general, and people would be able to enforce this new act through a variety of means. The FTC would treat violations as f they were violations of regulation barring a deceptive or unfair practice, meaning the agency could levy fines in the first instance of more than $43,000 a violation. The FTC could seek a range of other relief, including injunctions, restitution, disgorgement, and remediation. However, the FTC could go to federal court without having to consult with the Department of Justice, which is a departure from current law and almost all the privacy bills introduced in this Congress. The FTC’s jurisdiction would be broadened to include common carriers and non-profits for purposes of this bill, too. The FTC would receive normal notice and comment rulemaking authority that is very broad as the agency would be free to promulgate any regulations it sees necessary to effectuate this act.

State attorneys general would be able to seek all the same relief the FTC can so long as the latter is not already bringing an action. Moreover, any other state officials empowered by state statute to bring similar actions may do so for violations of this act.

People would be allowed to sue in either federal or state court to vindicate violations. The bill states that any violation is to be considered an injury in fact to forestall any court from finding that a violation does not injure the person, meaning her suit cannot proceed. The act would allow people to recover between $100-1000 for any negligent violation and between $500-5000 for any reckless, willful, or intentional violation with no cap on total damages. People may also ask for and receive reasonable attorney’s fees and costs and any equitable or declaratory relief a court sees fit to grant. Moreover, the bill would disallow all pre-lawsuit arbitration agreements or waivers of rights. Much of the right to sue granted to people by the “Public Health Emergency Privacy Act” will be opposed by many Republicans and industry stakeholders.

The enforcement section raises a few questions given that entities covered by the bill include government agencies. Presumably, the FTC cannot enforce this act against government agencies for the very good reason they do not have jurisdiction over them. However, does the private right of action waive the federal and state government’s sovereign immunity? This is not clear and may need clarification of the bill is acted upon in either chamber of Congress.

This bill would not preempt state laws, which, if enacted, could subject covered entities to meeting more than one regime for collecting, using, and disclosing emergency health data.

Apart from contact tracing, the “Public Health Emergency Privacy Act” also bars the use of emergency health data to abridge a person’s right to vote and it requires HHS, the United States Commission on Civil Rights, and the FTC to “prepare and submit to Congress reports that examines the civil rights impact of the collection, use, and disclosure of health information in response to the COVID–19 public health emergency.”

Given the continued impasse over privacy legislation, it is little wonder that the bill unveiled a few weeks ago by four key Senate Republicans takes a similar approach that differs in key aspects. Of course, there is no private right of action and it expressly preempts state laws to the contrary.

Generally speaking, the structure of the “COVID–19 Consumer Data Protection Act of 2020” (S.3663) tracks with the bills that have been released thus far by the four sponsors: Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The FTC and state attorneys general would enforce the new protections.

The scope of the information and entities covered is narrower than the Democratic bill. “Covered data” is “precise geolocation data, proximity data, a persistent identifier, and personal health information” but not aggregated data, business contact information, de-identified data, employee screening data, and publicly available information. Those entities covered by the bill are those already subject to FTC jurisdiction along with common carriers and non-profits, but government agencies are not, again unlike the bill put forth by Democrats. Entities must abide by this bill to the extent they “collect[], process[], or transfer[] such covered data, or determine[] the means and purposes for the collection, processing, or transfer of covered data.”

Another key definition is how an “individual” is determined, for it excludes any person acting in her role as an employee and almost all work-related capacities, making clear employers will not need to comply with respect to those working for them.

“Personal health information” is “information relating to an individual that is

  • genetic information of the individual; or
  • information relating to the diagnosis or treatment of past, present, or future physical, mental health, or disability of the individual; and
  • identifies, or is reasonably linkable to, the individual.

And yet, this term excludes educational information subject to the “Family Educational Rights and Privacy Act of 1974” or “information subject to regulations promulgated pursuant to the HIPAA/HITECH Acts.

The “COVID–19 Consumer Data Protection Act of 2020” bars the collection, processing, or transferring of covered data for a covered purpose unless prior notice is provided, a person has provided affirmative consent, and the covered entity has agreed not to collect, process, or transfer such covered data for any other purpose than those detailed in the notice. However, leaving aside the bill’s enumerated allowable purposes for which covered data may collected with consent, the bill provides a number of exemptions from this general bar. For example, collection, processing, or transfers necessary for a covered entity to comply with another law are permissible apparently in the absence of a person’s consent. Moreover, a covered entity need not obtain consent for operational or administrative tasks not disclosed in the notice provided to people.

The act does spell out “covered purposes” for which covered entities may collect, process, or transfer covered data with consent after notice has been given:

  • Collecting, processing, or transferring the covered data of an individual to track the spread, signs, or symptoms of COVID–19.
  • Collecting, processing, or transferring the covered data of an individual to measure compliance with social distancing guidelines or other requirements related to COVID–19 that are imposed on individuals under a Federal, State, or local government order.
  • Collecting, processing, or transferring the covered data of an individual to conduct contact tracing for COVID–19 cases.

Covered entities would be required to publish a privacy policy detailing for which of the above covered purposes a person’s covered data would be collected, processed, or transferred. This policy must also detail the categories of entities that receive covered data, its data retention and data security policies.

There would be reporting requirements that would affect more covered entities than the Democratic bill. Accordingly, any covered entity collecting, processing or transferring covered data for one of the enumerated covered processes must issue a public report 30 days after enactment and then every 60 days thereafter.

Among other provisions in the bill, people would be able to revoke consent, a request that covered entities must honor within 14 days. Covered entities must also ensure the covered data are accurate, but this requirement falls a bit short of people being granted to right to correct inaccurate data as they would, instead, be merely able to report inaccuracies. There is no recourse if a covered entity chooses not to heed these reports. Covered entities would need to “delete or de-identify all covered data collected, processed, or transferred for a [covered purpose] when it is no longer being used for such purpose and is no longer necessary to comply with a Federal, State, or local legal obligation, or the establishment, exercise, or defense of a legal claim.” Even though there is the commandment to delete or de-identify, the timing as to when that happens seems somewhat open-ended as some covered entities could seek legal obligations to meet in order to keep the data on hand.

Covered entities must also minimize covered data by limiting collection, processing, and transferring to “what is reasonably necessary, proportionate, and limited to carry out [a covered purpose.]” The FTC must draft and release guidelines “recommending best practices for covered entities to minimize the collection, processing, and transfer of covered data.” However, these guidelines would most likely be advisory in nature and would not carry the force of law or regulation, leaving covered entities to disregard some of the document if they choose. Covered entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of such data,” a mandate broader than the duty imposed by the Democratic bill.

The FTC and state attorneys general would enforce the new regime. The FTC would be able to seek civil fines for first violations in the same way as the Democrat’s privacy bill. However, unlike the other bill, the Republican bill would nullify any Federal Communications Commission (FCC) jurisdiction to the extent it conflicted with the FTC’s in enforcement of the new act. Presumably, this would address jurisdictional issues raised by placing common carriers under FTC jurisdiction when they are usually overseen by the FCC. Even though state laws are preempted, state attorneys general would be able to bring actions to enforce this act at the state level. And, as noted earlier, there is no private right of action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Senate Commerce Republicans Vow To Introduce Privacy Bill To Govern COVID-19 Apps and Tech

Key Republican stakeholders on privacy legislation float a bill on COVID-19 relating to privacy that seems unlikely to garner the necessary Democratic buy-in to advance.  

Late last week, key Republicans on the Senate Commerce, Science, and Transportation announced they would introduce the “COVID-19 Consumer Data Protection Act” that provide new privacy and data security protections for the use of a COVID-19 contact tracing app and similar technologies. To date, text of the legislation has not been released and so any analysis of the bill is derived from a short summary issued by the committee and reports from media outlets that have apparently been provided a copy of the bill.

Based on this information, to no great surprise, the basic structure of the bill tracks privacy and data protection legislation previously introduced by the co-sponsors of the new bill: Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The Federal Trade Commission (FTC) and state attorneys general would enforce the new protections, and as there was no mention of a private right of action, and given these Members opposition to such provisions, it is likely the bill does not provide such redress. Moreover, according to media reports, the bill would preempt state laws contrary to its provision, which would be another likely non-starter among Democrats.

Wicker, Thune, Moran, and Blackburn claimed their bill “would provide all Americans with more transparency, choice, and control over the collection and use of their personal health, geolocation, and proximity data…[and] would also hold businesses accountable to consumers if they use personal data to fight the COVID-19 pandemic” as they asserted in their press release.

Wicker, Thune, Moran, and Blackburn provided this summary of the “COVID-19 Consumer Data Protection Act:”

  • Require companies under the jurisdiction of the Federal Trade Commission to obtain affirmative express consent from individuals to collect, process, or transfer their personal health, geolocation, or proximity information for the purposes of tracking the spread of COVID-19.
  • Direct companies to disclose to consumers at the point of collection how their data will be handled, to whom it will be transferred, and how long it will be retained.
  • Establish clear definitions about what constitutes aggregate and de-identified data to ensure companies adopt certain technical and legal safeguards to protect consumer data from being re-identified.
  • Require companies to allow individuals to opt out of the collection, processing, or transfer of their personal health, geolocation, or proximity information.
  • Direct companies to provide transparency reports to the public describing their data collection activities related to COVID-19.
  • Establish data minimization and data security requirements for any personally identifiable information collected by a covered entity.
  • Require companies to delete or de-identify all personally identifiable information when it is no longer being used for the COVID-19 public health emergency.
  • Authorize state attorneys general to enforce the Act.

If such legislation were to pass, it would add to the patchwork of privacy and data security bills already enacted that are geared to addressing certain sectors or populations (e.g. the “Health Insurance Portability and Accountability Act” (HIPAA) protects some healthcare information and “Children’s Online Privacy Protection Act” (COPPA) broadly protects children online.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Senate Democrats Release Privacy Principles

The ranking members of four Senate Committees have released their principles for any privacy legislation, many of which are likely to be rejected by Republicans and many industry stakeholders (e.g. no preemptions of the “California Consumer Privacy Act” (AB 375) and a private right of action for consumers).

Nonetheless, Senators Maria Cantwell (D-WA), Dianne Feinstein (D-CA), Patty Murray (D-WA), and Sherrod Brown (D-OH) agreed to these principles, and reportedly Senate Minority Leader Chuck Schumer (D-NY) convened and facilitated the effort, which has come ahead of the release of any of the privacy bills that have been under development this year in the Senate.

Of course, the Senate Commerce, Science, and Transportation Committee had convened an informal working group late last year consisting of Cantwell, Chair Roger Wicker (R-MS) and Senators John Thune (R-SD), Jerry Moran (R-KS), Brian Schatz (D-HI), and Richard Blumenthal (D-CT) to hash out a privacy bill. However, like most other such efforts, the timeline for releasing bill text has been repeatedly pushed back even after Wicker and Cantwell tried working by themselves on a bill late in the summer. Additionally, Moran and Blumenthal, the chair and ranking member of the Manufacturing, Trade, and Consumer Protection Subcommittee, have been working on a bill for some time as well but without a timeline for releasing text.

And, the efforts at this committee are in parallel to those in other committees. Senate Judiciary Chair Lindsey Graham (R-SC) has gotten his committee onto the field with hearings on the subject and has articulated his aim to play a role in crafting a bill. Likewise, the Senate Banking Committee has held hearings and are looking to participate in the process as well. But, like Senate Commerce, no bills have been released.

Of course, it is easier to write out one’s principles than to draft legislation. And yet, the release of these desired policies elegantly puts down a marker for Senate Democrats at a time when the majority in the chamber is struggling to coalesce and release a privacy bill. The move also demonstrates cohesion among the top Democrats on four of the committees with a slice of jurisdiction over privacy and data security issues: Commerce, Banking, HELP, and Judiciary.