Further Reading, Other Developments, and Coming Events (23 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Here are Further Reading, Other Developments, and Coming Events.

Other Developments

  • New Zealand’s Privacy Commissioner has begun the process of implementing the new Privacy Act 2020 and has started asking for input on the codes of practice that will effectuate the rewrite of the nation’s privacy laws. The Commissioner laid out the following schedule:
    • Telecommunications Information Privacy Code and Civil Defence National Emergencies (Information Sharing) Code
      • Open: 29 July 2020 / Close: 26 August 2020
    • The Commissioner noted “[t]he new Privacy Act 2020 is set to come into force on 1 December…[and] makes several key reforms to New Zealand’s privacy law, including amendments to the information privacy principles.” The Commissioner added “[a]s a result, the six codes of practice made under the Privacy Act 1993 require replacement.”
  • Australia’s 2020 Cyber Security Strategy Industry Advisory Panel issued its report and recommendations “to provide strategic advice to support the development of Australia’s 2020 Cyber Security Strategy.” The body was convened by the Minister for Home Affairs. The panel “recommendations are structured around a framework of five key pillars:
    • Deterrence: The Government should establish clear consequences for those targeting businesses and Australians. A key priority is increasing transparency on Government investigative activity, more frequent attribution and consequences applied where appropriate, and strengthening the Australian Cyber Security Centre’s (ACSC’s) ability to disrupt cyber criminals by targeting the proceeds of cybercrime.
    • Prevention: Prevention is vital and should include initiatives to help businesses and Australians remain safer online. Industry should increase its cyber security capabilities and be increasingly responsible for ensuring their digital products and services are cyber safe and secure, protecting their customers from foreseeable cyber security harm. While Australians have access to trusted goods and services, they also need to be supported with advice on how to practice safe behaviours at home and work. A clear definition is required for what constitutes critical infrastructure and systems of national significance across the public and private sectors. This should be developed with consistent, principles-based regulatory requirements to implement reasonable protection against cyber threats for both the public and private sectors.
    • Detection: There is clear need for the development of a mechanism between industry and Government for real-time sharing of threat information, beginning with critical infrastructure operators. The Government should also empower industry to automatically detect and block a greater proportion of known cyber security threats in real-time including initiatives such as ‘cleaner pipes’.
    • Resilience: We know malicious cyber activity is hitting Australians hard. The tactics and techniques used by malicious cyber actors are evolving so quickly that individuals, businesses and critical infrastructure operators in Australia are not fully able to protect themselves and their assets against every cyber security threat. As a result, it is recommended that the Government should strengthen the incident response and victim support options already in place. This should include conducting cyber security exercises in partnership with the private sector. Speed is key when it comes to recovering from cyber incidents, it is therefore proposed that critical infrastructure operators should collaborate more closely to increase preparedness for major cyber incidents.
    • Investment: The Joint Cyber Security Centre (JCSC) program is a highly valuable asset to form a key delivery mechanism for the initiatives under the 2020 Cyber Security Strategy should be strengthened. This should include increased resources and the establishment of a national board in partnership with industry, states and territories with an integrated governance structure underpinned by a charter outlining scope and deliverables.
  •  Six of the world’s data protection authorities issued an open letter to the teleconferencing companies “to set out our concerns, and to clarify our expectations and the steps you should be taking as Video Teleconferencing (VTC) companies to mitigate the identified risks and ultimately ensure that our citizens’ personal information is safeguarded in line with public expectations and protected from any harm.” The DPAs stated that “[t]he principles in this open letter set out some of the key areas to focus on to ensure that your VTC offering is not only compliant with data protection and privacy law around the world, but also helps build the trust and confidence of your userbase.” They added that “[w]e welcome responses to this open letter from VTC companies, by 30 September 2020, to demonstrate how they are taking these principles into account in the design and delivery of their services. Responses will be shared amongst the joint signatories to this letter.” The letter was drafted and signed by:
    • The Privacy Commissioner of Canada
    • The United Kingdom Information Commissioner’s Office
    • The Office of the Australian Information Commissioner
    • The Gibraltar Regulatory Authority
    • The Office of the Privacy Commissioner for Personal Data, Hong Kong, China
    • The Federal Data Protection and Information Commissioner of Switzerland
  • The United States Office of the Comptroller of the Currency (OCC) “is reviewing its regulations on bank digital activities to ensure that its regulations continue to evolve with developments in the industry” and released an “advance notice of proposed rulemaking (ANPR) [that] solicits public input as part of this review” by 8 August 2020. The OCC explained:
    • Over the past two decades, technological advances have transformed the financial industry, including the channels through which products and services are delivered and the nature of the products and services themselves. Fewer than fifteen years ago, smart phones with slide-out keyboards and limited touchscreen capability were newsworthy.[1] Today, 49 percent of Americans bank on their phones,[2] and 85 percent of American millennials use mobile banking.[3]
    • The first person-to-person (P2P) platform for money transfer services was established in 1998.[4] Today, there are countless P2P payment options, and many Americans regularly use P2P to transfer funds.[5] In 2003, Congress authorized digital copies of checks to be made and electronically processed.[6] Today, remote deposit capture is the norm for many consumers.[7] The first cryptocurrency was created in 2009; there are now over 1,000 rival cryptocurrencies,[8] and approximately eight percent of Americans own cryptocurrency.[9] Today, artificial intelligence (AI) and machine learning, biometrics, cloud computing, big data and data analytics, and distributed ledger and blockchain technology are used commonly or are emerging in the banking sector. Even the language used to describe these innovations is evolving, with the term “digital” now commonly used to encompass electronic, mobile, and other online activities.
    • These technological developments have led to a wide range of new banking products and services delivered through innovative and more efficient channels in response to evolving customer preferences. Back-office banking operations have experienced significant changes as well. AI and machine learning play an increasing role, for example, in fraud identification, transaction monitoring, and loan underwriting and monitoring. And technology is fueling advances in payments. In addition, technological innovations are helping banks comply with the complex regulatory framework and enhance cybersecurity to more effectively protect bank and customer data and privacy. More and more banks, of all sizes and types, are entering into relationships with technology companies that enable banks and the technology companies to establish new delivery channels and business practices and develop new products to meet the needs of consumers, businesses, and communities. These relationships facilitate banks’ ability to reach new customers, better serve existing customers, and take advantage of cost efficiencies, which help them to remain competitive in a changing industry.
    • Along with the opportunities presented by these technological changes, there are new challenges and risks. Banks should adjust their business models and practices to a new financial marketplace and changing customer demands. Banks are in an environment where they compete with non-bank entities that offer products and services that historically have only been offered by banks, while ensuring that their activities are consistent with the authority provided by a banking charter and safe and sound banking practices. Banks also must comply with applicable laws and regulations, including those focused on consumer protection and Bank Secrecy Act/anti-money laundering (BSA/AML) compliance. And, importantly, advanced persistent threats require banks to pay constant and close attention to increasing cybersecurity risks.
    • Notwithstanding these challenges, the Federal banking system is well acquainted with and well positioned for change, which has been a hallmark of this system since its inception. The OCC’s support of responsible innovation throughout its history has helped facilitate the successful evolution of the industry. The OCC has long understood that the banking business is not frozen in time and agrees with the statement made over forty years ago by the U.S. Court of Appeals for the Ninth Circuit: “the powers of national banks must be construed so as to permit the use of new ways of conducting the very old business of banking.” [10] Accordingly, the OCC has sought to regulate banking in ways that allow for the responsible creation or adoption of technological advances and to establish a regulatory and supervisory framework that allows banking to evolve, while ensuring that safety and soundness and the fair treatment of customers is preserved.
  • A trio of House of Representatives Members have introduced “legislation to put American consumers in the driver’s seat by giving them clearer knowledge about the technology they are purchasing.” The “Informing Consumers about Smart Devices Act” (H.R.7583) was drafted and released by Representatives John Curtis (R-UT), Seth Moulton (D-MA), and Gus Bilirakis (R-FL) and according to their press release, it would:
    • The legislation is in response to reports about household devices listening to individuals’ conversations without their knowledge. While some manufacturers have taken steps to more clearly label their products with listening devices, this legislation would make this information more obvious to consumers without overly burdensome requirements on producers of these devices. 
    • Specifically, the bill requires the Federal Trade Commission (FTC) to work alongside industry leaders to establish guidelines for properly disclosing the potential for their products to contain audio or visual recording capabilities. To ensure this does not become an overly burdensome labeling requirement, the legislation provides manufacturers the option of requesting customized guidance from the FTC that fits within their existing marketing or branding practices in addition to permitting these disclosures pre or post-sale of their products.
  • House Oversight and Reform Committee Ranking Member James Comer (R-KY) sent Twitter CEO Jack Dorsey a letter regarding last week’s hack, asking for answers to his questions about the security practices of the platform. Government Operations Subcommittee Ranking Member Jody Hice (R-GA) and 18 other Republicans also wrote Dorsey demanding an explanation of “Twitter’s intent and use of tools labeled ‘SEARCH BLACKLIST’ and ‘TRENDS BLACKLIST’ shown in the leaked screenshots.”
  • The United States Court of Appeals for the District of Columbia has ruled against United States Agency for Global Media (USAGM) head Michael Pack and enjoined his efforts to fire the board of the Open Technology Fund (OTF). The court stated “it appears likely that the district court correctly concluded that 22 U.S.C. § 6209(d) does not grant the Chief Executive Officer of the United States Agency for Global Media, Michael Pack, with the authority to remove and replace members of OTF’s board.” Four removed members of the OTF Board had filed suit against pack. Yesterday, District of Columbia Attorney General Karl Racine (D) filed suit against USAGM, arguing that Pack violated District of Columbia law by dissolving the OTF Board and creating a new one.
  • Three advocacy organizations have lodged their opposition to the “California Privacy Rights Act” (aka Proposition 24) that will be on the ballot this fall in California. The American Civil Liberties Union, the California Alliance for Retired Americans, and Color of Change are speaking out against the bill because “it stacks the deck in favor of big tech corporations and reduces your privacy rights.” Industry groups have also started advertising and advocating against the statute that would rewrite the “California Consumer Privacy Act” (CCPA) (AB 375).

Further Reading

  • Facebook adds info label to Trump post about elections” – The Hill. Facebook has followed Twitter in appending information to posts of President Donald Trump that implicitly rebut his false claims about fraud and mail-in voting. Interestingly, they also appended information to posts of former Vice President Joe Biden that merely asked people to vote Trump out in November. If Facebook continues this policy, it is likely to stoke the ire of Republicans, many of whom claim that the platform and others are biased against conservative voices and viewpoints.
  • Ajit Pai urges states to cap prison phone rates after he helped kill FCC caps” – Ars Technica. The chair of the Federal Communications Commission (FC) is imploring states to regulate the egregious rates charged on payphones to the incarcerated in prison. The rub here is that Pai fought against Obama-era FCC efforts to regulate these practices, claiming the agency lacked the jurisdiction to police intrastate calls. Pai pulled the plug on the agency’s efforts to fight for these powers in court when he became chair.
  • Twitter bans 7,000 QAnon accounts, limits 150,000 others as part of broad crackdown” – NBC News. Today, Twitter announced it was suspending thousands of account of conspiracy theorists who believe a great number of untrue things, namely the “deep state” of the United States is working to thwart the presidency of Donald Trump. Twitter announced in a tweet: “[w]e will permanently suspend accounts Tweeting about these topics that we know are engaged in violations of our multi-account policy, coordinating abuse around individual victims, or are attempting to evade a previous suspension — something we’ve seen more of in recent weeks.” This practice, alternately called brigading or swarming, has been employed on a number of celebrities who are alleged to be engaging in pedophilia. The group, QAnon, has even been quoted or supported by Members of the Republican Party, some of whom may see Twitter’s actions as ideological.
  • Russia and China’s vaccine hacks don’t violate rules of road for cyberspace, experts say” – The Washington Post. Contrary to the claims of the British, Canadian, and American governments, attempts by other nations to hack into COVID-19 research is not counter to cyber norms these and other nations have been pushing to make the rules of the road. The experts interviewed for the article are far more concerned about the long term effects of President Donald Trump allowing the Central Intelligence Agency to start launching cyber attacks when and how it wishes.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Senate Democratic Stakeholder Floats Privacy Discussion Draft

The top Democrat on one committee has released a bill that would scrap the notice and consent model and strictly limit what information can be collected, processed, and shared.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

On 18 June, Senate Banking, Housing, and Urban Affairs Ranking Member Sherrod Brown (D-OH) released a discussion draft of a federal privacy bill that “rejects the current, ineffective “consent” model for privacy, and instead places strict limits on the collection, use, and sharing of Americans’ personal data.” The “Data Accountability and Transparency Act of 2020” may possibly shift the debate on privacy legislation as other recent bills and developments have moved the window of what stakeholders believe possible on the issue of the sufficiency of the notice and consent model. Like a few other bills, Brown’s legislation would establish a new agency to regulate privacy at the federal level, thus rejecting the idea to expand the Federal Trade Commission’s jurisdiction. The package also addresses an issue that has grown in visibility over the last month or so: facial recognition technology. Most of the privacy bills have not sought to fold the new technology into their regulatory frameworks. However, at present, election year politics compounded by the ongoing pandemic and protests in the United States may serve to further diminish the already flagging chances of enactment of federal privacy legislation this year.

In his press release, Brown claimed his bill “creates a new framework that would give Americans the power to hold corporations, big tech, and the government responsible for how they collect and protect personal data.” He claimed “[t]he bill rejects the current, ineffective “consent” model for privacy, and instead places strict limits on the collection, use, and sharing of Americans’ personal data…[and] contains strong civil rights protections to ensure personal information is not used for discriminatory purposes, as well as a ban on the use of facial recognition technology.” Brown add the “Data Accountability and Transparency Act of 2020” “also establishes a new independent agency dedicated to protecting Americans’ privacy rights.”

Brown stated that “[s]pecifically, the Data Accountability and Transparency Act of 2020 would:

  • Ban the collection, use or sharing of personal data unless specifically allowed by law
  • Ban the use of facial recognition technology
  • Prohibits the use of personal data to discriminate in housing, employment, credit, insurance, and public accommodations;
  • Requires anyone using decision-making algorithms to provide new accountability reports
  • Creates a new, independent agency that is dedicated to protecting individuals’ privacy and the implementation of DATA 2020. The new agency will have rulemaking, supervisory, and enforcement authority, the ability to issue civil penalties for violations of the Act, and a dedicated Office of Civil Rights to protect individuals from discrimination
  • The proposal empowers individuals and state attorneys general to enforce privacy protections and does not preempt more protective state laws
  • Finally, the proposal would require CEO certification of compliance with the Act and contains potential criminal and civil penalties for CEO and Board of Directors

Brown had begun the process with the chair of the Senate Banking, Housing, and Urban Affairs Committee on possible bipartisan privacy legislation likely within the jurisdiction of their committee. In February 2019, Brown and Chair Mike Crapo (R-ID) requested “feedback from interested stakeholders on the collection, use and protection of sensitive information by financial regulators and private companies.” Crapo and Brown stated:

The collection, use and protection of personally identifiable information and other sensitive information by financial regulators and private financial companies (including third-parties that share information with financial regulators and private financial companies) is something that deserves close scrutiny.  Americans are rightly concerned about how their data is collected and used, and how such data is secured and protected.  The collection and use of personally identifiable information will be a major focus of the Banking Committee moving forward. 

However, the quotes from Crapo and Brown in the joint press release suggested they may not have been entirely aligned on the scope of potential privacy legislation. Crapo asserted “it is worth examining how the Fair Credit Reporting Act should work in a digital economy, and whether certain data brokers and other firms serve a function similar to the original consumer reporting agencies.” However, Brown remarked that “[i]n the year and a half since the Equifax breach, the country has learned that financial and technology companies are collecting huge stockpiles of sensitive personal data, but fail over and over to protect Americans’ privacy.” Brown added that “Congress should make it easy for consumers to find out who is collecting personal information about them, and give consumers power over how that data is used, stored and distributed.”

Crapo provided further insight into his preferred model by which the federal government would regulate privacy at an October 2019 hearing titled “Data Ownership: Exploring Implications for Data Privacy Rights and Data Valuation.” Crapo noted that “[t]his Committee has held a series of data privacy hearings exploring possible frameworks for facilitating privacy rights to consumers….[and] [n]early all have included references to data as a new currency or commodity.” He stated that “[t]he next question, then, is who owns it?” Crapo stated that “[t]here has been much debate about the concept of data ownership, the monetary value of personal information and its potential role in data privacy.” He asserted that “[s]ome have argued that privacy and control over information could benefit from applying an explicit property right to personal data, similar to owning a home or protecting intellectual property…[and yet] [o]thers contend the very nature of data is different from that of other tangible assets or goods.”

Crapo stated that “[s]till, it is difficult to ignore the concept of data ownership that appears in existing data privacy frameworks.” He said that “[f]or example, the European Union’s General Data Protection Regulation, or GDPR, grants an individual the right to request and access personally identifiable information that has been collected about them.” Crapo contended that “[t]here is an inherent element of ownership in each of these rights, and it is necessary to address some of the difficulties of ownership when certain rights are exercised, such as whether information could pertain to more than one individual, or if individual ownership applies in the concept of derived data.” He stated that “[a]ssociated with concepts about data ownership or control is the value of personal data being used in the marketplace, and the opportunities for individuals to benefit from its use.”

Crapo asserted that “Senators [John] Kennedy (R-LA) and [Mark] Warner (D-VA) have both led on these issues, with Senator Kennedy introducing legislation that would grant an explicit property right over personal data (i.e. the “Own Your Own Data Act” (S. 806), and Senator Warner introducing legislation that would give consumers more information about the value of their personal data and how it is being used in the economy (i.e. the “Designing Accounting Safeguards To Help Broaden Oversight and Regulations on Data” (S. 1951).” Crapo contended that “[a]s the Banking Committee continues exploring ways to give individuals real control over their data, it is important to learn more about what relationship exists between true data ownership and individuals’ degree of control over their personal information; how a property right would work for different types of personal information; how data ownership interacts with existing privacy laws, including the Gramm-Leach-Bliley Act, the Fair Credit Reporting Act and GDPR; and different ways that companies use personal data, how personal data could be reliably valued and what that means for privacy.” (See here for more analysis of both bills.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Exposure Notification Privacy Act Introduced

A third COVID-19 privacy bill is unveiled in the Senate that may be more about messaging and positioning on broader privacy legislation. In any event, the odds on such legislation being enacted in the near term is not high.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

This week, a third COVID-19 privacy bill was released that occupies a middle ground between the other two bills. However, despite being bipartisan and between the two other bills, it is still not likely Congress will enact either targeted privacy legislation or broader, national privacy legislation this year. And yet, a number of the bill’s requirements track more closely with the Democratic bill released last month, suggesting some of the ground may be shifting under some of the outstanding issues. For example, the bill would not preempt state laws and while it would not create a new federal means a person could sue a company for violations, it expressly preserves all existing state and federal avenues a person could use to litigate.

On 3 June, Senate Commerce, Science and Transportation Committee Ranking Member Maria Cantwell (D-WA) and Bill Cassidy (R-LA) introduced the “Exposure Notification Privacy Act” (S.3861) with Senator Amy Klobuchar (D-MN) cosponsoring. The Senators released a section-by-section and a summary of the bill, too. This bill follows the “Public Health Emergency Privacy Act” (S.3749) and the “COVID-19 Consumer Data Protection Act” (S.3663), bills that take approaches aligned with the Democratic and Republican thinking on privacy respectively. (See here for more analysis).

The key term in the Exposure Notification Privacy Act is “automated exposure notification service,” (AENS) for it informs what is “covered data,” and hence covered by the bill’s protections, and it seems fairly targeted to address only those apps or services created to track contacts for purposes of reducing the spread of COVID-19. This term is defined as:

  • a website, online service, online application, mobile application, or mobile operating system
  • offered in interstate commerce in the United States
  • designed, in part or in full, specifically to be used for, or marketed for, the purpose of digitally notifying, in an automated manner, an individual who may have become exposed to an infectious disease

And yet, because what is covered data is limited to information “collected, processed, or transferred in connection with an AENS,” it is a reasonable reading of this language that an entity obtaining information from a data broker in order to track COVID-19 would be outside the definition of covered data. The same would seem to be true of social media platforms that collect and process data from their users incidentally to their main business of monetizing these data. This seems like a fairly large loophole that would mean the “Exposure Notification Privacy Act” would really focus tightly on technology programs, apps, and platforms mostly used to track and prevent infectious diseases with the voluntary, knowingly consent of users.

AENS would need to obtain express, affirmative consent a person provides after being provided with conspicuous, easy-to-understand notice about data collection, usage, processing, and transfer. There must also be a conspicuous means of withdrawing such consent. In any event, a person with an “authorized diagnosis” would control whether this information is processed by the AENS.

AENS and platform operators must publish “a privacy policy that provides a detailed and accurate representation of that person or entity’s covered data collection, processing, and transfer activities in connection with such person or entity’s AENS or the facilitation of such service.” These privacy policies must divulge “each category of covered data the person or entity collects and the limited allowable processing purposes for which such covered data is collected” and

  • “a description of the person or entity’s covered data minimization and retention policies;
  • how an individual can exercise the individual rights described in this title;
  • a description of the person or entity’s covered data security policies.”

As an aside, platform operators are entities “other than a service provider who provides an operating system that includes features supportive of an AENS and facilitates the use or distribution of such AENS to the extent the technology is not used by the platform operator as an AENS.” And so, platform operators might be Google, Apple, Microsoft, or a handful of others to the extent their operations systems are supporting the AENS in its purpose to track infectious diseases. Hence, some of the bill’s requirements will be imposed on such entities.

Of course, the bill text does not limit this measure just to COVID-19 and extends it to all infectious diseases, which is perhaps a nod to a new normal in which many Americans have apps on their phone or wearables on their bodies designed to counter contracting the flu or other, less dangerous viruses (See below in further reading for an article on FitBit and other apps and platforms that may be poised to do just this and a wearable Singapore may debut shortly.)

There are restrictions on whom may receive covered data from AENS. These entities may only alert individuals of possible exposure if they opted in or a public health authority, transfer these data to service providers to maintain, fix, or improve the system or for security purposes, or to comply in a legal action. The bill also seeks to assuage fears that the sensitive information of people collected for the purposes of combatting infectious diseases could be transferred to and used by law enforcement and surveillance agencies. The legislation explains “[i]t shall be unlawful for any person, entity, or Executive agency to transfer covered data to any Executive agency unless the information is transferred in connection with an investigation or enforcement proceeding under this Act.” Consequently, it would appear the Centers for Disease Control and Prevention (CDC) would be able to transfer covered data to the FTC for an investigation, it could not do the same with the Federal Bureau of Investigation (FBI). In this vein, Executive agencies can only process or transfer for a health purpose related to infectious diseases or in connection with an FTC or state investigation or enforcement action. However, this limitation does not seem to bar a state public health authority from conducting such a transfer to a state law enforcement agency.

There are data minimization responsibilities AENS would need to meet. AENS may not “collect or process any covered data…beyond the minimum amount necessary to implement an AENS for public health purposes; or…for any commercial purpose.” This would seem to limit AENS to collecting, processing and sharing personal information strictly necessary for the purpose of tracking infectious diseases. Likewise, AENS must delete a person’s covered data upon request and on a rolling basis per public health authority guidance. Service providers working with AENS must comply with the latter’s direction to delete covered data.

AENS must “establish, implement, and maintain data security practices to protect the confidentiality, integrity, availability, and accessibility of covered data…[that] be consistent with standards generally accepted by experts in the information security field.” The bill further specifies that such practices must include identifying and assessing risks, corrective and preventive actions for risks, and notification if an AENS is breached. The bill would also ban discrimination on the basis of covered data collected or processed by an AENS or on the basis of a person’s decision not to use an AENS.

As a means of providing oversight, the Privacy and Civil Liberties Oversight Board (PCLOB) would have its mandate enlarged to include “health-related epidemics,” meaning the Board could investigate and issue reports on how well or poorly the act is being implemented with respect to privacy and civil liberties.  To this end, within one year of enactment, PCLOB “shall issue a report, which shall be publicly available to the greatest extent possible, assessing the impact on privacy and civil liberties of Government activities in response to the public health emergency related to the Coronavirus 2019 (COVID–19), and making recommendations for how the Government should mitigate the threats posed by such emergency.”

AENS must also collaborate with public health authorities, which are federal and state agencies charged with protecting and ensuring public health. AENS could only collect, process, and transfer actual diagnoses of an infectious disease and could not do so with potential or presumptive diagnoses. AENS would be charged with issuing public guidance to help people understand the notifications of the system and any limitations with respect to accuracy and reliability. Moreover, AENS must also publish metrics (i.e. “measures of the effectiveness of the service”), including adoption rates. Presumably these latter two requirements would allow for greater transparency and also greater insight into how widely an app or platform is being adopted.

There are a few unexpected wrinkles, however. For example, the act only bars deceptive acts, and not unfair ones, which is a deviation from Section 5 of the Federal Trade Commission (FTC) Act, necessitating language in the bill to this effect rather than the usual reference to 15 USC 45. The bill also places a positive duty on service providers to report violations of the act by either AENS or public health authorities to these entities. It is possible that if such a report accurately depicted a violation the AENS or public health authority then neglected to remedy, the enforcers of the act would have an easier case to make that a violation occurred.

As mentioned, the FTC would police and enforce the act with an enlarged jurisdiction to include common carriers and non-profits. The agency would treat violations as if they were violations of an FTC regulation barring unfair or deceptive practices, which allows the agency to seek civil fines for first offenses. The FTC would not, however, receive rulemaking authority, and should regulations be needed, the agency would be forced to use the cumbersome Moss-Magnuson process.

However, and like the “Public Health Emergency Privacy Act,” the FTC would receive explicit authority to go to court itself instead of having to work through the Department of Justice (DOJ), which is currently the case. That this new wrinkle has appeared in two recent bills largely sponsored by Democrats suggests this may be a new demand for targeted and national privacy legislation and also may reflect diminished faith in the DOJ to vigorously enforce privacy legislation.

State attorneys general could enforce the act in the same ways as the FTC, meaning civil penalties in the first instance being possible. State attorneys general may also bring concurrent state claims, alleging violations under state laws. And so, the bill does not preempt state laws, as a section of the bill goes to some length to stress.

Interestingly, while the bill does not create a private right of action, it suggests a possible way of resolving that sticking point in negotiations between Republicans and Democrats. The bill stresses that it does not foreclose any existing common law federal and state rights of action and would therefore allow people to use any existing law to sue covered entities. This would allow tort suits and other suits to move forward. That Cassidy has cosponsored legislation with this language does not necessarily indicate this is now the will of the Senate Republican Conference.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

House Action On FISA Fizzles; A Conference Committee Is Requested

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Despite House Democratic leadership’s plans to pass the Foreign Intelligence Surveillance Act (FISA) reauthorization the Senate sent back to the House earlier this month, plans for a vote last week were scrapped when the coalition that made possible passage of substantially the same bill in March fell apart. Instead, the House voted for a motion to disagree with the Senate’s amendments, to request a conference, and to appoint conferees. It remains to be seen whether the Senate opts to go to conference with the House, but a statement from a spokesperson for the Senate Majority Leader suggested he would support doing so. In the meantime, intelligence and law enforcement agencies cannot use the authorities the bill would renew and reform for they expired on 15 March except for investigations that started before that date.

At week’s beginning, it appeared as if the House would bring the amended “USA FREEDOM Reauthorization Act of 2020” (H.R. 6172) to the floor and possibly take a run at adding language that barely failed to get added during debate in the Senate that would further pare back the ability of federal law enforcement agencies to use the FISA process for surveillance. However, the Trump Administration more forcefully stated its objections to the amended bill, including a veto threat issued via Twitter, that caused Republican support for the bill to cave, and with it the chances of passage, for Republican votes were needed to pass the bill in the first place. Consequently, House Democratic Leadership explored the possibility of a clean vote on the Senate-amended bill, with the House Rules Committee reporting a rule for debate, but this effort was also scuttled as there were not the votes for passage of the bill, sending it to the White House. Instead, House Democratic Leadership opted to go to conference committee, which succeeded in a 284-122 proxy vote, one of the first taken under the new procedure. Thereafter, the House named the following conferees: House Judiciary Committee Chair Jerrold Nadler (D-NY) and Ranking Member Jim Jordan (R-OH); House Intelligence Committee Chair Adam Schiff (D-CA) and Ranking Member Devin Nunes (R-CA) and Representative Zoe Lofgren (D-CA).

House Democratic plans on the FISA reauthorization went from amendment to passing the bill the Senate passed to requesting a conference after the Democratic-Republican coalition that got the bill out of the House in March crumbled.  

As noted, this week, the Trump Administration’s opposition has stiffened with the President getting on the field via Twitter, the Department of Justice (DOJ) publicly stating its opposition, and House Republican leadership urging its Members to vote no on H.R.6172. Moreover, progressive Democrats and allied advocacy groups were pushing House Democratic Leadership to adopt provisions blocking the collection and surveillance of web browsing and search engine history under Section 215. Also, some House Democrats had announced their intention to vote against H.R. 6172 regardless of whether the Section 215 narrowing was added, and so it was not clear the Speaker had the votes to pass a bill the President had vowed to veto anyway.

On 26 May, President Donald Trump tweeted “I hope all Republican House Members vote NO on FISA until such time as our Country is able to determine how and why the greatest political, criminal, and subversive scandal in USA history took place!” On 27 May, Trump tweeted

If the FISA Bill is passed tonight on the House floor, I will quickly VETO it. Our Country has just suffered through the greatest political crime in its history. The massive abuse of FISA was a big part of it!

Also on 27 May, Assistant Attorney General Stephen Boyd released the following statement on H.R.6172:

The Department worked closely with House leaders on both sides of the aisle to draft legislation to reauthorize three national security authorities in the U.S.A. Freedom Act while also imposing reforms to other aspects of FISA designed to address issues identified by the DOJ Inspector General. Although that legislation was approved with a large, bipartisan House majority, the Senate thereafter made significant changes that the Department opposed because they would unacceptably impair our ability to pursue terrorists and spies. We have proposed specific fixes to the most significant problems created by the changes the Senate made. Instead of addressing those issues, the House is now poised to further amend the legislation in a manner that will weaken national security tools while doing nothing to address the abuses identified by the DOJ Inspector General.

Accordingly, the Department opposes the Senate-passed bill in its current form and also opposes the Lofgren amendment in the House. Given the cumulative negative effect of these legislative changes on the Department’s ability to identify and track terrorists and spies, the Department must oppose the legislation now under consideration in the House. If passed, the Attorney General would recommend that the President veto the legislation.

And yet this week, the head of the DOJ’s National Security Division John Demers said there is no pressing need for reauthorization at this time. He remarked in an interview:

We’re going to have to look at where we can fill in the gaps using criminal tools. They’re not perfect. Foreign partners are not crazy when we use their information as the basis of criminal tools, because we don’t have the same protections that we do to protect underlying information as we do on the national security side. We are going to do the best we can to fill those holes and keep those investigations going.

Two weeks ago, following Senate amendment and passage of H.R.6172, a DOJ spokesperson said of the bill, it “would unacceptably degrade our ability to conduct surveillance of terrorists, spies and other national security threats.”

Early in the week, Representatives Zoe Lofgren (D-CA) and Warren Davidson (R-OH) submitted an amendment along the lines of the language Senators Ron Wyden (D-OR) and Steve Daines (R-MT) that the Senate rejected by one vote to bar the collection of web browsing and internet search history via a FISA order under Section 215. Lofgren and Davidson had negotiated with other House Democratic stakeholders on language acceptable to them.

Regarding their amendment, in their press release, Lofgren and Davidson claimed “[t]he amendment – which is supported by Reps. Adam Schiff, Chair of the House Permanent Select Committee on Intelligence, and Jerrold Nadler, Chair of the House Judiciary Committee – is an outright prohibition: the government will not be able to use Section 215 to collect the websites that a U.S. person visits, the videos that a U.S. person watches, or the search queries that a U.S. person makes…[and] [s]pecifically:

  • If the government is not sure if you’re a U.S. person, but you could be, the government cannot get your internet activity without a Title I FISA warrant.
  • If the government wants to order a service provider to produce a list of everyone who has visited a particular website, watched a particular video, or made a particular search query: the government cannot make that order unless it can guarantee that no U.S. persons’ IP addresses, device identifiers, or other identifiers will be disclosed to the government.
    • This amendment does not allow for the incidental collection of U.S. persons’ web browsing or search information when the target is a specific-selection term that would or could produce such information.
  • This prohibition is a strict liability-type provision. (It isn’t a knowledge standard or a reasonable-belief standard. An order must not result in the production of a U.S. person’s web browsing or search information.)
  • If the order would or could result in the production of a U.S. person’s web browsing or search information, the government cannot order it without a Title I FISA warrant that must be narrowly tailored toward the subject of the warrant.

It appeared this amendment would be made in order during debate, but opposition from both the left and right in the House and among stakeholders made this untenable. The fact that the Lofgren/Davidson amendment was narrower in that it would only provide this protection to people in the United States whereas the Wyden/Daines amendment would have outright barred the practice under FISA led to opposition on the left. Early on 27 May, Wyden supported this language, but when House Intelligence Committee Chair Adam Schiff (D-CA) suggested that intelligence agencies could continue to collect web browsing and search histories of Americans, Wyden withdrew his support. Thereafter, House Democratic Leadership ultimately decided against allowing this amendment to have a vote.

In December, Lofgren and Davidson were among the Members who introduced the “Safeguarding Americans’ Private Records Act of 2020” (H.R.5675/S.3242) in both chambers. In their press release, the sponsors claimed “[t]he bill includes a host of reforms:

  • It would permanently end the flawed phone surveillance program, which secretly scooped up Americans’ telephone records for years.
  • It would close loopholes and prohibit secret interpretation of the law, like those that led to unconstitutional warrantless surveillance programs.
  • It would prohibit warrantless collection of geolocation information by intelligence agencies.
  • It would respond to issues raised by the Inspector General’s office by ensuring independent attorneys, known as amici, have access to all documents, records and proceedings of Foreign Intelligence Surveillance Court, to provide more oversight and transparency.

Notably, beyond revoking the authority for the NSA to restart the telephone collection program, the bill would also exclude from the definition of “tangible thing” in the Section 215 business records exception: Cell site location information, Global positioning system information, Internet website browsing information, and Internet search history information. The bill also contains language that would limit the use of Section 215 to only counterterrorism and foreign intelligence matters and limit the retention of any such material to three years unless it includes foreign intelligence. Moreover, the bill would increase the justification requirements the government must meet before a nondisclosure requirement (aka gag order) can be placed on a company subject to a Section 215 order.

Two week ago, the Senate amended and passed H.R. 6172 by an 80-16 vote. Consideration of the bill was stalled in March when some Senators pushed for amendments, a demand to which the Senate Majority Leader finally agreed, provided these amendments would need 60 votes to be adopted. Consequently, once COVID-19 legislation had been considered, the Senate returned to H.R.6172, and debated and voted upon three amendments, one of which was agreed to. Senators Pat Leahy (D-VT) and Mike Lee’s (R-UT) amendment to expand the amicus process during the FISA process prevailed by a 77-19 vote. In an op-ed in The Washington Post, Leahy and Lee argued

  • The key to our proposal is to substantially strengthen a program that currently allows FISA judges, in very limited circumstances, to appoint outside legal scholars — called “amici”— to independently analyze FBI surveillance requests that are particularly sensitive. Out of thousands of cases, FISA judges have called for such an independent review by a court-appointed “amicus” only 16 times. Yet this protection is critical because, unlike every courtroom you may have stepped into or any court in a TV drama, the FISA court is not adversarial — meaning there is only a government lawyer and a judge, but no one to advocate for Americans under surveillance.
  • We propose measures that would authorize and actively encourage judges in this secret court to seek independent amicus reviews in all sensitive cases — such as those involving significant First Amendment issues — thereby adding a layer of protection for those who will likely never know they have been targeted for secret surveillance.

As mentioned, Wyden and Daines offered an amendment to narrow the Section 215 exception to the Fourth Amendment’s requirement that a search requires a warrant. Section 215 currently allows for FISA court approved searches of business records and all tangible things in the course of a national security investigation, and the underlying text of H.R. 6172 would exclude cell site location and GPS location from Section 215. The Wyden/Daines amendment would also exclude web browsing and search engine histories.

As Wyden explained during debate,

With web browsing and searches, you are talking about some of the most intimate, some of the most personal, some of the most private details of the lives of Americans. Every thought that can come into people’s heads can be revealed in an internet search or in a visit to a website: their health histories, their medical fears, their political views, their romantic lives, their religious beliefs. Collecting this information is as close to reading minds as surveillance can get. It is the digital mining of the personal lives of the American people.

However, the amendment failed to reach the 60-vote threshold necessary for adoption under the rule of debate for H.R. 6172, failing by one vote as four Senators did not vote.

As for the underlying bill the Senate considered, in March, the House passed H.R. 6172 by a 278-136 vote, a bill to reauthorize three expiring FISA provisions used by the National Security Agency (NSA) primarily to conduct surveillance: the business records exception, roving wiretaps, and the “lone wolf” provision. These authorities had been extended in December 2019 to March 15, 2020. However, the Senate did not act immediately on the bill and opted instead to send a 77-day extension of these now lapsed authorities to the House, which did not to take up the bill. The Senate was at an impasse on how to proceed, for some Members did not favor the House reforms while others wanted to implement further changes to the FISA process. Consequently, Senate Majority Leader Mitch McConnell (R-KY) promised amendment votes when the Senate took up H.R.6172.

Moreover, H.R. 6172 ends the NSA’s ability to use the so-called call detail record (CDR) program that had allowed the agency to access data on many billions of calls. Nonetheless, the NSA shut down the program in 2018 due to what it termed technical problems. This closure of the program was included in the bill even though the Trump Administration had explicitly requested it also be reauthorized.

As mentioned, H.R. 6172 would reauthorize the business records exception, which includes “any tangible thing,” in FISA first instituted in the USA PATRIOT Act in 2001 but would reform certain aspects of the program. For example, if the Federal Bureau of Investigation (FBI) or NSA is seeking a business record under FISA for which a law enforcement agency would need to obtain a warrant, then the FBI or NSA will also need to obtain a warrant. Currently, this is not the case. Additionally, under H.R.6172, the FISA application process under Section 215 could not be used to obtain a person’s cell site location or GPS information. However, the FBI or NSA would still be able to use Title I of FISA to seek cell site location or GPS data for purposes of conducting electronic surveillance related to alleged foreign intelligence. The bill would require that prosecutors must inform defendants of the evidence derived from electronic surveillance unless doing so would harm national security.

Moreover, records obtained under Section 215 could be retained no longer than five years subject to a number of exceptions that may serve to make this limitation a dead letter. For example, if such records are deemed to have a “secret meaning” or are certified by the FBI as being vital to national security, then such records may be held longer than five years. Given the tendency of agencies to read their authority as broadly as possible and the past record of IC agencies, it is likely these authorities will be stretched as far as legally possible. It bears note that all restrictions are prospective, meaning that current, ongoing uses of Section 215 would be exempted. The business records provision would be extended until December 1, 2023 as are the other two expiring authorities that permit so-called roving wiretaps and allow for surveillance of so-called “lone wolves.”

For FISA applications under Title I (i.e. electronic surveillance), any agency seeking a FISA order to surveil will need to disclose to the FISA court any information that may call into question the accuracy of the application or any doubtful information. Moreover, certain FISA applications to surveil Americans or residents would need to spell out the proposed investigative techniques to the FISA court. Moreover, any FISA application targeting U.S. officials or candidates for federal office must be approved by the Attorney General in writing before they can be submitted. H.R.6172 would permit the suspension or removal of any federal official, employee, or contractor for misconduct before the FISA court and increases criminal liability for violating FISA from five to eight years. Most of these reforms seem aimed at those Members, many of whom are Republican, that were alarmed by the defects in the FISA surveillance process of Trump Campaign associate Cater Page as turned up by the Department of Justice’s Office of the Inspector General investigation. Some of these Members were opposed to the House Judiciary Committee’s initial bill, which they thought did not implement sufficient reforms to the larger FISA process.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Dueling COVID-19 Privacy Bills Released

Democratic stakeholders answer a Republican proposal on how to regulate privacy issues raised by COVID-19 contact tracing. The proposals have little chance of enactment and are mostly about positioning.  

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Late last week, a group of Democratic stakeholders on privacy and data security issues released the “Public Health Emergency Privacy Act” (S.3749), a bill that serves as a counterpoint to the “COVID-19 Consumer Data Protection Act” (S.3663) legislation introduced a few weeks ago to pose a solution to the privacy issues raised by contact tracing of COVID-19. However, the Democratic bill contains a number of provisions that many Republicans consider non-starters such as a private right of action for individuals and no preemption of state laws. It is not likely this bill will advance in the Senate even though it may possibly be moved in the House. S. 3749 was introduced by Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) and Representatives Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Suzan DelBene (D-WA). 

In a way, the “Public Health Emergency Privacy Act” makes the case that “Health Insurance Portability and Accountability Act” (HIPAA)/“Health Information Technology for Economic and Clinical Health Act” (HITECH Act) regulations are inadequate to protect the privacy and data of people in the United States for the reason that these regulations apply only to healthcare providers, their business associates, and other named entities in the healthcare system. Entities collecting healthcare information, even very sensitive information, are not subject to HIPAA/HITECH regulations and would likely be regulated, to the extent they are, by the Federal Trade Commission (FTC) or state agencies and attorneys general.

The “Public Health Emergency Privacy Act” would cover virtually all entities, including government agencies except for public health authorities (e.g. a state department of health or the Centers for Disease Control and Prevention), healthcare providers, service providers, people acting in a household capacity. This remarkable definition likely reflects plans announced around by the world by governments to eschew Google and Apple’s contact tracing app to develop its own. Moreover, it would also touch some efforts aside and apart from contact tracing apps. Moreover, this is the first bill introduced recently that proposes to treat public and private entities the same way with respect to how and when they may collect personal data.

The types of data protected under the act are “emergency health data” which are defined as “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” The bill then lists a number of examples of emergency health data that is sweeping and comprehensive. This term includes “information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual” related to testing for COVID-19 and related genetic and biometric information. Geolocation and proximity information would also be covered by this definition. Finally, emergency health data encompasses “any other data collected from a personal device,” which seems to cover all data collection from a person’s phone, the primary means of contact tracing proposed thus far. However, it would appear that data on people collected at the household or higher level may not be subject to this definition and hence much of the bill’s protections.

The authority provided to covered entities is limited. First, collection, use, and disclosure of emergency health data are restricted to good faith health purposes. And yet, this term is not defined in the act, and the FTC may need to define it during the expedited rulemaking the bill. However, it seems a fairly safe assumption the agency and courts would not construe using emergency health data for advertising would not be a good faith public health purpose. Next, covered entities must allow people to correct inaccurate information and the entity itself has the duty to take reasonable efforts on its own to correct this information. Covered entities must implement reasonable safeguards to prevent discrimination on the basis of these data, which is a new obligation placed on covered entities in a privacy or data security bill. This provision may have been included to bar government agencies collecting and using covered data for discriminatory purposes. However, critics of more expansive bills may see this as the camel’s nose under the tent for future privacy and data security bills. Finally, the only government agencies that can legally be provided emergency health data are public health authorities and then only “for good faith public health purposes and in direct response to exigent circumstances.” Again, the limits of good faith public health purposes remain unclear and then such disclosure can only occur in exigent circumstances, which likely means when a person has contracted or been exposed to COVID-19.

Covered entities and service providers must implement appropriate measure to protect the security and confidentiality of emergency health data, but the third member of the usual triumvirate, availability, is not identified as requiring protection.

The “Public Health Emergency Privacy Act” outright bars a number of uses for emergency health data:

  • Commercial advertising, recommendations for e-commerce, or machine learning for these purposes
  • Any discriminatory practices related to employment, finance, credit, insurance, housing, or education opportunities; and
  • Discriminating with respect to goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation

It bears note the covered data cannot be collected or disclosed for these purposes either.

The act contains very strong consent language. Covered entities may not collect, use, or disclose emergency health data unless a person has provided express affirmative consent, which requires knowing, informed choice that cannot be obtained through the use of deceptive practices nor inferred through a person’s inaction. However, there are significant exceptions to this consent requirement that would allow covered entities to collect, use, or disclose these data, including

  • protecting against malicious, deceptive, fraudulent, or illegal activity; or
  • detecting, responding to, or preventing information security incidents or threats;
    the covered organization is compelled to do so by a legal obligation.

But these purposes are valid only when necessary and solely for the fulfillment of named purpose.

In a related vein, covered entities must allow people to revoke their consent and it must be effective as soon as practicable but no later than 15 days afterward. Moreover, the person’s emergency health data must be destroyed or rendered not linkable within 30 days of revocation of consent.

Covered entities must provide clear and conspicuous notice before or at the point of data collection that explains how and why the emergency health data is collected, used, and disclosed. Moreover, it must also disclose the categories of recipients to whom a covered entity discloses covered data. This notice must also disclose the covered entity’s data retention policies and data security policies and practices but only with respect to emergency health data. The notice must also inform consumers on how they may exercise rights under the act and how to file a complaint with the FTC.

There would also be a public reporting requirement for those covered entities collecting, using, or disclosing the emergency health data of 100,000 or more people. Every three months, these entities would need to report on aggregate figures on the number of people from whom data was collected, used, and disclosed. These reports must also detail the categories of emergency health data collected, used and disclosed and the categories of third parties with whom such data are shared.

The bill requires covered entities to destroy or make not linkable emergency health data 60 days after the Secretary of Health and Human Services declares the end of the COVID-19 public health emergency, or a state does so, or 60 days after collection.

The FTC would be given the daunting task of beginning a rulemaking 7 days after enactment “to ensure a covered organization that has collected, used, or disclosed emergency health data before the date of enactment of this Act is in compliance with this Act, to the degree practicable” that must be completed within 45 days. There is also a provision requiring the Department of Health and Human Services (HHS) to issue guidance so that HIPAA/HITECH Act regulated entities do not need to meet duplicative requirements, and, moreover, the bill exempts these entities when operating under the HIPAA/HITECH Act regulations.

The FTC, state attorneys general, and people would be able to enforce this new act through a variety of means. The FTC would treat violations as f they were violations of regulation barring a deceptive or unfair practice, meaning the agency could levy fines in the first instance of more than $43,000 a violation. The FTC could seek a range of other relief, including injunctions, restitution, disgorgement, and remediation. However, the FTC could go to federal court without having to consult with the Department of Justice, which is a departure from current law and almost all the privacy bills introduced in this Congress. The FTC’s jurisdiction would be broadened to include common carriers and non-profits for purposes of this bill, too. The FTC would receive normal notice and comment rulemaking authority that is very broad as the agency would be free to promulgate any regulations it sees necessary to effectuate this act.

State attorneys general would be able to seek all the same relief the FTC can so long as the latter is not already bringing an action. Moreover, any other state officials empowered by state statute to bring similar actions may do so for violations of this act.

People would be allowed to sue in either federal or state court to vindicate violations. The bill states that any violation is to be considered an injury in fact to forestall any court from finding that a violation does not injure the person, meaning her suit cannot proceed. The act would allow people to recover between $100-1000 for any negligent violation and between $500-5000 for any reckless, willful, or intentional violation with no cap on total damages. People may also ask for and receive reasonable attorney’s fees and costs and any equitable or declaratory relief a court sees fit to grant. Moreover, the bill would disallow all pre-lawsuit arbitration agreements or waivers of rights. Much of the right to sue granted to people by the “Public Health Emergency Privacy Act” will be opposed by many Republicans and industry stakeholders.

The enforcement section raises a few questions given that entities covered by the bill include government agencies. Presumably, the FTC cannot enforce this act against government agencies for the very good reason they do not have jurisdiction over them. However, does the private right of action waive the federal and state government’s sovereign immunity? This is not clear and may need clarification of the bill is acted upon in either chamber of Congress.

This bill would not preempt state laws, which, if enacted, could subject covered entities to meeting more than one regime for collecting, using, and disclosing emergency health data.

Apart from contact tracing, the “Public Health Emergency Privacy Act” also bars the use of emergency health data to abridge a person’s right to vote and it requires HHS, the United States Commission on Civil Rights, and the FTC to “prepare and submit to Congress reports that examines the civil rights impact of the collection, use, and disclosure of health information in response to the COVID–19 public health emergency.”

Given the continued impasse over privacy legislation, it is little wonder that the bill unveiled a few weeks ago by four key Senate Republicans takes a similar approach that differs in key aspects. Of course, there is no private right of action and it expressly preempts state laws to the contrary.

Generally speaking, the structure of the “COVID–19 Consumer Data Protection Act of 2020” (S.3663) tracks with the bills that have been released thus far by the four sponsors: Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The FTC and state attorneys general would enforce the new protections.

The scope of the information and entities covered is narrower than the Democratic bill. “Covered data” is “precise geolocation data, proximity data, a persistent identifier, and personal health information” but not aggregated data, business contact information, de-identified data, employee screening data, and publicly available information. Those entities covered by the bill are those already subject to FTC jurisdiction along with common carriers and non-profits, but government agencies are not, again unlike the bill put forth by Democrats. Entities must abide by this bill to the extent they “collect[], process[], or transfer[] such covered data, or determine[] the means and purposes for the collection, processing, or transfer of covered data.”

Another key definition is how an “individual” is determined, for it excludes any person acting in her role as an employee and almost all work-related capacities, making clear employers will not need to comply with respect to those working for them.

“Personal health information” is “information relating to an individual that is

  • genetic information of the individual; or
  • information relating to the diagnosis or treatment of past, present, or future physical, mental health, or disability of the individual; and
  • identifies, or is reasonably linkable to, the individual.

And yet, this term excludes educational information subject to the “Family Educational Rights and Privacy Act of 1974” or “information subject to regulations promulgated pursuant to the HIPAA/HITECH Acts.

The “COVID–19 Consumer Data Protection Act of 2020” bars the collection, processing, or transferring of covered data for a covered purpose unless prior notice is provided, a person has provided affirmative consent, and the covered entity has agreed not to collect, process, or transfer such covered data for any other purpose than those detailed in the notice. However, leaving aside the bill’s enumerated allowable purposes for which covered data may collected with consent, the bill provides a number of exemptions from this general bar. For example, collection, processing, or transfers necessary for a covered entity to comply with another law are permissible apparently in the absence of a person’s consent. Moreover, a covered entity need not obtain consent for operational or administrative tasks not disclosed in the notice provided to people.

The act does spell out “covered purposes” for which covered entities may collect, process, or transfer covered data with consent after notice has been given:

  • Collecting, processing, or transferring the covered data of an individual to track the spread, signs, or symptoms of COVID–19.
  • Collecting, processing, or transferring the covered data of an individual to measure compliance with social distancing guidelines or other requirements related to COVID–19 that are imposed on individuals under a Federal, State, or local government order.
  • Collecting, processing, or transferring the covered data of an individual to conduct contact tracing for COVID–19 cases.

Covered entities would be required to publish a privacy policy detailing for which of the above covered purposes a person’s covered data would be collected, processed, or transferred. This policy must also detail the categories of entities that receive covered data, its data retention and data security policies.

There would be reporting requirements that would affect more covered entities than the Democratic bill. Accordingly, any covered entity collecting, processing or transferring covered data for one of the enumerated covered processes must issue a public report 30 days after enactment and then every 60 days thereafter.

Among other provisions in the bill, people would be able to revoke consent, a request that covered entities must honor within 14 days. Covered entities must also ensure the covered data are accurate, but this requirement falls a bit short of people being granted to right to correct inaccurate data as they would, instead, be merely able to report inaccuracies. There is no recourse if a covered entity chooses not to heed these reports. Covered entities would need to “delete or de-identify all covered data collected, processed, or transferred for a [covered purpose] when it is no longer being used for such purpose and is no longer necessary to comply with a Federal, State, or local legal obligation, or the establishment, exercise, or defense of a legal claim.” Even though there is the commandment to delete or de-identify, the timing as to when that happens seems somewhat open-ended as some covered entities could seek legal obligations to meet in order to keep the data on hand.

Covered entities must also minimize covered data by limiting collection, processing, and transferring to “what is reasonably necessary, proportionate, and limited to carry out [a covered purpose.]” The FTC must draft and release guidelines “recommending best practices for covered entities to minimize the collection, processing, and transfer of covered data.” However, these guidelines would most likely be advisory in nature and would not carry the force of law or regulation, leaving covered entities to disregard some of the document if they choose. Covered entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of such data,” a mandate broader than the duty imposed by the Democratic bill.

The FTC and state attorneys general would enforce the new regime. The FTC would be able to seek civil fines for first violations in the same way as the Democrat’s privacy bill. However, unlike the other bill, the Republican bill would nullify any Federal Communications Commission (FCC) jurisdiction to the extent it conflicted with the FTC’s in enforcement of the new act. Presumably, this would address jurisdictional issues raised by placing common carriers under FTC jurisdiction when they are usually overseen by the FCC. Even though state laws are preempted, state attorneys general would be able to bring actions to enforce this act at the state level. And, as noted earlier, there is no private right of action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Privacy Legislation in the Time of Pandemics

Now that Apple and Google have released their Exposure Notifications API and numerous nations around the world are adopting or adapting it in order to trace exposure of COVID-19, numerous concerns and questions about privacy and data security have been raised about this new form of mass surveillance. Even before the development of this API, Members of Congress and civil liberties and privacy advocates were calling for limits for how and to what extent personal data may be used to fight the pandemic. The tension between the exigencies of the current emergency and privacy will likely spill over into the process to enact federal privacy legislation. For example, four Senate Republicans announced plans to introduce the “COVID-19 Consumer Data Protection Act,” and while the prospects for this particular bill do not look good at present, an exploration of other, more broadly gauged privacy bills may inform policy considerations on how personal data would be collected, processed, and disclosed during a public health emergency.

And, as privacy legislation continues to be an issue at the forefront of stakeholders’ minds – to the extent this and other non-COVID-19 issues have purchase during a pandemic – policymakers will likely scrutinize further the legitimate and non-legitimate use of personal data in a public health emergency. However, it is likely that even if some of the strictest of privacy bills pass Congress, regulated entities and government agencies would still possess tremendous latitude to access personal data in the event of public health emergencies. Almost all the comprehensive privacy bills introduced in Congress allow provide exceptions for the use, sharing, and disclosure of information that may otherwise be considered private, especially if there is imminent risk to life or health. Moreover, given that many experts are saying that de-identified or anonymized data are sufficient for tracking COVID-19, the provisions in those bills that usually carve out these types of data from the personal data subject to regulation are also of interest.

First, a threshold matter bears discussion. For purposes of this article, let’s assume a pandemic in which a highly contagious respiratory disease with death rates of 1-3% qualifies as the type of situation where a person is at risk for purposes of using the exception in almost all the bills for a situation where a person’s explicit consent is not needed for collection and processing.

Turning to the bills that have been introduced to regulate privacy at the federal level, let’s look at of the most restrictive bills. Senator Ed Markey’s (D-MA) “Privacy Bill of Rights” (S.1214) is one of the few bills on which the Electronic Privacy Information Center (EPIC) bestowed an A and is generally seen as far more favorable among privacy and civil liberties advocates than many of the bills introduced this Congress on privacy. However, even in this bill, there are a number of exceptions that would allow tech companies like Facebook to share a person’s location data quite likely without her consent.

Under S.1214, covered entities must generally obtain the affirmative, express, knowing consent of consumers before they can collect, use, retain, share, or sell personal information through the provision of notice. And yet, in the Privacy Bill of Rights, it is provided that “a covered entity shall not be required to obtain opt-in approval…if the covered entity, in good faith, believes danger of death or serious physical injury to any individual requires use, access, or disclosure without delay of personal information relating to the emergency.” It would not be a hard case to make that a pandemic like the current one with COVID-19 would function to allow a large collector and processor of personal data to share information with, say, the Centers for Disease Control and Prevention. However, the more interesting scenarios arise when it comes to public health emergencies like a bad year for the seasonal flu which is not quite an epidemic but still has significant public health effects. For example, during the 2018-2019 flu season in the U.S., there were more than 34,000 deaths and nearly half a million hospitalizations. Using such authorities to fight the flu seems like a closer case and may not pass muster under this standard.

Another means by which data could be shared under S.1214 would be through the de-identification of data. The legislation defines de-identified data as “information that cannot reasonably identify, relate to, describe, or be capable of being associated with or linked to, directly or indirectly, a particular individual.” Any de-identified data is to be considered publicly available and not personal information and therefore largely exempted from regulation. Obviously, Markey intended that this exclusion would create the incentive to move more covered entities to de-identify the personal information they hold, collect, share, and process to protect against breaches but also future repurposing of the information. However, according to a number of experts, aggregated anonymized data (which is not exactly the same as de-identified) would be useful for public health officials in the fight to flatten the curve and control future outbreaks. Consequently, Google could de-identify data and then turn it over to the Department of Homeland Security which could then utilize it. In this vein, there have been articles in the media detailing the Trump Administration’s efforts to obtain aggregated, anonymous data in order to better understand and ideally prevent the transmission of the respiratory virus.

Of course, a number of the bills bestow heightened protection for location data. For example, the Energy and Commerce Committee’s discussion draft released in mid-December provides a heightened level of protection for “sensitive information,” a subset of “covered information.” Among the data to be considered “sensitive information” are “precise geolocation information.” Assuming the term covers all location data that can be gleaned from a smartphone, the bill allows for the collection, processing, and sharing of sensitive information only after express, affirmative consent so long as clear and concise notice is given to the individual before consent is provided. Consequently, in order for location data to be processed, a covered entity could merely write into its privacy policy an exception for collecting and sharing sensitive information during public emergencies that a person would be free to assent to or reject. It is likely a significant number of people would accept such a term.

In any event, there is language in the bill that may not require covered entities to include such language in their privacy notices. In the discussion draft, there are explicit exceptions to the general rule under the bill that covered entities may not process certain classes of sensitive information absent notice and express consent that may also be used. Notably, a carveout is established for processing personal data for “preventing imminent danger to the personal safety of an individual or group of individuals.” Therefore, a covered entity could process the following types of information, most of which are defined as sensitive information: 

  • precise geolocation information linkable to an identifiable individual or [consumer device;]
  • covered information to attribute a [consumer device or devices] to a specific individual using probabilistic methods, such as algorithms or usage patterns;
  • covered information obtained through a microphone or camera of a consumer device;
  • the contents of an individual’s communications or the parties to such communications; or
  • health information.

I would think that there would be agreement that not all these types of personal data would be needed to fight a pandemic even they could be used from a legal perspective and would result in a backlash to government efforts to quell outbreaks of a disease.

Finally, a few closing thoughts. The Department of Health and Human Services’ (HHS) Office for Civil Rights (OCR) is availing itself of exceptions written into the HIPAA/HITECH regulations to allow limited sharing and disclosure of protected health information (PHI) to some federal and state health agencies to combat COVID-19. However, this pertains only to entities regulated under those regulations, mostly healthcare providers and their business associates. Nonetheless, this demonstrates precedence for writing into regulation and statute exceptions to address public emergencies, which is not terribly surprising, of course.

Moreover, almost all the bills provide exceptions for most of the requirements to respect and honor the privacy choices of people if it is necessary to obey a federal law and in other similar situations. Therefore, Congress could always come after a federal privacy statute and pass another bill requiring private sector entities to provide private data during public emergencies, thus broadening this exception in a federal privacy statute. Then covered entities would need to turn over certain data or face legal liability.

Finally, as with diminutions of privacy and civil liberties for national security emergencies as happened after September 11, 2001, policymakers would be wise to consider whether such expansions of how people’s information is collected and used is, in a sense, a one-way ratchet. Governments rarely want to surrender the authority provided them in times of crisis, often times on the rationale that the authority will be needed to act quickly to address future, unforeseen crises. Consequently, the enactment of a privacy bill may be a Trojan Horse through which increased, legal surveillance occurs, but in the name of public health and safety, and not national security.

What’s more, under some of the privacy bills, there would no fast ways to stop illegal collection and processing of personal data. It is not hard to envision a scenario where the U.S. government and private sector entities agree that the exigencies of another public health crisis justify illegal collection and processing of personal data. Since many Republicans and other stakeholders oppose a private right of action, the only means of challenging such activity would be through the federal political system, which is not typically fast to address civil liberties violations where fear has taken root. Therefore, a private right of action or enforcement by state attorneys general may be the only feasible checks in such a situation as a court may conceivably enjoin such activities.

Furthermore, some health and climate experts are projecting that the ongoing warming of the planet and other facets of global warming (e.g. vanishing habitats for some animals brings them closer to humans, increasing the chances of zoonotic diseases jumping from animals to humans like COVID-19). Consequently, we may be facing a future of more frequent such diseases that turn into epidemics and even pandemics if policymakers do not act quickly during the next epidemic. And, therefore, privacy during a public health emergency may become more than a once in 100 years event.

Moreover, if privacy legislation is not enacted, private sector companies may see the use of big data by governments during the COVID-19 crisis as an implicit approval of its data processing practices, many of which are objectionable to many experts and across the political spectrum. Will successes in collecting and processing big data during the crisis let the air out of the movement to enact privacy legislation? Will it inure most people to the risks to and infringements of privacy? It may very well do so.

CCPA 2.0 Backers Submit Ballot Initiative for November Election

A new California ballot initiative is submitted for approval that would revise the CCPA and impose new requirements starting in 2023, if enacted. This new statute could not be amended to weaken it per ballot initiative law in California.

The organization that forced action on the “California Consumer Protection Act” (CCPA) (AB 375) by getting its proposed measure approved for California’s November 2018 ballot announced that it has the sufficient number of signatures to get its preferred revision of the CCPA on the ballot for this fall’s coming election. If this effort succeeds, and Californians vote for this measure, it would throw the state’s efforts to establish and enforce the new CCPA into doubt as the new regime would commence in 2023 and there would likely again be a rulemaking process to implement the new statute. It is possible that should this initiative be placed on the November ballot, new life could be breathed into Congressional efforts to pass a national privacy and data protection bill.

The Californians for Consumer Privacy claimed in its press release “it is submitting well over 900,000 signatures to qualify the “California Privacy Rights Act” (CPRA) for the November 2020 ballot.” The Californians for Consumer Privacy have been negotiating extensively with stakeholders on the CCPA’s follow on bill and actually released a draft bill last fall. Nonetheless, even though some stakeholders were able to secure desired changes in the base text, others were not. This fact along with the reality that it is next to impossible to weaken or dilute statutes added to the California Code through ballot initiative suggest a serious campaign to defeat this ballot initiative.

In a summary, the Californians for Consumer Privacy claimed the CPRA would:

1) Make it almost impossible to weaken privacy in California in the future, absent a new initiative allowing such weakening. CPRA would give the California Legislature the power to amend the law via a simple majority, but any amendment would have to be “in furtherance of the purpose and intent” of CPRA, which is to enhance consumer privacy. This would protect privacy in California from a business onslaught to weaken it in Sacramento.

2) Establish a new category of sensitive personal information (SPI), and give consumers the power to restrict the use of it. SPI includes: SSN, DL, Passport, financial account info, precise geolocation, race, ethnicity, religion, union membership, personal communications, genetic data, biometric or health information, information about sex life or sexual orientation.

3) Allow consumers to prohibit businesses from tracking their precise geolocation for most purposes, including advertising, to a location within roughly 250 acres.

a. This would mean no more tracking consumers in rehab, a cancer clinic, at the gym (for how long) at a fast food restaurant (how often), sleeping in a separate part of the house from their partner (how recently), etc., all with the intention of monetizing that most intimate data that makes up people’s lives.

4) Add email +password to the list of items covered by the ‘negligent data breach’ section to help curb ID theft. Your sensitive information (i.e. your health or financial data)would now include your email and password; and if mishandled, you would be able to sue the business for damages, without having to prove an actual financial loss (and let’s face it—who can ever link the data breach from one company, to the ID theft six months later.  It’s impossible, and this would change that). 

5) Establish the California Privacy Protection Agency to protect privacy for Californians, funded with $10M from the State’s General Fund

a. This funding would equate to roughly the same number of privacy enforcement staff as the FTC has to police the entire country (the FTC has 40 privacy professionals).

A predecessor bill, “The California Privacy Rights and Enforcement Act of 2020” (CPREA), was released last fall (See 3 October 2019 Technology Update for write up.) At the time, Californians for Consumer Privacy Chair and Founder Alistair Mactaggart explained his reasoning for a second ballot initiative: “First, some of the world’s largest companies have actively and explicitly prioritized weakening the CCPA…[and] [s]econd, technological tools have evolved in ways that exploit a consumer’s data with potentially dangerous consequences.”

As noted, changes to the California Code made by ballot initiative are much harder to change or modify than the legislative route for enacting statutes. Notably, the CPRA would limit future amendments to only those in furtherance of the act, which would rule out any attempts to weaken or dilute the new regime. Consequently, industry and allied stakeholders can be expected to fight this ballot initiative.

As mentioned, stakeholders in Congress may be motivated by this new effort to resolve differences and reach agreement on a bill to govern privacy and protect data at the federal level, sweeping aside state laws like the CPRA. However, a new, stronger law in California may cause key Democrats to dig in and insist on the policy changes Republicans have been reluctant to give way on such as a federal private right of action. In such a scenario, it is conceivable Democrats would use their leverage to extract even more changes from Republicans. As it stands, Republicans have moved a fair distance from their original positions on privacy and data protection and may be willing to cede more policy ground.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Senate Commerce Republicans Vow To Introduce Privacy Bill To Govern COVID-19 Apps and Tech

Key Republican stakeholders on privacy legislation float a bill on COVID-19 relating to privacy that seems unlikely to garner the necessary Democratic buy-in to advance.  

Late last week, key Republicans on the Senate Commerce, Science, and Transportation announced they would introduce the “COVID-19 Consumer Data Protection Act” that provide new privacy and data security protections for the use of a COVID-19 contact tracing app and similar technologies. To date, text of the legislation has not been released and so any analysis of the bill is derived from a short summary issued by the committee and reports from media outlets that have apparently been provided a copy of the bill.

Based on this information, to no great surprise, the basic structure of the bill tracks privacy and data protection legislation previously introduced by the co-sponsors of the new bill: Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The Federal Trade Commission (FTC) and state attorneys general would enforce the new protections, and as there was no mention of a private right of action, and given these Members opposition to such provisions, it is likely the bill does not provide such redress. Moreover, according to media reports, the bill would preempt state laws contrary to its provision, which would be another likely non-starter among Democrats.

Wicker, Thune, Moran, and Blackburn claimed their bill “would provide all Americans with more transparency, choice, and control over the collection and use of their personal health, geolocation, and proximity data…[and] would also hold businesses accountable to consumers if they use personal data to fight the COVID-19 pandemic” as they asserted in their press release.

Wicker, Thune, Moran, and Blackburn provided this summary of the “COVID-19 Consumer Data Protection Act:”

  • Require companies under the jurisdiction of the Federal Trade Commission to obtain affirmative express consent from individuals to collect, process, or transfer their personal health, geolocation, or proximity information for the purposes of tracking the spread of COVID-19.
  • Direct companies to disclose to consumers at the point of collection how their data will be handled, to whom it will be transferred, and how long it will be retained.
  • Establish clear definitions about what constitutes aggregate and de-identified data to ensure companies adopt certain technical and legal safeguards to protect consumer data from being re-identified.
  • Require companies to allow individuals to opt out of the collection, processing, or transfer of their personal health, geolocation, or proximity information.
  • Direct companies to provide transparency reports to the public describing their data collection activities related to COVID-19.
  • Establish data minimization and data security requirements for any personally identifiable information collected by a covered entity.
  • Require companies to delete or de-identify all personally identifiable information when it is no longer being used for the COVID-19 public health emergency.
  • Authorize state attorneys general to enforce the Act.

If such legislation were to pass, it would add to the patchwork of privacy and data security bills already enacted that are geared to addressing certain sectors or populations (e.g. the “Health Insurance Portability and Accountability Act” (HIPAA) protects some healthcare information and “Children’s Online Privacy Protection Act” (COPPA) broadly protects children online.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

The BROWSER Act (S. 1116)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 here.

My apologies. I thought I had posted this write up and others on the various privacy and data protection bills. In any event, I’ll be doing some remedial work of a sort in putting these materials up, which is not to say I see any great movement on Congress passing a U.S. privacy and data protection bill.

In this post, we will examine one of the Senate bills sponsored by Senators Marsha Blackburn (R-TN), Tammy Duckworth (D-IL), and Martha McSally (R-AZ): the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116). S. 1116 would set up an enhanced notice and consent regime for consumers policed by the Federal Trade Commission (FTC) but only for certain classes of private sector entities collecting, sharing, selling, and using consumer information, mainly broadband providers and so-called “edge Providers,” that is entities like Google and Facebook that provide services online. This bill is much closer to the current FTC means for regulating privacy and data security even though the scope of the agency’s jurisdiction to police privacy practices for some types of consumer information would be expanded.

As noted, this bill would cover only “broadband internet access service[s]” and “edge service[s],” which as these terms are defined in the bill would mostly be technology and communications companies. Therefore, this bill would sweep much more narrowly than many of the other privacy bills introduced thus far. Accordingly, S. 1116 defines “broadband internet access service” as “a mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all internet endpoints, including any capabilities that are incidental to and enable the operation of the communications service, but excluding dial-up internet access service.” The bill also provides a definition of “edge service:” “a service provided over the internet—

for which the provider requires the user to subscribe or establish an account in order to use the service;

that the user purchases from the provider of the service without a subscription or account;

by which a program searches for and identifies items in a database that correspond to keywords or characters specified by the user, used especially for finding particular sites on the world wide web; or

by which the user divulges sensitive user information; and

includes a service described in subparagraph (A) that is provided through a software program, including a mobile application.

Clearly, big technology companies like Facebook, Google, Instagram, Amazon, etc. would be classified as “edge providers.” Moreover, the definition of broadband internet access service would clearly include all of the internet service providers like Comcast or AT&T but would also seem to include cell phone service providers like Verizon and T-Mobile.

All covered service providers must “provide a user of the service with clear and conspicuous notice of the privacy policies of the provider with respect to the service.” Additionally, covered service providers must also give users “clear and conspicuous advance notice of any material change to the privacy policies of the provider with respect to the service.”

Whether consumers need to opt-in or opt-out on data use will turn on whether the information is “sensitive” or not. Under S. 1116, “sensitive user information” includes any of the following:

  • Financial information.
  • Health information.
  • Information pertaining to children under the age of 13.
  • Social Security number.
  • Precise geolocation information.
  • Content of communications.
  • Web browsing history, history of usage of a software program (including a mobile application), and the functional equivalents of either.

Among the information that would be deemed non-sensitive under the bill are meta-data (aka call detail records) from usage of a phone such as the addressee of a communication and the time, one’s order history from a site like Amazon, matters relating to employment, and other categories of information not enumerated above. Additionally, the bill deems “precise geolocation information” as sensitive information, suggesting “geolocation information” that is less than precise might be non-sensitive. So, perhaps a trip to a mall would not be considered “precise” but the stores a customer visits might be?

Covered service providers would need to “obtain opt-in approval from a user to use, disclose, or permit access to the sensitive user information of the user.” However, what constitutes the “approval” necessary to satisfy this requirement is not spelled out in the bill. Conversely, the provider of covered services must only offer consumers the option to opt out of the use, disclosure, and accessing of their non-sensitive personal information. Again “approval” is a key word as covered service providers need only obtain a consumer’s approval in order to opt-out.

As is usually the case, there are some exceptions to this seemingly general rule against using, collecting, sharing, or selling sensitive user information. Notably, in the following situations, covered service providers need not obtain opt-in approval from consumers:

(1) In providing the covered service from which the information is derived, or in providing services necessary to, or used in, the provision of the service.

(2) To initiate, render, bill for, and collect for the covered service.

(3) To protect the rights or property of the provider, or to protect users of the covered service and other service providers from fraudulent, abusive, or unlawful use of the service.

(4) To provide location information or non-sensitive user information—

(A) to a public safety answering point, emergency medical service provider or emergency dispatch provider, public safety, fire service, or law enforcement official, or hospital emergency or trauma care facility, in order to respond to the request of the user for emergency services;

(B) to inform the legal guardian of the user, or members of the immediate family of the user, of the location of the user in an emergency situation that involves the risk of death or serious physical harm; or

(C) to providers of information or database management services solely for purposes of assisting in the delivery of emergency services in response to an emergency.

(5) As otherwise required or authorized by law.

Covered service providers would not be able to require consumers to waive their privacy rights in exchange for use of a service. The bill stipulates that “[a] provider of a covered service may not—

(1) condition, or effectively condition, provision of the service on agreement by a user to waive privacy rights guaranteed by law or regulation, including this Act; or

(2) terminate the service or otherwise refuse to provide the service as a direct or indirect consequence of the refusal of a user to waive any privacy rights described in paragraph (1).”

The FTC would enforce this new privacy scheme under its existing Section 5 powers to police unfair and deceptive practices and crucially not as if a violation of an existing FTC regulation against unfair and deceptive practices. If the FTC is seeking to punish a violation of such a regulation, it may seek civil fines in the first instance. And, this is in contrast to the FTC’s general powers to punish unfair and deceptive practices with respect to data security and privacy violations, which is limited to monetary remedies in the form of equitable relief such as disgorgement and restitution. The BROWSER Act would be at odds with most other privacy bills that contain language such as “[a] violation of this Act or a regulation promulgated under this Act shall be treated as a violation of a rule under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)) regarding unfair or deceptive acts or practices.”

Again unlike other bills, the BROWSER Act does not provide the FTC with the authority to promulgate regulations under the Administrative Procedures Act (APA) process, and to the extent the agency would be able to write regulations to implement the bill, it would be under the much more lengthy and involved Moss-Magnuson procedures that have effectively halted the FTC’s regulatory activity (seeIt’s  Time  to  Remove  the  “Mossified” Procedures  for  FTC  Rulemaking” for a summary of these procedures.) Therefore, the FTC would essentially extend to privacy regulation its current practice of penalizing companies for not maintaining “reasonable” data security standards on a case-by-case basis and not providing any bright lines to assure companies of the practices.

The FTC’s jurisdiction would be expanded, however, to police the privacy practices under the bill for broadband providers that would otherwise be subject to the jurisdiction and enforcement powers of the Federal Communications Commission (FCC.)

The bill would preempt state privacy laws. To wit, “[n]o State or political subdivision of a State shall, with respect to a provider of a covered service subject to this Act, adopt, maintain, enforce, or impose or continue in effect any law, rule, regulation, duty, requirement, standard, or other provision having the force and effect of law relating to or with respect to the privacy of user information.” Of course, preemption of state laws is a non-starter for many Democrats but a sine non qua for many Republicans, leaving this as an area of ongoing dispute.

Regarding another issue that has split Democrats and Republicans in the past regarding data security legislation, the BROWSER Act would not provide a role for state attorneys general to enforce the new regulatory regime. However, Republicans may be willing to give on this issue provided consumers have no private right of action, and the BROWSER Act would not allow consumers to sue those providing covered services for violating the bill.

© Michael Kans and Michael Kans Blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans and Michael Kans Blog with appropriate and specific direction to the original content.

Technology Policy Update (10 April)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 here.

Here are the articles from this edition:

  • “Paper” Hearing on COVID-19 and Big Data
  • DOD Revises Cybersecurity Model For Contractors; Accreditation Body Holds Webinar
  • EC Calls For EU-Wide Approach on Big Data and COVID-19
  • EU’s Data Supervisor Calls For Limits On Using Data In Fighting COVID-19
  • EDPB Fast Tracks Privacy and Processing Guidance For COVID-19
  • Warner Asks OMB For Uniform Guidance On Contractors
  • OCR Announces HIPAA Enforcement Discretion
  • Executive Order Formalizes Review of Foreign Investment in Telecommunications
  • CISA Guides Agencies On Telework Best Practices and Security