Further Reading, Other Developments, and Coming Events (5 October)

Coming Events

  • On 6 October, the House Administration Committee’s Elections Subcommittee will hold a virtual hearing titled “Voting Rights and Election Administration: Combatting Misinformation in the 2020 Election.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • The House Intelligence Committee released an unclassified executive summary of “The China Deep Dive: A Report on the Intelligence Community’s Capabilities and Competencies with Respect to the People’s Republic of China.” In a press release, the committee “found that “the United States’ (U.S.) Intelligence Community (IC) has not sufficiently adapted to a changing geopolitical and technological environment increasingly shaped by a rising China and the growing importance of interlocking non-military transnational threats, such as global health, economic security, and climate change.” The committee further claimed “[a]bsent a significant realignment of resources, the U.S. government and intelligence community will fail to achieve the outcomes required to enable continued U.S. competition with China on the global stage for decades to come, and to protect the U.S. health and security.”
    • The committee stated that while its “review was scoped to assess the IC’s efforts against the China target, some of its findings address not merely China, but also broader issues foundational to the IC’s structure and continued ability to operate in a 21st century environment—an environment shaped by the ravages of COVID-19.”
    • The committee made the following findings:
      • Intelligence Community [REDACTED] compete with China. Absent a significant realignment of resources, the U.S. government will fail to achieve the outcomes required to enable U.S. competition with China on the global stage.
      • The Intelligence Community places insufficient emphasis and focus on “soft,” often interconnected long-term national security threats, such as infectious diseases of pandemic potential and climate change, and such threats’ macroeconomic impacts on U.S. national security. This could jeopardize the future relevance of the Intelligence Community’s analysis to policymakers on certain long-range challenges, particularly given the growing importance of these policy challenges to decision-makers and the public and the devastating impact of the current pandemic on U.S. national life.
      • The Intelligence Community has failed to fully achieve the integration objectives outlined in the 2004 Intelligence Reform and Terrorism Prevention Act (IRTPA) for targets and topics unrelated to counterterrorism.
      • The Intelligence Community is struggling to adapt to the increasing availability and commodification of data, [REDACTED].
      • The increasing pace of global events, fueled by the rise of social media and mobile communications, will continue to stress the IC’s ability to provide timely and accurate analysis within customers’ decision-making window.
      • The future successful application of artificial intelligence, machine learning, and other advanced analytic techniques will be integral enablers for the U.S. national security enterprise. Conversely, there is a high degree of strategic risk associated with stasis and a failure to modernize.
      • Existing intelligence requirement prioritization mechanisms [REDACTED] particularly with respect to decision-makers outside of the Department of Defense.
    • The committee made the following recommendations broadly about the IC:
      • The Committee recommends the creation of a bipartisan, bicameral congressional study group to evaluate the current organization of and authorities provided to the intelligence community, with the express goal of making necessary reforms to the National Security Act of 1947 and the Intelligence Reform and Preventing Terrorism Act (IRPTA) of 2004.
      • The Executive Branch, in consultation with congressional intelligence and appropriations committees, must undertake a zero-based review of all intelligence program expenditures, assess the programs’ continued relevance to forward-looking mission sets, such as the increased relevance of “soft” transnational threats and continued competition with China, and take immediate corrective action to align taxpayer resources in support of strategic requirements.
      • An external entity should conduct a formal review of the governance of open-source intelligence (OSINT) within the intelligence community, and submit to congressional intelligence and appropriations committees a proposal to streamline and strengthen U.S. government capabilities.
      • The Office of the Director of National Intelligence (ODNI) should identify shared artificial intelligence and machine learning (AI/ML) use cases across the intelligence community and use the its coordinating and budgetary authorities to consolidate spending, expertise, and data around shared community-wide AI/ML capabilities.
    • Specific to the People’s Republic of China, the committee stated
      • ODNI should strengthen its ability to effectively track [REDACTED]
      • The IC should [REDACTED] existing intelligence collection prioritization frameworks, particularly to inform resource allocation decisions.
      • The IC should formalize and broaden programs designed to mentor the next generation of China analysts. Agencies should leverage best practices from across the community, and develop internal Senior Steering Groups to prioritize investments in specific China-focused programs.
      • The IC should conduct a review of security clearance adjudication policies surrounding [REDACTED]
      • If an officer possesses critical skills relevant to China mission-set, such as proficiency in Mandarin Chinese, the Intelligence Community should [REDACTED]
      • The IC should engage in a dialogue with the U.S. Department of Education on the requirements for the future of the U.S. national security workforce.
      • The Intelligence Community should codify and nurture cadres of officers with China-focused expertise [REDACTED]
      • The U.S. should expand its diplomatic, economic, and defense presence in the Indo-Pacific region, to include in the Pacific Island Countries and Southeast Asia.
      • The IC should consider developing a series of reskilling programs to leverage existing talent and expertise previously cultivated in counterterrorism programs.
      • The IC should streamline China-focused reporting across regional areas of responsibility.
      • The IC should leverage lessons learned from providing support to the counterterrorism mission in order to identify ways in which it can embed real-time support to customers, especially those located outside of the Department of Defense, such as the Department of State, the United States Trade Representative, or U.S. health and disaster preparedness agencies.
      • In recognition of the growing importance of economic and policy agencies to the overall success of the U.S. government’s approach to China, the intelligence community should develop plans to increase analytic support to, or otherwise ensure consistent, agile communications and appropriate interactions with, non-traditional agencies, such as the Department of Commerce, the Department of Homeland Security, the National Science Foundation, the Department of Education, and U.S. public health agencies.
  • The United States (U.S.), Australia, India, and Japan convened a virtual session of the Quadrilateral Security Dialogue (aka The Quad) late last month ahead of in person talks in Tokyo set for tomorrow. The renewal of this diplomatic relationship is being portrayed by the People’s Republic of China (PRC) as “an anti-China frontline,” a “mini-NATO,” and a reflection of the U.S.’ “Cold War mentality” according to the PRC’s Vice Foreign Minister. Nonetheless, the four nations issued a statement indicating the “four democracies discussed ways to work together to respond to the COVID-19 pandemic, promote transparency and counter disinformation, and protect the rules-based order the region has long enjoyed,” a statement that includes some pokes at the PRC. First, obviously the PRC is not a democracy and is in the process of cracking down on democracy in Hong Kong. Second, the PRC’s government is not renowned for its transparency and is coming to be one of the world’s foremost purveyors of disinformation online. Third, the U.S. has been arguing since the Obama Administration that the PRC is violating the rules and norms that have ensured prosperity and peace in the Pacific and Indian Oceans since World War II. Not surprisingly, the PRC sees this order as having been established by the U.S. and largely for its benefit.
    • The four nations added:
      • Noting the importance of digital connectivity and secure networks, the officials discussed ways to promote the use of trusted vendors, particularly for fifth generation (5G) networks. They explored ways to enhance coordination on counterterrorism, maritime security, cyber security, and regional connectivity, as well as quality infrastructure based upon international best practices, such as the G20 Principles for Quality Infrastructure Investment. Participants also highlighted the need to improve supply chains in sectors including critical minerals, medical supplies, and pharmaceuticals.
      • The officials reaffirmed their countries’ strong support for ASEAN centrality and ASEAN-led regional architecture. They explored ways to work together in the Mekong sub-region, in the South China Sea, and across the Indo-Pacific to support international law, pluralism, regional stability, and post-pandemic recovery efforts.
    • Again, many of these policy goals and problems are arising because of PRC actions, at least according to The Quad The U.S. and its allies have been fighting the PRC’s 5G push and have accused the PRC of stepping up its cyber activities, including espionage.
    • Moreover, Japan created and advocated what eventually became the G20 Principles for Quality Infrastructure Investment as a policy counterpoint to the PRC’s Silk Belt and Road initiative that has resulted in massive aid from and indebtedness to Beijing in the developing world.
    • The Quad’s work, alongside bilateral relationships in the region, could well coalesce into an informal alliance against the PRC, an outcome that would likely help Washington achieve some of its professed policy goals.
  • Representative Jennifer Wexton (D-VA) and Senator Mazie K. Hirono (D-HI) introduced the “COVID-19 Disinformation Research and Reporting Act” (H.R.8395/S.4732) that “would examine the role of disinformation and misinformation on the public response to COVID-19 and the role that social media has in promoting the spread of false information” per their press release. The bill would require the “National Academies of Sciences, Engineering, and Medicine to conduct a study on the current understanding of the spread of COVID–19-related disinformation and misinformation on the internet and social media platforms.”
    • Wexton and Hirono asserted:
      • Disinformation and misinformation can be particularly dangerous during public health emergencies like COVID-19. This kind of false information can erode trust in science, government officials, and medical and public health experts. Disinformation and misinformation can also make it harder to get accurate and important materials to vulnerable communities, particularly once a vaccine becomes available. The internet and social media have made it easier to spread fake medical information such as unproven treatments for COVID-19.
    • The National Academies of Science, Engineering, and Medicine would need to submit a report to Congress, including “potential strategies to mitigate the dissemination and negative impacts of COVID–19-related disinformation and misinformation (and specifically the dissemination of disinformation and misinformation on social media),” which would likely have utility in fighting other disinformation and misinformation spread online. In fact, the sponsors may be using the current pandemic as the rationale to pass a bill that may otherwise be opposed. It is not hard to imagine the opposition from many on the right if Wexton, Hirono and their cosponsors had proposed legislation to study online extremism and hate in the United States, resulting in a report on how the U.S. might mitigate these phenomena given the role extremists and white supremacists have played in the Republican Party under President Donald Trump.
    • The bill is being sponsored by other Democrats in each chamber but no Republicans.
  • Senate Majority Whip John Thune (R-SD) and 18 Republican colleagues sent President Donald Trump a letter “to express our concerns about a Request For Information (RFI) released by the Department of Defense (DOD) that contradicts the successful free-market strategy you have embraced for 5G.” Late last month, The United States Department of Defense (DOD) released a  RFI on the possibility of the agency sharing its prized portions of electromagnetic spectrum with commercial providers to speed the development and adoption of 5G in the United States. The Senators argued:
    • Rather than rely on private industry and market forces to foster multiple, facilities-based 5G networks, the RFI seeks information on a government-managed process for 5G networks.
    • Nationalizing 5G and experimenting with untested models for 5G deployment is not the way the United States will win the 5G race.  While we recognize the need for secure communications networks for our military, we are concerned that such a proposal threatens our national security.  When bad actors only need to penetrate one network, they have a greater likelihood of disrupting the United States’ communications services.
  • The Department of Defense (DOD) implemented a new rule designed to drive better cybersecurity among United States (U.S.) defense contractors. This rule brings together two different lines of effort to require the Defense Industrial Base (DIB) to employ better cybersecurity given the risks they face by holding and using classified information, Federal Contract Information (FCI) and Controlled Unclassified Information (CUI). The Executive Branch has long wrestled with how to best push contractors to secure their systems, and Congress and the White House have opted for using federal contract requirements in that contractors must certify compliance. However, the most recent initiative, the Cybersecurity Maturity Model Certification (CMMC) Framework will require contractors to be certified by third party assessors. And yet, it is not clear the DOD has wrestled with the often misaligned incentives present in third party certification schemes.
  • Nonetheless, the DOD explained this is “an interim rule to amend the Defense Federal Acquisition Regulation Supplement (DFARS) to implement a DOD Assessment Methodology and CMMC framework in order to assess contractor implementation of cybersecurity requirements and enhance the protection of unclassified information within the DOD supply chain.
  • The DOD added
    • This rule amends DFARS subpart 204.73, Safeguarding Covered Defense Information and Cyber Incident Reporting, to implement the National Institute of Standards and Technology (NIST) Special Publication 800-171 DOD Assessment Methodology. The new coverage in the subpart directs contracting officers to verify in the Supplier Performance Risk System (SPRS) that an offeror has a current NIST SP 800-171 DOD Assessment on record, prior to contract award, if the offeror is required to implement NIST SP 800-171 pursuant to DFARS clause 252.204-7012. The contracting officer is also directed to include a new DFARS provision 252.204-7019, Notice of NIST SP 800-171 DOD Assessment Requirements, and a new DFARS clause 252.204-7020, NIST SP 800-171 DOD Assessment Requirements, in solicitations and contracts including solicitations using FAR part 12 procedures for the acquisition of commercial items, except for solicitations solely for the acquisition of COTS items.
    • This rule adds a new DFARS subpart, Subpart 204.75, CMMC, to specify the policy and procedures for awarding a contract, or exercising an option on a contract, that includes the requirement for a CMMC certification. Specifically, this subpart directs contracting officers to verify in SPRS that the apparently successful offeror’s or contractor’s CMMC certification is current and meets the required level prior to making the award.
  • The House Republican’s China Task Force (CTF) released its final report with its recommendations on how the United States (U.S.) should change its policies to counter the People’s Republic of China (PRC), which includes a slew of technology-related recommendations.
    • The CTF asserted:
      • Since the establishment of diplomatic relations with the PRC more than 40 years ago, the United States has sought to draw the PRC into the community of nations as a responsible stakeholder. U.S. leaders pursued a strategy of engagement based on the assumption that expanding the bilateral economic relationship with the PRC would advance the U.S. national interest and lead the Chinese Communist Party (CCP) to change. This engagement strategy often turned a blind eye to the CCP’s human rights violations, economic malfeasance, expansionist aggression, and empty promises, as well as the CCP’s deep commitment to a hostile Communist ideology that drives this malign behavior. This strategy has, simply put, failed.
    • The CTF made these recommendations:
      • Supply Chain Security
        • Better securing our medical and national security supply chains by:
        • Providing aggressive, smart, and targeted tax incentives to accelerate our research and development (R&D) and production of crucial medicines, medical supplies, ingredients, tests, and vaccines;
        • Creating a grant program necessary to catalyze domestic production of important technologies and designing tax incentives to secure U.S. supply of advanced semiconductors; and
        • Overhauling the federal permitting process for mineral development and prioritizing advancements in mineral refining so neither industry nor the Defense Industrial Base are reliant on the CCP.
      • National Security
        • Working with the DoD to modernize force structure, posture, operational concepts, and acquisitions in order to deter CCP aggression in the Indo-Pacific and around the world.
        • Ensuring modernization of all three legs of the nuclear triad as well as development and fielding of conventional capabilities critical to counter the PLA in the Indo-Pacific, including ground-launched cruise and ballistic missiles.
        • Underscoring the need for a minimum three to five percent real growth in the defense budget per year in order to deter and defeat the PLA and other key adversaries.
        • Increasing focus on how the U.S. military protects space capabilities and carrying out space exploration goals by leveraging private sector investments.
        • Cutting off material support of CCP military industrial base companies, including divestment from companies with ties to the CCP’s military.
        • Safeguarding the U.S. electoral process and the integrity of our elections with various measures, including the identification of foreign malign actors and ensuring any individuals who engage in interference are inadmissible for entry to the U.S. or deportable if already present.
        • Providing more resources for investigations, criminal prosecutions, and other actions against CCP sponsored IP theft in addition to closing loopholes the CCP has exploited in our visa system.
        • Enhancing federal counterintelligence capabilities and bolstering Mandarin language capacity.
      • Technology
        • Taking a whole-of-government approach to assess the security risks posed by the PRC in 5G networks and increasing cooperation between the U.S. and its allies and partners in identifying and countering them.
        • Supporting the formation of a new D-10 group of leading democracies to develop and deploy 5G and subsequent generations and establishing a reimbursement program for companies to remove equipment from their communications networks that poses a national security risk.
        • Securing international leadership in the technologies of tomorrow, including AI, quantum, 5G, and autonomous vehicles.
        • Sanctioning PRC telecommunications companies engaged in economic or industrial espionage and any PRC entity that tries to hack COVID-19 researchers working on a vaccine.
      • Economics and Energy
        • Ensuring no U.S. taxpayer dollars support any PRC state- owned enterprises.
        • Harmonizing export control policies with our partners and allies to keep critical technologies, including semiconductor manufacturing equipment and R&D, from our adversaries.
        • Applying heightened scrutiny for investments in U.S. companies or operations from the PRC.
        • Strengthening trade relationships with our allies to establish U.S. standards and counter the PRC’s influence.
        • Pursuing trade policies that deter and protect against the PRC’s theft of IP.
        • Enforcing reciprocal treatment of PRC investment into the U.S. to restore symmetry in bilateral investment rules.
        • Ensuring PRC companies are held to the same financial disclosure standards as American companies when listing on U.S. stock exchanges.
        • Working to deepen our trade ties with Taiwan and resolving specific outstanding trade issues so the Administration can take steps to launch trade agreement negotiations once those issues are addressed.
        • Strengthening the Development Finance Corporation, Export Import Bank, and other government efforts to more robustly counter the CCP’s Belt and Road Initiative and debt trap diplomacy.
        • Continuing to advance U.S. energy security in order to be a global counter against the PRC, particularly on the nuclear energy front.
      • Competitiveness
        • Doubling the funding of basic science and technology research over the next 10 years.
        • Increasing coordination and funding for STEM education to create a more capable, skilled workforce.
        • Strengthening the protection of sensitive research at America’s colleges and universities and leading research institutions which includes restricting all federal employees and contractors from participating in foreign talent programs.
        • Requiring colleges and universities to annually report all donations from the PRC.

Further Reading

  • In U.S.-China Tech Feud, Taiwan Feels Heat From Both Sides” By Raymond Zhong — The New York Times. Not surprisingly, this island nation (or renegade province according to the People’s Republic of China (PRC)) is being squeezed in the trade war between the United States (U.S.) and the PRC. The main factor that has led to its central role is the Taiwan Semiconductor Manufacturing Company (TSMC), which produces many of the semiconductors needed by both nations. However, with the U.S. tightening ever further the PRC’s access to this technology, Taiwan’s place in the technology world becomes ever more important. Many in. Taiwan see this technological prowess as a bulwark against a PRC-style takeover as in Hong Kong.
  • Beautiful, perk-filled and mostly empty: What the future holds for tech’s billion-dollar headquarters” By Heather Kelly — The Washington Post. Understandably, COVID-19 has caused many large companies to rethink their real estate footprint. Tech is no different as some companies have told workers to stay home until well into next year. Might the pandemic mark a paradigm shift and companies will require much less building and office space? Or will top companies continue their trend of building company towns of sorts?
  • Ad Tech Could Be the Next Internet Bubble” By Gilad Edelman — WIRED. This deep dive into the online advertising world peels back some of the fictions that have kept this multi-billion-dollar black box running. The question is what would happen to the world economy if it crashes?
  • What the antitrust proposals would actually mean for tech” By Emily Birnbaum — Protocol. This article surveys the waterfront on current antitrust proposals before Congress to address large technology companies.
  • Now You Can Use Instagram to Chat With Friends on Facebook Messenger” By Mike Issac — The New York Times. In a move sure not to make friends among those convinced Facebook is monopolistic, the platform has crossed a Rubicon of sorts by combining messaging platforms. Facebook is now allowing those using Messenger and Instagram to message users on the other platform. Soon, this will also be the case with WhatsApp. Critics claim Facebook is doing this to make the company harder to break up in an antitrust action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by mohamed Hassan from Pixabay

Senate Commerce Hearing On Privacy

Senate stakeholders appear no closer to resolving the two key impasses in privacy legislation: preemption and a private right of action.

A week after the introduction of the “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) was introduced, the Senate Commerce, Science, and Transportation Committee held a hearing titled “Revisiting the Need for Federal Data Privacy Legislation with four former Federal Trade Commission (FTC) Commissioners and California’s Attorney General. Generally speaking, Members used the hearing to elicit testimony on the aspects of a privacy bill they would like to see with the chair and ranking member asking the witnesses about the need for preemption and the benefits of one national privacy standard and the need for people to be able to sue as a means of supplementing limited capacity of the FTC and state attorneys general to police violations of a new law respectively.

The SAFE DATA Act (see here for more analysis) was introduced last week by Committee Chair Roger Wicker (R-MS), Senate Majority Whip and  Communications, Technology, Innovation, and the Internet Subcommittee Chair John Thune (R-SD), Transportation and Safety Subcommittee Chair Deb Fischer (R-NE), and Safety, and Senator Marsha Blackburn (R-TN). Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis).

Chair Roger Wicker (R-MS) stated “[d]uring this Congress, protecting consumer data privacy has been a primary focus of this Committee…[and] [w]e held one of the first hearings of my chairmanship to examine how Congress should address this issue.” He said “[a]t that time, we heard that individuals needed rigorous privacy protections to ensure that businesses do not misuse their data…[and] [w]e heard that individuals need to be able to access, control, and delete the data that companies have collected on them.” Wicker stated “[w]e heard that businesses need a consistent set of rules applied reasonably and fairly to allow for continued innovation and growth in the digital economy…[a]nd we heard that the FTC needs enhanced authority and resources in order to oversee and enforce privacy protections.”

Wicker stated “[i]n the nearly two years since, members of this Committee have done a great deal of work developing legislation to address data privacy.” He said “[w]hile we worked, the world of data privacy did not stand still…[and] [t]he state of California implemented its California Consumer Privacy Act (CCPA) and began enforcement this past summer.” Wicker contended “[l]ong-held concerns remain that the CCPA is difficult to understand and comply with and could become worse if the law is further expanded and amended through an upcoming ballot measure this fall.” He claimed “[t]he European Union has continued to enforce the General Data Protection Regulation (GDPR)…[and] [t]he EU’s main focus appears to be going after the biggest American companies rather than providing clear guidance for all businesses with European citizens as customers.”

Wicker noted

The picture in Europe is even more complex following the recent court ruling invalidating the EU-U.S. Privacy Shield framework, which governed how U.S. companies treated the data of EU citizens. Though the issues in that case were more related to national security than consumer privacy, the result was yet more uncertainty about the future of trans-Atlantic data flows. I look forward to holding a hearing before the end of the year on the now-invalidated Privacy Shield.

Wicker asserted “[t]he biggest new development that has impacted data privacy – as it has impacted so many facets of our life – is the COVID-19 pandemic, which has resulted in millions of Americans working from home.” He said “[t]he increased use of video conferencing, food delivery apps, and other online services increases the potential for privacy violations…[and] [t]he need to collect a great deal of data for contact tracing and to track the spread of the disease likewise raises privacy concerns if done improperly.”

Wicker declared that “[f]or all of these reasons and more, the need for a uniform, national privacy law is greater than ever…[and] [l]ast week I introduced the SAFE DATA Act.” He argued

The SAFE DATA Act would provide Americans with more choice and control over their data. It would require businesses to be more transparent and hold them to account for their data practices. It would strengthen the FTC’s ability to be an effective enforcer of new data privacy rules. And it would establish a nationwide standard so that businesses know how to comply no matter where their customers live, and so that consumers know their data is safe wherever the company that holds their data is located.

Wicker stated that “[t]he SAFE DATA Act is the result of nearly two years of discussions with advocacy groups, state and local governments, nonprofits, academics, and businesses of every size and from every sector of the economy – my thanks to all of those.” He claimed “[t]he diversity of voices was essential in crafting a law that would work consistently and fairly for all Americans.” Wicker contended “we have a chance to pass a strong national privacy law that achieves the goals of privacy advocates with real consensus among members of both parties and a broad array of industry members.”

Ranking Member Maria Cantwell (D-WA) stated “[p]rotecting Americans’ privacy rights is critical, and that has become even sharper in the focus of the COVID-19 crisis, where so much of our lives have moved online.” She noted “[t]he American people deserve strong privacy protections for their personal data, and Congress must work to act in establishing these protections.” Cantwell said “[l]ast year, along with Senators [Brian] Schatz (D-HI), [Amy] Klobuchar (D-MN), and [Ed] Markey (D-MA), I introduced the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968).” She claimed “[t]he bill is pretty straightforward…[and] provides foundational privacy rights to consumers, creates rules to prevent abuse of consumer data, and holds companies accountable with real enforcement measures.” Cantwell said “[u]nfortunately, other legislation throughout the legislative process, I think has taken different approaches…[and] [t]hese bills allow companies to maintain the status quo, burying important disclosure information in long contracts, hiding where consumer data is sold, and changing the use of consumer data without their consent.” She conclude that “obviously, I believe these loopholes are unacceptable.”

Cantwell argued

Most strikingly, these bills would actually weaken consumer rights around the country by preempting stronger state laws. Attorney General Becerra is with us today and I appreciate him being able to join us, because this would have an impact on a broad preemption, I should say, would have an impact on 40 million Californians who are protected by your privacy law and the privacy protections in your state. So we need to resolve this issue. But we can’t do so at the expense of states who have already taken action to protect the privacy of their citizens.

Cantwell stated that “[f]inally, we also know that individuals must have the right to their day in court, when privacy is violated, even with the resources and expertise and enforcers like the FTC–many of you, I know, know these rules well, and Attorneys General–we will never be able to fully police the thousands and thousands of companies collecting consumer data if you are the only cop on the beat.” She said “I’d like to go further, but there are many issues that we’re going to address here. I want to say that the legislation also needs to cover the complex issues of, you know, health and safety standards and important issues.” Cantwell stated “[t]he Supreme Court discussion that we’re now having, I think will launch us into a very broad discussion of privacy rights and where they exist within the Constitution.” She explained “[j]ust this recent court ruling that put at risk the little known but vital important provision of the FTC Act 13b, which allows the FTC to go to court to obtain refunds and other redress for consumers–the 10 billion dollars for example in the Volkswagen case–without this provision, the core mission of the FTC would be crippled.”

Cantwell asserted “I think all of these issues, and the important issues of privacy rights, should and will have a fair discussion, if we can have time to discuss them in this process…[and] I believe the issue of how the government interferes in our own privacy rights, whether the government oversteps our privacy rights, is a major issue to be discussed by this body in the next month.” She added “I don’t believe in some of the tactics that government has used to basically invade the privacy rights of individuals.”

Cantwell stated that “next week the minority will be introducing a report that we’ve been working on about the value of local journalism…[and] I believe the famous economist who said that markets need perfect information.” She argued “[w]e’re talking about the fact that if markets are distorted by information, then that really cripples our economy…[and] I think local journalism in a COVID crisis is proving that it’s valued information with the correct information on our local communities, and I think that this is something we need to take into consideration as we consider privacy laws and we consider these issues moving forward.”

Former FTC Commissioner and Microsoft’s Corporate Vice President, Chief Privacy Officer, and Deputy General Counsel for Global Privacy and Regulatory Affairs Julie Brill explained that “Microsoft believes that comprehensive federal privacy legislation should support four key principles: consumer empowerment, transparency, corporate responsibility, and strong enforcement:

  • Consumer Empowerment. Empower consumers with the tools they need to control their personal information, including the ability to make informed choices about the data they provide to companies, to understand what data companies know about them, to obtain a copy of their data, to make sure the data is accurate and up to date, and to delete their data. Americans care deeply about having meaningful control over their data. In just the past nine months, from January 1, 2020 to September 18, 2020, Microsoft received over 14 and a half million unique global visitors to its privacy dashboard, where they were able to exercise their ability to control their data. This continued engagement with the control tools we provide included over 4 and a half million visitors from the United States, representing the greatest level engagement from any single country.
  • Transparency. Require companies to be transparent about their data collection and use practices, by providing people with concise and understandable information about what personal information is collected from them, and how that information is used and shared.
  • Corporate Responsibility. Place direct requirements on companies to ensure that they collect and use consumers’ data in ways that are responsible, and demonstrate that they are worthy stewards of that data.
  • Strong Enforcement. Provide for strong enforcement through regulators, and ensure they have sufficient resources to enforce the legal requirements that organizations must uphold, but also to be well-grounded in the data collection and analysis technologies that are used in the modern digital economy. These are the key elements that are required to build a robust and lasting U.S. privacy law.

George Washington University Law School Professor, King’s College Visiting Professor, and United Kingdom Competition and Markets Authority Non-Executive Director and former FTC Chair William E. Kovacic said:

As Congress defines the substantive commands of a new omnibus law, I suggest a close review of the FTC’s experience in implementing the Telemarketing Sales Rule. To my mind, this experience offers several insights into the design of privacy protections:

  • In addition to unfair or deceptive acts and practices, the definition of forbidden behavior should encompass abusive conduct, as the FTC has developed that concept in the elaboration of the Telemarketing Sales Rule (TSR). I single out 2003 TSR amendments, which established the National Do Not Call Registry, popularly known as the Do Not Call Rule (DNC Rule). In applying the concept of abusive conduct, the DNC Rule used a definition of harm that reached beyond quantifiable economic costs of the challenged practice (i.e., the time lost and inconvenience associated with responding to unwanted telephone calls to the home). The DNC Rule’s theory of harm focused on the fact that, to many citizens, telemarketing calls were annoying, irritating intrusions into the privacy of the home. A new privacy regime could build on this experience and allow privacy regulators, by rulemaking and by law enforcement, to address comparable harms and to create standards that map onto common expectations for data protection and security.
  • The coverage of the omnibus statute should be comprehensive. Privacy authorities should have power to apply the law to all commercial actors (i.e., with no exclusions for specific economic sectors)and to not-for-profit institutions such as charitable bodies and universities.
  • The omnibus law should clarify that its restrictions on the accumulation and use of date about individuals apply to their status as consumers and employees. Since the late 1990s, the FTC at times has engaged in debatable interpretations of its authority under Section 5 of the Federal Trade Commission Act to assure foreign jurisdictions that it has authority to enforce promises regarding the collection and transfer by firms of information about their employees.

Kovacic stated “[w]ith this general framework in mind, my testimony proposes that an omnibus privacy law should enhance the institutional arrangements for administering anew substantive privacy framework. This statement

  • Sets out criteria to assess the performance of the entities implementing U.S. privacy policy, and to determine how to allocate tasks to institutions responsible for policy development and law enforcement.
  • Suggests approaches to increase the coherence and effectiveness of the US privacy system and to make the United States a more effective participant in the development of international privacy policy.
  • Considers whether the FTC, with an enhanced mandate, should serve as the national privacy regulator, or whether the FTC’s privacy operations should be spun off to provide the core of a new privacy institution.

Kovacic explained

This statement concludes that the best solution is to take steps that would enhance the FTC’s role by (a) eliminating gaps in its jurisdiction, (b) expanding its capacity to promote cooperation among agencies with privacy portfolios and to encourage convergence upon superior policy norms, and (c) providing resources necessary to fulfill these duties. The proposal for an enlarged FTC role considers two dimensions of privacy regulation. The first is what might be called the “consumer-facing” elements of a privacy. My testimony deals mainly with the relationship between consumers and enterprises (for-profit firms and not-for-profit institutions, such as universities) that provide them with goods and services. My testimony does not address the legal mechanisms that protect privacy where the actors are government institutions. Thus, I do not examine the appropriate framework for devising and implementing policies that govern data collection and record-keeping responsibilities of federal agencies, such as bodies that conduct surveillance for national security purposes.

21st Century Privacy Coalition Co-Chair and Former FTC Chair Jon Leibowitz asserted:

  • Congress does not need to reinvent the wheel. Many of the elements I would propose are consistent with recommendations made by my former agency in its 2012 Privacy Report, drafted after years of work and engagement with stakeholders of all kinds. Technology will continue to change, but the basic principles enshrined in the Report remain the most effective way to give consumers the protections they deserve.
  • My view, and that of the Report, is that national privacy legislation must give consumers statutory rights to control how their personal information is used and shared, and provide increased visibility into companies’ practices when it comes to managing consumer data. Such an approach should provide consumers with easy-to-understand privacy choices based upon the nature of the information itself—its sensitivity, the risk of consumer harm if such information is the subject of an unauthorized disclosure—and the context in which it is collected. For example, consumers expect sensitive information—including health and financial data, precise geolocation, Social Security numbers, and children’s information—to receive heightened protection to ensure confidentiality.
  • Therefore, a muscular privacy law should require affirmative express consent for the use and sharing of consumers’ sensitive personally identifiable information, and opt-out rights for non-sensitive information. But consumers do not expect to consistently provide affirmative consent to ensure that companies fulfill their online orders or protect them from fraud; thus, inferred consent for certain types of operational uses of information by companies makes sense. Consumers should also have rights of access and deletion where appropriate, and deserve civil rights protections thoughtfully built for the Internet age.
  • Another key tenet of the FTC Report is that privacy should not be about who collects an individual’s personal information, but rather should be about what information is collected and how it is protected and used. That is why federal privacy legislation should be technology- and industry-neutral. Companies that collect, use, or share the same type of covered personal information should not be subject to different privacy requirements based on how they classify themselves in the marketplace.
  • Rigorous standards should be backed up with tough enforcement. To that end, Congress should provide the FTC with the ability to impose civil penalties on violators for first-time offenses, something all of the current Commissioners—and I believe all the former Commissioners testifying here today—support. Otherwise, malefactors will continue to get two bites at the apple of the unsuspecting consumer. And there is no question in my mind that the FTC should have the primary authority to administer the national privacy law. The FTC has the unparalleled institutional knowledge and experience gained from bringing more than 500 cases to protect the privacy and security of consumer information, including those against large companies like Google, Twitter, Facebook, Uber, Dish Network, and others. Congress should not stop there.
  • The way to achieve enhanced enforcement is by giving the FTC, an agency that already punches above its weight, the resources and authority to carry out its mandate effectively. As of 2019, there were fewer employees (“FTEs”) at the agency now than there were in 1980, and the American population has grown by more than 100 million people since then. The number of FTEs has actually decreased since I left the agency in 2013 until this year.
  • Moreover, the FTC clearly has a role to play in developing rules to address details that Congress may not want to tackle in the legislation itself as well as new developments in technology that could overwhelm (or circumvent) enforcement. For that reason, you should give the agency some APA rulemaking authority to effectively implement your law. Having said that, Congress should not overwhelm the FTC with mandated rulemaking after rulemaking, which would only bog the agency down instead of permitting it to focus on enforcing the new law.

California Attorney General Xavier Becerra argued:

  • In the data privacy space, the optimal federal legal framework recognizes that privacy protections must keep pace with innovation, the hallmark of our data-driven economy. State law is the backbone of consumer privacy in the United States. Federal law serves as the glue that ties our communities together. To keep pace, we must all work from the same baseline playbook, but be nimble enough to adapt to real-world circumstances on the field where we meet them. I urge this committee to proceed in your work in a manner that respects—and does not preempt—more rigorous state laws, including those we have in California.
  • Like any law, the CCPA is not perfect, but it is an excellent first step. Consumers deserve more privacy and easier tools. For example, in the regulations implementing the CCPA, the California Department of Justice tried to address the frustration of consumers who must proceed website-by-website, browser-by-browser in order to opt out of the sale of their personal information. One provision of our regulations intended to facilitate the submission of a request to opt-out of sale by requiring businesses to comply when a consumer has enabled a global privacy control at the device or browser level, which should be less time-consuming and burdensome. I urge the technology community to develop consumer-friendly controls to make exercise of the right to opt out of the sale of information meaningful and frictionless. Making technology work for consumers is just as important as the benefits businesses receive in innovating.
  • There are also ways in which CCPA could go further and require refinement of its compliance measures. For example, the CCPA currently only requires disclosure of “categories of sources” from which personal information is collected and “categories of third parties” to whom personal information is sold. More specific disclosures, including the names of businesses that were the source or recipient of the information, should be required so that consumers can know the extent to which their information has been shared, bartered, and sold. If I receive junk mail from a company, I should be able to find out how it got my address and to whom it shared the information so I can stop the downstream purchase of my personal data. For now, businesses are not legally required to share that granularity of information. Consumers should also have the ability to correct the personal information collected about them, so as to prevent the spreading of misinformation.
  • On a broader level, if businesses want to use consumers’ data, they should have a duty to protect and secure it, and wherever feasible, minimize data collection. Businesses should no longer approach consumer data with the mindset, “collect now, monetize later.” There should be a duty imposed to use a consumer’s personal information in accordance with the purposes for which the consumer allowed its collection, and in the consumer’s interest, especially with the collection and storage of sensitive information, like precise geolocation. Although CCPA requires transparent notice at collection, moving beyond a notice-and-consent framework to contemplate use limitations would make our privacy rights more robust and balanced.
  • We need clear lines on what is illegal data use from the context of civil rights protections. Indirect inferences based on personal information should not be used against us in healthcare decisions, insurance coverage or employment determinations. We need greater transparency on how algorithms impact people’s fundamental rights of healthcare, housing and employment, and how they may be perpetuating systemic racism and bias. Predatory online practices, such as increased cross-site tracking after a user browses healthcare websites, must be addressed.
  • Finally, new laws should include a private right of action to complement and fortify the work of state enforcers. While my office is working hard to protect consumer privacy rights in California, and our sister states do the same in their jurisdictions, we cannot do this work alone. While we endeavor to hold companies accountable for violations of privacy laws, trying to defend the privacy rights of 40 million people in California alone is a massive undertaking. Violators know this. They know our scope and reach are limited to remedying larger and more consequential breaches of privacy. Consumers need the authority to pursue remedies themselves for violations of their rights. Private rights of action provide a critical adjunct to government enforcement, and enable consumers to assert their rights and seek appropriate remedies. Consumer privacy must be real, it deserves its day in court.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by KaraSuva from Pixabay

Another Federal Privacy Bill

Senate Commerce Republicans revise and release privacy bill that does not budge on main issues setting them apart from their Democratic colleagues.

Last week, in advance of tomorrow’s hearing on privacy legislation, the chair and key Republicans released a revised version of draft legislation released last year to mark their position on what United States (U.S.) federal privacy regulation should be. Notably, last year’s draft and the updated version would still preempt state laws like the “California Consumer Privacy Act” (CCPA) (AB 375), and people in the U.S. would not be given the right to sue entities that violate the privacy law. These two issues continue to be the main battle lines between Democratic and Republican bills to establish a U.S. privacy law. Given the rapidly dwindling days left in the 116th Congress and the possibility of a Democratic White House and Senate next year, this may be both a last gasp effort to get a bill out of the Senate and to lay down a marker for next year.

The “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) was introduced by Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS), Senate Majority Whip and  Communications, Technology, Innovation, and the Internet Subcommittee Chair John Thune (R-SD), Transportation and Safety Subcommittee Chair Deb Fischer (R-NE), and Safety, and Senator Marsha Blackburn (R-TN). However, a notable Republican stakeholder is not a cosponsor: Consumer Protection Subcommittee Chair Jerry Moran (R-KS) who introduced his own bill, the “Consumer Data Privacy and Security Act of 2020” (S.3456) (See here for analysis).

As mentioned, Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis). Here’s how I summarized the differences at the time: in the main, CDPA shares the same framework with COPRA with some key, significant differences, including:

  • COPRA expands the FTC’s jurisdiction in policing privacy harms whereas CDPA would not
  • COPRA places a duty of loyalty on covered entities to people whose covered data they process or transfer; CDPA does not have any such duty
  • CDPA does not allow people to sue if covered entities violate the new federal privacy and security regime; COPRA would allow such suits to move forward
  • CDPA preempts state privacy and data security laws; COPRA would establish a federal floor that states like California would be able to legislate on top of
  • CDPA would take effect two years after enactment, and COPRA would take effect six months after enactment.
  • The bar against a person waiving her privacy rights under COPRA is much broader than CDPA
  • COPRA would empower the FTC to punish discriminatory data processing and transfers; CDPA would require the FTC to refer these offenses to the appropriate federal and state agencies
  • CDPA revives a concept from the Obama Administration’s 2015 data privacy bill by allowing organizations and entities to create standards or codes of conduct and allowing those entities in compliance with these standards or codes to be deemed in compliance with CDPA subject to FTC oversight; COPRA does not have any such provision
  • COPRA would require covered entities to conduct algorithmic decision-making impact assessments to determine if these processes are fair, accurate, and unbiased; no such requirement is found in CDPA

As a threshold matter, the SAFE DATA Act is in the latest in a line of enhanced notice and consent bills founded on the logic that if people were informed and able to make choices about how and when their data are used, then the U.S. would have an ideal data and privacy ecosystem. This view, perhaps coincidentally, dovetails with Republican views on other issues where people should merely be given information and the power to choose, and any bad outcomes being the responsibility of those who made poor choices. This view runs counter to those who see privacy and data security as being akin to environmental or pollution problems, that is being beyond the ability of any one person to manage or realistically change.

Turning to the bill before us, we see that while covered entities may not outright deny services and products to people if they choose to exercise the rights granted under the bill visa vis their covered data, a covered entity may charge different prices. This structure would predictably lead to only those who can afford it or are passionately committed to their privacy being able to pay for more privacy. And yet, the rights established by the bill for people to exercise some control over their private information cannot be waived, forestalling the possibility that some covered entities would make such a waiver a term of service like many companies do with a person’s right to sue.

Covered entities must publish privacy policies before or at the point of data collection, including:

  • The identity of the entity in charge of processing and using the covered data
  • The categories of covered data collected and the processing purposes of each category
  • Whether transfers of covered data occur, the categories of those receiving such data, and the purposes for which transfers occur
  • The entity’s data retention and data security policies generally; and
  • How individuals may exercise their rights.

Any material changes mean new privacy policies provided to people and consent again must be obtained before collection and processing may resume.

There is, however, language not seen in other privacy bills: “[w]here the ownership of an individual’s device is transferred directly from one individual to another individual, a covered entity may satisfy its obligation to disclose a privacy policy prior to or at the point of collection of covered data by making the privacy policy available under (a)(2)” (i.e. by posting on the entity’s website.) So, if I give an old phone to a friend, a covered entity may merrily continue collecting and processing data because I consented and my friend’s consent is immaterial. Admittedly, this would seem to be a subset of all devices used in the U.S., but it does not seem to be a stretch for covered entities to need to obtain consent if they determine a different person has taken over a device. After all, they will almost certainly be able to discern the identity of the new user and that the device is now being used by someone new.

Section 103 of the SAFE DATA Act establishes a U.S. resident’s rights to access, correct, delete, and port covered data. People would be able to access their covered data and correct “material” inaccuracies or incomplete information at least twice a year at no cost provided the covered entity can verify their identity. Included with the right to access would be provision of the categories of third parties to whom covered data has been transferred and a list of the categories of purposes. There is a long list of reasons why covered entities would not need to comply, including but not limited to:

  • If the covered entity must “retain any covered data for the sole purpose of fulfilling the request; “
  • If it would “be impossible or demonstrably impracticable to comply with;”
  • If a request would “require the covered entity to combine, relink, or otherwise reidentify covered data that has been deidentified;”
  • If it would “result in the release of trade secrets, or other proprietary or confidential data or business practices;”
  • If it would “interfere with law enforcement, judicial proceedings, investigations, or reasonable efforts to guard against, detect, or investigate malicious or unlawful activity, or enforce contracts;”
  • If it would “require disproportionate effort, taking into consideration available technology, or would not be reasonably feasible on technical grounds;”
  • If it would “compromise the privacy, security, or other rights of the covered data of an- other individual;”
  • If it would “be excessive or abusive to another individual; or
  • If t would “violate Federal or State law or the rights and freedoms of another individual, including under the Constitution of the United States.”

This extensive list will give companies not interested in complying with plenty of reason to proffer as to why they will not provide access or correct. Nonetheless, the FTC would need to draft and implement regulations “establishing requirements for covered entities with respect to the verification of requests to exercise rights” to access and correct. Perhaps the agency will be able to address some foreseeable problems with the statute as written.

Explicit consent is needed before a covered entity may transfer or process the “sensitive covered data” of a person. The first gloss on this right is that a person’s consent is not needed to collect, process, and transfer the “covered data” of a person. Elsewhere in the section, it is clear that one has a limited opt out right: “a covered entity shall provide an individual with the ability to opt out of the collection, processing, or transfer of such individual’s covered data before such collection, processing, or transfer occurs.”

Nonetheless, a bit of a detour back into the definitions section of the bill is in order to understand which types of data lay on which side of the consent line. “Covered data” are “information that identifies or is linked or reasonably linkable to an individual or a device that is linked or reasonably linkable to an individual” except for publicly available data, employment data, aggregated data, and de-identified data. Parenthetically, I would note the latter two exceptions would seem to be incentives for companies to hold personal information in the aggregate or in a de-identified state as much as possible so as to avoid triggering the requirements of the SAFE DATA Act.

“Sensitive covered data” would be any of the following (and, my apologies, the list is long):

  • A unique, government-issued identifier, such as a Social Security number, passport number, or driver’s license number, that is not required to be displayed to the public.
  • Any covered data that describes or reveals the diagnosis or treatment of the past, present, or future physical health, mental health, or disability of an individual.
  • A financial account number, debit card number, credit card number, or any required security or access code, password, or credentials allowing access to any such account.
    Covered data that is biometric information.
  • A persistent identifier.
  • Precise geolocation information (defined elsewhere as anything within 1750 feet)
  • The contents of an individual’s private communications, such as emails, texts, direct messages, or mail, or the identity of the parties subject to such communications, unless the covered entity is the intended recipient of the communication (meaning metadata is fair game; and this can be incredibly valuable. Just ask he National Security Agency)
  • Account log-in credentials such as a user name or email address, in combination with a password or security question and answer that would permit access to an online account.
  • Covered data revealing an individual’s racial or ethnic origin, or religion in a manner inconsistent with the individual’s reasonable expectation regarding the processing or transfer of such information. (of course, this sort of qualifying language always makes me think according to who’s definition of “reasonable expectation”)
  • Covered data revealing the sexual orientation or sexual behavior of an individual in a manner inconsistent with the individual’s reasonable expectation regarding the processing or transfer of such information. (See the previous clause)
  • Covered data about the online activities of an individual that addresses or reveals a category of covered data described in another subparagraph of this paragraph. (I suppose this is intended as a backstop against covered entities trying to backdoor their way into using sensitive covered data by claiming its covered data from online activities.)
  • Covered data that is calendar information, address book information, phone or text logs, photos, or videos maintained for private use on an individual’s device.
  • Any covered data collected or processed by a covered entity for the purpose of identifying covered data described in another paragraph of this paragraph. (again, this seems aimed at plugging a possible loophole in that ordinary covered data can probably be processed or combined with other covered data to arrive at some categories of “sensitive covered data.”)
  • Any other category of covered data designated by the Commission pursuant to a rulemaking under section 553 of title 5, United States Code (meaning the FTC can use normal rulemaking authority and not the shackles of the Moss-Magnuson rulemaking procedures to expand this definition as needed).

So, we have a subset of covered data that would be subject to consent requirements, including notice with a “clear description of the processing purpose for which the sensitive covered data will be processed;” that “clearly identif[ies] any processing purpose that is necessary to fulfill a request made by the individual” that “include[s] a prominent heading that would enable a reasonable individual to easily identify the processing purpose for which consent is sought; and “clearly explain[s] the individual’s right to provide or withhold consent.”

Finally, the FTC may but does not have “to establish requirements for covered entities regarding clear and conspicuous procedures for allowing individuals to provide or withdraw affirmative express consent for the collection of sensitive covered data.” If the agency chooses to do so, it may use the normal notice and comment procedures virtually every other U.S. agency may.

Covered entities must minimize collection, processing, and retention of covered data to “what is reasonably necessary, proportionate, and limited” except if permitted elsewhere in the SAFE DATA Act or another federal statute. Interestingly, the FTC would not be tasked with conducting a rulemaking but would instead need to issue guidelines with best practices on how covered entities would undertake such minimization.

Service providers must follow the direction of the covered entity with whom they are working and delete or deidentify data after they have finished work upon it. Third parties are limited in processing covered data to only those purposes consistent with the reasonable expectations of the individual to whom the data belong. However, third parties do not need to obtain consent for processing sensitive covered data or covered data. However, covered entities must perform due diligence to ensure that service providers and third parties will comply with the requirements particular to these two classes of entities. However, there is no obligation beyond due diligence and no suggestion of liability for the misdeeds and violations of service providers and third parties.

Large data holders would need to conduct periodic privacy impact analyses with an eye toward helping these entities improve their privacy policies. This class of covered entities are those that have processed or transferred the covered data of 8 million or more people in a given year or the sensitive covered data of 300,000 people.

The SAFE DATA Act would generally allow covered entities to collect, process, and transfer the covered data of people without their consent so long as these activities are reasonably necessary, proportionate and limited to the following purposes:

  • To initiate or complete a transaction or to fulfill an order or provide a service specifically requested by an individual, including associated routine administrative activities such as billing, shipping, financial reporting, and accounting.
  • To perform internal system maintenance, diagnostics, product or service management, inventory management, and network management.
  • To prevent, detect, or respond to a security incident or trespassing, provide a secure environment, or maintain the safety and security of a product, service, or individual.
  • To protect against malicious, deceptive, fraudulent, or illegal activity.
  • To comply with a legal obligation or the establishment, exercise, analysis, or defense of legal claims or rights, or as required or specifically authorized by law.
  • To comply with a civil, criminal, or regulatory inquiry, investigation, subpoena, or summons by an Executive agency.
  • To cooperate with an Executive agency or a law enforcement official acting under the authority of an Executive or State agency concerning conduct or activity that the Executive agency or law enforcement official reasonably and in good faith believes may violate Federal, State, or local law, or pose a threat to public safety or national security.
  • To address risks to the safety of an individual or group of individuals, or to ensure customer safety, including by authenticating individuals in order to provide access to large venues open to the public.
  • To effectuate a product recall pursuant to Federal or State law.

People would not be able to opt out of collection, processing, and transferring covered data. As mentioned earlier, U.S. residents would receive a limited right to opt out, and it is in Section 108 that one learns the things a person cannot opt out of. I suppose it should go without saying that covered entities will interpret these terms as broadly as they can in order to forestall people from opting out. The performance of “internal system maintenance, diagnostics, product or service management, inventory management, and network management” seems like a potentially elastic definition that could be asserted to give cover to some covered entities.

Speaking of exceptions, small businesses would not need to heed the rights of individuals regarding their covered data, do not need to minimize their collection, processing, and transferring covered data, and will not need to have data privacy and security officers. These are defined as entities with gross annual revenues below $50 million per year, that has processed the covered data of less than 1 million people, has fewer than 500 employees, and earns less than 50% of its revenue from transferring covered data. On its face, this seems like a very generous definition of what shall be a small business.

The FTC would not be able to police processing and transferring of covered data that violates discrimination laws. Instead the agency would need to transfer these matters to agencies of jurisdiction. The FTC would be required to use its 6(b) authority to “examin[e] the use of algorithms to process covered data in a manner that may violate Federal anti-discrimination laws” and then publish a report in its findings and guidance on how covered entities can avoid violating discrimination laws.

Moreover, the National Institute of Standards and Technology (NIST) must “develop and publish a definition of ‘‘digital content forgery’’ and accompanying explanatory materials.” One year afterwards, the FTC must “publish a report regarding the impact of digital content forgeries on individuals and competition.”

Data brokers would need to register with the FTC, which would then publish a registry of data brokers on its website.

There would be additional duties placed on covered entities. For example, these entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of covered data.” However, financial services companies subject to and in compliance with Gramm-Leach-Bliley regulations would be deemed to be in compliance with these data security obligations. The same would be true of entities subject to and in compliance with the “Health Insurance Portability and Accountability Act” and “Health Information Technology for Economic and Clinical Health Act.” Additionally, the FTC may “issue regulations to identify processes for receiving and assessing information regarding vulnerabilities to covered data that are reported to the covered entity.”

The SAFE DATA Act has language new to federal privacy bills on “opaque algorithms.” Specifically, covered internet platforms would not be able to use opaque algorithms unless notice is provided to users and an input-transparent algorithm version is available to users. The term ‘‘covered internet platform’’ is broad and encompasses “any public-facing website, internet application, or mobile application, including a social network site, video sharing service, search engine, or content aggregation service.” An “opaque algorithm” is “an algorithmic ranking system that determines the order or manner that information is furnished to a user on a covered internet platform based, in whole or part, on user-specific data that was not expressly provided by the user to the platform for such purpose.”

The bill makes it an unfair and deceptive practice for “large online operator[s]” “to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”

A covered entity must have

  • 1 or more qualified employees or contractors as data privacy officers; and
  • 1 or more qualified employees or contractors…as data security officers.

Moreover, “[a] covered entity shall maintain internal controls and reporting structures to ensure that appropriate senior management officials of the covered entity are involved in assessing risks and making decisions that implicate compliance with this Act.”

There are also provisions protecting whistleblowers inside covered entities that “voluntarily provide[] [“original information”] to the [FTC]…relating to non-compliance with, or any violation or alleged violation of, this Act or any regulation promulgated under this Act.”

Like virtually all the other bills, the FTC would be able to levy civil fines of more than $42,000 per violation, and state attorneys general would also be able to enforce the new privacy regime. However, the FTC would be able to intervene and take over the action if it chose, and if two or more state attorneys general are bringing cases regarding the same violations, then the cases would be consolidated and heard in the federal court in the District of Columbia. The FTC would also get jurisdiction over common carriers and non-profits for purposes of enforcing the SAFE DATA Act.

And then there is new language in the SAFE DATA Act that seems aimed at addressing a pair of cases before the Supreme Court on the extent of the FTC’s power to seek and obtain certain monetary damages and equitable relief. The FTC has appealed an adverse ruling from the U.S. Court of Appeals for the Seventh Circuit while the other case is coming from the U.S. Court of Appeals for the Ninth Circuit.

Like the forerunner bill released last November, the FTC would be empowered to “approve voluntary consensus standards or certification programs that covered entities may use to comply with 1 or more provisions in this Act.” These provisions came from an Obama Administration privacy bill allowing for the development and usage of voluntary consensus-based standards for covered entities to comply with in lieu of the provisions of that bill.

The SAFE DATA Act would not impinge existing federal privacy laws but would preempt all privacy laws at the state level. Ironically, the bill would not preempt data breach notification laws. One would think if uniformity across the U.S. were a driving motivation, doing so would be desirable.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Further Reading, Other Developments, and Coming Events (19 August)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The United States (U.S.) Department of Commerce tightened its chokehold on Huawei’s access to United States’ semiconductors and chipsets vital to its equipment and services. This rule follows a May rule that significantly closed off Huawei’s access to the point that many analysts are projecting the People’s Republic of China company will run out of these crucial technologies sometime next year without a suitable substitute, meaning the company may not be able to sell its smartphone and other leading products. In its press release, the department asserted the new rule “further restricts Huawei from obtaining foreign made chips developed or produced from U.S. software or technology to the same degree as comparable U.S. chips.”
    • Secretary of Commerce Wilbur Ross argued “Huawei and its foreign affiliates have extended their efforts to obtain advanced semiconductors developed or produced from U.S. software and technology in order to fulfill the policy objectives of the Chinese Communist Party.” He contended “[a]s we have restricted its access to U.S. technology, Huawei and its affiliates have worked through third parties to harness U.S. technology in a manner that undermines U.S. national security and foreign policy interests…[and] his multi-pronged action demonstrates our continuing commitment to impede Huawei’s ability to do so.”
    • The Department of Commerce’s Bureau of Industry and Security (BIS) stated in the final rule that it is “making three sets of changes to controls for Huawei and its listed non-U.S. affiliates under the Export Administration Regulations (EAR):
      • First, BIS is adding additional non-U.S. affiliates of Huawei to the Entity List because they also pose a significant risk of involvement in activities contrary to the national security or foreign policy interests of the United States.
      • Second, this rule removes a temporary general license for Huawei and its non-U.S. affiliates and replaces those provisions with a more limited authorization that will better protect U.S. national security and foreign policy interests.
      • Third, in response to public comments, this final rule amends General Prohibition Three, also known as the foreign-produced direct product rule, to revise the control over certain foreign-produced items recently implemented by BIS.”
    • BIS claimed “[t]hese revisions promote U.S. national security by limiting access to, and use of, U.S. technology to design and produce items outside the United States by entities that pose a significant risk of involvement in activities contrary to the national security or foreign policy interests of the United States.”
    • One technology analyst claimed “[t]he U.S. moves represent a significant tightening of restrictions over Huawei’s ability to procure semiconductors…[and] [t]hat puts into significant jeopardy its ability to continue manufacturing smartphones and base stations, which are its core products.”
  • The Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP) have released their annual guidance to United States department and agencies to direct their budget requests for FY 2022 with respect to research and development (R&D). OMB explained:
  • For FY2022, the five R&D budgetary priorities in this memorandum ensure that America remains at the global forefront of science and technology (S&T) discovery and innovation. The Industries of the Future (IotF) -artificial intelligence (AI), quantum information sciences (QIS), advanced communication networks/5G, advanced manufacturing, and biotechnology-remain the Administration’s top R&D priority. This includes fulfilling President Trump’s commitment to double non-defense AI and QIS funding by FY2022:
    • American Public Health Security and Innovation
    • American Leadership in the Industries of the Future and Related Technologies
    • American Security
    • American Energy and Environmental Leadership
    • American Space Leadership
  • In light of the significant health and economic disruption caused by the COVID-19 pandemic, the FY2022 memorandum includes a new R&D priority aimed at American Public Health Security and Innovation. This priority brings under a single, comprehensive umbrella biomedical and biotechnology R&D aimed at responding to the pandemic and ensuring the U.S. S&T enterprise is maximally prepared for any health-related threats.
  • Lastly, this memorandum also describes/our high-priority crosscutting actions. These actions include research and related strategies that underpin the five R&D priorities and ensure departments and agencies deliver maximum return on investment to the American people:
    • Build the S&T Workforce of the Future
    • Optimize Research Environments and Results
    • Facilitate Multisector Partnerships and Technology Transfer
    • Leverage the Power of Data
  • Despite the Trump Administration touting its R&D priorities and achievements, the non-partisan Congressional Research Service noted
    • President Trump’s budget request for FY2021 includes approximately $142.2 billion for research and development (R&D) for FY 2021, $13.8 billion (8.8%) below the FY2020 enacted level of $156.0 billion. In constant FY 2020 dollars, the President’s FY 2021 R&D request would result in a decrease of $16.6 billion (10.6%) from the FY 2020 level.
  • Two key chairs of subcommittees of the Senate Commerce, Science, and Transportation Committee are pressing the Federal Trade Commission (FTC) to investigate TikTok’s data collection and processing practices. This Committee has primary jurisdiction over the FTC in the Senate and is a key stakeholder on data and privacy issues.
    • In their letter, Consumer Protection Subcommittee Chair Jerry Moran (R-KS) and Communications, Technology, Innovation Chair John Thune (R-SD) explained they are “are seeking specific answers from the FTC related to allegations from a Wall Street Journal article that described TikTok’s undisclosed collection and transmission of unique persistent identifiers from millions of U.S. consumers until November 2019…[that] also described questionable activity by the company as it relates to the transparency of these data collection activities, and the letter seeks clarity on these practices.”
    • Moran and Thune asserted “there are allegations that TikTok discretely collected media access control (MAC) addresses, commonly used for advertisement targeting purposes, through Google Android’s operating system under an “unusual layer of encryption” through November 2019.” They said “[g]iven these reports and their potential relevancy to the “Executive Order on Addressing the Threat Posed by TikTok,” we urge the Federal Trade Commission (FTC) to investigate the company’s consumer data collection and processing practices as they relate to these accusations and other possible harmful activities posed to consumers.”
    • If the FTC were to investigate, find wrongdoing, and seek civil fines against TikTok, the next owner may be left to pay as the White House’s order to ByteDance to sell the company within three months will almost certainly be consummated before any FTC action is completed.
  • Massachusetts Attorney General Maura Healey (D) has established a “Data Privacy and Security Division within her office to protect consumers from the surge of threats to the privacy and security of their data in an ever-changing digital economy.” Healey has been one of the United States’ more active attorneys general on data privacy and technology issues, including her suit and settlement with Equifax for its massive data breach.
    • Her office explained:
      • The Data Privacy and Security Division investigates online threats and the unfair or deceptive collection, use, and disclosure of consumers’ personal data through digital technologies. The Division aims to empower consumers in the digital economy, ensure that companies are protecting consumers’ personal data from breach, protect equal and open access to the internet, and protect consumers from data-driven technologies that unlawfully deny them fair access to socioeconomic opportunities. The Division embodies AG Healey’s commitment to continue and grow on this critical work and ensure that data-driven technologies operate lawfully for the benefit of all consumers.
  • A California appeals court ruled that Amazon can be held liable for defective products their parties sell on its website. The appellate court reversed the trial court which held Amazon could not be liable.
    • The appeals court recited the facts of the case:
      • Plaintiff Angela Bolger bought a replacement laptop computer battery on Amazon, the popular online shopping website operated by defendant Amazon.com, LLC. The Amazon listing for the battery identified the seller as “E-Life, ”a fictitious name used on Amazon by Lenoge Technology (HK) Ltd. (Lenoge). Amazon charged Bolger for the purchase, retrieved the laptop battery from its location in an Amazon warehouse, prepared the battery for shipment in Amazon-branded packaging, and sent it to Bolger. Bolger alleges the battery exploded several months later, and she suffered severe burns as a result.
      • Bolger sued Amazon and several other defendants, including Lenoge. She alleged causes of action for strict products liability, negligent products liability, breach of implied warranty, breach of express warranty, and “negligence/negligent undertaking.”
    • The appeals court continued:
      • Amazon moved for summary judgment. It primarily argued that the doctrine of strict products liability, as well as any similar tort theory, did not apply to it because it did not distribute, manufacture, or sell the product in question. It claimed its website was an “online marketplace” and E-Life (Lenoge) was the product seller, not Amazon. The trial court agreed, granted Amazon’s motion, and entered judgment accordingly.
      • Bolger appeals. She argues that Amazon is strictly liable for defective products offered on its website by third-party sellers like Lenoge. In the circumstances of this case, we agree.
  • The National Institute of Standards and Technology (NIST) issued Special Publication 800-207, “Zero Trust Architecture,” that posits a different conceptual model for an organization’s cybersecurity than perimeter security. NIST claimed:
    • Zero trust security models assume that an attacker is present in the environment and that an enterprise-owned environment is no different—or no more trustworthy—than any nonenterprise-owned environment. In this new paradigm, an enterprise must assume no implicit trust and continually analyze and evaluate the risks to its assets and business functions and then enact protections to mitigate these risks. In zero trust, these protections usually involve minimizing access to resources (such as data and compute resources and applications/services) to only those subjects and assets identified as needing access as well as continually authenticating and authorizing the identity and security posture of each access request.
    • A zero trust architecture (ZTA) is an enterprise cybersecurity architecture that is based on zero trust principles and designed to prevent data breaches and limit internal lateral movement. This publication discusses ZTA, its logical components, possible deployment scenarios, and threats. It also presents a general road map for organizations wishing to migrate to a zero trust design approach and discusses relevant federal policies that may impact or influence a zero trust architecture.
    • ZT is not a single architecture but a set of guiding principles for workflow, system design and operations that can be used to improve the security posture of any classification or sensitivity level [FIPS199]. Transitioning to ZTA is a journey concerning how an organization evaluates risk in its mission and cannot simply be accomplished with a wholesale replacement of technology. That said, many organizations already have elements of a ZTA in their enterprise infrastructure today. Organizations should seek to incrementally implement zero trust principles, process changes, and technology solutions that protect their data assets and business functions by use case. Most enterprise infrastructures will operate in a hybrid zero trust/perimeter-based mode while continuing to invest in IT modernization initiatives and improve organization business processes.
  • The United Kingdom’s Government Communications Headquarters’ (GCHQ) National Cyber Security Centre (NCSC) released “Cyber insurance guidance” “for organisations of all sizes who are considering purchasing cyber insurance…not intended to be a comprehensive cyber insurance buyers guide, but instead focuses on the cyber security aspects of cyber insurance.” The NCSC stated “[i]f you are considering cyber insurance, these questions can be used to frame your discussions…[and] [t]his guidance focuses on standalone cyber insurance policies, but many of these questions may be relevant to cyber insurance where it is included in other policies.”

Further Reading

  • I downloaded Covidwise, America’s first Bluetooth exposure-notification app. You should, too.” By Geoffrey Fowler – The Washington Post. The paper’s technology columnist blesses the Apple/Google Bluetooth exposure app and claims it protects privacy. One person on Twitter pointed out the Android version will not work unless location services are turned on, which is contrary to the claims made by Google and Apple, an issue the New York Times investigated last month. A number of European nations have pressed Google to remove this feature, and a Google spokesperson claimed the Android Bluetooth tracing capability did not use location services, begging the question why the prompt appears. Moreover, one of the apps Fowler names has had its own privacy issues as detailed by The Washington Post in May. As it turns out Care19, a contact tracing app developed when the governor of North Dakota asked a friend who had designed a app for football fans to meet up, is violating its own privacy policy according to Jumbo, the maker of privacy software. Apparently, Care19 shares location and personal data with FourSquare when used on iPhones. Both Apple and state officials are at a loss to explain how this went unnoticed when the app was scrubbed for technical and privacy problems before being rolled out.
  • Truss leads China hawks trying to derail TikTok’s London HQ plan” By Dan Sabbagh – The Guardian. ByteDance’s plan to establish a headquarters in London is now under attack by members of the ruling Conservative party for the company’s alleged role in persecuting the Uighur minority in Xinjiang. ByteDance has been eager to move to London and also eager to avoid the treatment that another tech company from the People’s Republic of China has gotten in the United Kingdom (UK): Huawei. Nonetheless, this decision may turn political as the government’s reversal on Huawei and 5G did. Incidentally, if Microsoft does buy part of TikTok, it would be buying operations in four of the five Five Eyes nations but not the UK.
  • Human Rights Commission warns government over ‘dangerous’ use of AI” By Fergus Hunter – The Sydney Morning Herald. A cautionary tale regarding the use of artificial intelligence and algorithms in government decision-making. While this article nominally pertains to Australia’s Human Rights Commission advice to the country’s government, it is based, in large part, on a scandal in which an automated process illegally collected $721 million AUD from welfare beneficiaries. In the view of the Human Rights Commission, decision-making by humans is still preferable and more accurate than automated means.
  • The Attack That Broke Twitter Is Hitting Dozens of Companies” By Andy Greenberg – WIRED. In the never-ending permutations of hacking, the past has become the present because the Twitter hackers use phone calls to talk their way into gaining access to a number of high-profile accounts (aka phone spear phishing.) Other companies are suffering the same onslaught, proving the axiom that people may be the weakest link in cybersecurity. However, the phone calls are based on exacting research and preparation as hackers scour the internet for information on their targets and the companies themselves. A similar hack was reportedly executed by the Democratic People’s Republic of Korea (DPRK) against Israeli defense firms.
  • Miami Police Used Facial Recognition Technology in Protester’s Arrest” By Connie Fossi and Phil Prazan – NBC Miami. The Miami Police Department used Clearview AI to identify a protestor that allegedly injured an officer but did not divulge this fact to the accused or her attorney. The department’s policy on facial recognition technology bars officers from making arrests solely on the basis of identification through such a system. Given the error rates many facial recognition systems have experienced with identifying minorities and the use of masks during the pandemic, which further decreases accuracy, it is quite likely people will be wrongfully accused and convicted using this technology.
  • Big Tech’s Domination of Business Reaches New Heights” By Peter Eavis and Steve Lohr – The New York Times. Big tech has gotten larger, more powerful, and more indispensable in the United States (U.S.) during the pandemic, and one needs to go back to the railroads in the late 19th Century to find comparable companies. It is an open question whether their size and influence will change much no matter who is president of the U.S. next year.
  • License plate tracking for police set to go nationwide” By Alfred Ng – c/net. A de facto national license plate reader may soon be activated in the United States (U.S.). Flock Safety unveiled the “Total Analytics Law Officers Network,” (TALON) that will link its systems of cameras in more than 700 cities, allowing police departments to track cars across multiple jurisdictions. As the U.S. has no national laws regulating the use of this and other similar technologies, private companies may set policy for the country in the short term.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Dueling COVID-19 Privacy Bills Released

Democratic stakeholders answer a Republican proposal on how to regulate privacy issues raised by COVID-19 contact tracing. The proposals have little chance of enactment and are mostly about positioning.  

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Late last week, a group of Democratic stakeholders on privacy and data security issues released the “Public Health Emergency Privacy Act” (S.3749), a bill that serves as a counterpoint to the “COVID-19 Consumer Data Protection Act” (S.3663) legislation introduced a few weeks ago to pose a solution to the privacy issues raised by contact tracing of COVID-19. However, the Democratic bill contains a number of provisions that many Republicans consider non-starters such as a private right of action for individuals and no preemption of state laws. It is not likely this bill will advance in the Senate even though it may possibly be moved in the House. S. 3749 was introduced by Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) and Representatives Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Suzan DelBene (D-WA). 

In a way, the “Public Health Emergency Privacy Act” makes the case that “Health Insurance Portability and Accountability Act” (HIPAA)/“Health Information Technology for Economic and Clinical Health Act” (HITECH Act) regulations are inadequate to protect the privacy and data of people in the United States for the reason that these regulations apply only to healthcare providers, their business associates, and other named entities in the healthcare system. Entities collecting healthcare information, even very sensitive information, are not subject to HIPAA/HITECH regulations and would likely be regulated, to the extent they are, by the Federal Trade Commission (FTC) or state agencies and attorneys general.

The “Public Health Emergency Privacy Act” would cover virtually all entities, including government agencies except for public health authorities (e.g. a state department of health or the Centers for Disease Control and Prevention), healthcare providers, service providers, people acting in a household capacity. This remarkable definition likely reflects plans announced around by the world by governments to eschew Google and Apple’s contact tracing app to develop its own. Moreover, it would also touch some efforts aside and apart from contact tracing apps. Moreover, this is the first bill introduced recently that proposes to treat public and private entities the same way with respect to how and when they may collect personal data.

The types of data protected under the act are “emergency health data” which are defined as “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” The bill then lists a number of examples of emergency health data that is sweeping and comprehensive. This term includes “information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual” related to testing for COVID-19 and related genetic and biometric information. Geolocation and proximity information would also be covered by this definition. Finally, emergency health data encompasses “any other data collected from a personal device,” which seems to cover all data collection from a person’s phone, the primary means of contact tracing proposed thus far. However, it would appear that data on people collected at the household or higher level may not be subject to this definition and hence much of the bill’s protections.

The authority provided to covered entities is limited. First, collection, use, and disclosure of emergency health data are restricted to good faith health purposes. And yet, this term is not defined in the act, and the FTC may need to define it during the expedited rulemaking the bill. However, it seems a fairly safe assumption the agency and courts would not construe using emergency health data for advertising would not be a good faith public health purpose. Next, covered entities must allow people to correct inaccurate information and the entity itself has the duty to take reasonable efforts on its own to correct this information. Covered entities must implement reasonable safeguards to prevent discrimination on the basis of these data, which is a new obligation placed on covered entities in a privacy or data security bill. This provision may have been included to bar government agencies collecting and using covered data for discriminatory purposes. However, critics of more expansive bills may see this as the camel’s nose under the tent for future privacy and data security bills. Finally, the only government agencies that can legally be provided emergency health data are public health authorities and then only “for good faith public health purposes and in direct response to exigent circumstances.” Again, the limits of good faith public health purposes remain unclear and then such disclosure can only occur in exigent circumstances, which likely means when a person has contracted or been exposed to COVID-19.

Covered entities and service providers must implement appropriate measure to protect the security and confidentiality of emergency health data, but the third member of the usual triumvirate, availability, is not identified as requiring protection.

The “Public Health Emergency Privacy Act” outright bars a number of uses for emergency health data:

  • Commercial advertising, recommendations for e-commerce, or machine learning for these purposes
  • Any discriminatory practices related to employment, finance, credit, insurance, housing, or education opportunities; and
  • Discriminating with respect to goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation

It bears note the covered data cannot be collected or disclosed for these purposes either.

The act contains very strong consent language. Covered entities may not collect, use, or disclose emergency health data unless a person has provided express affirmative consent, which requires knowing, informed choice that cannot be obtained through the use of deceptive practices nor inferred through a person’s inaction. However, there are significant exceptions to this consent requirement that would allow covered entities to collect, use, or disclose these data, including

  • protecting against malicious, deceptive, fraudulent, or illegal activity; or
  • detecting, responding to, or preventing information security incidents or threats;
    the covered organization is compelled to do so by a legal obligation.

But these purposes are valid only when necessary and solely for the fulfillment of named purpose.

In a related vein, covered entities must allow people to revoke their consent and it must be effective as soon as practicable but no later than 15 days afterward. Moreover, the person’s emergency health data must be destroyed or rendered not linkable within 30 days of revocation of consent.

Covered entities must provide clear and conspicuous notice before or at the point of data collection that explains how and why the emergency health data is collected, used, and disclosed. Moreover, it must also disclose the categories of recipients to whom a covered entity discloses covered data. This notice must also disclose the covered entity’s data retention policies and data security policies and practices but only with respect to emergency health data. The notice must also inform consumers on how they may exercise rights under the act and how to file a complaint with the FTC.

There would also be a public reporting requirement for those covered entities collecting, using, or disclosing the emergency health data of 100,000 or more people. Every three months, these entities would need to report on aggregate figures on the number of people from whom data was collected, used, and disclosed. These reports must also detail the categories of emergency health data collected, used and disclosed and the categories of third parties with whom such data are shared.

The bill requires covered entities to destroy or make not linkable emergency health data 60 days after the Secretary of Health and Human Services declares the end of the COVID-19 public health emergency, or a state does so, or 60 days after collection.

The FTC would be given the daunting task of beginning a rulemaking 7 days after enactment “to ensure a covered organization that has collected, used, or disclosed emergency health data before the date of enactment of this Act is in compliance with this Act, to the degree practicable” that must be completed within 45 days. There is also a provision requiring the Department of Health and Human Services (HHS) to issue guidance so that HIPAA/HITECH Act regulated entities do not need to meet duplicative requirements, and, moreover, the bill exempts these entities when operating under the HIPAA/HITECH Act regulations.

The FTC, state attorneys general, and people would be able to enforce this new act through a variety of means. The FTC would treat violations as f they were violations of regulation barring a deceptive or unfair practice, meaning the agency could levy fines in the first instance of more than $43,000 a violation. The FTC could seek a range of other relief, including injunctions, restitution, disgorgement, and remediation. However, the FTC could go to federal court without having to consult with the Department of Justice, which is a departure from current law and almost all the privacy bills introduced in this Congress. The FTC’s jurisdiction would be broadened to include common carriers and non-profits for purposes of this bill, too. The FTC would receive normal notice and comment rulemaking authority that is very broad as the agency would be free to promulgate any regulations it sees necessary to effectuate this act.

State attorneys general would be able to seek all the same relief the FTC can so long as the latter is not already bringing an action. Moreover, any other state officials empowered by state statute to bring similar actions may do so for violations of this act.

People would be allowed to sue in either federal or state court to vindicate violations. The bill states that any violation is to be considered an injury in fact to forestall any court from finding that a violation does not injure the person, meaning her suit cannot proceed. The act would allow people to recover between $100-1000 for any negligent violation and between $500-5000 for any reckless, willful, or intentional violation with no cap on total damages. People may also ask for and receive reasonable attorney’s fees and costs and any equitable or declaratory relief a court sees fit to grant. Moreover, the bill would disallow all pre-lawsuit arbitration agreements or waivers of rights. Much of the right to sue granted to people by the “Public Health Emergency Privacy Act” will be opposed by many Republicans and industry stakeholders.

The enforcement section raises a few questions given that entities covered by the bill include government agencies. Presumably, the FTC cannot enforce this act against government agencies for the very good reason they do not have jurisdiction over them. However, does the private right of action waive the federal and state government’s sovereign immunity? This is not clear and may need clarification of the bill is acted upon in either chamber of Congress.

This bill would not preempt state laws, which, if enacted, could subject covered entities to meeting more than one regime for collecting, using, and disclosing emergency health data.

Apart from contact tracing, the “Public Health Emergency Privacy Act” also bars the use of emergency health data to abridge a person’s right to vote and it requires HHS, the United States Commission on Civil Rights, and the FTC to “prepare and submit to Congress reports that examines the civil rights impact of the collection, use, and disclosure of health information in response to the COVID–19 public health emergency.”

Given the continued impasse over privacy legislation, it is little wonder that the bill unveiled a few weeks ago by four key Senate Republicans takes a similar approach that differs in key aspects. Of course, there is no private right of action and it expressly preempts state laws to the contrary.

Generally speaking, the structure of the “COVID–19 Consumer Data Protection Act of 2020” (S.3663) tracks with the bills that have been released thus far by the four sponsors: Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The FTC and state attorneys general would enforce the new protections.

The scope of the information and entities covered is narrower than the Democratic bill. “Covered data” is “precise geolocation data, proximity data, a persistent identifier, and personal health information” but not aggregated data, business contact information, de-identified data, employee screening data, and publicly available information. Those entities covered by the bill are those already subject to FTC jurisdiction along with common carriers and non-profits, but government agencies are not, again unlike the bill put forth by Democrats. Entities must abide by this bill to the extent they “collect[], process[], or transfer[] such covered data, or determine[] the means and purposes for the collection, processing, or transfer of covered data.”

Another key definition is how an “individual” is determined, for it excludes any person acting in her role as an employee and almost all work-related capacities, making clear employers will not need to comply with respect to those working for them.

“Personal health information” is “information relating to an individual that is

  • genetic information of the individual; or
  • information relating to the diagnosis or treatment of past, present, or future physical, mental health, or disability of the individual; and
  • identifies, or is reasonably linkable to, the individual.

And yet, this term excludes educational information subject to the “Family Educational Rights and Privacy Act of 1974” or “information subject to regulations promulgated pursuant to the HIPAA/HITECH Acts.

The “COVID–19 Consumer Data Protection Act of 2020” bars the collection, processing, or transferring of covered data for a covered purpose unless prior notice is provided, a person has provided affirmative consent, and the covered entity has agreed not to collect, process, or transfer such covered data for any other purpose than those detailed in the notice. However, leaving aside the bill’s enumerated allowable purposes for which covered data may collected with consent, the bill provides a number of exemptions from this general bar. For example, collection, processing, or transfers necessary for a covered entity to comply with another law are permissible apparently in the absence of a person’s consent. Moreover, a covered entity need not obtain consent for operational or administrative tasks not disclosed in the notice provided to people.

The act does spell out “covered purposes” for which covered entities may collect, process, or transfer covered data with consent after notice has been given:

  • Collecting, processing, or transferring the covered data of an individual to track the spread, signs, or symptoms of COVID–19.
  • Collecting, processing, or transferring the covered data of an individual to measure compliance with social distancing guidelines or other requirements related to COVID–19 that are imposed on individuals under a Federal, State, or local government order.
  • Collecting, processing, or transferring the covered data of an individual to conduct contact tracing for COVID–19 cases.

Covered entities would be required to publish a privacy policy detailing for which of the above covered purposes a person’s covered data would be collected, processed, or transferred. This policy must also detail the categories of entities that receive covered data, its data retention and data security policies.

There would be reporting requirements that would affect more covered entities than the Democratic bill. Accordingly, any covered entity collecting, processing or transferring covered data for one of the enumerated covered processes must issue a public report 30 days after enactment and then every 60 days thereafter.

Among other provisions in the bill, people would be able to revoke consent, a request that covered entities must honor within 14 days. Covered entities must also ensure the covered data are accurate, but this requirement falls a bit short of people being granted to right to correct inaccurate data as they would, instead, be merely able to report inaccuracies. There is no recourse if a covered entity chooses not to heed these reports. Covered entities would need to “delete or de-identify all covered data collected, processed, or transferred for a [covered purpose] when it is no longer being used for such purpose and is no longer necessary to comply with a Federal, State, or local legal obligation, or the establishment, exercise, or defense of a legal claim.” Even though there is the commandment to delete or de-identify, the timing as to when that happens seems somewhat open-ended as some covered entities could seek legal obligations to meet in order to keep the data on hand.

Covered entities must also minimize covered data by limiting collection, processing, and transferring to “what is reasonably necessary, proportionate, and limited to carry out [a covered purpose.]” The FTC must draft and release guidelines “recommending best practices for covered entities to minimize the collection, processing, and transfer of covered data.” However, these guidelines would most likely be advisory in nature and would not carry the force of law or regulation, leaving covered entities to disregard some of the document if they choose. Covered entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of such data,” a mandate broader than the duty imposed by the Democratic bill.

The FTC and state attorneys general would enforce the new regime. The FTC would be able to seek civil fines for first violations in the same way as the Democrat’s privacy bill. However, unlike the other bill, the Republican bill would nullify any Federal Communications Commission (FCC) jurisdiction to the extent it conflicted with the FTC’s in enforcement of the new act. Presumably, this would address jurisdictional issues raised by placing common carriers under FTC jurisdiction when they are usually overseen by the FCC. Even though state laws are preempted, state attorneys general would be able to bring actions to enforce this act at the state level. And, as noted earlier, there is no private right of action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Senate Commerce Republicans Vow To Introduce Privacy Bill To Govern COVID-19 Apps and Tech

Key Republican stakeholders on privacy legislation float a bill on COVID-19 relating to privacy that seems unlikely to garner the necessary Democratic buy-in to advance.  

Late last week, key Republicans on the Senate Commerce, Science, and Transportation announced they would introduce the “COVID-19 Consumer Data Protection Act” that provide new privacy and data security protections for the use of a COVID-19 contact tracing app and similar technologies. To date, text of the legislation has not been released and so any analysis of the bill is derived from a short summary issued by the committee and reports from media outlets that have apparently been provided a copy of the bill.

Based on this information, to no great surprise, the basic structure of the bill tracks privacy and data protection legislation previously introduced by the co-sponsors of the new bill: Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The Federal Trade Commission (FTC) and state attorneys general would enforce the new protections, and as there was no mention of a private right of action, and given these Members opposition to such provisions, it is likely the bill does not provide such redress. Moreover, according to media reports, the bill would preempt state laws contrary to its provision, which would be another likely non-starter among Democrats.

Wicker, Thune, Moran, and Blackburn claimed their bill “would provide all Americans with more transparency, choice, and control over the collection and use of their personal health, geolocation, and proximity data…[and] would also hold businesses accountable to consumers if they use personal data to fight the COVID-19 pandemic” as they asserted in their press release.

Wicker, Thune, Moran, and Blackburn provided this summary of the “COVID-19 Consumer Data Protection Act:”

  • Require companies under the jurisdiction of the Federal Trade Commission to obtain affirmative express consent from individuals to collect, process, or transfer their personal health, geolocation, or proximity information for the purposes of tracking the spread of COVID-19.
  • Direct companies to disclose to consumers at the point of collection how their data will be handled, to whom it will be transferred, and how long it will be retained.
  • Establish clear definitions about what constitutes aggregate and de-identified data to ensure companies adopt certain technical and legal safeguards to protect consumer data from being re-identified.
  • Require companies to allow individuals to opt out of the collection, processing, or transfer of their personal health, geolocation, or proximity information.
  • Direct companies to provide transparency reports to the public describing their data collection activities related to COVID-19.
  • Establish data minimization and data security requirements for any personally identifiable information collected by a covered entity.
  • Require companies to delete or de-identify all personally identifiable information when it is no longer being used for the COVID-19 public health emergency.
  • Authorize state attorneys general to enforce the Act.

If such legislation were to pass, it would add to the patchwork of privacy and data security bills already enacted that are geared to addressing certain sectors or populations (e.g. the “Health Insurance Portability and Accountability Act” (HIPAA) protects some healthcare information and “Children’s Online Privacy Protection Act” (COPPA) broadly protects children online.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Senate Democrats Release Privacy Principles

The ranking members of four Senate Committees have released their principles for any privacy legislation, many of which are likely to be rejected by Republicans and many industry stakeholders (e.g. no preemptions of the “California Consumer Privacy Act” (AB 375) and a private right of action for consumers).

Nonetheless, Senators Maria Cantwell (D-WA), Dianne Feinstein (D-CA), Patty Murray (D-WA), and Sherrod Brown (D-OH) agreed to these principles, and reportedly Senate Minority Leader Chuck Schumer (D-NY) convened and facilitated the effort, which has come ahead of the release of any of the privacy bills that have been under development this year in the Senate.

Of course, the Senate Commerce, Science, and Transportation Committee had convened an informal working group late last year consisting of Cantwell, Chair Roger Wicker (R-MS) and Senators John Thune (R-SD), Jerry Moran (R-KS), Brian Schatz (D-HI), and Richard Blumenthal (D-CT) to hash out a privacy bill. However, like most other such efforts, the timeline for releasing bill text has been repeatedly pushed back even after Wicker and Cantwell tried working by themselves on a bill late in the summer. Additionally, Moran and Blumenthal, the chair and ranking member of the Manufacturing, Trade, and Consumer Protection Subcommittee, have been working on a bill for some time as well but without a timeline for releasing text.

And, the efforts at this committee are in parallel to those in other committees. Senate Judiciary Chair Lindsey Graham (R-SC) has gotten his committee onto the field with hearings on the subject and has articulated his aim to play a role in crafting a bill. Likewise, the Senate Banking Committee has held hearings and are looking to participate in the process as well. But, like Senate Commerce, no bills have been released.

Of course, it is easier to write out one’s principles than to draft legislation. And yet, the release of these desired policies elegantly puts down a marker for Senate Democrats at a time when the majority in the chamber is struggling to coalesce and release a privacy bill. The move also demonstrates cohesion among the top Democrats on four of the committees with a slice of jurisdiction over privacy and data security issues: Commerce, Banking, HELP, and Judiciary.