EDPB Issues FAQs On Privacy Shield Decision

While the EDPB does not provide absolute answers on how US entities looking to transfer EU personal data should proceed, the agencies provide their best thinking on what the path forward looks like.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

On 24 July, the European Data Protection Board (EDPB) has addressed, in part, the implications of the recent decision that struck down the European Union-United States Privacy Shield, an agreement that had allowed US companies to transfer and process the personal data of EU citizens. The EDPB fully endorsed the view that the United States’ (US) surveillance regime, notably Section 702 of the “Foreign Intelligence Surveillance Act” (FISA) and Executive Order (EO) 12333, makes most transfers to the US illegal except perhaps if entities holding and using the data take extra steps to protect it. The EDPB references another means that allows for transfers to possibly continue but that generally requires informed and explicit consent from each and every EU person involved. Finally, the EDPB does not address whether the European Commission (EC) and the US are able to execute a third agreement that would be legal under EU law.

The EDPB, which is comprised of the European Union’s (EU) data protection authorities (DPAs), has formally adopted a document spelling out its view on if data transfers under Privacy Shield to the US are still legal and how companies should proceed in using standard contractual clauses (SCCs) and Binding Corporate Rules (BCR), two alternative means of transferring data aside from Privacy Shield. The EDPB’s views suggest the DPAs and supervisory authorities (SA) in each EU nation are going to need to work on a case-by-case basis regarding the latter two means, for the EDPB stressed these are to be evaluated individually. Given recent criticism of how nations are funding and resourcing their DPAs, there may be capacity issues in managing this new work alongside existing enforcement and investigation matters. Moreover, the EDPB discusses use of the exceptions available in Article 49 of the General Data Privacy Regulation (GDPR), stressing that most such transfers are to be occasional.

In last week’s decision, the Court of Justice of the European Union (CJEU) invalidated the European Commission’s adequacy decision on the EU-US Privacy Shield, thus throwing into question all transfers of personal data from the EU into the US that relied on this means. The CJEU was more circumspect in ruling on the use of standard contractual clauses (SCC), another way to legally transfer personal data out of the EU in compliance with the bloc’s law. The court seems to suggest there may be cases in which the use of SCCs may be inadequate given a country’s inadequate protections of the data of EU residents, especially with respect to national security and law enforcement surveillance. The EDPB issued a statement when the decision was made supporting the CJEU but has now adopted a more detailed explanation of its views on the implications of the decision for data controllers, data processors, other nations, EU DPAs and SAs.

In “Frequently Asked Questions (FAQ) on the judgment of the CJEU in Case C-311/18 -Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems,” the EDPB explains its current thinking on the decision, much of which is built on existing guidance and interpretation of the GDPR. The EDPB explained that the FAQ “aims at presenting answers to some frequently asked questions received by SAs and will be developed and complemented along with further analysis, as the EDPB continues to examine and assess the judgment of the CJEU.”

Here are notable excerpts:

  • Is there any grace period during which I can keep on transferring data to the U.S. without assessing my legal basis for the transfer? No, the Court has invalidated the Privacy Shield Decision without maintaining its effects, because the U.S. law assessed by the Court does not provide an essentially equivalent level of protection to the EU. This assessment has to be taken into account for any transfer to the U.S.
  • I was transferring data to a U.S. data importer adherent to the Privacy Shield, what should I do now? Transfers on the basis of this legal framework are illegal. Should you wish to keep on transferring data to the U.S., you would need to check whether you can do so under the conditions laid down below.
  • I am using SCCs with a data importer in the U.S., what should I do? The Court found that U.S. law (i.e., Section 702 FISA and EO 12333) does not ensure an essentially equivalent level of protection. Whether or not you can transfer personal data on the basis of SCCs will depend on the result of your assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place. The supplementary measures along with SCCs, following a case-by-case analysis of the circumstances surrounding the transfer, would have to ensure that U.S. law does not impinge on the adequate level of protection they guarantee. If you come to the conclusion that, taking into account the circumstances of the transfer and possible supplementary measures, appropriate safeguards would not be ensured, you are required to suspend or end the transfer of personal data. However, if you are intending to keep transferring data despite this conclusion, you must notify your competent SA.
  • I am using Binding Corporate Rules (“BCRs”) with an entity in the U.S., what should I do? Given the judgment of the Court, which invalidated the Privacy Shield because of the degree of interference created by the law of the U.S. with the fundamental rights of persons whose data are transferred to that third country, and the fact that the Privacy Shield was also designed to bring guarantees to data transferred with other tools such as BCRs, the Court’s assessment applies as well in the context of BCRs, since U.S. law will also have primacy over this tool.
  • Whether or not you can transfer personal data on the basis of BCRs will depend on the result of your assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place. These supplementary measures along with BCRs, following a case-by-case analysis of the circumstances surrounding the transfer, would have to ensure that U.S. law does not impinge on the adequate level of protection they guarantee. If you come to the conclusion that, taking into account the circumstances of the transfer and possible supplementary measures, appropriate safeguards would not be ensured, you are required to suspend or end the transfer of personal data. However if you are intending to keep transferring data despite this conclusion, you must notify your competent SA.
  • Can I rely on one of the derogations of Article 49 GDPR to transfer data to the U.S.? It is still possible to transfer data from the EEA to the U.S. on the basis of derogations foreseen in Article 49 GDPR provided the conditions set forth in this Article apply. The EDPB refers to its guidelines on this provision. In particular, it should be recalled that when transfers are based on the consent of the data subject, it should be:
    • explicit,
    • specific for the particular data transfer or set of transfers (meaning that the data exporter must make sure to obtain specific consent before the transfer is put in place even if this occurs after the collection of the data has been made),and
    • informed, particularly as to the possible risks of the transfer (meaning the data subject should also informed of the specific risks resulting from the fact that their data will be transferred to a country that does not provide adequate protection and that no adequate safeguards aimed at providing protection for the data are being implemented).
  • With regard to transfers necessary for the performance of a contract between the data subject and the controller, it should be borne in mind that personal data may only be transferred when the transfer is occasional. It would have to be established on a case-by-case basis whether data transfers would be determined as “occasional” or “non-occasional”. In any case, this derogation can only be relied upon when the transfer is objectively necessary for the performance of the contract.
  • In relation to transfers necessary for important reasons of public interest(which must be recognized in EU or Member States’ law), the EDPB recalls that the essential requirement for the applicability of this derogation is the finding of an important public interest and not the nature of the organisation, and that although this derogation is not limited to data transfers that are “occasional”, this does not mean that data transfers on the basis of the important public interest derogation can take place on a large scale and in a systematic manner. Rather, the general principle needs to be respected according to which the derogations as set out in Article 49 GDPR should not become “the rule” in practice, but need to be restricted to specific situations and each data exporter needs to ensure that the transfer meets the strict necessity test.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Maret H. from Pixabay

Further Reading and Other Developments (17 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Speaking of which, the Technology Policy Update is being published daily during the week, and here are the Other Developments and Further Reading from this week.

Other Developments

  • Acting Senate Intelligence Committee Chair Marco Rubio (R-FL), Senate Foreign Relations Committee Chair Jim Risch (R-ID), and Senators Chris Coons (D-DE) and John Cornyn (R-TX) wrote Secretary of Commerce Wilbur Ross and Secretary of Defense Mike Esper “to ask that the Administration take immediate measures to bring the most advanced digital semiconductor manufacturing capabilities to the United States…[which] are critical to our American economic and national security and while our nation leads in the design of semiconductors, we rely on international manufacturing for advanced semiconductor fabrication.” This letter follows the Trump Administration’s May announcement that the Taiwan Semiconductor Manufacturing Corporation (TSMC) agreed to build a $12 billion plant in Arizona. It also bears note that one of the amendments pending to the “National Defense Authorization Act for Fiscal Year 2021“ (S.4049) would establish a grants program to stimulate semiconductor manufacturing in the US.
  • Senators Mark R. Warner (D-VA), Mazie K. Hirono (D-HI) and Bob Menendez (D-NJ) sent a letter to Facebook “regarding its failure to prevent the propagation of white supremacist groups online and its role in providing such groups with the organizational infrastructure and reach needed to expand.” They also “criticized Facebook for being unable or unwilling to enforce its own Community Standards and purge white supremacist and other violent extremist content from the site” and posed “a series of questions regarding Facebook’s policies and procedures against hate speech, violence, white supremacy and the amplification of extremist content.”
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published the Pipeline Cyber Risk Mitigation Infographic that was “[d]eveloped in coordination with the Transportation Security Administration (TSA)…[that] outlines activities that pipeline owners/operators can undertake to improve their ability to prepare for, respond to, and mitigate against malicious cyber threats.”
  • Representative Kendra Horn (D-OK) and 10 other Democrats introduced legislation “requiring the U.S. government to identify, analyze, and combat efforts by the Chinese government to exploit the COVID-19 pandemic” that was endorsed by “[t]he broader Blue Dog Coalition” according to their press release. The “Preventing China from Exploiting COVID-19 Act” (H.R.7484) “requires the Director of National Intelligence—in coordination with the Secretaries of Defense, State, and Homeland Security—to prepare an assessment of the different ways in which the Chinese government has exploited or could exploit the pandemic, which originated in China, in order to advance China’s interests and to undermine the interests of the United States, its allies, and the rules-based international order.” Horn and her cosponsors stated “[t]he assessment must be provided to Congress within 90 days and posted in unclassified form on the DNI’s website.”
  • The Supreme Court of Canada upheld the “Genetic Non-Discrimination Act” and denied a challenge to the legality of the statute brought by the government of Quebec, the Attorney General of Canada, and others. The court found:
    • The pith and substance of the challenged provisions is to protect individuals’ control over their detailed personal information disclosed by genetic tests, in the broad areas of contracting and the provision of goods and services, in order to address Canadians’ fears that their genetic test results will be used against them and to prevent discrimination based on that information. This matter is properly classified within Parliament’s power over criminal law. The provisions are supported by a criminal law purpose because they respond to a threat of harm to several overlapping public interests traditionally protected by the criminal law — autonomy, privacy, equality and public health.
  • The U.S.-China Economic and Security Review Commission published a report “analyzing the evolution of U.S. multinational enterprises (MNE) operations in China from 2000 to 2017.” The Commission found MNE’s operations in the People’s Republic of China “may indirectly erode the  United  States’  domestic industrial competitiveness  and  technological  leadership relative  to  China” and “as U.S. MNE activity in China increasingly focuses on the production of high-end technologies, the risk  that  U.S.  firms  are  unwittingly enabling China to  achieve  its industrial  policy and  military  development objectives rises.”
  • The Federal Communications Commission (FCC) and Huawei filed their final briefs in their lawsuit before the United States Court of Appeals for the Fifth Circuit arising from the FCC’s designation of Huawei as a “covered company” for purposes of a rule that denies Universal Service Funds (USF) “to purchase or obtain any equipment or services produced or provided by a covered company posing a national security threat to the integrity of communications networks or the communications supply chain.” Huawei claimed in its brief that “[t]he rulemaking and “initial designation” rest on the FCC’s national security judgments..[b]ut such judgments fall far afield of the FCC’s statutory  authority  and  competence.” Huawei also argued “[t]he USF rule, moreover, contravenes the Administrative Procedure Act (APA) and the Due Process Clause.” The FCC responded in its filing that “Huawei challenges the FCC’s decision to exclude carriers whose networks are vulnerable to foreign interference, contending that the FCC has neither statutory nor constitutional authority to make policy judgments involving “national security”…[but] [t]hese arguments are premature, as Huawei has not yet been injured by the Order.” The FCC added “Huawei’s claim that the Communications Act textually commits all policy determinations with national security implications to the President is demonstrably false.”
  • European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski released his Strategy for 2020-2024, “which will focus on Digital Solidarity.” Wiewiórowski explained that “three core pillars of the EDPS strategy outline the guiding actions and objectives for the organisation to the end of 2024:
    • Foresight: The EDPS will continue to monitor legal, social and technological advances around the world and engage with experts, specialists and data protection authorities to inform its work.
    • Action: To strengthen the EDPS’ supervision, enforcement and advisory roles the EDPS will promote coherence in the activities of enforcement bodies in the EU and develop tools to assist the EU institutions, bodies and agencies to maintain the highest standards in data protection.
    • Solidarity: While promoting digital justice and privacy for all, the EDPS will also enforce responsible and sustainable data processing, to positively impact individuals and maximise societal benefits in a just and fair way.
  • Facebook released a Civil Rights Audit, an “investigation into Facebook’s policies and practices began in 2018 at the behest and encouragement of the civil rights community and some members of Congress.” Those charged with conducting the audit explained that they “vigorously advocated for more and would have liked to see the company go further to address civil rights concerns in a host of areas that are described in detail in the report” including but not limited to
    • A stronger interpretation of its voter suppression policies — an interpretation that makes those policies effective against voter suppression and prohibits content like the Trump voting posts — and more robust and more consistent enforcement of those policies leading up to the US 2020 election.
    • More visible and consistent prioritization of civil rights in company decision-making overall.
    • More resources invested to study and address organized hate against Muslims, Jews and other targeted groups on the platform.
    • A commitment to go beyond banning explicit references to white separatism and white nationalism to also prohibit express praise, support and representation of white separatism and white nationalism even where the terms themselves are not used.
    • More concrete action and specific commitments to take steps to address concerns about algorithmic bias or discrimination.
    • They added that “[t]his report outlines a number of positive and consequential steps that the company has taken, but at this point in history, the Auditors are concerned that those gains could be obscured by the vexing and heartbreaking decisions Facebook has made that represent significant setbacks for civil rights.”
  • The National Security Commission on Artificial Intelligence (NSCAI) released a white paper titled “The Role of AI Technology in Pandemic Response and Preparedness” that “outlines a series of investments and initiatives that the United States must undertake to realize the full potential of AI to secure our nation against pandemics.” NSCAI noted its previous two white papers:
  • Secretary of Defense Mark Esper announced that Chief Technology Officer Michael J.K. Kratsios has “been designated to serve as Acting Under Secretary of Defense for Research and Engineering” even though he does not have a degree in science. The last Under Secretary held a PhD. However, Kratsios worked for venture capitalist Peter Thiel who backed President Donald Trump when he ran for office in 2016.
  • The United States’ Department of Transportation’s Federal Railroad Administration (FRA) issued research “to develop a cyber security risk analysis methodology for communications-based connected railroad technologies…[and] [t]he use-case-specific implementation of the methodology can identify potential cyber attack threats, system vulnerabilities, and consequences of the attack– with risk assessment and identification of promising risk mitigation strategies.”
  • In a blog post, a National Institute of Standards and Technology (NIST) economist asserted cybercrime may be having a much larger impact on the United States’ economy than previously thought:
    • In a recent NIST report, I looked at losses in the U.S. manufacturing industry due to cybercrime by examining an underutilized dataset from the Bureau of Justice Statistics, which is the most statistically reliable data that I can find. I also extended this work to look at the losses in all U.S. industries. The data is from a 2005 survey of 36,000 businesses with 8,079 responses, which is also by far the largest sample that I could identify for examining aggregated U.S. cybercrime losses. Using this data, combined with methods for examining uncertainty in data, I extrapolated upper and lower bounds, putting 2016 U.S. manufacturing losses to be between 0.4% and 1.7% of manufacturing value-added or between $8.3 billion and $36.3 billion. The losses for all industries are between 0.9% and 4.1% of total U.S. gross domestic product (GDP), or between $167.9 billion and $770.0 billion. The lower bound is 40% higher than the widely cited, but largely unconfirmed, estimates from McAfee.
  • The Government Accountability Office (GAO) advised the Federal Communications Commission (FCC) that it needs a comprehensive strategy for implementing 5G across the United States. The GAO concluded
    • FCC has taken a number of actions regarding 5G deployment, but it has not clearly developed specific and measurable performance goals and related measures–with the involvement of relevant stakeholders, including National Telecommunications and Information Administration (NTIA)–to manage the spectrum demands associated with 5G deployment. This makes FCC unable to demonstrate whether the progress being made in freeing up spectrum is achieving any specific goals, particularly as it relates to congested mid-band spectrum. Additionally, without having established specific and measurable performance goals with related strategies and measures for mitigating 5G’s potential effects on the digital divide, FCC will not be able to assess the extent to which its actions are addressing the digital divide or what actions would best help all Americans obtain access to wireless networks.
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “Time Guidance for Network Operators, Chief Information Officers, and Chief Information Security Officers” “to inform public and private sector organizations, educational institutions, and government agencies on time resilience and security practices in enterprise networks and systems…[and] to address gaps in available time testing practices, increasing awareness of time-related system issues and the linkage between time and cybersecurity.”
  • Fifteen Democratic Senators sent a letter to the Department of Defense, Office of the Director of National Intelligence (ODNI), Department of Homeland Security (DHS), Federal Bureau of Investigations (FBI), and U.S. Cyber Command, urging them “to take additional measures to fight influence campaigns aimed at disenfranchising voters, especially voters of color, ahead of the 2020 election.” They called on these agencies to take “additional measures:”
    • The American people and political candidates are promptly informed about the targeting of our political processes by foreign malign actors, and that the public is provided regular periodic updates about such efforts leading up to the general election.
    • Members of Congress and congressional staff are appropriately and adequately briefed on continued findings and analysis involving election related foreign disinformation campaigns and the work of each agency and department to combat these campaigns.
    • Findings and analysis involving election related foreign disinformation campaigns are shared with civil society organizations and independent researchers to the maximum extent which is appropriate and permissible.
    • Secretary Esper and Director Ratcliffe implement a social media information sharing and analysis center (ISAC) to detect and counter information warfare campaigns across social media platforms as authorized by section 5323 of the Fiscal Year 2020 National Defense Authorization Act.
    • Director Ratcliffe implement the Foreign Malign Influence Response Center to coordinate a whole of government approach to combatting foreign malign influence campaigns as authorized by section 5322 of the Fiscal Year 2020 National Defense Authorization Act.
  • The Information Technology and Innovation Foundation (ITIF) unveiled an issue brief “Why New Calls to Subvert Commercial Encryption Are Unjustified” arguing “that government efforts to subvert encryption would negatively impact individuals and businesses.” ITIF offered these “key takeaways:”
    • Encryption gives individuals and organizations the means to protect the confidentiality of their data, but it has interfered with law enforcement’s ability to prevent and investigate crimes and foreign threats.
    • Technological advances have long frustrated some in the law enforcement community, giving rise to multiple efforts to subvert commercial use of encryption, from the Clipper Chip in the 1990s to the San Bernardino case two decades later.
    • Having failed in these prior attempts to circumvent encryption, some law enforcement officials are now calling on Congress to invoke a “nuclear option”: legislation banning “warrant-proof” encryption.
    • This represents an extreme and unjustified measure that would do little to take encryption out of the hands of bad actors, but it would make commercial products less secure for ordinary consumers and businesses and damage U.S. competitiveness.
  • The White House released an executive order in which President Donald Trump determined “that the Special Administrative Region of Hong Kong (Hong Kong) is no longer sufficiently autonomous to justify differential treatment in relation to the People’s Republic of China (PRC or China) under the particular United States laws and provisions thereof set out in this order.” Trump further determined “the situation with respect to Hong Kong, including recent actions taken by the PRC to fundamentally undermine Hong Kong’s autonomy, constitutes an unusual and extraordinary threat, which has its source in substantial part outside the United States, to the national security, foreign policy, and economy of the United States…[and] I hereby declare a national emergency with respect to that threat.” The executive order would continue the Administration’s process of changing policy to ensure Hong Kong is treated the same as the PRC.
  • President Donald Trump also signed a bill passed in response to the People’s Republic of China (PRC) passing legislation the United States and other claim will strip Hong Kong of the protections the PRC agreed to maintain for 50 years after the United Kingdom (UK) handed over the city. The “Hong Kong Autonomy Act” “requires the imposition of sanctions on Chinese individuals and banks who are included in an annual State Department list found to be subverting Hong Kong’s autonomy” according to the bill’s sponsor Representative Brad Sherman (D-CA).
  • Representative Stephen Lynch, who chairs House Oversight and Reform Committee’s National Security Subcommittee, sent letters to Apple and Google “after the Office of the Director of National Intelligence (ODNI) and the Federal Bureau of Investigation (FBI) confirmed that mobile applications developed, operated, or owned by foreign entities, including China and Russia, could potentially pose a national security risk to American citizens and the United States” according to his press release. He noted in letters sent by the technology companies to the Subcommittee that:
    • Apple confirmed that it does not require developers to submit “information on where user data (if any such data is collected by the developer’s app) will be housed” and that it “does not decide what user data a third-party app can access, the user does.”
    • Google stated that it does “not require developers to provide the countries in which their mobile applications will house user data” and acknowledged that “some developers, especially those with a global user base, may store data in multiple countries.”
    • Lynch is seeking “commitments from Apple and Google to require information from application developers about where user data is stored, and to make users aware of that information prior to downloading the application on their mobile devices.”
  • Minnesota Attorney General Keith Ellison announced a settlement with Frontier Communications that “concludes the three major investigations and lawsuits that the Attorney General’s office launched into Minnesota’s major telecoms providers for deceptive, misleading, and fraudulent practices.” The Office of the Attorney General (OAG) stated
    • Based on its investigation, the Attorney General’s Office alleged that Frontier used a variety of deceptive and misleading practices to overcharge its customers, such as: billing customers more than they were quoted by Frontier’s agents; failing to disclose fees and surcharges in its sales presentations and advertising materials; and billing customers for services that were not delivered.
    • The OAG “also alleged that Frontier sold Minnesotans expensive internet services with so-called “maximum speed” ratings that were not attainable, and that Frontier improperly advertised its service as “reliable,” when in fact it did not provide enough bandwidth for customers to consistently receive their expected service.”
  • The European Data Protection Board (EDPB) issued guidelines “on the criteria of the Right to be Forgotten in the search engines cases under the GDPR” that “focuses solely on processing by search engine providers and delisting requests  submitted by data subjects” even Article 17 of the General Data Protection Regulation applies to all data controllers. The EDPB explained “This paper is divided into two topics:
    • The first topic concerns the grounds a data subject can rely on for a delisting request sent to a search engine provider pursuant to Article 17.1 GDPR.
    • The second topic concerns the exceptions to the Right to request delisting according to Article 17.3 GDPR.
  • The Australian Competition & Consumer Commission (ACCC) “is seeking views on draft Rules and accompanying draft Privacy Impact Assessment that authorise third parties who are accredited at the ‘unrestricted’ level to collect Consumer Data Right (CDR) data on behalf of another accredited person.” The ACCC explained “[t]his will allow accredited persons to utilise other accredited parties to collect CDR data and provide other services that facilitate the provision of goods and services to consumers.” In a March explanatory statement, the ACCC stated “[t]he CDR is an economy-wide reform that will apply sector-by-sector, starting with the banking sector…[and] [t]he objective of the CDR is to provide individual and business consumers (consumers) with the ability to efficiently and conveniently access specified data held about them by businesses (data holders), and to authorise the secure disclosure of that data to third parties (accredited data recipients) or to themselves.” The ACCC noted “[t]he CDR is regulated by both the ACCC and the Office of the Australian Information Commissioner (OAIC) as it concerns both competition and consumer matters as well as the privacy and confidentiality of consumer data.” Input is due by 20 July.
  • Office of the Inspector General (OIG) for the Department of the Interior (Interior) found that even though the agency spends $1.4 billion annually on cybersecurity “[g]uarding against increasing cybersecurity threats” remains one of Interior’s top challenges. The OIG asserted Interior “continues to struggle to implement an enterprise information technology (IT) security program that balances compliance, cost, and risk while enabling bureaus to meet their diverse missions.”
  • In a summary of its larger investigation into “Security over Information Technology Peripheral Devices at Select Office of Science Locations,” the Department of Energy’s Office of the Inspector General (OIG) that “identified weaknesses related to access controls and configuration settings” for peripheral devices (e.g. thumb drives, printers, scanners and other connected devices)  “similar in type to those identified in prior evaluations of the Department’s unclassified cybersecurity program.”
  • The House Homeland Security Committee’s Cybersecurity, Infrastructure Protection, and Innovation Subcommittee Ranking Member John Katko (R-NY) “a comprehensive national cybersecurity improvement package” according to his press release, consisting of these bills:
    • The “Cybersecurity and Infrastructure Security Agency Director and Assistant Directors Act:”  This bipartisan measure takes steps to improve guidance and long-term strategic planning by stabilizing the CISA Director and Assistant Directors positions. Specifically, the bill:
      • Creates a 5-year term for the CISA Director, with a limit of 2 terms. The term of office for the current Director begins on date the Director began to serve.
      • Elevates the Director to the equivalent of a Deputy Secretary and Military Service Secretaries.
      • Depoliticizes the Assistant Director positions, appointed by the Secretary of the Department of Homeland Security (DHS), categorizing them as career public servants. 
    • The “Strengthening the Cybersecurity and Infrastructure Security Agency Act of 2020:” This measure mandates a comprehensive review of CISA in an effort to strengthen its operations, improve coordination, and increase oversight of the agency. Specifically, the bill:
      • Requires CISA to review how additional appropriations could be used to support programs for national risk management, federal information systems management, and public-private cybersecurity and integration. It also requires a review of workforce structure and current facilities and projected needs. 
      • Mandates that CISA provides a report to the House and Senate Homeland Committees within 1-year of enactment. CISA must also provide a report and recommendations to GSA on facility needs. 
      • Requires GSA to provide a review to the Administration and House and Senate Committees on CISA facilities needs within 30-days of Congressional report. 
    • The “CISA Public-Private Talent Exchange Act:” This bill requires CISA to create a public-private workforce program to facilitate the exchange of ideas, strategies, and concepts between federal and private sector cybersecurity professionals. Specifically, the bill:
      • Establishes a public-private cyber exchange program allowing government and industry professionals to work in one another’s field.
      • Expands existing private outreach and partnership efforts. 
  • The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) is ordering United States federal civilian agencies “to apply the July 2020 Security Update for Windows Servers running DNS (CVE-2020-1350), or the temporary registry-based workaround if patching is not possible within 24 hours.” CISA stated “[t]he software update addresses a significant vulnerability where a remote attacker could exploit it to take control of an affected system and run arbitrary code in the context of the Local System Account.” CISA Director Christopher Krebs explained “due to the wide prevalence of Windows Server in civilian Executive Branch agencies, I’ve determined that immediate action is necessary, and federal departments and agencies need to take this remote code execution vulnerability in Windows Server’s Domain Name System (DNS) particularly seriously.”
  • The United States (US) Department of State has imposed “visa restrictions on certain employees of Chinese technology companies that provide material support to regimes engaging in human rights abuses globally” that is aimed at Huawei. In its statement, the Department stated “Companies impacted by today’s action include Huawei, an arm of the Chinese Communist Party’s (CCP) surveillance state that censors political dissidents and enables mass internment camps in Xinjiang and the indentured servitude of its population shipped all over China.” The Department claimed “[c]ertain Huawei employees provide material support to the CCP regime that commits human rights abuses.”
  • Earlier in the month, the US Departments of State, Treasury, Commerce, and of Homeland Security issued an “advisory to highlight the harsh repression in Xinjiang.” The agencies explained
    • Businesses, individuals, and other persons, including but not limited to academic institutions, research service providers, and investors (hereafter “businesses and individuals”), that choose to operate in Xinjiang or engage with entities that use labor from Xinjiang elsewhere in China should be aware of reputational, economic, and, in certain instances, legal, risks associated with certain types of involvement with entities that engage in human rights abuses, which could include Withhold Release Orders (WROs), civil or criminal investigations, and export controls.
  • The United Kingdom’s National Cyber Security Centre (NCSC), Canada’s Communications  Security Establishment (CSE), United States’ National Security Agency (NSA) and the United States’ Department of Homeland Security’s Cybersecurity and Infrastructure Security  Agency (CISA) issued a joint advisory on a Russian hacking organization’s efforts have “targeted various organisations involved in COVID-19 vaccine development in Canada, the United States and the United Kingdom, highly likely with the intention of stealing information and intellectual property relating to the development and testing of COVID-19 vaccines.” The agencies named APT29 (also known as ‘the Dukes’ or ‘Cozy Bear’), “a cyber espionage group, almost certainly part of the Russian intelligence services,” as the culprit behind “custom malware known as ‘WellMess’ and ‘WellMail.’”
    • This alert follows May advisories issued by Australia, the US, and the UK on hacking threats related to the pandemic. Australia’s Department of Foreign Affairs and Trade (DFAT) and the Australian Cyber Security Centre (ACSC) issued “Advisory 2020-009: Advanced Persistent Threat (APT) actors targeting Australian health sector organisations and COVID-19 essential services” that asserted “APT groups may be seeking information and intellectual property relating to vaccine development, treatments, research and responses to the outbreak as this information is now of higher value and priority globally.” CISA and NCSC issued a joint advisory for the healthcare sector, especially companies and entities engaged in fighting COVID-19. The agencies stated that they have evidence that Advanced Persistent Threat (APT) groups “are exploiting the COVID-19 pandemic as part of their cyber operations.” In an unclassified public service announcement, the Federal Bureau of Investigation (FBI) and CISA named the People’s Republic of China as a nation waging a cyber campaign against U.S. COVID-19 researchers. The agencies stated they “are issuing this announcement to raise awareness of the threat to COVID-19-related research.”
  • The National Initiative for Cybersecurity Education (NICE) has released a draft National Institute of Standards and Technology (NIST) Special Publication (SP) for comment due by 28 August. Draft NIST Special Publication (SP) 800-181 Revision 1, Workforce Framework for Cybersecurity (NICE Framework) that features several updates, including:
    • an updated title to be more inclusive of the variety of workers who perform cybersecurity work,
    • definition and normalization of key terms,
    • principles that facilitate agility, flexibility, interoperability, and modularity,
    • introduction of competencies,
  • Representatives Glenn Thompson (R-PA), Collin Peterson (D-MN), and James Comer (R-KY) sent a letter to Federal Communications Commission (FCC) “questioning the Commission’s April 20, 2020 Order granting Ligado’s application to deploy a terrestrial nationwide network to provide 5G services.”
  • The European Commission (EC) is asking for feedback on part of its recently released data strategy by 31 July. The EC stated it is aiming “to create a single market for data, where data from public bodies, business and citizens can be used safely and fairly for the common good…[and] [t]his initiative will draw up rules for common European data spaces (covering areas like the environment, energy and agriculture) to:
    • make better use of publicly held data for research for the common good
    • support voluntary data sharing by individuals
    • set up structures to enable key organisations to share data.
  • The United Kingdom’s Parliament is asking for feedback on its legislative proposal to regulate Internet of Things (IoT) devices. The Department for Digital, Culture, Media & Sport explained “the obligations within the government’s proposed legislative framework would fall mainly on the manufacturer if they are based in the UK, or if not based in the UK, on their UK representative.” The Department is also “developing an enforcement approach with relevant stakeholders to identify an appropriate enforcement body to be granted day to day responsibility and operational control of monitoring compliance with the legislation.” The Department also touted the publishing of the European Telecommunications Standards Institute’s (ETSI) “security baseline for Internet-connected consumer devices and provides a basis for future Internet of Things product certification schemes.”
  • Facebook issued a white paper, titled “CHARTING A WAY FORWARD: Communicating Towards People-Centered and Accountable Design About Privacy,” in which the company states its desire to be involved in shaping a United States privacy law (See below for an article on this). Facebook concluded:
    • Facebook recognizes the responsibility we have to make sure that people are informed about the data that we collect, use, and share.
    • That’s why we support globally consistent comprehensive privacy laws and regulations that, among other things, establish people’s basic rights to be informed about how their information is collected, used, and shared, and impose obligations for organizations to do the same, including the obligation to build internal processes that maintain accountability.
    • As improvements to technology challenge historic approaches to effective communications with people about privacy, companies and regulators need to keep up with changing times.
    • To serve the needs of a global community, on both the platforms that exist now and those that are yet to be developed, we want to work with regulators, companies, and other interested third parties to develop new ways of informing people about their data, empowering them to make meaningful choices, and holding ourselves accountable.
    • While we don’t have all the answers, there are many opportunities for businesses and regulators to embrace modern design methods, new opportunities for better collaboration, and innovative ways to hold organizations accountable.
  • Four Democratic Senators sent Facebook a letter “about reports that Facebook has created fact-checking exemptions for people and organizations who spread disinformation about the climate crisis on its social media platform” following a New York Times article this week on the social media’s practices regarding climate disinformation. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars. Senators Elizabeth Warren (D-WA), Tom Carper (D-DE), Sheldon Whitehouse (D-R.I.) and Brian Schatz (D-HI) argued “[i]f Facebook is truly “committed to fighting the spread of false news on Facebook and Instagram,” the company must immediately acknowledge in its fact-checking process that the climate crisis is not a matter of opinion and act to close loopholes that allow climate disinformation to spread on its platform.” They posed a series of questions to Facebook CEO Mark Zuckerberg on these practices, requesting answers by 31 July.
  • A Canadian court has found that the Canadian Security Intelligence Service (CSIS) “admittedly collected information in a manner that is contrary to this foundational commitment and then relied on that information in applying for warrants under the Canadian Security Intelligence Service Act, RSC 1985, c C-23 [CSIS Act]” according to a court summary of its redacted decision. The court further stated “[t]he Service and the Attorney General also admittedly failed to disclose to the Court the Service’s reliance on information that was likely collected unlawfully when seeking warrants, thereby breaching the duty of candour owed to the Court.” The court added “[t]his is not the first time this Court has been faced with a breach of candour involving the Service…[and] [t]he events underpinning this most recent breach were unfolding as recommendations were being implemented by the Service and the Attorney General to address previously identified candour concerns.” CSIS was found to have illegally collected and used metadata in a 2016 case ion its conduct between 2006-2016. In response to the most recent ruling, CSIS is vowing to implement a range of reforms. The National Security and Intelligence Review Agency (NSIRA) is pledging the same.
  • The United Kingdom’s National Police Chiefs’ Council (NPCC) announced the withdrawal of “[t]he ‘Digital device extraction – information for complainants and witnesses’ form and ‘Digital Processing Notice’ (‘the relevant forms’) circulated to forces in February 2019 [that] are not sufficient for their intended purpose.” In mid-June, the UK’s data protection authority, the Information Commissioner’s Office (ICO) unveiled its “finding that police data extraction practices vary across the country, with excessive amounts of personal data often being extracted, stored, and made available to others, without an appropriate basis in existing data protection law.” This withdrawal was also due, in part, to a late June Court of Appeal decision.  
  • A range of public interest and advocacy organizations sent a letter to Speaker of the House Nancy Pelosi (D-CA) and House Minority Leader Kevin McCarthy (R-CA) noting “there are intense efforts underway to do exactly that, via current language in the House and Senate versions of the FY2021 National Defense Authorization Act (NDAA) that ultimately seek to reverse the FCC’s recent bipartisan and unanimous approval of Ligado Networks’ regulatory plans.” They urged them “not endorse efforts by the Department of Defense and its allies to veto commercial spectrum authorizations…[and][t]he FCC has proven itself to be the expert agency on resolving spectrum disputes based on science and engineering and should be allowed to do the job Congress authorized it to do.” In late April, the FCC’s “decision authorize[d] Ligado to deploy a low-power terrestrial nationwide network in the 1526-1536 MHz, 1627.5-1637.5 MHz, and 1646.5-1656.5 MHz bands that will primarily support Internet of Things (IoT) services.” The agency argued the order “provides regulatory certainty to Ligado, ensures adjacent band operations, including Global Positioning System (GPS), are sufficiently protected from harmful interference, and promotes more efficient and effective use of [the U.S.’s] spectrum resources by making available additional spectrum for advanced wireless services, including 5G.”
  • The European Data Protection Supervisor (EDPS) rendered his opinion on the European Commission’s White Paper on Artificial Intelligence: a European approach to excellence and trust and recommended the following for the European Union’s (EU) regulation of artificial intelligence (AI):
    • applies both to EU Member States and to EU institutions, offices, bodies and agencies;
    • is designed to protect from any negative impact, not only on individuals, but also on communities and society as a whole;
    • proposes a more robust and nuanced risk classification scheme, ensuring any significant potential harm posed by AI applications is matched by appropriate mitigating measures;
    • includes an impact assessment clearly defining the regulatory gaps that it intends to fill.
    • avoids overlap of different supervisory authorities and includes a cooperation mechanism.
    • Regarding remote biometric identification, the EDPS supports the idea of a moratorium on the deployment, in the EU, of automated recognition in public spaces of human features, not only of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals, so that an informed and democratic debate can take place and until the moment when the EU and Member States have all the appropriate safeguards, including a comprehensive legal framework in place to guarantee the proportionality of the respective technologies and systems for the specific use case.
  • The Bundesamt für Verfassungsschutz (BfV), Germany’s domestic security agency, released a summary of its annual report in which it claimed:
    • The Russian Federation, the People’s Republic of China, the Islamic Republic of Iran and the Republic of Turkey remain the main countries engaged in espionage activities and trying to exert influence on Germany.
    • The ongoing digital transformation and the increasingly networked nature of our society increases the potential for cyber attacks, worsening the threat of cyber espionage and cyber sabotage.
    • The intelligence services of the Russian Federation and the People’s Republic of China in particular carry out cyber espionage activities against German agencies. One of their tasks is to boost their own economies with the help of information gathered by the intelligence services. This type of information-gathering campaign severely threatens the success and development opportunities of German companies.
    • To counteract this threat, Germany has a comprehensive cyber security architecture in place, which is operated by a number of different authorities. The BfV plays a major role in investigating and defending against cyber threats by detecting attacks, attributing them to specific attackers, and using the knowledge gained from this to draw up prevention strategies. The National Cyber Response Centre, in which the BfV plays a key role, was set up to consolidate the co-operation between the competent agencies. The National Cyber Response Centre aims to optimise the exchange of information between state agencies and to improve the co-ordination of protective and defensive measures against potential IT incidents.

Further Reading

  • Trump confirms cyberattack on Russian trolls to deter them during 2018 midterms” – The Washington Post. In an interview with former George W. Bush speechwriter Marc Thiessen, President Donald Trump confirmed he ordered a widely reported retaliatory attack on the Russian Federation’s Internet Research Agency as a means of preventing interference during the 2018 mid-term election. Trump claimed this attack he ordered was the first action the United States took against Russian hacking even though his predecessor warned Russian President Vladimir Putin to stop such activities and imposed sanctions at the end of 2016. The timing of Trump’s revelation is interesting given the ongoing furor over reports of Russian bounties paid to Taliban fighters for killing Americans the Trump Administration may have known of but did little or nothing to stop.
  • Germany proposes first-ever use of EU cyber sanctions over Russia hacking” – Deutsche Welle. Germany is looking to use the European Union’s (EU) cyber sanctions powers against Russia for its alleged 2015 16 GB exfiltration of data from the Bundestag’s systems, including from Chancellor Angela Merkel’s office. Germany has been alleging that Fancy Bear (aka APT28) and Russia’s military secret service GRU carried out the attack. Germany has circulated its case for sanctions to other EU nations and EU leadership. In 2017, the European Council declared “[t]he EU diplomatic response to malicious cyber activities will make full use of measures within the Common Foreign and Security Policy, including, if necessary, restrictive measures…[and] [a] joint EU response to malicious cyber activities would be proportionate to the scope, scale, duration, intensity, complexity, sophistication and impact of the cyber activity.”
  • Wyden Plans Law to Stop Cops From Buying Data That Would Need a Warrant” – VICE. Following on a number of reports that federal, state, and local law enforcement agencies are essentially sidestepping the Fourth Amendment through buying location and other data from people’s smartphones, Senator Ron Wyden (D-OR) is going to draft legislation that would seemingly close what he, and other civil libertarians, are calling a loophole to the warrant requirement.
  • Amazon Backtracks From Demand That Employees Delete TikTok” – The New York Times. Amazon first instructed its employees to remove ByteDance’s app, TikTok, on 11 July from company devices and then reversed course the same day, claiming the email had been erroneously sent out. The strange episode capped another tumultuous week for ByteDance as the Trump Administration is intensifying pressure in a number of ways on the company which officials claim is subject to the laws of the People’s Republic of China and hence must share information with the government in Beijing. ByteDance counters the app marketed in the United States is through a subsidiary not subject to PRC law. ByteDance also said it would no longer offer the app in Hong Kong after the PRC change in law has extended the PRC’s reach into the former British colony. TikTok was also recently banned in India as part of a larger struggle between India and he PRC. Additionally, the Democratic National Committee warned staff about using the app this week, too.
  • Is it time to delete TikTok? A guide to the rumors and the real privacy risks.” – The Washington Post. A columnist and security specialist found ByteDance’s app vacuums up information from users, but so does Facebook and other similar apps. They scrutinized TikTok’s privacy policy and where the data went, and they could not say with certainty that it goes to and stays on servers in the US and Singapore. 
  • California investigating Google for potential antitrust violations” – Politico. California Attorney General Xavier Becerra is going to conduct his own investigation of Google aside and apart from the investigation of the company’s advertising practices being conducted by virtually every other state in the United States. It was unclear why Becerra opted against joining the larger probe launched in September 2019. Of course, the Trump Administration’s Department of Justice is also investigating Google and could file suit as early as this month.
  • How May Google Fight an Antitrust Case? Look at This Little-Noticed Paper” – The New York Times. In a filing with the Australian Competition and Consumer Commission (ACCC), Google claimed it does not control the online advertising market and it is borne out by a number of indicia that argue against a monopolistic situation. The company is likely to make the same case to the United States’ government in its antitrust inquiry. However, similar arguments did not gain tractions before the European Commission, which levied a €1.49 billion for “breaching EU antitrust rules” in March 2019.
  •  “Who Gets the Banhammer Now?” – The New York Times. This article examines possible motives for the recent wave of action by social media platforms to police a fraction of the extreme and hateful speech activists and others have been asking them to take down for years. This piece makes the argument that social media platforms are businesses and operate as such and expecting them to behave as de facto public squares dedicated to civil political and societal discourse is more or less how we ended up where we are.
  • TikTok goes tit-for-tat in appeal to MPs: ‘stop political football’ – The Australian. ByteDance is lobbying hard in Canberra to talk Ministers of Parliament out of possibly banning TikTok like the United States has said it is considering. While ByteDance claims the data collected on users in Australia is sent to the US or Singapore, some experts are arguing just to maintain and improve the app would necessarily result in some non-People’s Republic of China (PRC) user data making its way back to the PRC. As Australia’s relationship with the PRC has grown more fraught with allegations PRC hackers infiltrated Parliament and the Prime Minister all but saying PRC hackers were targeting hospitals and medical facilities, the government in Canberra could follow India’s lead and ban the app.
  • Calls for inquiry over claims Catalan lawmaker’s phone was targeted” – The Guardian. British and Spanish newspapers are reporting that an official in Catalonia who favors separating the region from Spain may have had his smartphone compromised with industrial grade spyware typically used only by law enforcement and counterterrorism agencies. The President of the Parliament of Catalonia Roger Torrent claims his phone was hacked for domestic political purposes, which other Catalan leaders argued, too. A spokesperson for the Spanish government said “[t]he government has no evidence that the speaker of the Catalan parliament has been the victim of a hack or theft involving his mobile.” However, the University of Toronto’s CitizenLab, the entity that researched and claimed that Israeli firm NSO Group’s spyware was deployed via WhatsApp to spy on a range of journalists, officials, and dissidents, often by their own governments, confirmed that Torrent’s phone was compromised.
  • While America Looks Away, Autocrats Crack Down on Digital News Sites” – The New York Times. The Trump Administration’s combative relationship with the media in the United States may be encouraging other nations to crack down on digital media outlets trying to hold those governments to account.
  •  “How Facebook Handles Climate Disinformation” – The New York Times. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars.
  • Here’s how President Trump could go after TikTok” – The Washington Post. This piece lays out two means the Trump Administration could employ to press ByteDance in the immediate future: use of the May 2019 Executive Order “Securing the Information and Communications Technology and Services Supply Chain” or the Committee on Foreign Investment in the United States process examining ByteDance of the app Music.ly that became TikTok. Left unmentioned in this article is the possibility of the Federal Trade Commission (FTC) examining its 2019 settlement with ByteDance to settle violations of the “Children’s Online Privacy Protection Act” (COPPA).
  • You’re Doomscrolling Again. Here’s How to Snap Out of It.” – The New York Times. If you find yourself endlessly looking through social media feeds, this piece explains why and how you might stop doing so.
  • UK selling spyware and wiretaps to 17 repressive regimes including Saudi Arabia and China” – The Independent. There are allegations that the British government has ignored its own regulations on selling equipment and systems that can be used for surveillance and spying to other governments with spotty human rights records. Specifically, the United Kingdom (UK) has sold £75m to countries non-governmental organizations (NGO) are rated as “not free.” The claims include nations such as the People’s Republic of China (PRC), the Kingdom of Saudi Arabia, Bahrain, and others. Not surprisingly, NGOs and the minority Labour party are calling for an investigation and changes.
  • Google sued for allegedly tracking users in apps even after opting out” – c/net. Boies Schiller Flexner filed suit in what will undoubtedly seek to become a class action suit over Google’s alleged continuing to track users even when they turned off tracking features. This follows a suit filed by the same firm against Google in June, claiming its browser Chrome still tracks people when they switch to incognito mode.
  • Secret Trump order gives CIA more powers to launch cyberattacks” – Yahoo! News. It turns out that in addition to signing National Security Presidential Memorandum (NSPM) 13 that revamped and eased offensive cyber operations for the Department of Defense, President Donald Trump signed a presidential finding that has allowed the Central Intelligence Agency (CIA) to launch its own offensive cyber attacks, mainly at Russia and Iran, according to unnamed former United States (US) officials according to this blockbuster story. Now, the decision to commence with an attack is not vetted by the National Security Council; rather, the CIA makes the decision. Consequently, there have been a number of attacks on US adversaries that until now have not been associated with the US. And, the CIA is apparently not informing the National Security Agency or Cyber Command of its operations, raising the risk of US cyber forces working at cross purposes or against one another in cyberspace. Moreover, a recently released report blamed the lax security environment at the CIA for a massive exfiltration of hacking tools released by Wikileaks. 
  • Facebook’s plan for privacy laws? ‘Co-creating’ them with Congress” – Protocol. In concert with the release of a new white paper, Facebook Deputy Chief Privacy Officer Rob Sherman sat for an interview in which he pledged the company’s willingness to work with Congress to co-develop a national privacy law. However, he would not comment on any of the many privacy bills released thus far or the policy contours of a bill Facebook would favor except for advocating for an enhanced notice and consent regime under which people would be better informed about how their data is being used. Sherman also shrugged off suggestions Facebook may not be welcome given its record of privacy violations. Finally, it bears mention that similar efforts by other companies at the state level have not succeeded as of yet. For example, Microsoft’s efforts in Washington state have not borne fruit in the passage of a privacy law.
  • Deepfake used to attack activist couple shows new disinformation frontier” – Reuters. We are at the beginning of a new age of disinformation in which fake photographs and video will be used to wage campaigns against nations, causes, and people. An activist and his wife were accused of being terrorist sympathizers by a university student who apparently was an elaborate ruse for someone or some group looking to defame the couple. Small errors gave away the ruse this time, but advances in technology are likely to make detection all the harder.
  • Biden, billionaires and corporate accounts targeted in Twitter hack” – The Washington Post. Policymakers and security experts were alarmed when the accounts of major figures like Bill Gates and Barack Obama were hacked yesterday by some group seeking to sell bitcoin. They argue Twitter was lucky this time and a more ideologically motivated enemy may seek to cause havoc, say on the United States’ coming election. A number of experts are claiming the penetration of the platform must have been of internal controls for so many high profile accounts to be taken over at the same time.
  • TikTok Enlists Army of Lobbyists as Suspicions Over China Ties Grow” – The New York Times. ByteDance’s payments for lobbying services in Washington doubled between the last quarter of 2019 and thirst quarter of 2020, as the company has retained more than 35 lobbyists to push back against the Trump Administration’s rhetoric and policy changes. The company is fighting against a floated proposal to ban the TikTok app on national security grounds, which would cut the company off from another of its top markets after India banned it and scores of other apps from the People’s Republic of China. Even if the Administration does not bar use of the app in the United States, the company is facing legislation that would ban its use on federal networks and devices that will be acted upon next week by a Senate committee. Moreover, ByteDance’s acquisition of the app that became TikTok is facing a retrospective review of an inter-agency committee for national security considerations that could result in an unwinding of the deal. Moreover, the Federal Trade Commission (FTC) has been urged to review ByteDance’s compliance with a 2019 settlement that the company violated regulations protecting the privacy of children that could result in multi-billion dollar liability if wrongdoing is found.
  • Why Google and Facebook Are Racing to Invest in India” – Foreign Policy. With New Delhi banning 59 apps and platforms from the People’s Republic of China (PRC), two American firms have invested in an Indian giant with an eye toward the nearly 500 million Indians not yet online. Reliance Industries’ Jio Platforms have sold stakes to Google and Facebook worth $4.5 billion and $5.7 billion that gives them prized positions as the company looks to expand into 5G and other online ventures. This will undoubtedly give a leg up to the United States’ online giants in vying with competitors to the world’s second most populous nation.
  • “Outright Lies”: Voting Misinformation Flourishes on Facebook” – ProPublica. In this piece published with First Draft, “a global nonprofit that researches misinformation,” an analysis of the most popular claims made about mail voting show that many of them are inaccurate or false, thus violating the platforms terms of services yet Facebook has done nothing to remove them or mark them as inaccurate until this article was being written.
  • Inside America’s Secretive $2 Billion Research Hub” – Forbes. Using contract information obtained through Freedom of Information requests and interviews, light is shined on the little known non-profit MITRE Corporation that has been helping the United States government address numerous technological problems since the late 1950’s. The article uncovers some of its latest, federally funded projects that are raising eyebrows among privacy advocates: technology to life people’s fingerprints from social media pictures, technology to scan and copy Internet of Things (IoT) devices from a distance, a scanner to read a person’s DNA, and others.
  • The FBI Is Secretly Using A $2 Billion Travel Company As A Global Surveillance Tool” – Forbes. In his second blockbuster article in a week, Forbes reporter Thomas Brewster exposes how the United States (US) government is using questionable court orders to gather travel information from the three companies that essentially provide airlines, hotels, and other travel entities with back-end functions with respect to reservations and bookings. The three companies, one of whom, Sabre is a US multinational, have masses of information on you if you have ever traveled, and US law enforcement agencies, namely the Federal Bureau of Investigation, is using a 1789 statute to obtain orders all three companies have to obey for information in tracking suspects. Allegedly, this capability has only been used to track terror suspects but will now reportedly be used for COVID-19 tracking.
  • With Trump CIA directive, the cyber offense pendulum swings too far” – Yahoo! News. Former United States (US) National Coordinator for Security, Infrastructure Protection, and Counter-terrorism Richard Clarke argues against the Central Intelligence Agency (CIA) having carte blanche in conducting cyber operations without the review or input of other federal agencies. He suggests that the CIA in particular, and agencies in general, tend to push their authority to the extreme, which in this case could lead to incidents and lasting precedents in cyberspace that may haunt the US. Clarke also intimated that it may have been the CIA and not Israel that launched cyber attacks on infrastructure facilities in Tehran this month and last.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Europe’s Highest Court Strikes Down Privacy Shield

The agreement that has been allowing US companies to transfer the personal data of EU residents to the US was found to be invalid under EU law. The EU’s highest court seem to indicate standard contractual clauses, a frequently used means to transfer data, may be acceptable.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In the second major ruling from the European Union (EU) this week, earlier today, its highest court invalidated the agreement that has allowed multinational corporations and others to transfer the personal data of EU citizens to the United States (US) for commercial purposes since 2016. The court did not, however, find illegal standard contractual clauses, the means by which many such transfers are occurring. This is the second case an Austrian privacy activist has brought, alleging that Facebook was transferring his personal data into the US in violation of European law because US law, especially surveillance programs, resulted in less protection and fewer rights. The first case resulted in the previous transfer agreement being found illegal, and now this case has resulted in much the same outcome. The import of this ruling is not immediately clear.

Maximillian Schrems filed a complaint against Facebook with the Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under EU law because of the mass US surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-US Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the US passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”

However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The European Data Protection Board (EDPB) explained in a recent decision on Denmark’s SCC that

  • According to Article 28(3) General Data Protection Regulation (GDPR), the processing by a data processor shall be governed by a contract or other legal act under Union or Member State law that is binding on the processor with regard to the controller, setting out a set of specific aspects to regulate the contractual relationship between the parties. These include the subject-matter and duration of the processing, its nature and purpose, the type of personal data and categories of data subjects, among others.
  • Under Article 28(6) GDPR, without prejudice to an individual contract between the data controller and the data processor, the contract or the other legal act referred in paragraphs (3) and (4) of Article 28 GDPR may be based, wholly or in part on SCCs.

In a summary of its decision, the CJEU explained

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

The CJEU found

  • Regarding the level of protection required in respect of such a transfer, the Court holds that the requirements laid down for such purposes by the GDPR concerning appropriate safeguards, enforceable rights and effective legal remedies must be interpreted as meaning that data subjects whose personal data are transferred to a third country pursuant to standard data protection clauses must be afforded a level of protection essentially equivalent to that guaranteed within the EU by the GDPR, read in the light of the Charter. In those circumstances, the Court specifies that the assessment of that level of protection must take into consideration both the contractual clauses agreed between the data exporter established in the EU and the recipient of the transfer established in the third country concerned and, as regards any access by the public authorities of that third country to the data transferred, the relevant aspects of the legal system of that third country.
  • Regarding the supervisory authorities’ obligations in connection with such a transfer, the Court holds that, unless there is a valid Commission adequacy decision, those competent supervisory authorities are required to suspend or prohibit a transfer of personal data to a third country where they take the view, in the light of all the circumstances of that transfer, that the standard data protection clauses are not or cannot be complied with in that country and that the protection of the data transferred that is required by EU law cannot be ensured by other means, where the data exporter established in the EU has not itself suspended or put an end to such a transfer.

The CJEU stated “the limitations on the protection of personal data arising from the domestic law of the US on the access and use by US public authorities of such data transferred from the EU to that third country, which the Commission assessed in [its 2016 adequacy decision], are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law, by the principle of proportionality, in so far as the surveillance programmes based on those provisions are not limited to what is strictly necessary.”

The CJEU found the process put in place by the US government to handle complaints inadequate. The 2016 Privacy Shield resulted in the creation of an Ombudsman post that EU citizens could submit their complaints. This position is currently held by Under Secretary of State for Economic Growth, Energy, and the Environment Keith Krach.

The CJEU stated “the Ombudsperson mechanism referred to in that decision does  not  provide  data  subjects with any  cause  of  action  before  a  body  which  offers guarantees substantially equivalent to those required by EU law, such as to ensure both the independence  of  the Ombudsperson  provided  for  by  that  mechanism  and the  existence  of rules  empowering  the  Ombudsperson  to  adopt  decisions  that  are  binding  on  the US intelligence services.”

The decision on SCCs is more ambiguous as it is not entirely clear the circumstances under which they can be used. In its decision, the CJEU made clear that SCCs are not necessarily legal under EU law:

although there are situations in which, depending on the law and practices in force in the third country concerned, the recipient of such a transfer is in a position to guarantee the necessary protection of the data solely on the basis of standard data protection clauses, there are others in which the content of those standard clauses might not constitute a sufficient means of ensuring, in practice, the effective protection of personal data transferred to the third country concerned. That is the case, in particular, where the law of that third country allows its public authorities to interfere with the rights of the data subjects to which that data relates.

Reaction from the parties was mixed, particularly on what the CJEU’s ruling means for SCCs even though there was agreement that the Privacy Shield will soon no longer govern data transfers from the EU to the US.

The DPC issued a statement in which it asserted

Today’s judgment provides just that, firmly endorsing the substance of the concerns expressed by the DPC (and by the Irish High Court) to the effect that EU citizens do not enjoy the level of protection demanded by EU law when their data is transferred to the United States. In that regard, while the judgment most obviously captures Facebook’s transfers of data relating to Mr Schrems, it is of course the case that its scope extends far beyond that, addressing the position of EU citizens generally.

The DPC added

So, while in terms of the points of principle in play, the Court has endorsed the DPC’s position, it has also ruled that the SCCs transfer mechanism used to transfer data to countries worldwide is, in principle, valid, although it is clear that, in practice, the application of the SCCs transfer mechanism to transfers of personal data to the United States is now questionable. This is an issue that will require further and careful examination, not least because assessments will need to be made on a case by case basis.

At a press conference, EC Vice-President Věra Jourová claimed the “CJEU declared the Privacy Shield decision invalid, but also confirmed that the standard contractual clauses remain a valid tool for the transfer of personal data to processors established in third countries.” She asserted “[t]his means that the transatlantic data flows can continue, based on the broad toolbox for international transfers provided by the GDPR, for instance binding corporate rules or SCCs.” Jourová contended with regard to next steps, “[w]e are not starting from scratch…[and] [o]n the contrary, the Commission has already been working intensively to ensure that this toolbox is fit for purpose, including the modernisation of the Standard Contractual Clauses.” Jourová stated “we will be working closely with our American counterparts, based on today’s ruling.”

European Commissioner for Justice Didier Reynders stated

  • First, I welcome the fact that the Court confirmed the validity of our Decision on SCCs.
    • We have been working already for some time on modernising these clauses and ensuring that our toolbox for international data transfers is fit for purpose.
    • Standard Contractual Clauses are in fact the most used tool for international transfers of personal data and we wanted to ensure they can be used by businesses and fully in line with EU law.
    • We are now advanced with this work and we will of course take into account the requirements of judgement.
    • We will work with the European Data Protection Board, as well as the 27 EU Member States. It will be very important to start the process to have a formal approval to modernise the Standard Contractual Clauses as soon as possible. We have been in an ongoing process about such a modernisation for some time, but with an attention to the different elements of the decision of the Court today.
  • My second point: The Court has invalidated the Privacy Shield. We have to study the judgement in detail and carefully assess the consequences of this invalidation.

Reynders stated that “[i]n the meantime, transatlantic data flows between companies can continue using other mechanisms for international transfers of personal data available under the GDPR.”

In a statement, US Secretary of Commerce Wilbur Ross

While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts.

Ross continued

We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.

The Department of Commerce stated it “will continue to administer the Privacy Shield program, including processing submissions for self-certification and re-certification to the Privacy Shield Frameworks and maintaining the Privacy Shield List.” The agency added “[t]oday’s decision does not relieve participating organizations of their Privacy Shield obligations.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by harakir from Pixabay

FTC Settles A Pair of Privacy Shield Cases

The FTC imposes 20 year commitments for two companies who were not meeting their requirements in terms of transferring the personal data of EU residents out of Europe.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

The Federal Trade Commission (FTC) has announced its second Privacy Shield violation settlement in the last few weeks that will impose obligations over the next 20 years so long as the United States (US) companies choose to transfer and process the data of European Union (EU) citizens and residents. The 2016 agreement requires US entities to self-certify compliance subject to enforcement by the FTC for most companies and violations are punished under the Section 5 prohibition against deceptive practices of the FTC Act. The agreement requires a range of practices for those companies that choose to participate, including heeding standards for notice, consent, accountability for onward transfers, data security, data integrity and purpose limitation. A failure to fully comply represents a violation subject to enforcement.

In the settlement announced this week, the FTC claimed Ortho-Clinical Diagnostics, Inc. “participated in the Privacy Shield framework and complied with the program’s requirements, even though the company had allowed its certification to lapse in 2018” according to the agency’s press release. The FTC added

After Ortho’s certification lapsed, the Department of Commerce warned the company to either remove the claims or take steps to recertify its participation in the Privacy Shield program, which the company failed to do, the complaint alleges. The FTC also alleges Ortho violated the Privacy Shield principles by failing to verify annually that statements about its Privacy Shield practices were accurate. In addition, it also failed to comply with a Privacy Shield requirement that it affirm that the company would continue to apply Privacy Shield protections to personal information collected while participating in the program, according to the complaint.

In a Consent Agreement set to run for 20 years, Ortho-Clinical Diagnostics, Inc. “whether acting directly or indirectly, in connection with the advertising, marketing, promotion, offering for sale, or sale of any product or service, must affirm to the Department of Commerce, within ten (10) days after the effective date of this Order and on an annual basis thereafter for as long as it retains such information, that it will

1. continue to apply the EU-U.S. Privacy Shield framework principles to the personal information it received while it participated in the Privacy Shield; or

2. protect the information by another means authorized under EU (for the EU-U.S. Privacy Shield framework) or Swiss (for the Swiss-U.S. Privacy Shield framework) law, including by using a binding corporate rule or a contract that fully reflects the requirements of the relevant standard contractual clauses adopted by the European Commission

If the company decides not to participate in the Privacy Shield, it must delete all data within 10 days.

The FTC meted out a stiffer penalty to NTT Global Data Centers, Inc., formerly known as RagingWire Data Centers for Privacy Shield compliance violations. The company “must hire a third-party assessor to verify that it is adhering to its Privacy Shield promises if it plans to participate in the framework” per the FTC’s press release. The FTC explained

In a complaint filed in November 2019, the FTC alleged that, between January 2017 and October 2018, RagingWire claimed in its online privacy policy and marketing materials that the company participated in the Privacy Shield framework and complied with the program’s requirements. In fact, the FTC alleged, the company’s certification lapsed in January 2018 and it failed to comply with certain Privacy Shield requirements while it was a participant in the program. The FTC also alleged that, upon allowing its certification to lapse, RagingWire failed to take the necessary steps to confirm that it would comply with its continuing obligations relating to data received pursuant to the framework.

In the 20 year Consent Order with NTT Global Data Centers, the FTC stipulated

no later than 120 days after the effective date of this Order and for so long as Respondent is a self-certified participant in Privacy Shield, Respondent and its officers, agents, employees, and attorneys, and all other persons in active concert or participation with any of them, who receive actual notice of this Order, whether acting directly or indirectly, in connection with the advertisement, marketing, promotion, offering for sale, or sale of any product or service, shall obtain an annual outside compliance review from an independent third-party assessor approved by the Associate Director for the Division of Enforcement of the Bureau of Consumer Protection at the Federal Trade Commission, that demonstrates that the assertions Respondent makes about its Privacy Shield practices are true, and that those Privacy Shield practices have been implemented as represented and in accord with the Privacy Shield Principles. (emphasis added).

NTT Global Data Centers must also

1. continue to apply the EU-U.S. Privacy Shield framework principles to the personal information it received while it participated in the Privacy Shield; or

2. protect the information by another means authorized under EU (for the EU-U.S. Privacy Shield framework) or Swiss (for the Swiss-U.S. Privacy Shield framework) law, including by using a binding corporate rule or a contract that fully reflects the requirements of the relevant standard contractual clauses adopted by the European Commission

The FTC split over the Consent Order against NTT Global Data Centers, with Commissioner Rohit Chopra dissenting for these reasons:

  • American businesses that participate in the EU-U.S. Privacy Shield Framework should not have to compete with those that break their privacy promises.
  • The FTC charged a data center company with violating their Privacy Shield commitments, but our proposed settlement does not even attempt to adequately remedy the harm to the market.
  • The evidence in the record raises serious concerns that customers looking to follow the law relied on the company’s representations and may be locked into long-term contracts.
  • A quick settlement with a small firm for an inadvertent mistake may be appropriate, but it is inadequate for a dishonest, large firm violating a core pillar of Privacy Shield.
  • We must consider seeking additional remedies, including rights to renegotiate contracts, disgorgement of ill-gotten revenue and data, and notice and redress for customers.

Chair Joe Simons and Commissioners Noah Joshua Phillips and Christine Wilson argued in their majority statement that

Commissioner Chopra would ask us to reject a settlement that protects consumers and furthers our Privacy Shield goals, to instead continue litigation during an ongoing pandemic. There is no need and doing so would unnecessarily divert resources from other important matters, including investigations of other substantive violations of Privacy Shield. We do not support moving the goalposts in this manner and for this reason vote to accept the settlement, which not just accords with but exceeds the relief the Commission unanimously sought to obtain at the outset of the case.

Despite these and other Privacy Shield enforcement actions, it is likely EU officials will still find US enforcement lacking. The European Data Protection Board (EDPB or Board) released its most recent annual assessment of the Privacy Shield in December 2019 and again found both the agreement itself and implementation wanting. There was some overlap between the concerns of the EDPB and the European Commission (EC) as detailed in its recently released third assessment of the Privacy Shield, but the EDPB discusses areas that were either omitted from or downplayed in the EC’s report. The EDPB’s authority is persuasive with respect to Privacy Shield and carries weight with the EC; however, its concerns as detailed in previous annual reports have pushed the EC to demand changes, including but not limited to, pushing the Trump Administration to nominate Board Members to the Privacy and Civil Liberties Oversight Board (PCLOB) and the appointment of a new Ombudsperson to handle complaints about how the U.S. Intelligence Community is handling the personal data of EU citizens.

In January 2019, in the “EU-U.S. Privacy Shield – Second Annual Joint Review,” the EDPB noted some progress by the US in implementing the EU-U.S. Privacy Shield. However, the EU’s Data Protection Authorities (DPA) and EDPB took issue with a number of shortcomings in US implementation, many of which have been noted in previous analyses of US efforts to ensure that U.S. companies that agree to the Privacy Shield’s principles. Notably, the EDPB found problems with the assurances provided by the US government regarding the collection and use of personal data by national security and law enforcement agencies. The EDPB also found problems with how the Department of Commerce and FTC are enforcing the Privacy Shield in the US against commercial entities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by ipse dixit on Unsplash

EDPB Opines Encryption Ban Would Endanger A Nation’s Compliance with GDPR

As the US and others call on technology companies to develop the means to crack encrypted communications, an EU entity argues any nation with such a law would likely not meet the GDPR’s requirements.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB’s letter comes amidst another push by the Trump Administration, Republican allies in Congress, and other nations to have technology companies develop workarounds or backdoors to its end-to-end encrypted devices, apps, and systems. The proponents of this change claim online child sexual predators, terrorists, and other criminals are using products and services like WhatsApp, Telegram, and iPhones to defeat legitimate, targeted government surveillance and enforcement. They reason that unless technology companies abandon their unnecessarily absolutist position and work towards a technological solution, the number of bad actors communicating in ways that cannot be broken (aka “going dark”) will increase, allowing for greater crime and wrongdoing.

On the other side of the issue, technology companies, civil liberties and privacy experts, and computer scientists argue that any weakening of or backdoors to encryption will eventually be stolen and exposed, making it easier for criminals to hack, steal, and exfiltrate. They assert the internet and digital age are built on secure communications and threatening this central feature would wreak havoc beyond the crimes the US and other governments are seeking to prevent.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

The EDPB’s view is being articulated at a time when, as noted, a number of nations led by the United States (US) continue to press technology companies to allow them access to communications, apps, platforms, and devices that are encrypted. Last year, the US, United Kingdom, Australia, New Zealand, and Canada (the so-called Five Eyes nations) met and claimed in one of the communiques, the Five Eyes ministers asserted that

We are concerned where companies deliberately design their systems in a way that precludes any form of access to content, even in cases of the most serious crimes. This approach puts citizens and society at risk by severely eroding a company’s ability to identify and respond to the most harmful illegal content, such as child sexual exploitation and abuse, terrorist and extremist material and foreign adversaries’ attempts to undermine democratic values and institutions, as well as law enforcement agencies’ ability to investigate serious crime.

The five nations contended that “[t]ech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” The Five Eyes also claimed that “[t]hose companies should also embed the safety of their users in their system designs, enabling them to take action against illegal content…[and] [a]s part of this, companies and Governments must work together to ensure that the implications of changes to their services are well understood and that those changes do not compromise public safety.”

The Five Eyes applauded “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.”

The Five Eyes added

We share concerns raised internationally, inside and outside of government, about the impact these changes could have on protecting our most vulnerable citizens, including children, from harm. More broadly, we call for detailed engagement between governments, tech companies, and other stakeholders to examine how proposals of this type can be implemented without negatively impacting user safety, while protecting cyber security and user privacy, including the privacy of victims.

In October 2019, in an open letter to Facebook CEO Mark Zuckerberg, US Attorney General William P. Barr, United Kingdom Home Secretary Priti Patel, Australia’s Minister for Home Affairs Peter Dutton, and then acting US Homeland Security Secretary Kevin McAleenan asked “that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

However, one of the Five Eyes nations has already taken legislative action to force technology companies and individuals cooperate with law enforcement investigations in ways that could threaten encryption. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

In a related development, this week, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

INSLM stated “[t]he essential effects of TOLA are as follows:

a. Schedule 1 gives police and intelligence agencies new powers to agree or require significant industry assistance from communications providers.

b. Schedules 2, 3 and 4 update existing powers and, in some cases, extended them to new agencies. c. Schedule 5 gives the Australian Security Intelligence Organisation (ASIO) significant new powers to seek and receive both voluntary and compulsory assistance.

INSLM found

  • In relation to Schedule 1, for the reasons set out in greater detail in the report, Technical Assistance Notice (TANs) and Technical Capability Notice (TCNs) should be authorised by a body which is independent of the issuing agency or government. These are powers designed to compel a Designated Communications Provider (DCP) to reveal private information or data of its customers and therefore the usual practice of independent authorisation should apply.
  • I am satisfied that the computer access warrant and associated powers conferred by Schedule 2 are both necessary and proportionate, subject to some amendments.
  • I am generally satisfied that the powers conferred by Schedules 3 and 4 are both necessary and proportionate, but there are some matters that should be addressed and further monitored.
  • I have concluded that Schedule 5 should be amended to limit its breadth and clarify its scope.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

European Commission Releases Its First Review of the GDPR

While emphasizing the positive developments, the EC calls for more work to help nations and DPAs better and more uniformly endorse the law.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

The European Commission submitted its two-year review of the General Data Protection Regulation (GDPR) that took effect across the European Union in May 2018. This review was required to occur two years after the new cross-border data protection structure took effect, and the GDPR further requires reviews every four years after the first review has been completed. It bears note the EC opted to exceed its statutory mandate in the report by covering more than international transfers of EU personal data and how well nations and DPAs are using coordination and cooperation mechanisms to ensure uniform, effective enforcement across the EU.

Overall, the EC touts what it frames as successes of the GDPR and calls for EU member states, data protection authorities (DPA), and the European Data Protection Board (EDPB or Board) to address and resolve a host of ongoing issues that make enforcement of and compliance with the GDPR more difficult. For example, the EC flags the resources and independence EU nations are providing their DPAs as a major issue, as many of the regulatory bodies lack the funding, technical capability, and human power to fully regulate the data rights and obligations enshrined in the GDPR. Another issue the EC discusses at some length are the differing national data protection laws, many of which conflict with or not fully implement the GDPR.

In terms of a top-line summary, the EC claimed

  • The   general   view is   that   two   years   after   it   started   to   apply,   the   GDPR   has successfully,  met  its  objectives  of  strengthening  the  protection  of  the  individual’s right  to  personal  data  protection  and  guaranteeing  the  free  flow  of  personal  data within  the  EU.  However  a  number  of  areas  for  future  improvement  have  also  been identified. 
  • Like most stakeholders and data protection authorities, the Commission is of  the  view  that  it  would  be  premature  at  this  stage  to  draw  definite  conclusions regarding the application of the GDPR.
  • It is likely that most of the issues identified by Member  States  and  stakeholders  will  benefit from  more  experience  in  applying  the GDPR  in  the  coming  years. 
  • Nevertheless,  this  report  highlights  the  challenges encountered so far in applying the GDPR and sets out possible ways to address them.
  • Notwithstanding  its  focus  is  on  the  two  issues  highlighted  in  Article  97(2)  of  the GDPR,   namely   international   transfers   and   the   cooperation   and   consistency mechanisms,  this  evaluation  and  review  takes  a  broader  approach  to  also  address issues which have been raised by various actors during the last two years.

Among its other findings, the EC asserted

  • However, developing a truly common European data protection culture between data protection authorities is still an on-going process. Data protection authorities have not yet made full use of the tools the GDPR provides, such as joint operations that could lead to joint investigations. At times, finding a common approach meant moving to the lowest common denominator and as a result, opportunities to foster more harmonisation were missed
  • Stakeholders generally welcome the guidelines from the Board and request additional ones on key concepts of the GDPR, but also point to inconsistencies between the national guidance and the Board guidelines. They underline the need for more practical advice, in particular more concrete examples, and the need for data protection authorities to be equipped with the necessary human, technical and financial resources to effectively carry out their tasks.

The EC called for greater funding and resources for DPAs to enforce the GDPR, especially in Ireland and Luxembourg which serve as the EU headquarters for a number of large technology companies:

Data protection authorities play an essential role in ensuring that the GDPR is enforced at national level and that the cooperation and consistency mechanisms within the Board functions effectively, including in particular the one-stop-shop mechanism for cross-border cases. Member States are therefore called upon to provide them with adequate resources as required by the GDPR.

The EC wrapped up the GDPR review by drawing a roadmap of sorts for future actions:

Based on this evaluation of the application of the GDPR since May 2018, the actions listed below have been identified as necessary to support its application. The Commission will monitor their implementation also in view of the forthcoming evaluation report in 2024.

The EC offered the following as ongoing or future actions to more fully realize implementation and enforcement of the GDPR to be undertaken by EU states, EU DPAs, the EC, the EDPB, and stakeholders in the EU and elsewhere:

Implementing and complementing the legal framework

Member States should

  • complete the alignment of their sectoral laws to the GDPR;
  • consider limiting the use of specification clauses which might create fragmentation and jeopardise the free flow of data within the EU;
  • assess whether national law implementing the GDPR is in all circumstances within the margins provided for Member State legislation.

The Commission will

  • pursue bilateral exchanges with Member States on the compliance of national laws with the GDPR, including on the independence and resources of national data protection authorities; make use of all the tools at its disposal, including infringement procedures, to ensure that Member States comply with the GDPR;
  • support further exchanges of views and national practices between Member States on topics which are subject to further specification at national level so as to reduce the level of fragmentation of the single market, such as processing of personal data relating to health and research, or which are subject to balancing with other rights such as the freedom of expression;
  • support a consistent application of the data protection framework in relation to new technologies to support innovation and technological developments;
  • use the GDPR Member States Expert Group (established during the transitory phase before the GDPR became applicable) to facilitate discussions and sharing of experience between Member States and with the Commission;
  • explore whether, in the light of further experience and relevant case-law, proposing possible future targeted amendments to certain provisions of the GDPR might be appropriate, in particular regarding records of processing by SMEs that do not have the processing of personal data as their core business (low risk), and the possible harmonisation of the age of children consent in relation to information society services.

Making the new governance system deliver its full potential

The Board and data protection authorities are invited to

  • develop efficient arrangements between data protection authorities regarding the functioning of the cooperation and consistency mechanisms, including on procedural aspects, building on the expertise of its members and by strengthening the involvement of its secretariat;
  • support harmonisation in applying and enforcing the GDPR using all means at its disposal, including by further clarifying key concepts of the GDPR, and ensuring that national guidance is fully in line with guidelines adopted by the Board;
  • encourage the use of all tools provided for in the GDPR to ensure that it is applied consistently;
  • step up cooperation among data protection authorities, for instance by conducting joint investigations.

The Commission will

  • continue to closely monitor the effective and full independence of national data protection authorities;
  • encourage cooperation between regulators (in particular in fields such as competition, electronic communications, security of network and information systems and consumer policy);
  • support the reflection within the Board on the procedures applied by the national data protection authorities in order to improve the cooperation on the cross-border cases.

Member States shall

  • allocate resources to data protection authorities that are sufficient for them to perform their tasks.

Supporting stakeholders

The Board and data protection authorities are invited to

  • adopt further guidelines which are practical, easily understandable, and which provide clear answers and avoid ambiguities on issues related to the application of the GDPR, for example on processing children’s data and data subject rights, including the exercise of the right of access and the right to erasure, consulting stakeholders in the process;
  • review the guidelines when further clarifications are necessary in the light of experience and developments including in the case law of the Court of Justice;
  • develop practical tools, such as harmonised forms for data breaches and simplified records of processing activities, to help low-risk SMEs meeting their obligations.

The Commission will

  • provide standard contractual clauses both for international transfers and the controller/processor-relationship;
  • provide for tools clarifying/supporting the application of data protection rules to children;
  • in line with the Data Strategy, explore practical means to facilitate increased use of the right to portability by individuals, such as by giving them more control over who can access and use machine-generated data;
  • support standardisation/certification in particular on cybersecurity aspects through the cooperation between the European Union Agency for Cybersecurity (ENISA), the data protection authorities and the Board;
  • when appropriate, make use of its right to request the Board to prepare guidelines and opinions on specific issues of importance to stakeholders;
  • when necessary provide guidance, while fully respecting the role of the Board;
  • support the activities of data protection authorities that facilitate implementation of GDPR obligations by SMEs, through financial support, especially for practical guidance and digital tools that can be replicated in other Member States.

Encouraging innovation

The Commission will

  • monitor the application of the GDPR to new technologies, also taking into account of possible future initiatives in the field of artificial intelligence and under the Data Strategy;
  • encourage, including through financial support, the drafting of EU codes of conduct in the area of health and research;
  • closely follow the development and the use of apps in the context of the COVID-19 pandemic.

The Board is invited to

  • issue guidelines on the application of the GDPR in the area of scientific research, artificial intelligence, blockchain, and possible other technological developments;
  • review the guidelines when further clarifications are necessary in the light of technological development.

Further developing the toolkit for data transfers

The Commission will

  • pursue adequacy dialogues with interested third countries, in line with the strategy set out in its 2017 Communication ‘Exchanging and Protecting Personal Data in a Globalised World‘, including where possible by covering data transfers to criminal law enforcement authorities (under the Data Protection Law Enforcement Directive) and other public authorities; this includes finalisation of the adequacy process with the Republic of Korea as soon as possible;
  • finalise the ongoing evaluation of the existing adequacy decisions and report to the European Parliament and the Council;
  • finalise the work on the modernisation of the standard contractual clauses, with a view to updating them in light of the GDPR, covering all relevant transfer scenarios and better reflecting modern business practices.

The Board is invited to

  • further clarify the interplay between the rules on international data transfers (Chapter V) with the GDPR’s territorial scope of application (Article 3);
  • ensure effective enforcement against operators established in third countries falling within the GDPR’s territorial scope of application, including as regards the appointment of a representative where applicable (Article 27);
  • streamline the assessment and eventual approval of binding corporate rules with a view to speed up the process;
  • complete the work on the architecture, procedures and assessment criteria for codes of conduct and certification mechanisms as tools for data transfers.

Promoting convergence and developing international cooperation

The Commission will

  • support ongoing reform processes in third countries on new or modernised data protection rules by sharing experience and best practices;
  • engage with African partners to promote regulatory convergence and support capacity-building of supervisory authorities as part of the digital chapter of the new EU-Africa partnership;
  • assess how cooperation between private operators and law enforcement authorities could be facilitated, including by negotiating bilateral and multilateral frameworks for data transfers in the context of access by foreign criminal law enforcement authorities to electronic evidence, to avoid conflicts of law while ensuring appropriate data protection safeguards;
  • engage with international and regional organisations such as the OECD, ASEAN or the G20 to promote trusted data flows based on high data protection standards, including in the context of the Data Flow with Trust initiative;
  • set up a ‘Data Protection Academy’ to facilitate and support exchanges between European and international regulators;
  • promote international enforcement cooperation between supervisory authorities, including through the negotiation of cooperation and mutual assistance agreements.

EC staff released a working document more detailed than the EC’s report and broader than the mandate in Article 97 of the GDPR:

Although its focus is on the two issues highlighted in Article 97(2) of the GDPR, namely international transfers and the cooperation and consistency mechanisms, this evaluation takes a broader approach in order to address issues which have been raised by various actors during the last two years.

EC staff highlighted the number and types of enforcement actions, taking care to stress their deterrent effect, in part, perhaps to counter criticism that the fines levied have often been a fraction of the statutory ceiling. Of course, this sort of argument is hard to dispute for how does one prove or disprove a negative (i.e. all the GDPR violations that were averted because regulated entities feared being punished in a fashion similar to entities subject to enforcement actions.) EC staff asserted:

The GDPR establishes independent data protection authorities and provides them with harmonised and strengthened enforcement powers. Since the GDPR applies, those authorities have been using of a wide range of corrective powers provided for in the GDPR, such as administrative fines (22 EU/EEA authorities)10, warnings and reprimands (23), orders to comply with data subject’s requests (26), orders to bring processing operations into compliance with the GDPR (27), and orders to rectify, erase or restrict processing (17). Around half of the data protection authorities (13) have imposed temporary or definitive limitations on processing, including bans. This demonstrates a conscious use of all corrective measures provided for in the GDPR; the data protection authorities did not shy away from imposing administrative fines in addition to or instead of other corrective measures, depending on the circumstances of individual cases.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Biljana Jovanovic from Pixabay

Further Reading and Other Developments (4 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The Senate invoked cloture on the nomination of acting Office of Management and Budget (OMB) Director Russell Vought to be confirmed in that role and will vote on the nomination on 20 July. OMB has been without a Senate-confirmed Director since Mick Mulvaney resigned at the end of March, but he was named acting White House Chief of Staff in January 2019, resulting in Vought serving as the acting OMB head since that time.
  • The United States Federal Chief Information Officer (CIO) Suzette Kent announced she is stepping down in July, and Deputy Federal CIO Maria Roat is expected to be named acting Federal CIO. Given the Trump Administration’s approach to submitting nominations to the Senate for confirmation and the Senate’s truncated work schedule due to the election, it is likely no nomination is made this year. Kent technically held the position of Administrator of the Office of Electronic Government within the Office of Management and Budget (OMB) and her portfolio includes a range of technology related matters including cybersecurity, information technology IT policy and procurement, workforce, data security, data management and others.
  • The General Services Administration (GSA) announced the next step in “establish[ing] a program to procure commercial products through commercial e-commerce portals for purposes of enhancing competition, expediting procurement, enabling market research, and ensuring reasonable pricing of commercial products.” GSA “awarded contracts to three e-marketplace platform providers…[to] Amazon Business, Fisher Scientific, and Overstock.com, Inc. allows GSA to test the use of commercial e-commerce portals for purchases below the micro-purchase threshold of $10,000 using a proof-of-concept (for up to three years).” Section 846 of the 2018 National Defense Authorization Act (P. L. 115-91) directed GSA to implement such a program, and the agency claimed in a blog posting:
    • These contracts and platforms will be available to federal agencies as part of a governmentwide effort to modernize the buying experience for agencies and help them gain insights into open-market online spend occurring outside of existing contracts.  It is estimated that open market purchases on government purchase cards represent an addressable market of $6 billion annually.
    • The goal of the proof of concept is to provide a modern buying solution for federal customers and increase transparency on agency spending that’s already taking place with better data through this solution. Further, this solution leverages the government’s buying power and increases supply chain security awareness with a governmentwide approach.
  • In response to the ongoing and growing advertising boycott, Facebook announced in a press release some changes to the platform’s policies regarding voter suppression or hateful content. CEO Mark Zuckerberg stated “Three weeks ago, I committed to reviewing our policies ahead of the 2020 elections…[and] [t]hat work is ongoing, but today I want to share some new policies to connect people with authoritative information about voting, crack down on voter suppression, and fight hate speech:
    • 1. Providing Authoritative Information on Voting During the Pandemic
      • Last week, we announced the largest voting information campaign in American history, with the goal of helping 4 million people register to vote. As part of this, we’re creating a Voting Information Center to share authoritative information on how and when you can vote, including voter registration, voting by mail and early voting. During a pandemic when people may be afraid of going to polls, sharing authoritative information on voting by mail will be especially important. We’ll be showing the Voting Information Center at the top of the Facebook and Instagram apps over the coming months.
    • 2. Additional Steps to Fight Voter Suppression
      • Since the most dangerous voter suppression campaigns can be local and run in the days immediately before an election, we’re going to use our Elections Operations Center to quickly respond and remove false claims about polling conditions in the 72 hours leading into election day. Learning from our experience fighting Covid misinformation, we will partner with and rely on state election authorities to help determine the accuracy of information and what is potentially dangerous. We know this will be challenging in practice as facts on the ground may be uncertain and we don’t want to remove accurate information about challenges people are experiencing, but we’re building our operation to be able to respond quickly.
      • We will also ban posts that make false claims saying ICE agents are checking for immigration papers at polling places, which is a tactic used to discourage voting. We’ll also remove any threats of coordinated interference, like someone saying “My friends and I will be doing our own monitoring of the polls to make sure only the right people vote”, which can be used to intimidate voters. We will continue to review our voter suppression policies on an ongoing basis as part of our work on voter engagement and racial justice.
    • 3. Creating a Higher Standard for Hateful Content in Ads
      • This week’s study from the EU showed that Facebook acts faster and removes a greater percent of hate speech on our services than other major internet platforms, including YouTube and Twitter. We’ve invested heavily in both AI systems and human review teams so that now we identify almost 90% of the hate speech we remove before anyone even reports it to us. We’ve also set the standard in our industry by publishing regular transparency reports so people can hold us accountable for progress. We will continue investing in this work and will commit whatever resources are necessary to improve our enforcement.
      • We believe there is a public interest in allowing a wider range of free expression in people’s posts than in paid ads. We already restrict certain types of content in ads that we allow in regular posts, but we want to do more to prohibit the kind of divisive and inflammatory language that has been used to sow discord. So today we’re prohibiting a wider category of hateful content in ads. Specifically, we’re expanding our ads policy to prohibit claims that people from a specific race, ethnicity, national origin, religious affiliation, caste, sexual orientation, gender identity or immigration status are a threat to the physical safety, health or survival of others. We’re also expanding our policies to better protect immigrants, migrants, refugees and asylum seekers from ads suggesting these groups are inferior or expressing contempt, dismissal or disgust directed at them.
    • 4. Labeling Newsworthy Content
      • A handful of times a year, we leave up content that would otherwise violate our policies if the public interest value outweighs the risk of harm. Often, seeing speech from politicians is in the public interest, and in the same way that news outlets will report what a politician says, we think people should generally be able to see it for themselves on our platforms.
      • We will soon start labeling some of the content we leave up because it is deemed newsworthy, so people can know when this is the case. We’ll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what’s acceptable in our society — but we’ll add a prompt to tell people that the content they’re sharing may violate our policies.
      • To clarify one point: there is no newsworthiness exemption to content that incites violence or suppresses voting. Even if a politician or government official says it, if we determine that content may lead to violence or deprive people of their right to vote, we will take that content down. Similarly, there are no exceptions for politicians in any of the policies I’m announcing here today.
  • On 30 June, Facebook banned the boogaloo movement from its platform. The company “designat[ed] a violent US-based anti-government network under our Dangerous Individuals and Organizations policy and disrupting it on our services…[and] [a]s a result, this violent network is banned from having a presence on our platform and we will remove content praising, supporting or representing it.”
  • The United States Department of Commerce suspended “regulations affording preferential treatment to Hong Kong… including the availability of export license exceptions.” The Trump Administration took this latest action in its trade war with the People’s Republic of China (PRC) because of “the Chinese Communist Party’s imposition of new security measures on Hong Kong” and “the risk that sensitive U.S. technology will be diverted to the People’s Liberation Army or Ministry of State Security has increased, all while undermining the territory’s autonomy.” The United States Department of State added “the United States will today end exports of U.S.-origin defense equipment and will take steps toward imposing the same restrictions on U.S. defense and dual-use technologies to Hong Kong as it does for China.”
  • The Democratic National Committee (DNC) updated its “social media comparative analysis to reflect changes companies have made in recent months to their counter disinformation and election integrity policies.” The DNC is working with Facebook/Instagram, Twitter, Google/YouTube, and now Snapchat to “to combat platform manipulation and train our campaigns on how best to secure their accounts and protect their brands against disinformation.”
  • The Office of the Privacy Commissioner of Canada (OPC) and three privacy agencies for provinces of Canada announced an investigation “into a Tim Hortons mobile ordering application after media reports raised concerns about how the app may be collecting and using data about people’s movements as they go about their daily activities.” A journalist made a request to Tim Hortons under Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) and learned the company’s app had logged his longitude and latitude coordinates over 2,700 times in five months, sometimes when he was not using the app even though the company has claimed it only tracks users when the app is being used. Moreover, Tim Hortons combines data from sister companies also owned by Restaurant Brands International like Burger King and Popeyes.
  • The United Kingdom’s Information Commissioner’s Office (ICO) released an “investigation report into the use of mobile phone extraction (MPE) by police forces when conducting criminal investigations in England and Wales” which “found that police data extraction practices vary across the country, with excessive amounts of personal data often being extracted and stored without an appropriate basis in existing data protection law.” The ICO made a range of recommendations, many of which will require a legislative revamp of the laws that currently govern these practices.
  • Ireland’s Data Protection Commission released its “2018-2020 Regulatory Activity Under GDPR” and listed the following enforcement actions under the General Data Protection Regulation:
    • An Garda Síochana–reprimand and corrective powers applied in accordance with the Data Protection Act, 2018.
    • Tusla; The Child and Family Agency –reprimand and fine applied in accordance with the Data Protection Act, 2018.
    • Tusla; The Child and Family Agency –reprimand and fine applied in accordance with the Data Protection Act, 2018.
    • Twitter–Inquiry completed and draft decision forwarded to EU concerned data protection authorities in accordance with Article 60 of the GDPR.
    • DEASP-Enforcement notice issued regarding the use of the Public Services Card (currently under appeal).
    • 59 Section 10 decisions issued.
    • 15,000 breach notifications assessed and concluded.
    • 9 litigation cases concluded in the Irish Courts.
    • Hearing in CJEU Standard Contractual Clauses case brought by DPC to Irish High Court.
    • 80 % of cases received under the GDPR have been concluded.
  • The National Telecommunications and Information Administration (NTIA) issued its “American Broadband Initiative Progress Report,” an update on a Trump Administration inter-agency effort to implement “a cohesive government-wide strategy to reform broadband deployment” started in 2019. NTIA highlighted the following accomplishment:
    • Through the ReConnect program, as of March 2020, the U.S. Department of Agriculture (USDA) awarded over $744 million in funds to support more than 80 broadband projects benefiting more than 430,000 rural residents in 34 states. The Federal Communications Commission (FCC) and USDA also established processes to coordinate awards for rural broadband deployment to ensure that USDA-funded grants do not overlap with the FCC’s $20 Billion Rural Digital Opportunity Fund (RDOF) or the $9 Billion 5G Fund for Rural America
    • The Department of the Interior (DOI) launched a Joint Overview-Established Locations (JOEL) mapping tool to make site locations visible to service providers looking to locate equipment on Federal property, and added new data layers from the General Services Administration, the U.S. Forest Service, and U.S. Postal Service. Since its release, the map has been viewed 4,294 times, averaging 7 views per day.
    • In June 2019, the General Services Administration (GSA) published the FY 2018 Federal Real Property Profile (FRPP) public data set, updated with a set of filters allowing users to identify Federal property that could be candidates for communications infrastructure installation. This publicly available data now includes the height of buildings and facilities and the elevation above mean sea level, helping the communications industry to determine a structure’s suitability for siting communications facilities. In June 2020, GSA will update the FRPP public data set with more current data from FY 2019.
    • In March 2019, the Department of Commerce’s NTIA updated its website with information about Federal Agencies’ permitting processes and funding information to provide easier, “one-stop” access to the information. NTIA continues to update this information with support from Agencies.
    • In September 2019, NTIA completed the first phase of its National Broadband Availability Map (NBAM), a geographic information system platform which allows for the visualization and analysis of federal, state, and commercially available data sets. As of June 2020, the NBAM program includes 18 States who are partnering on this critical broadband data platform.
    • In February 2020, GSA and USDA’s Forest Service (FS) finalized a revised Standard Form (SF-299), making this Common Application Form suitable for telecommunications purposes.

Further Reading

  • Google will start paying some publishers for news articles” – The Verge. In part because of pressure from regulators in Australia and France, Google will begin paying some new outlets for articles. This could be the start of a larger trend of online platforms compensating media which has long argued this should be the case. However, similar systems in Germany and Spain earlier this decade failed to bolster the media in those countries financially, and Google responded to the Spanish statute by ceasing to operate its News platform in that country.
  • Trump’s strike at Twitter risks collateral damage inside the executive branch” – Politico. One aspect to the Trump Administration executive order on online platforms is that it directs federal agencies to review their online advertising and marketing subject to additional Office of Management and Budget and Department of Justice review. If fully implemented, this process could derail a number of agency initiatives ranging from military recruitment to fighting drug addiction.
  • Column: With its Sprint merger in the bag, T-Mobile is already backing away from its promises” – The Los Angeles Times. Critics of the T-Mobile-Sprint merger have pounced on a recent filing with the California Public Utilities Commission in which the company has asked for two additional years to build out its 5G network despite making this a signal promise in selling California Attorney General Xavier Becerra on the deal. Likewise, the company is trying to renegotiate its promise to create 1,000 new jobs in the state.
  • Facebook policy changes fail to quell advertiser revolt as Coca-Cola pulls ads” – The Guardian. Despite Facebook CEO Mark Zuckerberg’s announcement of policy changes (see Other Developments above), advertisers continue to join a widening boycott that some companies are applying across all major social media platforms. Unilever, Coca Cola, Hershey’s, Honda, and other joined the movement. The majority of Facebook’s income comes from advertising, so a sustained boycott could do more than pushing down the company’s share value. And, the changes announced at the end of last week do not seem to have impressed the boycott’s organizers. It would be interesting if pressure placed on companies advertising on Facebook affects more change than pressure from the right and left in the United States, European Union, and elsewhere.
  • Trump administration tells Facebook, Twitter to act against calls to topple statues, commit violent acts” – The Washington Post. The Department of Homeland Security sent letters late last week to the largest technology companies, asserting they may have played a role in “burglary, arson, aggravated assault, rioting, looting, and defacing public property” by allowing people to post on or use their platforms. The thrust of the argument seems to be that Twitter, Facebook, Apple, Google, and other companies should have done more to prevent people from posting and sharing material that allegedly resulted in violence. Acting Secretary of Homeland Security Chad Wolf argued “In the wake of George Floyd’s death, America faced an unprecedented threat from violent extremists seeking to co-opt the tragedy of his death for illicit purposes.” These letters did not mention President Donald Trump’s tweets that seem to encourage authorities to use violence against protestors. Moreover, they seem to be of a piece with the recent executive order in that there is a scant legal basis for the action designed to cow the social media platforms.
  • Twitch, Reddit crack down on Trump-linked content as industry faces reckoning” – Politico. Two platforms acted against President Donald Trump and his supporters for violating the platforms terms of service and rules. The irony here is that the recent executive order on social platforms seeks to have them held accountable for not operating according to their terms of service.
  • Inside Facebook’s fight against European regulation” – Politico. Through until now unavailable European Commission documents on meetings with and positions of Facebook, this article traces the slow evolution of the company’s no-regulation approach in the European Union (EU) to a public position ostensibly amenable to regulation. It is also perhaps the tale of using lobbying tactics that work in Washington, DC, that have largely failed to gain traction in Brussels.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by congerdesign from Pixabay

Further Reading and Other Developments (29 June)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The Senate Commerce, Science, and Transportation Committee held an oversight hearing on the Federal Communications Commission (FCC) with the FCC Chair and four Commissioners.
  • New Zealand’s Parliament passed the “Privacy Act 2020,” a major update of its 1993 statute that would, according to New Zealand’s Privacy Commissioner, do the following:
    • Mandatory notification of harmful privacy breaches. If organisations or businesses have a privacy breach that poses a risk of serious harm, they are required to notify the Privacy Commissioner and affected parties. This change brings New Zealand in line with international best practice.
    • Introduction of compliance orders. The Commissioner may issue compliance notices to require compliance with the Privacy Act. Failure to follow a compliance notice could result a fine of up to $10,000.
    • Binding access determinations. If an organisation or business refuses to make personal information available upon request, the Commissioner will have the power to demand release.
    • Controls on the disclosure of information overseas. Before disclosing New Zealanders’ personal information overseas, New Zealand organisations or businesses will need to ensure those overseas entities have similar levels of privacy protection to those in New Zealand.
    • New criminal offences. It will be an offence to mislead an organisation or business in a way that affects someone’s personal information or to destroy personal information if a request has been made for it.  The maximum fine for these offences is $10,000.
    • Explicit application to businesses whether or not they have a legal or physical presence in New Zealand. If an international digital platform is carrying on business in New Zealand, with the New Zealanders’ personal information, there will be no question that they will be obliged to comply with New Zealand law regardless of where they, or their servers are based.
  • The United States’ National Archives’ Information Security Oversight Office (ISOO) submitted its annual report to the White House and found:
    • Our Government’s ability to protect and share Classified National Security Information and Controlled Unclassified Information (CUI) continues to present serious challenges to our national security. While dozens of agencies now use various advanced technologies to accomplish their missions, a majority of them still rely on antiquated information security management practices. These practices have not kept pace with the volume of digital data that agencies create and these problems will worsen if we do not revamp our data collection methods for overseeing information security programs across the Government. We must collect and analyze data that more accurately reflects the true health of these programs in the digital age.
    • However, ISOO noted progress on efforts to better secure and protect CUI but added “[f]ull implementation will require additional resources, including dedicated funds and more full-time staff.”
    • Regarding classified information, ISOO found “Classified National Security Information policies and practices remain outdated and are unable to keep pace with the volume of digital data that agencies create.”
  • The Australian Strategic Policy Institute’s International Cyber Policy Centre released its most recent “Covid-19 Disinformation & Social Media Manipulation” report titled “ID2020, Bill Gates and the Mark of the Beast: how Covid-19catalyses existing online conspiracy movements:”
    • Against the backdrop of the global Covid-19 pandemic, billionaire philanthropist Bill Gates has become the subject of a diverse and rapidly expanding universe of conspiracy theories. As an example, a recent poll found that 44% of Republicans and 19% of Democrats in the US now believe that Gates is linked to a plot to use vaccinations as a pretext to implant microchips into people. And it’s not just America: 13% of Australians believe that Bill Gates played a role in the creation and spread of the coronavirus, and among young Australians it’s 20%. Protests around the world, from Germany to Melbourne, have included anti-Gates chants and slogans.
    • This report takes a close look at a particular variant of the Gates conspiracy theories, which is referred to here as the ID2020 conspiracy (named after the non-profit ID2020 Alliance, which the conspiracy theorists claim has a role in the narrative), as a case study for examining the dynamics of online conspiracy theories on Covid-19. Like many conspiracy theories, that narrative builds on legitimate concerns, in this case about privacy and surveillance in the context of digital identity systems, and distorts them in extreme and unfounded ways.
  • The Pandemic Response Accountability Committee (PRAC) released “TOP CHALLENGES FACING FEDERAL AGENCIES:  COVID-19 Emergency Relief and Response Efforts” for those agencies that received the bulk of funds under the “Coronavirus Aid, Relief, and Economic Security (CARES) Act” (P.L. 116-136). PRAC is housed within the Council of the Inspectors General on Integrity and Efficiency (CIGIE) is comprised of “21 Offices of Inspector General (OIG) overseeing agencies who received the bulk of the emergency funding.” PRAC stated
    • CIGIE previously has identified information technology (IT) security and management as a long-standing, serious, and ubiquitous challenge that impacts agencies across the government, highlighting agencies’ dependence on reliable and secure IT systems to perform their mission-critical functions.  Key areas of concern have included safeguarding federal systems against cyberattacks and insider threats, modernizing and managing federal IT systems, ensuring continuity of operations, and recruiting and retaining a highly skilled cybersecurity workforce.  
    • These concerns remain a significant challenge, but are impacted by (1) widespread reliance on maximum telework to continue agency operations during the pandemic, which has strained agency networks and shifted IT resources, and (2) additional opportunities and targets for cyberattacks created by remote access to networks and increases in online financial activity.
  • Following the completion of a European Union-People’s Republic of China summit, European Commission President Ursula von der Leyen pointed to a number of ongoing technology-related issues between the EU and the PRC, including:
    • [W]e continue to have an unbalanced trade and investment relationship. We have not made the progress we aimed for in last year’s Summit statement in addressing market access barriers. We need to follow up on these commitments urgently. And we also need to have more ambition on the Chinese side in order to conclude negotiations on an investment agreement. These two actions would address the asymmetry in our respective market access and would improve the level playing field between us. In order to conclude the investment agreement, we would need in particular substantial commitments from China on the behaviour of state-owned enterprises, transparency in subsidies, and transparency on the topic of forced technology transfers.
    • We have raised these issues at the same time with President Xi and Premier Li that we expect that China will show the necessary level of ambition to conclude these negotiations by the end of this year. I think it is important that we have now a political, high-level approach on these topics.
    • I have also made it clear that China needs to engage seriously on a reform of the World Trade Organization, in particular on the future negotiations on industrial subsidies. This is the relevant framework where we have to work together on the topic – and it is a difficult topic – but this is the framework, which we have to establish to have common binding rules we agree on.
    • And we must continue to work on tackling Chinese overcapacity, for example in the steel and metal sectors, and in high technology. Here for us it is important that China comes back to the international negotiation table, that we sit down there and find solutions.
    • We also pointed out the importance of the digital transformation and its highly assertive approach to the security, the resilience and the stability of digital networks, systems and value chains. We have seen cyberattacks on hospitals and dedicated computing centres. Likewise, we have seen a rise of online disinformation. We pointed out clearly that this cannot be tolerated.
  • United States Secretary of State Mike Pompeo issued a statement titled “The Tide Is Turning Toward Trusted 5G Vendors,” in which he claimed:
    • The tide is turning against Huawei as citizens around the world are waking up to the danger of the Chinese Communist Party’s surveillance state. Huawei’s deals with telecommunications operators around the world are evaporating, because countries are only allowing trusted vendors in their 5G networks. Examples include the Czech Republic, Poland, Sweden, Estonia, Romania, Denmark, and Latvia. Recently, Greece agreed to use Ericsson rather than Huawei to develop its 5G infrastructure.
  • Germany’s highest court, the Bundesgerichtshof (BGH), ruled against Facebook’s claim that the country’s antitrust regulator was wrong in its finding that it was abusing its dominant position in combining data on German nationals and residents across its platforms. Now the matter will go down to a lower German court that is expected to heed the higher court’s ruling and allow the Bundeskartellamt’s restrictions to limit Facebook’s activity.
  • France’s Conseil d’État upheld the Commission nationale de l’informatique et des libertés’ (CNIL) 2019 fine of €50 million of Google under the General Data Protection Regulation (GDPR) “for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.”
  • A Virginia court ruled against House Intelligence Committee Ranking Member Devin Nunes (R-CA) in his suit against Twitter and Liz Mair, a Republican consultant, and Twitter accounts @devincow and @DevinNunesMom regarding alleged defamation.
  • The California Secretary of State has listed the ballot initiative to add the “California Privacy Rights Act” to the state’s law, in large part, to amend the “California Consumer privacy Act” (CCPA) (AB 375) as having qualified for November’s ballot.

Further Reading

  • Wrongfully Accused by an Algorithm” – The New York Times. In what should have been predictable and foreseeable given the error rate of many facial recognition algorithms at identifying correctly people of color, an African American was wrongly identified by this technology, causing him to be released. Those in the field and experts stress positive identifications are supposed to only be one piece of evidence, but in this case, it was the only evidence police had. After a store loss specialists agreed a person in low grade photo was the likely shoplifter, police arrested the man. Eventually, the charges were dismissed, initially with prejudice leaving open the possibility of future prosecution but later the district attorney cleared all charges and expunged the arrest.
  • Pentagon Says it Needs ‘More Time’ Fixing JEDI Contract“ – Nextgov. The saga of the Department of Defense’s Joint Enterprise Defense Infrastructure cloud contract continues. Amazon and Microsoft will need to submit revised bids for the possibly $10 billion procurement as the Department of Defense (DOD) is trying to cure the problems turned up by a federal court in the suit brought by Amazon. These bids would be evaluated later this summer, according to a recent DOD court filing. The next award of this contract could trigger another bid protest just as the first award caused Amazon to challenge Microsoft’s victory.
  • EU pushing ahead with digital tax despite U.S. resistance, top official says” – Politico. In an Atlantic Council event, European Commission Executive Vice President Margrethe Vestager stated the European Union will move ahead with an EU-wide digital services tax despite the recent pullout of the United States from talks on such a tax. The Organization for Economic Co-operation and Development had convened multi-lateral talks to resolve differences on how a global digital services tax will ideally function with most of the nations involved arguing for a 2% tax to be assessed in the nation where the transaction occurs as opposed to where the company is headquartered. EU officials claim agreement was within reach when the US removed itself from the talks. An EU-wide tax is of a piece with a more aggressive stance taken by the EU towards US technology companies, a number of which are currently under investigation for antitrust and anti-competitive behaviors.
  • Verizon joins ad boycott of Facebook over hateful content” – Associated Press. The telecommunications company joined a number of other companies in pulling their advertising from Facebook organized by the ADL (the Anti-Defamation League), the NAACP, Sleeping Giants, Color Of Change, Free Press and Common Sense. The #StopHateforProfit “asks large Facebook advertisers to show they will not support a company that puts profit over safety,” and thus far, a number of companies are doing just that, including Eddie Bauer, Patagonia, North Face, Ben & Jerry’s, and others. In a statement, a Facebook spokesperson stated “[o]ur conversations with marketers and civil rights organizations are about how, together, we can be a force for good.” While Facebook has changed course due to this and other pressure regarding content posted or ads placed on its platform by most recently removing a Trump campaign ad with Nazi imagery, the company has not changed its position on allowing political ads with lies.
  • The UK’s contact tracing app fiasco is a master class in mismanagement” – MIT Technology Review. This after-action report on the United Kingdom’s National Health Service’s efforts to build its own COVID-19 contact tracing app is grim. The NHS is basically scrapping its work and opting for the Google/Apple API. However, the government in London is claiming “we will now be taking forward a solution that brings together the work on our app and the Google/Apple solution.” A far too ambitious plan married to organizational chaos led to the crash of the NHS effort.
  • Trump administration sees no loophole in new Huawei curb” – Reuters. Despite repeated arguments by trade experts the most recent United States Department of Commerce regulations on Huawei will not cut off access to high technology components, Secretary of Commerce Wilbur Ross claimed “[t]he Department of Commerce does not see any loopholes in this rule…[and] [w]e reaffirm that we will implement the rule aggressively and pursue any attempt to evade its intent.”
  • Defense Department produces list of Chinese military-linked companies” – Axios. Likely in response to a letter sent last year by Senate Minority Leader Chuck Schumer (D-NY) and Senator Tom Cotton (R-AR), the Department of Defense has finally fulfilled a requirement in the FY 1999 National Defense Authorization Act to update a list of “those persons operating directly or indirectly in the United States or any of its territories and possessions that are Communist Chinese military companies.” The DOD has complied and compiled a list of People’s Republic of China (PRC) entities linked to the PRC military. This provision in the FY 1999 NDAA also grants the President authority to “exercise International Emergency Economic Powers Act (IEEPA) authorities” against listed entities, which could include serious sanctions.
  • Andrew Yang is pushing Big Tech to pay users for data” – The Verge. Former candidate for the nomination of the Democratic Party for President Andrew Yang has stated the Data Dividend Project, “a movement dedicated to taking back control of our personal data: our data is our property, and if we allow companies to use it, we should get paid for it.” Additionally, “[i]ts primary objective is to establish and enforce data property rights under laws such as the California Consumer Privacy Act (CCPA), which went into effect on January 1, 2020.” California Governor Gavin Newsom proposed a similar program in very vague terms in a State of California speech but never followed up on it, and Senator John Kennedy (R-LA) has introduced the “Own Your Own Data Act” (S. 806) to provide people with rights to sell their personal data.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Retha Ferguson from Pexels

Senate Democratic Stakeholder Floats Privacy Discussion Draft

The top Democrat on one committee has released a bill that would scrap the notice and consent model and strictly limit what information can be collected, processed, and shared.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

On 18 June, Senate Banking, Housing, and Urban Affairs Ranking Member Sherrod Brown (D-OH) released a discussion draft of a federal privacy bill that “rejects the current, ineffective “consent” model for privacy, and instead places strict limits on the collection, use, and sharing of Americans’ personal data.” The “Data Accountability and Transparency Act of 2020” may possibly shift the debate on privacy legislation as other recent bills and developments have moved the window of what stakeholders believe possible on the issue of the sufficiency of the notice and consent model. Like a few other bills, Brown’s legislation would establish a new agency to regulate privacy at the federal level, thus rejecting the idea to expand the Federal Trade Commission’s jurisdiction. The package also addresses an issue that has grown in visibility over the last month or so: facial recognition technology. Most of the privacy bills have not sought to fold the new technology into their regulatory frameworks. However, at present, election year politics compounded by the ongoing pandemic and protests in the United States may serve to further diminish the already flagging chances of enactment of federal privacy legislation this year.

In his press release, Brown claimed his bill “creates a new framework that would give Americans the power to hold corporations, big tech, and the government responsible for how they collect and protect personal data.” He claimed “[t]he bill rejects the current, ineffective “consent” model for privacy, and instead places strict limits on the collection, use, and sharing of Americans’ personal data…[and] contains strong civil rights protections to ensure personal information is not used for discriminatory purposes, as well as a ban on the use of facial recognition technology.” Brown add the “Data Accountability and Transparency Act of 2020” “also establishes a new independent agency dedicated to protecting Americans’ privacy rights.”

Brown stated that “[s]pecifically, the Data Accountability and Transparency Act of 2020 would:

  • Ban the collection, use or sharing of personal data unless specifically allowed by law
  • Ban the use of facial recognition technology
  • Prohibits the use of personal data to discriminate in housing, employment, credit, insurance, and public accommodations;
  • Requires anyone using decision-making algorithms to provide new accountability reports
  • Creates a new, independent agency that is dedicated to protecting individuals’ privacy and the implementation of DATA 2020. The new agency will have rulemaking, supervisory, and enforcement authority, the ability to issue civil penalties for violations of the Act, and a dedicated Office of Civil Rights to protect individuals from discrimination
  • The proposal empowers individuals and state attorneys general to enforce privacy protections and does not preempt more protective state laws
  • Finally, the proposal would require CEO certification of compliance with the Act and contains potential criminal and civil penalties for CEO and Board of Directors

Brown had begun the process with the chair of the Senate Banking, Housing, and Urban Affairs Committee on possible bipartisan privacy legislation likely within the jurisdiction of their committee. In February 2019, Brown and Chair Mike Crapo (R-ID) requested “feedback from interested stakeholders on the collection, use and protection of sensitive information by financial regulators and private companies.” Crapo and Brown stated:

The collection, use and protection of personally identifiable information and other sensitive information by financial regulators and private financial companies (including third-parties that share information with financial regulators and private financial companies) is something that deserves close scrutiny.  Americans are rightly concerned about how their data is collected and used, and how such data is secured and protected.  The collection and use of personally identifiable information will be a major focus of the Banking Committee moving forward. 

However, the quotes from Crapo and Brown in the joint press release suggested they may not have been entirely aligned on the scope of potential privacy legislation. Crapo asserted “it is worth examining how the Fair Credit Reporting Act should work in a digital economy, and whether certain data brokers and other firms serve a function similar to the original consumer reporting agencies.” However, Brown remarked that “[i]n the year and a half since the Equifax breach, the country has learned that financial and technology companies are collecting huge stockpiles of sensitive personal data, but fail over and over to protect Americans’ privacy.” Brown added that “Congress should make it easy for consumers to find out who is collecting personal information about them, and give consumers power over how that data is used, stored and distributed.”

Crapo provided further insight into his preferred model by which the federal government would regulate privacy at an October 2019 hearing titled “Data Ownership: Exploring Implications for Data Privacy Rights and Data Valuation.” Crapo noted that “[t]his Committee has held a series of data privacy hearings exploring possible frameworks for facilitating privacy rights to consumers….[and] [n]early all have included references to data as a new currency or commodity.” He stated that “[t]he next question, then, is who owns it?” Crapo stated that “[t]here has been much debate about the concept of data ownership, the monetary value of personal information and its potential role in data privacy.” He asserted that “[s]ome have argued that privacy and control over information could benefit from applying an explicit property right to personal data, similar to owning a home or protecting intellectual property…[and yet] [o]thers contend the very nature of data is different from that of other tangible assets or goods.”

Crapo stated that “[s]till, it is difficult to ignore the concept of data ownership that appears in existing data privacy frameworks.” He said that “[f]or example, the European Union’s General Data Protection Regulation, or GDPR, grants an individual the right to request and access personally identifiable information that has been collected about them.” Crapo contended that “[t]here is an inherent element of ownership in each of these rights, and it is necessary to address some of the difficulties of ownership when certain rights are exercised, such as whether information could pertain to more than one individual, or if individual ownership applies in the concept of derived data.” He stated that “[a]ssociated with concepts about data ownership or control is the value of personal data being used in the marketplace, and the opportunities for individuals to benefit from its use.”

Crapo asserted that “Senators [John] Kennedy (R-LA) and [Mark] Warner (D-VA) have both led on these issues, with Senator Kennedy introducing legislation that would grant an explicit property right over personal data (i.e. the “Own Your Own Data Act” (S. 806), and Senator Warner introducing legislation that would give consumers more information about the value of their personal data and how it is being used in the economy (i.e. the “Designing Accounting Safeguards To Help Broaden Oversight and Regulations on Data” (S. 1951).” Crapo contended that “[a]s the Banking Committee continues exploring ways to give individuals real control over their data, it is important to learn more about what relationship exists between true data ownership and individuals’ degree of control over their personal information; how a property right would work for different types of personal information; how data ownership interacts with existing privacy laws, including the Gramm-Leach-Bliley Act, the Fair Credit Reporting Act and GDPR; and different ways that companies use personal data, how personal data could be reliably valued and what that means for privacy.” (See here for more analysis of both bills.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Exposure Notification Privacy Act Introduced

A third COVID-19 privacy bill is unveiled in the Senate that may be more about messaging and positioning on broader privacy legislation. In any event, the odds on such legislation being enacted in the near term is not high.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

This week, a third COVID-19 privacy bill was released that occupies a middle ground between the other two bills. However, despite being bipartisan and between the two other bills, it is still not likely Congress will enact either targeted privacy legislation or broader, national privacy legislation this year. And yet, a number of the bill’s requirements track more closely with the Democratic bill released last month, suggesting some of the ground may be shifting under some of the outstanding issues. For example, the bill would not preempt state laws and while it would not create a new federal means a person could sue a company for violations, it expressly preserves all existing state and federal avenues a person could use to litigate.

On 3 June, Senate Commerce, Science and Transportation Committee Ranking Member Maria Cantwell (D-WA) and Bill Cassidy (R-LA) introduced the “Exposure Notification Privacy Act” (S.3861) with Senator Amy Klobuchar (D-MN) cosponsoring. The Senators released a section-by-section and a summary of the bill, too. This bill follows the “Public Health Emergency Privacy Act” (S.3749) and the “COVID-19 Consumer Data Protection Act” (S.3663), bills that take approaches aligned with the Democratic and Republican thinking on privacy respectively. (See here for more analysis).

The key term in the Exposure Notification Privacy Act is “automated exposure notification service,” (AENS) for it informs what is “covered data,” and hence covered by the bill’s protections, and it seems fairly targeted to address only those apps or services created to track contacts for purposes of reducing the spread of COVID-19. This term is defined as:

  • a website, online service, online application, mobile application, or mobile operating system
  • offered in interstate commerce in the United States
  • designed, in part or in full, specifically to be used for, or marketed for, the purpose of digitally notifying, in an automated manner, an individual who may have become exposed to an infectious disease

And yet, because what is covered data is limited to information “collected, processed, or transferred in connection with an AENS,” it is a reasonable reading of this language that an entity obtaining information from a data broker in order to track COVID-19 would be outside the definition of covered data. The same would seem to be true of social media platforms that collect and process data from their users incidentally to their main business of monetizing these data. This seems like a fairly large loophole that would mean the “Exposure Notification Privacy Act” would really focus tightly on technology programs, apps, and platforms mostly used to track and prevent infectious diseases with the voluntary, knowingly consent of users.

AENS would need to obtain express, affirmative consent a person provides after being provided with conspicuous, easy-to-understand notice about data collection, usage, processing, and transfer. There must also be a conspicuous means of withdrawing such consent. In any event, a person with an “authorized diagnosis” would control whether this information is processed by the AENS.

AENS and platform operators must publish “a privacy policy that provides a detailed and accurate representation of that person or entity’s covered data collection, processing, and transfer activities in connection with such person or entity’s AENS or the facilitation of such service.” These privacy policies must divulge “each category of covered data the person or entity collects and the limited allowable processing purposes for which such covered data is collected” and

  • “a description of the person or entity’s covered data minimization and retention policies;
  • how an individual can exercise the individual rights described in this title;
  • a description of the person or entity’s covered data security policies.”

As an aside, platform operators are entities “other than a service provider who provides an operating system that includes features supportive of an AENS and facilitates the use or distribution of such AENS to the extent the technology is not used by the platform operator as an AENS.” And so, platform operators might be Google, Apple, Microsoft, or a handful of others to the extent their operations systems are supporting the AENS in its purpose to track infectious diseases. Hence, some of the bill’s requirements will be imposed on such entities.

Of course, the bill text does not limit this measure just to COVID-19 and extends it to all infectious diseases, which is perhaps a nod to a new normal in which many Americans have apps on their phone or wearables on their bodies designed to counter contracting the flu or other, less dangerous viruses (See below in further reading for an article on FitBit and other apps and platforms that may be poised to do just this and a wearable Singapore may debut shortly.)

There are restrictions on whom may receive covered data from AENS. These entities may only alert individuals of possible exposure if they opted in or a public health authority, transfer these data to service providers to maintain, fix, or improve the system or for security purposes, or to comply in a legal action. The bill also seeks to assuage fears that the sensitive information of people collected for the purposes of combatting infectious diseases could be transferred to and used by law enforcement and surveillance agencies. The legislation explains “[i]t shall be unlawful for any person, entity, or Executive agency to transfer covered data to any Executive agency unless the information is transferred in connection with an investigation or enforcement proceeding under this Act.” Consequently, it would appear the Centers for Disease Control and Prevention (CDC) would be able to transfer covered data to the FTC for an investigation, it could not do the same with the Federal Bureau of Investigation (FBI). In this vein, Executive agencies can only process or transfer for a health purpose related to infectious diseases or in connection with an FTC or state investigation or enforcement action. However, this limitation does not seem to bar a state public health authority from conducting such a transfer to a state law enforcement agency.

There are data minimization responsibilities AENS would need to meet. AENS may not “collect or process any covered data…beyond the minimum amount necessary to implement an AENS for public health purposes; or…for any commercial purpose.” This would seem to limit AENS to collecting, processing and sharing personal information strictly necessary for the purpose of tracking infectious diseases. Likewise, AENS must delete a person’s covered data upon request and on a rolling basis per public health authority guidance. Service providers working with AENS must comply with the latter’s direction to delete covered data.

AENS must “establish, implement, and maintain data security practices to protect the confidentiality, integrity, availability, and accessibility of covered data…[that] be consistent with standards generally accepted by experts in the information security field.” The bill further specifies that such practices must include identifying and assessing risks, corrective and preventive actions for risks, and notification if an AENS is breached. The bill would also ban discrimination on the basis of covered data collected or processed by an AENS or on the basis of a person’s decision not to use an AENS.

As a means of providing oversight, the Privacy and Civil Liberties Oversight Board (PCLOB) would have its mandate enlarged to include “health-related epidemics,” meaning the Board could investigate and issue reports on how well or poorly the act is being implemented with respect to privacy and civil liberties.  To this end, within one year of enactment, PCLOB “shall issue a report, which shall be publicly available to the greatest extent possible, assessing the impact on privacy and civil liberties of Government activities in response to the public health emergency related to the Coronavirus 2019 (COVID–19), and making recommendations for how the Government should mitigate the threats posed by such emergency.”

AENS must also collaborate with public health authorities, which are federal and state agencies charged with protecting and ensuring public health. AENS could only collect, process, and transfer actual diagnoses of an infectious disease and could not do so with potential or presumptive diagnoses. AENS would be charged with issuing public guidance to help people understand the notifications of the system and any limitations with respect to accuracy and reliability. Moreover, AENS must also publish metrics (i.e. “measures of the effectiveness of the service”), including adoption rates. Presumably these latter two requirements would allow for greater transparency and also greater insight into how widely an app or platform is being adopted.

There are a few unexpected wrinkles, however. For example, the act only bars deceptive acts, and not unfair ones, which is a deviation from Section 5 of the Federal Trade Commission (FTC) Act, necessitating language in the bill to this effect rather than the usual reference to 15 USC 45. The bill also places a positive duty on service providers to report violations of the act by either AENS or public health authorities to these entities. It is possible that if such a report accurately depicted a violation the AENS or public health authority then neglected to remedy, the enforcers of the act would have an easier case to make that a violation occurred.

As mentioned, the FTC would police and enforce the act with an enlarged jurisdiction to include common carriers and non-profits. The agency would treat violations as if they were violations of an FTC regulation barring unfair or deceptive practices, which allows the agency to seek civil fines for first offenses. The FTC would not, however, receive rulemaking authority, and should regulations be needed, the agency would be forced to use the cumbersome Moss-Magnuson process.

However, and like the “Public Health Emergency Privacy Act,” the FTC would receive explicit authority to go to court itself instead of having to work through the Department of Justice (DOJ), which is currently the case. That this new wrinkle has appeared in two recent bills largely sponsored by Democrats suggests this may be a new demand for targeted and national privacy legislation and also may reflect diminished faith in the DOJ to vigorously enforce privacy legislation.

State attorneys general could enforce the act in the same ways as the FTC, meaning civil penalties in the first instance being possible. State attorneys general may also bring concurrent state claims, alleging violations under state laws. And so, the bill does not preempt state laws, as a section of the bill goes to some length to stress.

Interestingly, while the bill does not create a private right of action, it suggests a possible way of resolving that sticking point in negotiations between Republicans and Democrats. The bill stresses that it does not foreclose any existing common law federal and state rights of action and would therefore allow people to use any existing law to sue covered entities. This would allow tort suits and other suits to move forward. That Cassidy has cosponsored legislation with this language does not necessarily indicate this is now the will of the Senate Republican Conference.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.