EDPB Issues FAQs On Privacy Shield Decision

While the EDPB does not provide absolute answers on how US entities looking to transfer EU personal data should proceed, the agencies provide their best thinking on what the path forward looks like.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

On 24 July, the European Data Protection Board (EDPB) has addressed, in part, the implications of the recent decision that struck down the European Union-United States Privacy Shield, an agreement that had allowed US companies to transfer and process the personal data of EU citizens. The EDPB fully endorsed the view that the United States’ (US) surveillance regime, notably Section 702 of the “Foreign Intelligence Surveillance Act” (FISA) and Executive Order (EO) 12333, makes most transfers to the US illegal except perhaps if entities holding and using the data take extra steps to protect it. The EDPB references another means that allows for transfers to possibly continue but that generally requires informed and explicit consent from each and every EU person involved. Finally, the EDPB does not address whether the European Commission (EC) and the US are able to execute a third agreement that would be legal under EU law.

The EDPB, which is comprised of the European Union’s (EU) data protection authorities (DPAs), has formally adopted a document spelling out its view on if data transfers under Privacy Shield to the US are still legal and how companies should proceed in using standard contractual clauses (SCCs) and Binding Corporate Rules (BCR), two alternative means of transferring data aside from Privacy Shield. The EDPB’s views suggest the DPAs and supervisory authorities (SA) in each EU nation are going to need to work on a case-by-case basis regarding the latter two means, for the EDPB stressed these are to be evaluated individually. Given recent criticism of how nations are funding and resourcing their DPAs, there may be capacity issues in managing this new work alongside existing enforcement and investigation matters. Moreover, the EDPB discusses use of the exceptions available in Article 49 of the General Data Privacy Regulation (GDPR), stressing that most such transfers are to be occasional.

In last week’s decision, the Court of Justice of the European Union (CJEU) invalidated the European Commission’s adequacy decision on the EU-US Privacy Shield, thus throwing into question all transfers of personal data from the EU into the US that relied on this means. The CJEU was more circumspect in ruling on the use of standard contractual clauses (SCC), another way to legally transfer personal data out of the EU in compliance with the bloc’s law. The court seems to suggest there may be cases in which the use of SCCs may be inadequate given a country’s inadequate protections of the data of EU residents, especially with respect to national security and law enforcement surveillance. The EDPB issued a statement when the decision was made supporting the CJEU but has now adopted a more detailed explanation of its views on the implications of the decision for data controllers, data processors, other nations, EU DPAs and SAs.

In “Frequently Asked Questions (FAQ) on the judgment of the CJEU in Case C-311/18 -Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems,” the EDPB explains its current thinking on the decision, much of which is built on existing guidance and interpretation of the GDPR. The EDPB explained that the FAQ “aims at presenting answers to some frequently asked questions received by SAs and will be developed and complemented along with further analysis, as the EDPB continues to examine and assess the judgment of the CJEU.”

Here are notable excerpts:

  • Is there any grace period during which I can keep on transferring data to the U.S. without assessing my legal basis for the transfer? No, the Court has invalidated the Privacy Shield Decision without maintaining its effects, because the U.S. law assessed by the Court does not provide an essentially equivalent level of protection to the EU. This assessment has to be taken into account for any transfer to the U.S.
  • I was transferring data to a U.S. data importer adherent to the Privacy Shield, what should I do now? Transfers on the basis of this legal framework are illegal. Should you wish to keep on transferring data to the U.S., you would need to check whether you can do so under the conditions laid down below.
  • I am using SCCs with a data importer in the U.S., what should I do? The Court found that U.S. law (i.e., Section 702 FISA and EO 12333) does not ensure an essentially equivalent level of protection. Whether or not you can transfer personal data on the basis of SCCs will depend on the result of your assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place. The supplementary measures along with SCCs, following a case-by-case analysis of the circumstances surrounding the transfer, would have to ensure that U.S. law does not impinge on the adequate level of protection they guarantee. If you come to the conclusion that, taking into account the circumstances of the transfer and possible supplementary measures, appropriate safeguards would not be ensured, you are required to suspend or end the transfer of personal data. However, if you are intending to keep transferring data despite this conclusion, you must notify your competent SA.
  • I am using Binding Corporate Rules (“BCRs”) with an entity in the U.S., what should I do? Given the judgment of the Court, which invalidated the Privacy Shield because of the degree of interference created by the law of the U.S. with the fundamental rights of persons whose data are transferred to that third country, and the fact that the Privacy Shield was also designed to bring guarantees to data transferred with other tools such as BCRs, the Court’s assessment applies as well in the context of BCRs, since U.S. law will also have primacy over this tool.
  • Whether or not you can transfer personal data on the basis of BCRs will depend on the result of your assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place. These supplementary measures along with BCRs, following a case-by-case analysis of the circumstances surrounding the transfer, would have to ensure that U.S. law does not impinge on the adequate level of protection they guarantee. If you come to the conclusion that, taking into account the circumstances of the transfer and possible supplementary measures, appropriate safeguards would not be ensured, you are required to suspend or end the transfer of personal data. However if you are intending to keep transferring data despite this conclusion, you must notify your competent SA.
  • Can I rely on one of the derogations of Article 49 GDPR to transfer data to the U.S.? It is still possible to transfer data from the EEA to the U.S. on the basis of derogations foreseen in Article 49 GDPR provided the conditions set forth in this Article apply. The EDPB refers to its guidelines on this provision. In particular, it should be recalled that when transfers are based on the consent of the data subject, it should be:
    • explicit,
    • specific for the particular data transfer or set of transfers (meaning that the data exporter must make sure to obtain specific consent before the transfer is put in place even if this occurs after the collection of the data has been made),and
    • informed, particularly as to the possible risks of the transfer (meaning the data subject should also informed of the specific risks resulting from the fact that their data will be transferred to a country that does not provide adequate protection and that no adequate safeguards aimed at providing protection for the data are being implemented).
  • With regard to transfers necessary for the performance of a contract between the data subject and the controller, it should be borne in mind that personal data may only be transferred when the transfer is occasional. It would have to be established on a case-by-case basis whether data transfers would be determined as “occasional” or “non-occasional”. In any case, this derogation can only be relied upon when the transfer is objectively necessary for the performance of the contract.
  • In relation to transfers necessary for important reasons of public interest(which must be recognized in EU or Member States’ law), the EDPB recalls that the essential requirement for the applicability of this derogation is the finding of an important public interest and not the nature of the organisation, and that although this derogation is not limited to data transfers that are “occasional”, this does not mean that data transfers on the basis of the important public interest derogation can take place on a large scale and in a systematic manner. Rather, the general principle needs to be respected according to which the derogations as set out in Article 49 GDPR should not become “the rule” in practice, but need to be restricted to specific situations and each data exporter needs to ensure that the transfer meets the strict necessity test.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Maret H. from Pixabay

Further Reading and Other Developments (17 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Speaking of which, the Technology Policy Update is being published daily during the week, and here are the Other Developments and Further Reading from this week.

Other Developments

  • Acting Senate Intelligence Committee Chair Marco Rubio (R-FL), Senate Foreign Relations Committee Chair Jim Risch (R-ID), and Senators Chris Coons (D-DE) and John Cornyn (R-TX) wrote Secretary of Commerce Wilbur Ross and Secretary of Defense Mike Esper “to ask that the Administration take immediate measures to bring the most advanced digital semiconductor manufacturing capabilities to the United States…[which] are critical to our American economic and national security and while our nation leads in the design of semiconductors, we rely on international manufacturing for advanced semiconductor fabrication.” This letter follows the Trump Administration’s May announcement that the Taiwan Semiconductor Manufacturing Corporation (TSMC) agreed to build a $12 billion plant in Arizona. It also bears note that one of the amendments pending to the “National Defense Authorization Act for Fiscal Year 2021“ (S.4049) would establish a grants program to stimulate semiconductor manufacturing in the US.
  • Senators Mark R. Warner (D-VA), Mazie K. Hirono (D-HI) and Bob Menendez (D-NJ) sent a letter to Facebook “regarding its failure to prevent the propagation of white supremacist groups online and its role in providing such groups with the organizational infrastructure and reach needed to expand.” They also “criticized Facebook for being unable or unwilling to enforce its own Community Standards and purge white supremacist and other violent extremist content from the site” and posed “a series of questions regarding Facebook’s policies and procedures against hate speech, violence, white supremacy and the amplification of extremist content.”
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published the Pipeline Cyber Risk Mitigation Infographic that was “[d]eveloped in coordination with the Transportation Security Administration (TSA)…[that] outlines activities that pipeline owners/operators can undertake to improve their ability to prepare for, respond to, and mitigate against malicious cyber threats.”
  • Representative Kendra Horn (D-OK) and 10 other Democrats introduced legislation “requiring the U.S. government to identify, analyze, and combat efforts by the Chinese government to exploit the COVID-19 pandemic” that was endorsed by “[t]he broader Blue Dog Coalition” according to their press release. The “Preventing China from Exploiting COVID-19 Act” (H.R.7484) “requires the Director of National Intelligence—in coordination with the Secretaries of Defense, State, and Homeland Security—to prepare an assessment of the different ways in which the Chinese government has exploited or could exploit the pandemic, which originated in China, in order to advance China’s interests and to undermine the interests of the United States, its allies, and the rules-based international order.” Horn and her cosponsors stated “[t]he assessment must be provided to Congress within 90 days and posted in unclassified form on the DNI’s website.”
  • The Supreme Court of Canada upheld the “Genetic Non-Discrimination Act” and denied a challenge to the legality of the statute brought by the government of Quebec, the Attorney General of Canada, and others. The court found:
    • The pith and substance of the challenged provisions is to protect individuals’ control over their detailed personal information disclosed by genetic tests, in the broad areas of contracting and the provision of goods and services, in order to address Canadians’ fears that their genetic test results will be used against them and to prevent discrimination based on that information. This matter is properly classified within Parliament’s power over criminal law. The provisions are supported by a criminal law purpose because they respond to a threat of harm to several overlapping public interests traditionally protected by the criminal law — autonomy, privacy, equality and public health.
  • The U.S.-China Economic and Security Review Commission published a report “analyzing the evolution of U.S. multinational enterprises (MNE) operations in China from 2000 to 2017.” The Commission found MNE’s operations in the People’s Republic of China “may indirectly erode the  United  States’  domestic industrial competitiveness  and  technological  leadership relative  to  China” and “as U.S. MNE activity in China increasingly focuses on the production of high-end technologies, the risk  that  U.S.  firms  are  unwittingly enabling China to  achieve  its industrial  policy and  military  development objectives rises.”
  • The Federal Communications Commission (FCC) and Huawei filed their final briefs in their lawsuit before the United States Court of Appeals for the Fifth Circuit arising from the FCC’s designation of Huawei as a “covered company” for purposes of a rule that denies Universal Service Funds (USF) “to purchase or obtain any equipment or services produced or provided by a covered company posing a national security threat to the integrity of communications networks or the communications supply chain.” Huawei claimed in its brief that “[t]he rulemaking and “initial designation” rest on the FCC’s national security judgments..[b]ut such judgments fall far afield of the FCC’s statutory  authority  and  competence.” Huawei also argued “[t]he USF rule, moreover, contravenes the Administrative Procedure Act (APA) and the Due Process Clause.” The FCC responded in its filing that “Huawei challenges the FCC’s decision to exclude carriers whose networks are vulnerable to foreign interference, contending that the FCC has neither statutory nor constitutional authority to make policy judgments involving “national security”…[but] [t]hese arguments are premature, as Huawei has not yet been injured by the Order.” The FCC added “Huawei’s claim that the Communications Act textually commits all policy determinations with national security implications to the President is demonstrably false.”
  • European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski released his Strategy for 2020-2024, “which will focus on Digital Solidarity.” Wiewiórowski explained that “three core pillars of the EDPS strategy outline the guiding actions and objectives for the organisation to the end of 2024:
    • Foresight: The EDPS will continue to monitor legal, social and technological advances around the world and engage with experts, specialists and data protection authorities to inform its work.
    • Action: To strengthen the EDPS’ supervision, enforcement and advisory roles the EDPS will promote coherence in the activities of enforcement bodies in the EU and develop tools to assist the EU institutions, bodies and agencies to maintain the highest standards in data protection.
    • Solidarity: While promoting digital justice and privacy for all, the EDPS will also enforce responsible and sustainable data processing, to positively impact individuals and maximise societal benefits in a just and fair way.
  • Facebook released a Civil Rights Audit, an “investigation into Facebook’s policies and practices began in 2018 at the behest and encouragement of the civil rights community and some members of Congress.” Those charged with conducting the audit explained that they “vigorously advocated for more and would have liked to see the company go further to address civil rights concerns in a host of areas that are described in detail in the report” including but not limited to
    • A stronger interpretation of its voter suppression policies — an interpretation that makes those policies effective against voter suppression and prohibits content like the Trump voting posts — and more robust and more consistent enforcement of those policies leading up to the US 2020 election.
    • More visible and consistent prioritization of civil rights in company decision-making overall.
    • More resources invested to study and address organized hate against Muslims, Jews and other targeted groups on the platform.
    • A commitment to go beyond banning explicit references to white separatism and white nationalism to also prohibit express praise, support and representation of white separatism and white nationalism even where the terms themselves are not used.
    • More concrete action and specific commitments to take steps to address concerns about algorithmic bias or discrimination.
    • They added that “[t]his report outlines a number of positive and consequential steps that the company has taken, but at this point in history, the Auditors are concerned that those gains could be obscured by the vexing and heartbreaking decisions Facebook has made that represent significant setbacks for civil rights.”
  • The National Security Commission on Artificial Intelligence (NSCAI) released a white paper titled “The Role of AI Technology in Pandemic Response and Preparedness” that “outlines a series of investments and initiatives that the United States must undertake to realize the full potential of AI to secure our nation against pandemics.” NSCAI noted its previous two white papers:
  • Secretary of Defense Mark Esper announced that Chief Technology Officer Michael J.K. Kratsios has “been designated to serve as Acting Under Secretary of Defense for Research and Engineering” even though he does not have a degree in science. The last Under Secretary held a PhD. However, Kratsios worked for venture capitalist Peter Thiel who backed President Donald Trump when he ran for office in 2016.
  • The United States’ Department of Transportation’s Federal Railroad Administration (FRA) issued research “to develop a cyber security risk analysis methodology for communications-based connected railroad technologies…[and] [t]he use-case-specific implementation of the methodology can identify potential cyber attack threats, system vulnerabilities, and consequences of the attack– with risk assessment and identification of promising risk mitigation strategies.”
  • In a blog post, a National Institute of Standards and Technology (NIST) economist asserted cybercrime may be having a much larger impact on the United States’ economy than previously thought:
    • In a recent NIST report, I looked at losses in the U.S. manufacturing industry due to cybercrime by examining an underutilized dataset from the Bureau of Justice Statistics, which is the most statistically reliable data that I can find. I also extended this work to look at the losses in all U.S. industries. The data is from a 2005 survey of 36,000 businesses with 8,079 responses, which is also by far the largest sample that I could identify for examining aggregated U.S. cybercrime losses. Using this data, combined with methods for examining uncertainty in data, I extrapolated upper and lower bounds, putting 2016 U.S. manufacturing losses to be between 0.4% and 1.7% of manufacturing value-added or between $8.3 billion and $36.3 billion. The losses for all industries are between 0.9% and 4.1% of total U.S. gross domestic product (GDP), or between $167.9 billion and $770.0 billion. The lower bound is 40% higher than the widely cited, but largely unconfirmed, estimates from McAfee.
  • The Government Accountability Office (GAO) advised the Federal Communications Commission (FCC) that it needs a comprehensive strategy for implementing 5G across the United States. The GAO concluded
    • FCC has taken a number of actions regarding 5G deployment, but it has not clearly developed specific and measurable performance goals and related measures–with the involvement of relevant stakeholders, including National Telecommunications and Information Administration (NTIA)–to manage the spectrum demands associated with 5G deployment. This makes FCC unable to demonstrate whether the progress being made in freeing up spectrum is achieving any specific goals, particularly as it relates to congested mid-band spectrum. Additionally, without having established specific and measurable performance goals with related strategies and measures for mitigating 5G’s potential effects on the digital divide, FCC will not be able to assess the extent to which its actions are addressing the digital divide or what actions would best help all Americans obtain access to wireless networks.
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “Time Guidance for Network Operators, Chief Information Officers, and Chief Information Security Officers” “to inform public and private sector organizations, educational institutions, and government agencies on time resilience and security practices in enterprise networks and systems…[and] to address gaps in available time testing practices, increasing awareness of time-related system issues and the linkage between time and cybersecurity.”
  • Fifteen Democratic Senators sent a letter to the Department of Defense, Office of the Director of National Intelligence (ODNI), Department of Homeland Security (DHS), Federal Bureau of Investigations (FBI), and U.S. Cyber Command, urging them “to take additional measures to fight influence campaigns aimed at disenfranchising voters, especially voters of color, ahead of the 2020 election.” They called on these agencies to take “additional measures:”
    • The American people and political candidates are promptly informed about the targeting of our political processes by foreign malign actors, and that the public is provided regular periodic updates about such efforts leading up to the general election.
    • Members of Congress and congressional staff are appropriately and adequately briefed on continued findings and analysis involving election related foreign disinformation campaigns and the work of each agency and department to combat these campaigns.
    • Findings and analysis involving election related foreign disinformation campaigns are shared with civil society organizations and independent researchers to the maximum extent which is appropriate and permissible.
    • Secretary Esper and Director Ratcliffe implement a social media information sharing and analysis center (ISAC) to detect and counter information warfare campaigns across social media platforms as authorized by section 5323 of the Fiscal Year 2020 National Defense Authorization Act.
    • Director Ratcliffe implement the Foreign Malign Influence Response Center to coordinate a whole of government approach to combatting foreign malign influence campaigns as authorized by section 5322 of the Fiscal Year 2020 National Defense Authorization Act.
  • The Information Technology and Innovation Foundation (ITIF) unveiled an issue brief “Why New Calls to Subvert Commercial Encryption Are Unjustified” arguing “that government efforts to subvert encryption would negatively impact individuals and businesses.” ITIF offered these “key takeaways:”
    • Encryption gives individuals and organizations the means to protect the confidentiality of their data, but it has interfered with law enforcement’s ability to prevent and investigate crimes and foreign threats.
    • Technological advances have long frustrated some in the law enforcement community, giving rise to multiple efforts to subvert commercial use of encryption, from the Clipper Chip in the 1990s to the San Bernardino case two decades later.
    • Having failed in these prior attempts to circumvent encryption, some law enforcement officials are now calling on Congress to invoke a “nuclear option”: legislation banning “warrant-proof” encryption.
    • This represents an extreme and unjustified measure that would do little to take encryption out of the hands of bad actors, but it would make commercial products less secure for ordinary consumers and businesses and damage U.S. competitiveness.
  • The White House released an executive order in which President Donald Trump determined “that the Special Administrative Region of Hong Kong (Hong Kong) is no longer sufficiently autonomous to justify differential treatment in relation to the People’s Republic of China (PRC or China) under the particular United States laws and provisions thereof set out in this order.” Trump further determined “the situation with respect to Hong Kong, including recent actions taken by the PRC to fundamentally undermine Hong Kong’s autonomy, constitutes an unusual and extraordinary threat, which has its source in substantial part outside the United States, to the national security, foreign policy, and economy of the United States…[and] I hereby declare a national emergency with respect to that threat.” The executive order would continue the Administration’s process of changing policy to ensure Hong Kong is treated the same as the PRC.
  • President Donald Trump also signed a bill passed in response to the People’s Republic of China (PRC) passing legislation the United States and other claim will strip Hong Kong of the protections the PRC agreed to maintain for 50 years after the United Kingdom (UK) handed over the city. The “Hong Kong Autonomy Act” “requires the imposition of sanctions on Chinese individuals and banks who are included in an annual State Department list found to be subverting Hong Kong’s autonomy” according to the bill’s sponsor Representative Brad Sherman (D-CA).
  • Representative Stephen Lynch, who chairs House Oversight and Reform Committee’s National Security Subcommittee, sent letters to Apple and Google “after the Office of the Director of National Intelligence (ODNI) and the Federal Bureau of Investigation (FBI) confirmed that mobile applications developed, operated, or owned by foreign entities, including China and Russia, could potentially pose a national security risk to American citizens and the United States” according to his press release. He noted in letters sent by the technology companies to the Subcommittee that:
    • Apple confirmed that it does not require developers to submit “information on where user data (if any such data is collected by the developer’s app) will be housed” and that it “does not decide what user data a third-party app can access, the user does.”
    • Google stated that it does “not require developers to provide the countries in which their mobile applications will house user data” and acknowledged that “some developers, especially those with a global user base, may store data in multiple countries.”
    • Lynch is seeking “commitments from Apple and Google to require information from application developers about where user data is stored, and to make users aware of that information prior to downloading the application on their mobile devices.”
  • Minnesota Attorney General Keith Ellison announced a settlement with Frontier Communications that “concludes the three major investigations and lawsuits that the Attorney General’s office launched into Minnesota’s major telecoms providers for deceptive, misleading, and fraudulent practices.” The Office of the Attorney General (OAG) stated
    • Based on its investigation, the Attorney General’s Office alleged that Frontier used a variety of deceptive and misleading practices to overcharge its customers, such as: billing customers more than they were quoted by Frontier’s agents; failing to disclose fees and surcharges in its sales presentations and advertising materials; and billing customers for services that were not delivered.
    • The OAG “also alleged that Frontier sold Minnesotans expensive internet services with so-called “maximum speed” ratings that were not attainable, and that Frontier improperly advertised its service as “reliable,” when in fact it did not provide enough bandwidth for customers to consistently receive their expected service.”
  • The European Data Protection Board (EDPB) issued guidelines “on the criteria of the Right to be Forgotten in the search engines cases under the GDPR” that “focuses solely on processing by search engine providers and delisting requests  submitted by data subjects” even Article 17 of the General Data Protection Regulation applies to all data controllers. The EDPB explained “This paper is divided into two topics:
    • The first topic concerns the grounds a data subject can rely on for a delisting request sent to a search engine provider pursuant to Article 17.1 GDPR.
    • The second topic concerns the exceptions to the Right to request delisting according to Article 17.3 GDPR.
  • The Australian Competition & Consumer Commission (ACCC) “is seeking views on draft Rules and accompanying draft Privacy Impact Assessment that authorise third parties who are accredited at the ‘unrestricted’ level to collect Consumer Data Right (CDR) data on behalf of another accredited person.” The ACCC explained “[t]his will allow accredited persons to utilise other accredited parties to collect CDR data and provide other services that facilitate the provision of goods and services to consumers.” In a March explanatory statement, the ACCC stated “[t]he CDR is an economy-wide reform that will apply sector-by-sector, starting with the banking sector…[and] [t]he objective of the CDR is to provide individual and business consumers (consumers) with the ability to efficiently and conveniently access specified data held about them by businesses (data holders), and to authorise the secure disclosure of that data to third parties (accredited data recipients) or to themselves.” The ACCC noted “[t]he CDR is regulated by both the ACCC and the Office of the Australian Information Commissioner (OAIC) as it concerns both competition and consumer matters as well as the privacy and confidentiality of consumer data.” Input is due by 20 July.
  • Office of the Inspector General (OIG) for the Department of the Interior (Interior) found that even though the agency spends $1.4 billion annually on cybersecurity “[g]uarding against increasing cybersecurity threats” remains one of Interior’s top challenges. The OIG asserted Interior “continues to struggle to implement an enterprise information technology (IT) security program that balances compliance, cost, and risk while enabling bureaus to meet their diverse missions.”
  • In a summary of its larger investigation into “Security over Information Technology Peripheral Devices at Select Office of Science Locations,” the Department of Energy’s Office of the Inspector General (OIG) that “identified weaknesses related to access controls and configuration settings” for peripheral devices (e.g. thumb drives, printers, scanners and other connected devices)  “similar in type to those identified in prior evaluations of the Department’s unclassified cybersecurity program.”
  • The House Homeland Security Committee’s Cybersecurity, Infrastructure Protection, and Innovation Subcommittee Ranking Member John Katko (R-NY) “a comprehensive national cybersecurity improvement package” according to his press release, consisting of these bills:
    • The “Cybersecurity and Infrastructure Security Agency Director and Assistant Directors Act:”  This bipartisan measure takes steps to improve guidance and long-term strategic planning by stabilizing the CISA Director and Assistant Directors positions. Specifically, the bill:
      • Creates a 5-year term for the CISA Director, with a limit of 2 terms. The term of office for the current Director begins on date the Director began to serve.
      • Elevates the Director to the equivalent of a Deputy Secretary and Military Service Secretaries.
      • Depoliticizes the Assistant Director positions, appointed by the Secretary of the Department of Homeland Security (DHS), categorizing them as career public servants. 
    • The “Strengthening the Cybersecurity and Infrastructure Security Agency Act of 2020:” This measure mandates a comprehensive review of CISA in an effort to strengthen its operations, improve coordination, and increase oversight of the agency. Specifically, the bill:
      • Requires CISA to review how additional appropriations could be used to support programs for national risk management, federal information systems management, and public-private cybersecurity and integration. It also requires a review of workforce structure and current facilities and projected needs. 
      • Mandates that CISA provides a report to the House and Senate Homeland Committees within 1-year of enactment. CISA must also provide a report and recommendations to GSA on facility needs. 
      • Requires GSA to provide a review to the Administration and House and Senate Committees on CISA facilities needs within 30-days of Congressional report. 
    • The “CISA Public-Private Talent Exchange Act:” This bill requires CISA to create a public-private workforce program to facilitate the exchange of ideas, strategies, and concepts between federal and private sector cybersecurity professionals. Specifically, the bill:
      • Establishes a public-private cyber exchange program allowing government and industry professionals to work in one another’s field.
      • Expands existing private outreach and partnership efforts. 
  • The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) is ordering United States federal civilian agencies “to apply the July 2020 Security Update for Windows Servers running DNS (CVE-2020-1350), or the temporary registry-based workaround if patching is not possible within 24 hours.” CISA stated “[t]he software update addresses a significant vulnerability where a remote attacker could exploit it to take control of an affected system and run arbitrary code in the context of the Local System Account.” CISA Director Christopher Krebs explained “due to the wide prevalence of Windows Server in civilian Executive Branch agencies, I’ve determined that immediate action is necessary, and federal departments and agencies need to take this remote code execution vulnerability in Windows Server’s Domain Name System (DNS) particularly seriously.”
  • The United States (US) Department of State has imposed “visa restrictions on certain employees of Chinese technology companies that provide material support to regimes engaging in human rights abuses globally” that is aimed at Huawei. In its statement, the Department stated “Companies impacted by today’s action include Huawei, an arm of the Chinese Communist Party’s (CCP) surveillance state that censors political dissidents and enables mass internment camps in Xinjiang and the indentured servitude of its population shipped all over China.” The Department claimed “[c]ertain Huawei employees provide material support to the CCP regime that commits human rights abuses.”
  • Earlier in the month, the US Departments of State, Treasury, Commerce, and of Homeland Security issued an “advisory to highlight the harsh repression in Xinjiang.” The agencies explained
    • Businesses, individuals, and other persons, including but not limited to academic institutions, research service providers, and investors (hereafter “businesses and individuals”), that choose to operate in Xinjiang or engage with entities that use labor from Xinjiang elsewhere in China should be aware of reputational, economic, and, in certain instances, legal, risks associated with certain types of involvement with entities that engage in human rights abuses, which could include Withhold Release Orders (WROs), civil or criminal investigations, and export controls.
  • The United Kingdom’s National Cyber Security Centre (NCSC), Canada’s Communications  Security Establishment (CSE), United States’ National Security Agency (NSA) and the United States’ Department of Homeland Security’s Cybersecurity and Infrastructure Security  Agency (CISA) issued a joint advisory on a Russian hacking organization’s efforts have “targeted various organisations involved in COVID-19 vaccine development in Canada, the United States and the United Kingdom, highly likely with the intention of stealing information and intellectual property relating to the development and testing of COVID-19 vaccines.” The agencies named APT29 (also known as ‘the Dukes’ or ‘Cozy Bear’), “a cyber espionage group, almost certainly part of the Russian intelligence services,” as the culprit behind “custom malware known as ‘WellMess’ and ‘WellMail.’”
    • This alert follows May advisories issued by Australia, the US, and the UK on hacking threats related to the pandemic. Australia’s Department of Foreign Affairs and Trade (DFAT) and the Australian Cyber Security Centre (ACSC) issued “Advisory 2020-009: Advanced Persistent Threat (APT) actors targeting Australian health sector organisations and COVID-19 essential services” that asserted “APT groups may be seeking information and intellectual property relating to vaccine development, treatments, research and responses to the outbreak as this information is now of higher value and priority globally.” CISA and NCSC issued a joint advisory for the healthcare sector, especially companies and entities engaged in fighting COVID-19. The agencies stated that they have evidence that Advanced Persistent Threat (APT) groups “are exploiting the COVID-19 pandemic as part of their cyber operations.” In an unclassified public service announcement, the Federal Bureau of Investigation (FBI) and CISA named the People’s Republic of China as a nation waging a cyber campaign against U.S. COVID-19 researchers. The agencies stated they “are issuing this announcement to raise awareness of the threat to COVID-19-related research.”
  • The National Initiative for Cybersecurity Education (NICE) has released a draft National Institute of Standards and Technology (NIST) Special Publication (SP) for comment due by 28 August. Draft NIST Special Publication (SP) 800-181 Revision 1, Workforce Framework for Cybersecurity (NICE Framework) that features several updates, including:
    • an updated title to be more inclusive of the variety of workers who perform cybersecurity work,
    • definition and normalization of key terms,
    • principles that facilitate agility, flexibility, interoperability, and modularity,
    • introduction of competencies,
  • Representatives Glenn Thompson (R-PA), Collin Peterson (D-MN), and James Comer (R-KY) sent a letter to Federal Communications Commission (FCC) “questioning the Commission’s April 20, 2020 Order granting Ligado’s application to deploy a terrestrial nationwide network to provide 5G services.”
  • The European Commission (EC) is asking for feedback on part of its recently released data strategy by 31 July. The EC stated it is aiming “to create a single market for data, where data from public bodies, business and citizens can be used safely and fairly for the common good…[and] [t]his initiative will draw up rules for common European data spaces (covering areas like the environment, energy and agriculture) to:
    • make better use of publicly held data for research for the common good
    • support voluntary data sharing by individuals
    • set up structures to enable key organisations to share data.
  • The United Kingdom’s Parliament is asking for feedback on its legislative proposal to regulate Internet of Things (IoT) devices. The Department for Digital, Culture, Media & Sport explained “the obligations within the government’s proposed legislative framework would fall mainly on the manufacturer if they are based in the UK, or if not based in the UK, on their UK representative.” The Department is also “developing an enforcement approach with relevant stakeholders to identify an appropriate enforcement body to be granted day to day responsibility and operational control of monitoring compliance with the legislation.” The Department also touted the publishing of the European Telecommunications Standards Institute’s (ETSI) “security baseline for Internet-connected consumer devices and provides a basis for future Internet of Things product certification schemes.”
  • Facebook issued a white paper, titled “CHARTING A WAY FORWARD: Communicating Towards People-Centered and Accountable Design About Privacy,” in which the company states its desire to be involved in shaping a United States privacy law (See below for an article on this). Facebook concluded:
    • Facebook recognizes the responsibility we have to make sure that people are informed about the data that we collect, use, and share.
    • That’s why we support globally consistent comprehensive privacy laws and regulations that, among other things, establish people’s basic rights to be informed about how their information is collected, used, and shared, and impose obligations for organizations to do the same, including the obligation to build internal processes that maintain accountability.
    • As improvements to technology challenge historic approaches to effective communications with people about privacy, companies and regulators need to keep up with changing times.
    • To serve the needs of a global community, on both the platforms that exist now and those that are yet to be developed, we want to work with regulators, companies, and other interested third parties to develop new ways of informing people about their data, empowering them to make meaningful choices, and holding ourselves accountable.
    • While we don’t have all the answers, there are many opportunities for businesses and regulators to embrace modern design methods, new opportunities for better collaboration, and innovative ways to hold organizations accountable.
  • Four Democratic Senators sent Facebook a letter “about reports that Facebook has created fact-checking exemptions for people and organizations who spread disinformation about the climate crisis on its social media platform” following a New York Times article this week on the social media’s practices regarding climate disinformation. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars. Senators Elizabeth Warren (D-WA), Tom Carper (D-DE), Sheldon Whitehouse (D-R.I.) and Brian Schatz (D-HI) argued “[i]f Facebook is truly “committed to fighting the spread of false news on Facebook and Instagram,” the company must immediately acknowledge in its fact-checking process that the climate crisis is not a matter of opinion and act to close loopholes that allow climate disinformation to spread on its platform.” They posed a series of questions to Facebook CEO Mark Zuckerberg on these practices, requesting answers by 31 July.
  • A Canadian court has found that the Canadian Security Intelligence Service (CSIS) “admittedly collected information in a manner that is contrary to this foundational commitment and then relied on that information in applying for warrants under the Canadian Security Intelligence Service Act, RSC 1985, c C-23 [CSIS Act]” according to a court summary of its redacted decision. The court further stated “[t]he Service and the Attorney General also admittedly failed to disclose to the Court the Service’s reliance on information that was likely collected unlawfully when seeking warrants, thereby breaching the duty of candour owed to the Court.” The court added “[t]his is not the first time this Court has been faced with a breach of candour involving the Service…[and] [t]he events underpinning this most recent breach were unfolding as recommendations were being implemented by the Service and the Attorney General to address previously identified candour concerns.” CSIS was found to have illegally collected and used metadata in a 2016 case ion its conduct between 2006-2016. In response to the most recent ruling, CSIS is vowing to implement a range of reforms. The National Security and Intelligence Review Agency (NSIRA) is pledging the same.
  • The United Kingdom’s National Police Chiefs’ Council (NPCC) announced the withdrawal of “[t]he ‘Digital device extraction – information for complainants and witnesses’ form and ‘Digital Processing Notice’ (‘the relevant forms’) circulated to forces in February 2019 [that] are not sufficient for their intended purpose.” In mid-June, the UK’s data protection authority, the Information Commissioner’s Office (ICO) unveiled its “finding that police data extraction practices vary across the country, with excessive amounts of personal data often being extracted, stored, and made available to others, without an appropriate basis in existing data protection law.” This withdrawal was also due, in part, to a late June Court of Appeal decision.  
  • A range of public interest and advocacy organizations sent a letter to Speaker of the House Nancy Pelosi (D-CA) and House Minority Leader Kevin McCarthy (R-CA) noting “there are intense efforts underway to do exactly that, via current language in the House and Senate versions of the FY2021 National Defense Authorization Act (NDAA) that ultimately seek to reverse the FCC’s recent bipartisan and unanimous approval of Ligado Networks’ regulatory plans.” They urged them “not endorse efforts by the Department of Defense and its allies to veto commercial spectrum authorizations…[and][t]he FCC has proven itself to be the expert agency on resolving spectrum disputes based on science and engineering and should be allowed to do the job Congress authorized it to do.” In late April, the FCC’s “decision authorize[d] Ligado to deploy a low-power terrestrial nationwide network in the 1526-1536 MHz, 1627.5-1637.5 MHz, and 1646.5-1656.5 MHz bands that will primarily support Internet of Things (IoT) services.” The agency argued the order “provides regulatory certainty to Ligado, ensures adjacent band operations, including Global Positioning System (GPS), are sufficiently protected from harmful interference, and promotes more efficient and effective use of [the U.S.’s] spectrum resources by making available additional spectrum for advanced wireless services, including 5G.”
  • The European Data Protection Supervisor (EDPS) rendered his opinion on the European Commission’s White Paper on Artificial Intelligence: a European approach to excellence and trust and recommended the following for the European Union’s (EU) regulation of artificial intelligence (AI):
    • applies both to EU Member States and to EU institutions, offices, bodies and agencies;
    • is designed to protect from any negative impact, not only on individuals, but also on communities and society as a whole;
    • proposes a more robust and nuanced risk classification scheme, ensuring any significant potential harm posed by AI applications is matched by appropriate mitigating measures;
    • includes an impact assessment clearly defining the regulatory gaps that it intends to fill.
    • avoids overlap of different supervisory authorities and includes a cooperation mechanism.
    • Regarding remote biometric identification, the EDPS supports the idea of a moratorium on the deployment, in the EU, of automated recognition in public spaces of human features, not only of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals, so that an informed and democratic debate can take place and until the moment when the EU and Member States have all the appropriate safeguards, including a comprehensive legal framework in place to guarantee the proportionality of the respective technologies and systems for the specific use case.
  • The Bundesamt für Verfassungsschutz (BfV), Germany’s domestic security agency, released a summary of its annual report in which it claimed:
    • The Russian Federation, the People’s Republic of China, the Islamic Republic of Iran and the Republic of Turkey remain the main countries engaged in espionage activities and trying to exert influence on Germany.
    • The ongoing digital transformation and the increasingly networked nature of our society increases the potential for cyber attacks, worsening the threat of cyber espionage and cyber sabotage.
    • The intelligence services of the Russian Federation and the People’s Republic of China in particular carry out cyber espionage activities against German agencies. One of their tasks is to boost their own economies with the help of information gathered by the intelligence services. This type of information-gathering campaign severely threatens the success and development opportunities of German companies.
    • To counteract this threat, Germany has a comprehensive cyber security architecture in place, which is operated by a number of different authorities. The BfV plays a major role in investigating and defending against cyber threats by detecting attacks, attributing them to specific attackers, and using the knowledge gained from this to draw up prevention strategies. The National Cyber Response Centre, in which the BfV plays a key role, was set up to consolidate the co-operation between the competent agencies. The National Cyber Response Centre aims to optimise the exchange of information between state agencies and to improve the co-ordination of protective and defensive measures against potential IT incidents.

Further Reading

  • Trump confirms cyberattack on Russian trolls to deter them during 2018 midterms” – The Washington Post. In an interview with former George W. Bush speechwriter Marc Thiessen, President Donald Trump confirmed he ordered a widely reported retaliatory attack on the Russian Federation’s Internet Research Agency as a means of preventing interference during the 2018 mid-term election. Trump claimed this attack he ordered was the first action the United States took against Russian hacking even though his predecessor warned Russian President Vladimir Putin to stop such activities and imposed sanctions at the end of 2016. The timing of Trump’s revelation is interesting given the ongoing furor over reports of Russian bounties paid to Taliban fighters for killing Americans the Trump Administration may have known of but did little or nothing to stop.
  • Germany proposes first-ever use of EU cyber sanctions over Russia hacking” – Deutsche Welle. Germany is looking to use the European Union’s (EU) cyber sanctions powers against Russia for its alleged 2015 16 GB exfiltration of data from the Bundestag’s systems, including from Chancellor Angela Merkel’s office. Germany has been alleging that Fancy Bear (aka APT28) and Russia’s military secret service GRU carried out the attack. Germany has circulated its case for sanctions to other EU nations and EU leadership. In 2017, the European Council declared “[t]he EU diplomatic response to malicious cyber activities will make full use of measures within the Common Foreign and Security Policy, including, if necessary, restrictive measures…[and] [a] joint EU response to malicious cyber activities would be proportionate to the scope, scale, duration, intensity, complexity, sophistication and impact of the cyber activity.”
  • Wyden Plans Law to Stop Cops From Buying Data That Would Need a Warrant” – VICE. Following on a number of reports that federal, state, and local law enforcement agencies are essentially sidestepping the Fourth Amendment through buying location and other data from people’s smartphones, Senator Ron Wyden (D-OR) is going to draft legislation that would seemingly close what he, and other civil libertarians, are calling a loophole to the warrant requirement.
  • Amazon Backtracks From Demand That Employees Delete TikTok” – The New York Times. Amazon first instructed its employees to remove ByteDance’s app, TikTok, on 11 July from company devices and then reversed course the same day, claiming the email had been erroneously sent out. The strange episode capped another tumultuous week for ByteDance as the Trump Administration is intensifying pressure in a number of ways on the company which officials claim is subject to the laws of the People’s Republic of China and hence must share information with the government in Beijing. ByteDance counters the app marketed in the United States is through a subsidiary not subject to PRC law. ByteDance also said it would no longer offer the app in Hong Kong after the PRC change in law has extended the PRC’s reach into the former British colony. TikTok was also recently banned in India as part of a larger struggle between India and he PRC. Additionally, the Democratic National Committee warned staff about using the app this week, too.
  • Is it time to delete TikTok? A guide to the rumors and the real privacy risks.” – The Washington Post. A columnist and security specialist found ByteDance’s app vacuums up information from users, but so does Facebook and other similar apps. They scrutinized TikTok’s privacy policy and where the data went, and they could not say with certainty that it goes to and stays on servers in the US and Singapore. 
  • California investigating Google for potential antitrust violations” – Politico. California Attorney General Xavier Becerra is going to conduct his own investigation of Google aside and apart from the investigation of the company’s advertising practices being conducted by virtually every other state in the United States. It was unclear why Becerra opted against joining the larger probe launched in September 2019. Of course, the Trump Administration’s Department of Justice is also investigating Google and could file suit as early as this month.
  • How May Google Fight an Antitrust Case? Look at This Little-Noticed Paper” – The New York Times. In a filing with the Australian Competition and Consumer Commission (ACCC), Google claimed it does not control the online advertising market and it is borne out by a number of indicia that argue against a monopolistic situation. The company is likely to make the same case to the United States’ government in its antitrust inquiry. However, similar arguments did not gain tractions before the European Commission, which levied a €1.49 billion for “breaching EU antitrust rules” in March 2019.
  •  “Who Gets the Banhammer Now?” – The New York Times. This article examines possible motives for the recent wave of action by social media platforms to police a fraction of the extreme and hateful speech activists and others have been asking them to take down for years. This piece makes the argument that social media platforms are businesses and operate as such and expecting them to behave as de facto public squares dedicated to civil political and societal discourse is more or less how we ended up where we are.
  • TikTok goes tit-for-tat in appeal to MPs: ‘stop political football’ – The Australian. ByteDance is lobbying hard in Canberra to talk Ministers of Parliament out of possibly banning TikTok like the United States has said it is considering. While ByteDance claims the data collected on users in Australia is sent to the US or Singapore, some experts are arguing just to maintain and improve the app would necessarily result in some non-People’s Republic of China (PRC) user data making its way back to the PRC. As Australia’s relationship with the PRC has grown more fraught with allegations PRC hackers infiltrated Parliament and the Prime Minister all but saying PRC hackers were targeting hospitals and medical facilities, the government in Canberra could follow India’s lead and ban the app.
  • Calls for inquiry over claims Catalan lawmaker’s phone was targeted” – The Guardian. British and Spanish newspapers are reporting that an official in Catalonia who favors separating the region from Spain may have had his smartphone compromised with industrial grade spyware typically used only by law enforcement and counterterrorism agencies. The President of the Parliament of Catalonia Roger Torrent claims his phone was hacked for domestic political purposes, which other Catalan leaders argued, too. A spokesperson for the Spanish government said “[t]he government has no evidence that the speaker of the Catalan parliament has been the victim of a hack or theft involving his mobile.” However, the University of Toronto’s CitizenLab, the entity that researched and claimed that Israeli firm NSO Group’s spyware was deployed via WhatsApp to spy on a range of journalists, officials, and dissidents, often by their own governments, confirmed that Torrent’s phone was compromised.
  • While America Looks Away, Autocrats Crack Down on Digital News Sites” – The New York Times. The Trump Administration’s combative relationship with the media in the United States may be encouraging other nations to crack down on digital media outlets trying to hold those governments to account.
  •  “How Facebook Handles Climate Disinformation” – The New York Times. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars.
  • Here’s how President Trump could go after TikTok” – The Washington Post. This piece lays out two means the Trump Administration could employ to press ByteDance in the immediate future: use of the May 2019 Executive Order “Securing the Information and Communications Technology and Services Supply Chain” or the Committee on Foreign Investment in the United States process examining ByteDance of the app Music.ly that became TikTok. Left unmentioned in this article is the possibility of the Federal Trade Commission (FTC) examining its 2019 settlement with ByteDance to settle violations of the “Children’s Online Privacy Protection Act” (COPPA).
  • You’re Doomscrolling Again. Here’s How to Snap Out of It.” – The New York Times. If you find yourself endlessly looking through social media feeds, this piece explains why and how you might stop doing so.
  • UK selling spyware and wiretaps to 17 repressive regimes including Saudi Arabia and China” – The Independent. There are allegations that the British government has ignored its own regulations on selling equipment and systems that can be used for surveillance and spying to other governments with spotty human rights records. Specifically, the United Kingdom (UK) has sold £75m to countries non-governmental organizations (NGO) are rated as “not free.” The claims include nations such as the People’s Republic of China (PRC), the Kingdom of Saudi Arabia, Bahrain, and others. Not surprisingly, NGOs and the minority Labour party are calling for an investigation and changes.
  • Google sued for allegedly tracking users in apps even after opting out” – c/net. Boies Schiller Flexner filed suit in what will undoubtedly seek to become a class action suit over Google’s alleged continuing to track users even when they turned off tracking features. This follows a suit filed by the same firm against Google in June, claiming its browser Chrome still tracks people when they switch to incognito mode.
  • Secret Trump order gives CIA more powers to launch cyberattacks” – Yahoo! News. It turns out that in addition to signing National Security Presidential Memorandum (NSPM) 13 that revamped and eased offensive cyber operations for the Department of Defense, President Donald Trump signed a presidential finding that has allowed the Central Intelligence Agency (CIA) to launch its own offensive cyber attacks, mainly at Russia and Iran, according to unnamed former United States (US) officials according to this blockbuster story. Now, the decision to commence with an attack is not vetted by the National Security Council; rather, the CIA makes the decision. Consequently, there have been a number of attacks on US adversaries that until now have not been associated with the US. And, the CIA is apparently not informing the National Security Agency or Cyber Command of its operations, raising the risk of US cyber forces working at cross purposes or against one another in cyberspace. Moreover, a recently released report blamed the lax security environment at the CIA for a massive exfiltration of hacking tools released by Wikileaks. 
  • Facebook’s plan for privacy laws? ‘Co-creating’ them with Congress” – Protocol. In concert with the release of a new white paper, Facebook Deputy Chief Privacy Officer Rob Sherman sat for an interview in which he pledged the company’s willingness to work with Congress to co-develop a national privacy law. However, he would not comment on any of the many privacy bills released thus far or the policy contours of a bill Facebook would favor except for advocating for an enhanced notice and consent regime under which people would be better informed about how their data is being used. Sherman also shrugged off suggestions Facebook may not be welcome given its record of privacy violations. Finally, it bears mention that similar efforts by other companies at the state level have not succeeded as of yet. For example, Microsoft’s efforts in Washington state have not borne fruit in the passage of a privacy law.
  • Deepfake used to attack activist couple shows new disinformation frontier” – Reuters. We are at the beginning of a new age of disinformation in which fake photographs and video will be used to wage campaigns against nations, causes, and people. An activist and his wife were accused of being terrorist sympathizers by a university student who apparently was an elaborate ruse for someone or some group looking to defame the couple. Small errors gave away the ruse this time, but advances in technology are likely to make detection all the harder.
  • Biden, billionaires and corporate accounts targeted in Twitter hack” – The Washington Post. Policymakers and security experts were alarmed when the accounts of major figures like Bill Gates and Barack Obama were hacked yesterday by some group seeking to sell bitcoin. They argue Twitter was lucky this time and a more ideologically motivated enemy may seek to cause havoc, say on the United States’ coming election. A number of experts are claiming the penetration of the platform must have been of internal controls for so many high profile accounts to be taken over at the same time.
  • TikTok Enlists Army of Lobbyists as Suspicions Over China Ties Grow” – The New York Times. ByteDance’s payments for lobbying services in Washington doubled between the last quarter of 2019 and thirst quarter of 2020, as the company has retained more than 35 lobbyists to push back against the Trump Administration’s rhetoric and policy changes. The company is fighting against a floated proposal to ban the TikTok app on national security grounds, which would cut the company off from another of its top markets after India banned it and scores of other apps from the People’s Republic of China. Even if the Administration does not bar use of the app in the United States, the company is facing legislation that would ban its use on federal networks and devices that will be acted upon next week by a Senate committee. Moreover, ByteDance’s acquisition of the app that became TikTok is facing a retrospective review of an inter-agency committee for national security considerations that could result in an unwinding of the deal. Moreover, the Federal Trade Commission (FTC) has been urged to review ByteDance’s compliance with a 2019 settlement that the company violated regulations protecting the privacy of children that could result in multi-billion dollar liability if wrongdoing is found.
  • Why Google and Facebook Are Racing to Invest in India” – Foreign Policy. With New Delhi banning 59 apps and platforms from the People’s Republic of China (PRC), two American firms have invested in an Indian giant with an eye toward the nearly 500 million Indians not yet online. Reliance Industries’ Jio Platforms have sold stakes to Google and Facebook worth $4.5 billion and $5.7 billion that gives them prized positions as the company looks to expand into 5G and other online ventures. This will undoubtedly give a leg up to the United States’ online giants in vying with competitors to the world’s second most populous nation.
  • “Outright Lies”: Voting Misinformation Flourishes on Facebook” – ProPublica. In this piece published with First Draft, “a global nonprofit that researches misinformation,” an analysis of the most popular claims made about mail voting show that many of them are inaccurate or false, thus violating the platforms terms of services yet Facebook has done nothing to remove them or mark them as inaccurate until this article was being written.
  • Inside America’s Secretive $2 Billion Research Hub” – Forbes. Using contract information obtained through Freedom of Information requests and interviews, light is shined on the little known non-profit MITRE Corporation that has been helping the United States government address numerous technological problems since the late 1950’s. The article uncovers some of its latest, federally funded projects that are raising eyebrows among privacy advocates: technology to life people’s fingerprints from social media pictures, technology to scan and copy Internet of Things (IoT) devices from a distance, a scanner to read a person’s DNA, and others.
  • The FBI Is Secretly Using A $2 Billion Travel Company As A Global Surveillance Tool” – Forbes. In his second blockbuster article in a week, Forbes reporter Thomas Brewster exposes how the United States (US) government is using questionable court orders to gather travel information from the three companies that essentially provide airlines, hotels, and other travel entities with back-end functions with respect to reservations and bookings. The three companies, one of whom, Sabre is a US multinational, have masses of information on you if you have ever traveled, and US law enforcement agencies, namely the Federal Bureau of Investigation, is using a 1789 statute to obtain orders all three companies have to obey for information in tracking suspects. Allegedly, this capability has only been used to track terror suspects but will now reportedly be used for COVID-19 tracking.
  • With Trump CIA directive, the cyber offense pendulum swings too far” – Yahoo! News. Former United States (US) National Coordinator for Security, Infrastructure Protection, and Counter-terrorism Richard Clarke argues against the Central Intelligence Agency (CIA) having carte blanche in conducting cyber operations without the review or input of other federal agencies. He suggests that the CIA in particular, and agencies in general, tend to push their authority to the extreme, which in this case could lead to incidents and lasting precedents in cyberspace that may haunt the US. Clarke also intimated that it may have been the CIA and not Israel that launched cyber attacks on infrastructure facilities in Tehran this month and last.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Europe’s Highest Court Strikes Down Privacy Shield

The agreement that has been allowing US companies to transfer the personal data of EU residents to the US was found to be invalid under EU law. The EU’s highest court seem to indicate standard contractual clauses, a frequently used means to transfer data, may be acceptable.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In the second major ruling from the European Union (EU) this week, earlier today, its highest court invalidated the agreement that has allowed multinational corporations and others to transfer the personal data of EU citizens to the United States (US) for commercial purposes since 2016. The court did not, however, find illegal standard contractual clauses, the means by which many such transfers are occurring. This is the second case an Austrian privacy activist has brought, alleging that Facebook was transferring his personal data into the US in violation of European law because US law, especially surveillance programs, resulted in less protection and fewer rights. The first case resulted in the previous transfer agreement being found illegal, and now this case has resulted in much the same outcome. The import of this ruling is not immediately clear.

Maximillian Schrems filed a complaint against Facebook with the Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under EU law because of the mass US surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-US Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the US passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”

However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The European Data Protection Board (EDPB) explained in a recent decision on Denmark’s SCC that

  • According to Article 28(3) General Data Protection Regulation (GDPR), the processing by a data processor shall be governed by a contract or other legal act under Union or Member State law that is binding on the processor with regard to the controller, setting out a set of specific aspects to regulate the contractual relationship between the parties. These include the subject-matter and duration of the processing, its nature and purpose, the type of personal data and categories of data subjects, among others.
  • Under Article 28(6) GDPR, without prejudice to an individual contract between the data controller and the data processor, the contract or the other legal act referred in paragraphs (3) and (4) of Article 28 GDPR may be based, wholly or in part on SCCs.

In a summary of its decision, the CJEU explained

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

The CJEU found

  • Regarding the level of protection required in respect of such a transfer, the Court holds that the requirements laid down for such purposes by the GDPR concerning appropriate safeguards, enforceable rights and effective legal remedies must be interpreted as meaning that data subjects whose personal data are transferred to a third country pursuant to standard data protection clauses must be afforded a level of protection essentially equivalent to that guaranteed within the EU by the GDPR, read in the light of the Charter. In those circumstances, the Court specifies that the assessment of that level of protection must take into consideration both the contractual clauses agreed between the data exporter established in the EU and the recipient of the transfer established in the third country concerned and, as regards any access by the public authorities of that third country to the data transferred, the relevant aspects of the legal system of that third country.
  • Regarding the supervisory authorities’ obligations in connection with such a transfer, the Court holds that, unless there is a valid Commission adequacy decision, those competent supervisory authorities are required to suspend or prohibit a transfer of personal data to a third country where they take the view, in the light of all the circumstances of that transfer, that the standard data protection clauses are not or cannot be complied with in that country and that the protection of the data transferred that is required by EU law cannot be ensured by other means, where the data exporter established in the EU has not itself suspended or put an end to such a transfer.

The CJEU stated “the limitations on the protection of personal data arising from the domestic law of the US on the access and use by US public authorities of such data transferred from the EU to that third country, which the Commission assessed in [its 2016 adequacy decision], are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law, by the principle of proportionality, in so far as the surveillance programmes based on those provisions are not limited to what is strictly necessary.”

The CJEU found the process put in place by the US government to handle complaints inadequate. The 2016 Privacy Shield resulted in the creation of an Ombudsman post that EU citizens could submit their complaints. This position is currently held by Under Secretary of State for Economic Growth, Energy, and the Environment Keith Krach.

The CJEU stated “the Ombudsperson mechanism referred to in that decision does  not  provide  data  subjects with any  cause  of  action  before  a  body  which  offers guarantees substantially equivalent to those required by EU law, such as to ensure both the independence  of  the Ombudsperson  provided  for  by  that  mechanism  and the  existence  of rules  empowering  the  Ombudsperson  to  adopt  decisions  that  are  binding  on  the US intelligence services.”

The decision on SCCs is more ambiguous as it is not entirely clear the circumstances under which they can be used. In its decision, the CJEU made clear that SCCs are not necessarily legal under EU law:

although there are situations in which, depending on the law and practices in force in the third country concerned, the recipient of such a transfer is in a position to guarantee the necessary protection of the data solely on the basis of standard data protection clauses, there are others in which the content of those standard clauses might not constitute a sufficient means of ensuring, in practice, the effective protection of personal data transferred to the third country concerned. That is the case, in particular, where the law of that third country allows its public authorities to interfere with the rights of the data subjects to which that data relates.

Reaction from the parties was mixed, particularly on what the CJEU’s ruling means for SCCs even though there was agreement that the Privacy Shield will soon no longer govern data transfers from the EU to the US.

The DPC issued a statement in which it asserted

Today’s judgment provides just that, firmly endorsing the substance of the concerns expressed by the DPC (and by the Irish High Court) to the effect that EU citizens do not enjoy the level of protection demanded by EU law when their data is transferred to the United States. In that regard, while the judgment most obviously captures Facebook’s transfers of data relating to Mr Schrems, it is of course the case that its scope extends far beyond that, addressing the position of EU citizens generally.

The DPC added

So, while in terms of the points of principle in play, the Court has endorsed the DPC’s position, it has also ruled that the SCCs transfer mechanism used to transfer data to countries worldwide is, in principle, valid, although it is clear that, in practice, the application of the SCCs transfer mechanism to transfers of personal data to the United States is now questionable. This is an issue that will require further and careful examination, not least because assessments will need to be made on a case by case basis.

At a press conference, EC Vice-President Věra Jourová claimed the “CJEU declared the Privacy Shield decision invalid, but also confirmed that the standard contractual clauses remain a valid tool for the transfer of personal data to processors established in third countries.” She asserted “[t]his means that the transatlantic data flows can continue, based on the broad toolbox for international transfers provided by the GDPR, for instance binding corporate rules or SCCs.” Jourová contended with regard to next steps, “[w]e are not starting from scratch…[and] [o]n the contrary, the Commission has already been working intensively to ensure that this toolbox is fit for purpose, including the modernisation of the Standard Contractual Clauses.” Jourová stated “we will be working closely with our American counterparts, based on today’s ruling.”

European Commissioner for Justice Didier Reynders stated

  • First, I welcome the fact that the Court confirmed the validity of our Decision on SCCs.
    • We have been working already for some time on modernising these clauses and ensuring that our toolbox for international data transfers is fit for purpose.
    • Standard Contractual Clauses are in fact the most used tool for international transfers of personal data and we wanted to ensure they can be used by businesses and fully in line with EU law.
    • We are now advanced with this work and we will of course take into account the requirements of judgement.
    • We will work with the European Data Protection Board, as well as the 27 EU Member States. It will be very important to start the process to have a formal approval to modernise the Standard Contractual Clauses as soon as possible. We have been in an ongoing process about such a modernisation for some time, but with an attention to the different elements of the decision of the Court today.
  • My second point: The Court has invalidated the Privacy Shield. We have to study the judgement in detail and carefully assess the consequences of this invalidation.

Reynders stated that “[i]n the meantime, transatlantic data flows between companies can continue using other mechanisms for international transfers of personal data available under the GDPR.”

In a statement, US Secretary of Commerce Wilbur Ross

While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts.

Ross continued

We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.

The Department of Commerce stated it “will continue to administer the Privacy Shield program, including processing submissions for self-certification and re-certification to the Privacy Shield Frameworks and maintaining the Privacy Shield List.” The agency added “[t]oday’s decision does not relieve participating organizations of their Privacy Shield obligations.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by harakir from Pixabay

Further Reading and Other Developments (11 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The United States District Court of Maine denied a motion by a number of telecommunications trade associations to enjoin enforcement of a new Maine law instituting privacy practices for internet service providers (ISP) in the state that limited information collection and processing. The plaintiffs claimed the 2017 repeal of the Federal Communications Commission’s (FCC) 2016 ISP Privacy Order preempted states from implementing their own privacy rules for ISPs. In its decision, the court denied the plaintiffs’ motion and will proceed to decide the merits of the case.
  • The European Data Protection Board (EDPB) has debuted a “One-Stop-Shop” register “containing decisions taken by national supervisory authorities following the One-Stop-Shop cooperation procedure (Art. 60 GDPR).” The EDPB explained “[u]nder the GDPR, Supervisory Authorities have a duty to cooperate on cases with a cross-border component to ensure a consistent application of the regulation – the so-called one-stop-shop (OSS) mechanism…[and] [u]nder the OSS, the Lead Supervisory Authority (LSA) is in charge of preparing the draft decisions and works together with the concerned SAs to reach consensus.” Hence this new repository will contain the decisions on which EU data protection authorities have cooperated in addressing alleged GDPR violations that reach across the borders of EU nations.
  • The chair of the House Energy and Commerce Committee and three subcommittee chairs wrote Facebook, Google, and Twitter asking the companies “provide the Committee with monthly reports similar in scope to what you are providing the European Commission regarding your COVID-19 disinformation efforts as they relate to United States users of your platform.” They are also asking that the companies brief them and staff on 22 July on these efforts. Given the Committee’s focus on disinformation, it is quite possible these monthly reports and the briefing could be the basis of more hearings and/or legislation. Chair Frank Pallone, Jr. (D-NJ), Oversight and Investigations Subcommittee Chair Diana DeGette (D-CO), Communications and Technology Subcommittee Chair Mike Doyle (D-PA) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) signed the letters.
  • Reports indicate the Federal Trade Commission (FTC) and Department of Justice (DOJ) are reviewing the February 2019 $5.7 million settlement between the FTC and TikTok for violating the Children’s Online Privacy Protection Act (COPPA). In May 2020, a number of public advocacy groups filed a complaint with the FTC, asking whether the agency has “complied with the consent decree.” If TikTok has violated the order, it could face huge fines as the FTC and DOJ could seek a range of financial penalties. This seems to be another front in the escalating conflict between the United States and the People’s Republic of China.
  • Tech Inquiry, an organization that “seek[s] to combat abuses in the tech industry through coupling concerned tech workers with relevant members of civil society” revealed “an in-depth analysis of all public US federal (sub)contracting data over the last four and a half years to estimate the rankings of tech companies, both in and out of Silicon Valley, as contractors with the military, law enforcement, and diplomatic arms of the United States.” Tech Inquiry claimed “[o]ur analysis shows a diversity of contracting postures (see Tables 2 and 3), not a systemic divide from Washington. Within a substantial list of namebrand tech companies, only Facebook, Apple, and Twitter look to be staying out of major military and law enforcement contracts.”
  • The United States Secret Service announced the formation of a new Cyber Fraud Task Force (CFTF) which merges “its Electronic Crimes Task Forces (ECTFs) and Financial Crimes Task Forces (FCTFs) into a single unified network.” The rationale given for the merger is “the line between cyber and financial crimes has steadily blurred, to the point today where the two – cyber and financial crimes – cannot be effectively disentangled.”
  • The United States Election Assistance Commission (EAC) held a virtual public hearing, “Lessons Learned from the 2020 Primary Elections” “to discuss the administration of primary elections during the coronavirus pandemic.”
  • The National Council of Statewide Interoperability Coordinators (NCSWIC), a Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) administered program, released its “NCSWIC Strategic Plan and Implementation Guide,” “a stakeholder-driven, multi-jurisdictional, and multi-disciplinary plan to enhance interoperable and emergency communications.” NCSWIC contended “[t]he plan is a critical mid-range (three-year) tool to help NCSWIC and its partners prioritize resources, strengthen governance, identify future investments, and address interoperability gaps.”
  • Access Now is pressing “video conferencing platforms” other than Zoom to issue “regular transparency reports… clarifying exactly how they protect personal user data and enforce policies related to freedom of expression.”

Further Reading

  • India bans 59 Chinese apps, including TikTok and WeChat, after deadly border clash” – South China Morning Post. As a seeming extension to the military skirmish India and the People’s Republic of China (PRC) engaged in, a number of PRC apps have been banned by the Indian government, begging the question of whether there will be further escalation between the world’s two most populous nations. India is the TikTok’s biggest market with more than 120 million users in the South Asian country, and a range of other apps and platforms also have millions of users. Most of the smartphones used in India are made by PRC entities. Moreover, if New Delhi joins Washington’s war on Huawei, ZTE, and other PRC companies, the cumulative effect could significantly affect the PRC’s global technological ambitions.
  • Huawei data flows under fire in German court case” – POLITICO. A former Huawei employee in Germany has sued the company alleging violations of the General Data Protection Regulation (GDPR) through the company’s use of standard contractual clauses. This person requested the data the company had collected from him and the reasons for doing so. Huawei claimed it had deleted the data. A German court’s decision that Huawei had violated the GDPR is being appealed. However, some bigger issues are raised by the case, including growing unease within the European Union, that People’s Republic of China firms are possibly illegally transferring and processing EU citizens’ data and a case before Europe’s highest court in which the legality of standard contractual clauses may be determined as early as this month.
  • Deutsche Telekom under pressure after reports on Huawei reliance” – Politico. A German newspaper reported on confidential documents showing that Deutsche Telekom deepened its relationship with Huawei as the United States’ government was pressuring its allies and other nations to stop using the equipment and services of the company. The German telecommunications company denied the claims, and a number of German officials expressed surprise and dismay, opining that the government of Chancellor Angela Merkel should act more swiftly to implement legislation to secure Germany’s networks.
  • Inside the Plot to Kill the Open Technology Fund” – Vice. According to critics, the Trump Administration’s remaking of the United States (US) Agency for Global Media (USAGM) is threatening the mission and effectiveness of the Open Technology Fund (OTF), a US government non-profit designed to help dissidents and endangered populations throughout the world. The OTF has funded a number of open technology projects, including the Signal messaging app, but the new USAGM head, Michael pack, is pressing for closed source technology.
  • How Police Secretly Took Over a Global Phone Network for Organized Crime” – Vice. European law enforcement agencies penetrated and compromised an encrypted messaging service in Europe, leading to a number of arrests and seizures of drugs. Encrochat had billed itself as completely secure, but hackers with the French government broke into the system and laid bare the details of numerous crimes. And, this is only the latest encrypted app that is marketed to criminals, meaning others will soon step into the void created when Encrochat shut down.
  • Virus-Tracing Apps Are Rife With Problems. Governments Are Rushing to Fix Them.” – The New York Times. In numerous nations around the world, the rush to design and distribute contact tracing apps to fight COVID-19 has resulted in a host of problems predicted by information technology professionals and privacy, civil liberties and human rights advocates. Some apps collect too much information, many are not secure, and some do not seem to perform their intended tasks. Moreover, without mass adoption, the utility of an app is questionable at best. Some countries have sought to improve and perfect their apps in response to criticism, but others are continuing to use and even mandate their citizens and residents use them.
  • Hong Kong Security Law Sets Stage for Global Internet Fight” – The New York Times. After the People’s Republic of China (PRC) passed a new law that strips many of the protections Hong Kong enjoyed, technology companies are caught in a bind, for now Hong Kong may well start demanding they hand over data on people living in Hong Kong or employees could face jail time. Moreover, the data demands made of companies like Google or Facebook could pertain to people anywhere in the world. Companies that comply with Beijing’s wishes would likely face turbulence in Washington and vice versa. TikTok said it would withdraw from Hong Kong altogether.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gino Crescoli from Pixabay

EDPB Opines Encryption Ban Would Endanger A Nation’s Compliance with GDPR

As the US and others call on technology companies to develop the means to crack encrypted communications, an EU entity argues any nation with such a law would likely not meet the GDPR’s requirements.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB’s letter comes amidst another push by the Trump Administration, Republican allies in Congress, and other nations to have technology companies develop workarounds or backdoors to its end-to-end encrypted devices, apps, and systems. The proponents of this change claim online child sexual predators, terrorists, and other criminals are using products and services like WhatsApp, Telegram, and iPhones to defeat legitimate, targeted government surveillance and enforcement. They reason that unless technology companies abandon their unnecessarily absolutist position and work towards a technological solution, the number of bad actors communicating in ways that cannot be broken (aka “going dark”) will increase, allowing for greater crime and wrongdoing.

On the other side of the issue, technology companies, civil liberties and privacy experts, and computer scientists argue that any weakening of or backdoors to encryption will eventually be stolen and exposed, making it easier for criminals to hack, steal, and exfiltrate. They assert the internet and digital age are built on secure communications and threatening this central feature would wreak havoc beyond the crimes the US and other governments are seeking to prevent.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

The EDPB’s view is being articulated at a time when, as noted, a number of nations led by the United States (US) continue to press technology companies to allow them access to communications, apps, platforms, and devices that are encrypted. Last year, the US, United Kingdom, Australia, New Zealand, and Canada (the so-called Five Eyes nations) met and claimed in one of the communiques, the Five Eyes ministers asserted that

We are concerned where companies deliberately design their systems in a way that precludes any form of access to content, even in cases of the most serious crimes. This approach puts citizens and society at risk by severely eroding a company’s ability to identify and respond to the most harmful illegal content, such as child sexual exploitation and abuse, terrorist and extremist material and foreign adversaries’ attempts to undermine democratic values and institutions, as well as law enforcement agencies’ ability to investigate serious crime.

The five nations contended that “[t]ech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” The Five Eyes also claimed that “[t]hose companies should also embed the safety of their users in their system designs, enabling them to take action against illegal content…[and] [a]s part of this, companies and Governments must work together to ensure that the implications of changes to their services are well understood and that those changes do not compromise public safety.”

The Five Eyes applauded “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.”

The Five Eyes added

We share concerns raised internationally, inside and outside of government, about the impact these changes could have on protecting our most vulnerable citizens, including children, from harm. More broadly, we call for detailed engagement between governments, tech companies, and other stakeholders to examine how proposals of this type can be implemented without negatively impacting user safety, while protecting cyber security and user privacy, including the privacy of victims.

In October 2019, in an open letter to Facebook CEO Mark Zuckerberg, US Attorney General William P. Barr, United Kingdom Home Secretary Priti Patel, Australia’s Minister for Home Affairs Peter Dutton, and then acting US Homeland Security Secretary Kevin McAleenan asked “that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

However, one of the Five Eyes nations has already taken legislative action to force technology companies and individuals cooperate with law enforcement investigations in ways that could threaten encryption. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

In a related development, this week, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

INSLM stated “[t]he essential effects of TOLA are as follows:

a. Schedule 1 gives police and intelligence agencies new powers to agree or require significant industry assistance from communications providers.

b. Schedules 2, 3 and 4 update existing powers and, in some cases, extended them to new agencies. c. Schedule 5 gives the Australian Security Intelligence Organisation (ASIO) significant new powers to seek and receive both voluntary and compulsory assistance.

INSLM found

  • In relation to Schedule 1, for the reasons set out in greater detail in the report, Technical Assistance Notice (TANs) and Technical Capability Notice (TCNs) should be authorised by a body which is independent of the issuing agency or government. These are powers designed to compel a Designated Communications Provider (DCP) to reveal private information or data of its customers and therefore the usual practice of independent authorisation should apply.
  • I am satisfied that the computer access warrant and associated powers conferred by Schedule 2 are both necessary and proportionate, subject to some amendments.
  • I am generally satisfied that the powers conferred by Schedules 3 and 4 are both necessary and proportionate, but there are some matters that should be addressed and further monitored.
  • I have concluded that Schedule 5 should be amended to limit its breadth and clarify its scope.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

European Commission Releases Its First Review of the GDPR

While emphasizing the positive developments, the EC calls for more work to help nations and DPAs better and more uniformly endorse the law.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

The European Commission submitted its two-year review of the General Data Protection Regulation (GDPR) that took effect across the European Union in May 2018. This review was required to occur two years after the new cross-border data protection structure took effect, and the GDPR further requires reviews every four years after the first review has been completed. It bears note the EC opted to exceed its statutory mandate in the report by covering more than international transfers of EU personal data and how well nations and DPAs are using coordination and cooperation mechanisms to ensure uniform, effective enforcement across the EU.

Overall, the EC touts what it frames as successes of the GDPR and calls for EU member states, data protection authorities (DPA), and the European Data Protection Board (EDPB or Board) to address and resolve a host of ongoing issues that make enforcement of and compliance with the GDPR more difficult. For example, the EC flags the resources and independence EU nations are providing their DPAs as a major issue, as many of the regulatory bodies lack the funding, technical capability, and human power to fully regulate the data rights and obligations enshrined in the GDPR. Another issue the EC discusses at some length are the differing national data protection laws, many of which conflict with or not fully implement the GDPR.

In terms of a top-line summary, the EC claimed

  • The   general   view is   that   two   years   after   it   started   to   apply,   the   GDPR   has successfully,  met  its  objectives  of  strengthening  the  protection  of  the  individual’s right  to  personal  data  protection  and  guaranteeing  the  free  flow  of  personal  data within  the  EU.  However  a  number  of  areas  for  future  improvement  have  also  been identified. 
  • Like most stakeholders and data protection authorities, the Commission is of  the  view  that  it  would  be  premature  at  this  stage  to  draw  definite  conclusions regarding the application of the GDPR.
  • It is likely that most of the issues identified by Member  States  and  stakeholders  will  benefit from  more  experience  in  applying  the GDPR  in  the  coming  years. 
  • Nevertheless,  this  report  highlights  the  challenges encountered so far in applying the GDPR and sets out possible ways to address them.
  • Notwithstanding  its  focus  is  on  the  two  issues  highlighted  in  Article  97(2)  of  the GDPR,   namely   international   transfers   and   the   cooperation   and   consistency mechanisms,  this  evaluation  and  review  takes  a  broader  approach  to  also  address issues which have been raised by various actors during the last two years.

Among its other findings, the EC asserted

  • However, developing a truly common European data protection culture between data protection authorities is still an on-going process. Data protection authorities have not yet made full use of the tools the GDPR provides, such as joint operations that could lead to joint investigations. At times, finding a common approach meant moving to the lowest common denominator and as a result, opportunities to foster more harmonisation were missed
  • Stakeholders generally welcome the guidelines from the Board and request additional ones on key concepts of the GDPR, but also point to inconsistencies between the national guidance and the Board guidelines. They underline the need for more practical advice, in particular more concrete examples, and the need for data protection authorities to be equipped with the necessary human, technical and financial resources to effectively carry out their tasks.

The EC called for greater funding and resources for DPAs to enforce the GDPR, especially in Ireland and Luxembourg which serve as the EU headquarters for a number of large technology companies:

Data protection authorities play an essential role in ensuring that the GDPR is enforced at national level and that the cooperation and consistency mechanisms within the Board functions effectively, including in particular the one-stop-shop mechanism for cross-border cases. Member States are therefore called upon to provide them with adequate resources as required by the GDPR.

The EC wrapped up the GDPR review by drawing a roadmap of sorts for future actions:

Based on this evaluation of the application of the GDPR since May 2018, the actions listed below have been identified as necessary to support its application. The Commission will monitor their implementation also in view of the forthcoming evaluation report in 2024.

The EC offered the following as ongoing or future actions to more fully realize implementation and enforcement of the GDPR to be undertaken by EU states, EU DPAs, the EC, the EDPB, and stakeholders in the EU and elsewhere:

Implementing and complementing the legal framework

Member States should

  • complete the alignment of their sectoral laws to the GDPR;
  • consider limiting the use of specification clauses which might create fragmentation and jeopardise the free flow of data within the EU;
  • assess whether national law implementing the GDPR is in all circumstances within the margins provided for Member State legislation.

The Commission will

  • pursue bilateral exchanges with Member States on the compliance of national laws with the GDPR, including on the independence and resources of national data protection authorities; make use of all the tools at its disposal, including infringement procedures, to ensure that Member States comply with the GDPR;
  • support further exchanges of views and national practices between Member States on topics which are subject to further specification at national level so as to reduce the level of fragmentation of the single market, such as processing of personal data relating to health and research, or which are subject to balancing with other rights such as the freedom of expression;
  • support a consistent application of the data protection framework in relation to new technologies to support innovation and technological developments;
  • use the GDPR Member States Expert Group (established during the transitory phase before the GDPR became applicable) to facilitate discussions and sharing of experience between Member States and with the Commission;
  • explore whether, in the light of further experience and relevant case-law, proposing possible future targeted amendments to certain provisions of the GDPR might be appropriate, in particular regarding records of processing by SMEs that do not have the processing of personal data as their core business (low risk), and the possible harmonisation of the age of children consent in relation to information society services.

Making the new governance system deliver its full potential

The Board and data protection authorities are invited to

  • develop efficient arrangements between data protection authorities regarding the functioning of the cooperation and consistency mechanisms, including on procedural aspects, building on the expertise of its members and by strengthening the involvement of its secretariat;
  • support harmonisation in applying and enforcing the GDPR using all means at its disposal, including by further clarifying key concepts of the GDPR, and ensuring that national guidance is fully in line with guidelines adopted by the Board;
  • encourage the use of all tools provided for in the GDPR to ensure that it is applied consistently;
  • step up cooperation among data protection authorities, for instance by conducting joint investigations.

The Commission will

  • continue to closely monitor the effective and full independence of national data protection authorities;
  • encourage cooperation between regulators (in particular in fields such as competition, electronic communications, security of network and information systems and consumer policy);
  • support the reflection within the Board on the procedures applied by the national data protection authorities in order to improve the cooperation on the cross-border cases.

Member States shall

  • allocate resources to data protection authorities that are sufficient for them to perform their tasks.

Supporting stakeholders

The Board and data protection authorities are invited to

  • adopt further guidelines which are practical, easily understandable, and which provide clear answers and avoid ambiguities on issues related to the application of the GDPR, for example on processing children’s data and data subject rights, including the exercise of the right of access and the right to erasure, consulting stakeholders in the process;
  • review the guidelines when further clarifications are necessary in the light of experience and developments including in the case law of the Court of Justice;
  • develop practical tools, such as harmonised forms for data breaches and simplified records of processing activities, to help low-risk SMEs meeting their obligations.

The Commission will

  • provide standard contractual clauses both for international transfers and the controller/processor-relationship;
  • provide for tools clarifying/supporting the application of data protection rules to children;
  • in line with the Data Strategy, explore practical means to facilitate increased use of the right to portability by individuals, such as by giving them more control over who can access and use machine-generated data;
  • support standardisation/certification in particular on cybersecurity aspects through the cooperation between the European Union Agency for Cybersecurity (ENISA), the data protection authorities and the Board;
  • when appropriate, make use of its right to request the Board to prepare guidelines and opinions on specific issues of importance to stakeholders;
  • when necessary provide guidance, while fully respecting the role of the Board;
  • support the activities of data protection authorities that facilitate implementation of GDPR obligations by SMEs, through financial support, especially for practical guidance and digital tools that can be replicated in other Member States.

Encouraging innovation

The Commission will

  • monitor the application of the GDPR to new technologies, also taking into account of possible future initiatives in the field of artificial intelligence and under the Data Strategy;
  • encourage, including through financial support, the drafting of EU codes of conduct in the area of health and research;
  • closely follow the development and the use of apps in the context of the COVID-19 pandemic.

The Board is invited to

  • issue guidelines on the application of the GDPR in the area of scientific research, artificial intelligence, blockchain, and possible other technological developments;
  • review the guidelines when further clarifications are necessary in the light of technological development.

Further developing the toolkit for data transfers

The Commission will

  • pursue adequacy dialogues with interested third countries, in line with the strategy set out in its 2017 Communication ‘Exchanging and Protecting Personal Data in a Globalised World‘, including where possible by covering data transfers to criminal law enforcement authorities (under the Data Protection Law Enforcement Directive) and other public authorities; this includes finalisation of the adequacy process with the Republic of Korea as soon as possible;
  • finalise the ongoing evaluation of the existing adequacy decisions and report to the European Parliament and the Council;
  • finalise the work on the modernisation of the standard contractual clauses, with a view to updating them in light of the GDPR, covering all relevant transfer scenarios and better reflecting modern business practices.

The Board is invited to

  • further clarify the interplay between the rules on international data transfers (Chapter V) with the GDPR’s territorial scope of application (Article 3);
  • ensure effective enforcement against operators established in third countries falling within the GDPR’s territorial scope of application, including as regards the appointment of a representative where applicable (Article 27);
  • streamline the assessment and eventual approval of binding corporate rules with a view to speed up the process;
  • complete the work on the architecture, procedures and assessment criteria for codes of conduct and certification mechanisms as tools for data transfers.

Promoting convergence and developing international cooperation

The Commission will

  • support ongoing reform processes in third countries on new or modernised data protection rules by sharing experience and best practices;
  • engage with African partners to promote regulatory convergence and support capacity-building of supervisory authorities as part of the digital chapter of the new EU-Africa partnership;
  • assess how cooperation between private operators and law enforcement authorities could be facilitated, including by negotiating bilateral and multilateral frameworks for data transfers in the context of access by foreign criminal law enforcement authorities to electronic evidence, to avoid conflicts of law while ensuring appropriate data protection safeguards;
  • engage with international and regional organisations such as the OECD, ASEAN or the G20 to promote trusted data flows based on high data protection standards, including in the context of the Data Flow with Trust initiative;
  • set up a ‘Data Protection Academy’ to facilitate and support exchanges between European and international regulators;
  • promote international enforcement cooperation between supervisory authorities, including through the negotiation of cooperation and mutual assistance agreements.

EC staff released a working document more detailed than the EC’s report and broader than the mandate in Article 97 of the GDPR:

Although its focus is on the two issues highlighted in Article 97(2) of the GDPR, namely international transfers and the cooperation and consistency mechanisms, this evaluation takes a broader approach in order to address issues which have been raised by various actors during the last two years.

EC staff highlighted the number and types of enforcement actions, taking care to stress their deterrent effect, in part, perhaps to counter criticism that the fines levied have often been a fraction of the statutory ceiling. Of course, this sort of argument is hard to dispute for how does one prove or disprove a negative (i.e. all the GDPR violations that were averted because regulated entities feared being punished in a fashion similar to entities subject to enforcement actions.) EC staff asserted:

The GDPR establishes independent data protection authorities and provides them with harmonised and strengthened enforcement powers. Since the GDPR applies, those authorities have been using of a wide range of corrective powers provided for in the GDPR, such as administrative fines (22 EU/EEA authorities)10, warnings and reprimands (23), orders to comply with data subject’s requests (26), orders to bring processing operations into compliance with the GDPR (27), and orders to rectify, erase or restrict processing (17). Around half of the data protection authorities (13) have imposed temporary or definitive limitations on processing, including bans. This demonstrates a conscious use of all corrective measures provided for in the GDPR; the data protection authorities did not shy away from imposing administrative fines in addition to or instead of other corrective measures, depending on the circumstances of individual cases.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Biljana Jovanovic from Pixabay

EC Calls For EU-Wide Approach on Big Data and COVID-19

The European Commission (EC) met in Brussels last week and issued a recommendation outlining what it hopes will be a unified approach throughout the European Union (EU) on how smartphones and data are used to fight the spread of COVID-19. The EC laid out an ambitious timeline and explained “[t]he first priority for the Toolbox should be a pan-European approach for COVID-19 mobile applications, to be developed together by Member States and the Commission, by 15 April 2020.” The EC added that “[t]he European Data Protection Board (EDPB) and the European Data Protection supervisor (EDPS) will be associated to the process.” The EC stated “[t]he second priority for the Toolbox should be a common approach for the use of anonymised and aggregated mobility data necessary” for a range of purposes to model, predict, and track the virus throughout the EU. On this second priority, the EC is calling for measures to ensure anonymization, safeguarding, and permanent deletion after these data are no longer needed.

It bears note that the EC is working within the structure provided by the General Data Protection Regulation (GDPR), the Privacy and Electronic Communications Directive 2002/58/EC on Privacy and Electronic Communications (the ePrivacy Directive), and other statutes and regulations. In its press release, the EC asserted “[t]o support Member States, the Commission will provide guidance including on data protection and privacy implications…[and] is in close contact with the EDPB for an overview of the processing of personal data at national level in the context of the coronavirus crisis.” The EC also remarked on its 23 March call with the heads of EU telecommunications companies and GSMA, their association, that “also covered the need to collect anonymised mobile metadata to help analysing the patterns of diffusion of the coronavirus, in a way that is fully compliant with the GDPR and ePrivacy legislation.”

The EC declared

The public health crisis caused by the current COVID-19 pandemic (hereinafter, ‘COVID-19 crisis’) is compelling the Union and the Member States to face an unprecedented challenge to its health care systems, way of life, economic stability and values. No single Member State can succeed alone in combating the COVID-19 crisis. An exceptional crisis of such magnitude requires determined action of all Member States and EU institutions and bodies working together in a genuine spirit of solidarity.

The EC continued

Since the beginning of the COVID-19 crisis, a variety of mobile applications have been developed, some of them by public authorities, and there have been calls from Member States and the private sector for coordination at Union level, including to address cybersecurity, security and privacy concerns. These applications tend to serve three general functions:

(i) informing and advising citizens and facilitating the organisation of medical follow-up of persons with symptoms, often combined with a self-diagnosis questionnaire;

(ii) warning people who have been in proximity to an infected person in order to interrupt infection chains and preventing resurgence of infections in the reopening phase; and

(iii) monitoring and enforcement of quarantine of infected persons, possibly combined with features assessing their health condition during the quarantine period.

Certain applications are available to the general public, while others only to closed user groups directed at tracing contacts in the workplace. The effectiveness of these applications has generally not been evaluated. Information and symptom-checker apps may be useful to raise awareness of citizens. However, expert opinion suggests that applications aiming to inform and warn users seem to be the most promising to prevent the propagation of the virus, taking into account also their more limited impact on privacy, and several Member States are currently exploring their use.

The EC found

A common Union approach to the COVID-19 crisis has also become necessary since measures taken in certain countries, such as the geolocation-based tracking of individuals, the use of technology to rate an individual’s level of health risk and the centralisation of sensitive data, raise questions from the viewpoint of several fundamental rights and freedoms guaranteed in the EU legal order, including the right to privacy and the right to the protection of personal data. In any event, pursuant to the Charter of Fundamental Rights of the Union, restrictions on the exercise of the fundamental rights and freedoms laid down therein must be justified and proportionate. Any such restrictions should, in particular, be temporary, in that they remain strictly limited to what is necessary to combat the crisis and do not continue to exist, without an adequate justification, after the crisis has passed.

The EC explained the purpose of the recommendation:

(1)  This recommendation sets up a process for developing a common approach, referred to as a Toolbox, to use digital means to address the crisis. The Toolbox will consist of practical measures for making effective use of technologies and data, with a focus on two areas in particular:

(1)  A pan-European approach for the use of mobile applications, coordinated at Union level, for empowering citizens to take effective and more targeted social distancing measures, and for warning, preventing and contact tracing to help limit the propagation of the COVID-19 disease. This will involve a methodology monitoring and sharing assessments of effectiveness of these applications, their interoperability and cross-border implications, and their respect for security, privacy and data protection; and

(2)  A common scheme for using anonymized and aggregated data on mobility of populations in order (i) to model and predict the evolution of the disease, (ii) to monitor the effectiveness of decision-making by Member States’ authorities on measures such as social distancing and confinement, and (iii) to inform a coordinated strategy for exiting from the COVID-19 crisis.

(2)  Member States should take these actions as a matter of urgency and in close coordination with other Member States, the Commission and other relevant stakeholders, and without prejudice to the competences of the Member States in the domain of public health. They should ensure that all actions are taken in accordance with Union law, in particular law on medical devices and the right to privacy and the protection of personal data along with other rights and freedoms enshrined in the Charter of Fundamental Rights of the Union. The Toolbox will be complemented by Commission guidance, including guidance on the data protection and privacy implications of the use of mobile warning and prevention applications.

The EC added “[t]he EDPB and the EDPS should also be closely involved to ensure the Toolbox integrates data protection and privacy-by-design principles.”

The EC stated that “[t]he first priority for the Toolbox should be a pan-European approach for COVID-19 mobile applications, to be developed together by Member States and the Commission, by 15 April 2020” that “should consist of:

(1) specifications to ensure the effectiveness of mobile information, warning and tracing applications for combating COVID-19 from the medical and technical point of view;

(2)  measures to prevent proliferation of applications that are not compatible with Union law, to support requirements for accessibility for persons with disabilities, and for interoperability and promotion of common solutions, not excluding a potential pan-European application;

(3)  governance mechanisms to be applied by public health authorities and cooperation with the ECDC;

(4)  the identification of good practices and mechanisms for exchange of information on the functioning of the applications; and

(5)  sharing data with relevant epidemiological public bodies and public health research institutions, including aggregated data to ECDC.

Regarding the second principle for the Toolbox, the EC stated it “should be guided by privacy and data protection principles” including:

(1)  safeguards ensuring respect for fundamental rights and prevention of stigmatization, in particular applicable rules governing protection of personal data and confidentiality of communications;

(2)  preference for the least intrusive yet effective measures, including the use of proximity data and the avoidance of processing data on location or movements of individuals, and the use of anonymised and aggregated data where possible;

(3)  technical requirements concerning appropriate technologies (e.g. Bluetooth Low Energy) to establish device proximity, encryption, data security, storage of data on the mobile device, possible access by health authorities and data storage;

(4)  effective cybersecurity requirements to protect the availability, authenticity integrity, and confidentiality of data;

(5)  the expiration of measures taken and the deletion of personal data obtained through these measures when the pandemic is declared to be under control, at the latest;

(6)  uploading of proximity data in case of a confirmed infection and appropriate methods of warning persons who have been in close contact with the infected person, who shall remain anonymous; and

(7)  transparency requirements on the privacy settings to ensure trust into the applications.

The EC added it “will publish guidance further specifying privacy and data protection principles in the light of practical considerations arising from the development and implementation of the Toolbox.”

Technology Policy Update (10 April)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 here.

Here are the articles from this edition:

  • “Paper” Hearing on COVID-19 and Big Data
  • DOD Revises Cybersecurity Model For Contractors; Accreditation Body Holds Webinar
  • EC Calls For EU-Wide Approach on Big Data and COVID-19
  • EU’s Data Supervisor Calls For Limits On Using Data In Fighting COVID-19
  • EDPB Fast Tracks Privacy and Processing Guidance For COVID-19
  • Warner Asks OMB For Uniform Guidance On Contractors
  • OCR Announces HIPAA Enforcement Discretion
  • Executive Order Formalizes Review of Foreign Investment in Telecommunications
  • CISA Guides Agencies On Telework Best Practices and Security

EDPB Details Ongoing Concerns About EU-U.S. Privacy Shield

The European Data Protection Board (EDPB or Board), an entity consisting of the European Union’s (EU) data protection authorities, has released its annual assessment of the EU-U.S. Privacy Shield and again finds both the agreement itself and implementation wanting. There was some overlap between the concerns of the EDPB and the the European Commission (EC) as detailed in its recently released third assessment of the Privacy Shield, but the EDPB discusses areas that were either omitted from or downplayed in the EC’s report. The EDPB’s authority is persuasive with respect to Privacy Shield and carries weight with the EC; however, its concerns as detailed in previous annual reports have pushed the EC to demand changes, including but not limited to, pushing the Trump Administration to nominate Board Members to the Privacy and Civil Liberties Oversight Board (PCLOB) and the appointment of a new Ombudsperson to handle complaints about how the U.S. Intelligence Community is handling the personal data of EU citizens. Conceivably, this EDPB assessment could create more pressure for the Department of Commerce (Commerce) and Federal Trade Commission (FTC) to engage in more stringent oversight of those entities attesting to adhering to Privacy Shield in the transfer and processing of the personal data of EU citizens, including FTC actions alleging violations of Section 5 of the FTC Act if entities claim to be certified or in compliance but are found not to be (as the agency did in four recent cases.)

The EDPB took issue with how the Commerce is conducting spot reviews of a business’s adherence to Privacy Shield and how the FTC is enforcing the regime. In the view of the EDPB, these checks are mostly formal and do not delve into the substance of whether the business is actually complying with the requirements of Privacy Shield to protect the personal data of EU citizens. In particular, the EDPB criticized the lack of oversight of so-called onward transfers of the EU citizens’ data from the EU through the U.S. and into other countries that may not offer the protections required in the EU. The EDPB called for closer scrutiny of this practice by Commerce and for an examination of the contracts U.S. companies enter into with entities in third countries to ensure the requirements of Privacy Shield are being met. The EDPB renewed its concerns about the EU and U.S.’s different readings on how human resources (HR) data are to be treated, namely that EU employees would not be able to avail themselves of the same protections once their data has been transferred to the U.S. The EDPB also expressed its concern about how Commerce handles lapsed certifications of compliance with Privacy Shield by noting that such entities are still listed as being certified. The EDPB pushed for a reformed recertification regime.

The EDPB also expressed its “opinion that it is important that the [EC] continues monitoring cases related to automated decision making and profiling and to contemplate the possibility to foresee specific rules concerning automated decision making to provide sufficient safeguards, including the right to know the logic involved and to challenge the decision obtaining human intervention when the decision significantly affects him or her.” Finally, the EDPB noted “the remaining issues with respect to certain elements of the commercial part of the Privacy Shield adequacy decision as already raised in the WP 29’s Opinion 01/2016 in particular regarding the absence or the limitation to the rights of the data subjects (i.e. right to object, right to access, right to be informed for HR processing), the absence of key definitions, the application of the principles when it comes to “processors”, the lack of guarantees on transfers for regulatory purpose in the field of medical context, the lack of specific rules on automated decision making and the overly broad exemption for publicly available information.” The EDPB stated “[t]hose remain valid.”

The EDPB also took issue with U.S. law enforcement and national security treatment of EU citizens’ personal data. The Board asserted that nothing had changed in the legal landscape in the U.S. since last year’s review but recounted its concerns, chiefly that under Title VII of the Foreign Intelligence Surveillance Act (FISA) and Executive Order (EO) 12333 indiscriminate data collection from and analysis of EU citizens could occur with minimal oversight and little to no redress contrary to EU law. However, the EDPB lauded “the now fully functional Privacy and Civil Liberties Oversight Board (PCLOB)” even though many of its crucial reviews of U.S. surveillance practices were classified and therefore off-limits for the Board to review, notably its forthcoming review of EO 12333 which provides an alternative basis for the Intelligence Community to conduct surveillance. Nonetheless, overall, the EDPB calls for more safeguards for U.S. surveillance that would make these activities more targeted. The EDPB also decried how the standing requirements in federal courts have effectively blunted the available redress for EU citizens under the Privacy Act of 1974. The Board also enumerated its concerns about the Ombudsperson “provides the only way for EU individuals to ask for a verification that the relevant authorities have complied with the requirements of this instrument by asking the Ombudsperson to refer the matter to the competent authorities, which include the Inspector General, to check the internal policies of these authorities.” The EDPB was concerned about the impartiality and independence of the current Ombudsperson, Under Secretary of State for Economic Growth, Energy, and the Environment Kenneth Krach and asserted “still doubts that the powers of the Ombudsperson to remedy non-compliance vis-a-vis the intelligence authorities are sufficient, as his “power” seems to be limited to decide not to confirm compliance towards the petitioner.”

The EDPB detailed its “significant concerns that need to be addressed by both the Commission and the U.S. authorities:”

  • As regards the commercial aspects, the absence of substantial checks remains a concern of the EDPB. Other areas that require further attention are the application of the Privacy Shield requirements regarding onward transfers, HR data and the application of the principles when it comes to processors, as well as the recertification process. More generally, the members of the Review Team would benefit from a broader access to non-public information, concerning commercial aspects and ongoing investigations. In addition, the EDPB recalls the remaining issues with respect to certain elements of the commercial part of the Privacy Shield adequacy decision as already raised in the WP 29’s Opinion 01/2016.
  • As regards the collection of data by public authorities, the EDPB can only encourage the PCLOB to issue and publish further reports. It regrets that on Section 702 FISA no general report is contemplated, to provide an assessment of the changes brought since the last reauthorization in 2018. The EDPB would be very interested on an additional report on PPD-28 to follow up on the first report including an assessment of how the safeguards of PPD-28 are applied Finally, the EDPB underlines the importance of reports on Executive Order 12333, and regrets that those reports will most likely remain classified. In this regard, the EDPB stresses that the members of the review team only have access to the same documents as the general public. The EDPB recalls that the security cleared experts of the EDPB remain ready to review additional documents and discuss additional classified elements, in order to have more meaningful reviews, following the example of PNRs or TFTP reviews.
  • On the Ombudsperson mechanism, despite some new elements provided during this year’s review, especially on the procedural aspects in relation to the first case submitted to the Ombudsperson but declared inadmissible, as well as on hypothetical cases, the EDPB is still not in a position to conclude that the Ombudsperson is vested with sufficient powers to access information and to remedy non-compliance. Thus, it still cannot state that the Ombudsperson can be considered an “effective remedy before a tribunal” in the meaning of Art. 47 of the EU Charter of Fundamental Rights.
  • Finally, the EDPB recalls that the same concerns will be addressed by the Court of Justice of the European Union in cases that are still pending before it.”