Further Reading, Other Developments, and Coming Events (14 September)

Coming Events

  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The House Homeland Security Committee will hold a hearing titled “Worldwide Threats to the Homeland” on 17 September with the following witnesses:
    • Chad Wolf, Department of Homeland Security
    • Christopher Wray, Director, Federal Bureau of Investigation
    • Christopher Miller, Director, National Counterterrorism Center (NCTC)
  • On 17 September, the House Energy and Commerce Committee’s Communications & technology Subcommittee will hold a hearing titled “Trump FCC: Four Years of Lost Opportunities.”
  • The House Armed Services Committee’s Intelligence and Emerging Threats and Capabilities Subcommittee will hold a hearing’ titled “Interim Review of the National Security Commission on Artificial Intelligence Effort and Recommendations” with these witnesses:
    • Dr. Eric Schmidt , Chairman, National Security Commission on Artificial Intelligence 
    • HON Robert Work, Vice Chairman, National Security Commission on Artificial Intelligence, HON Mignon Clyburn, Commissioner, National Security Commission on Artificial Intelligence 
    • Dr. José-Marie Griffiths, Commissioner, National Security Commission on Artificial Intelligence
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” The agency has released its agenda and explained:
    • The workshop will also feature four panel discussions that will focus on: case studies on data portability rights in the European Union, India, and California; case studies on financial and health portability regimes; reconciling the benefits and risks of data portability; and the material challenges and solutions to realizing data portability’s potential.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled “Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September and has made available its agenda with these items:
    • Facilitating Shared Use in the 3.1-3.55 GHz Band. The Commission will consider a Report and Order that would remove the existing non-federal allocations from the 3.3-3.55 GHz band as an important step toward making 100 megahertz of spectrum in the 3.45-3.55 GHz band available for commercial use, including 5G, throughout the contiguous United States. The Commission will also consider a Further Notice of Proposed Rulemaking that would propose to add a co-primary, non-federal fixed and mobile (except aeronautical mobile) allocation to the 3.45-3.55 GHz band as well as service, technical, and competitive bidding rules for flexible-use licenses in the band. (WT Docket No. 19-348)
    • Expanding Access to and Investment in the 4.9 GHz Band. The Commission will consider a Sixth Report and Order that would expand access to and investment in the 4.9 GHz (4940-4990 MHz) band by providing states the opportunity to lease this spectrum to commercial entities, electric utilities, and others for both public safety and non-public safety purposes. The Commission also will consider a Seventh Further Notice of Proposed Rulemaking that would propose a new set of licensing rules and seek comment on ways to further facilitate access to and investment in the band. (WP Docket No. 07-100)
    • Improving Transparency and Timeliness of Foreign Ownership Review Process. The Commission will consider a Report and Order that would improve the timeliness and transparency of the process by which it seeks the views of Executive Branch agencies on any national security, law enforcement, foreign policy, and trade policy concerns related to certain applications filed with the Commission. (IB Docket No. 16-155)
    • Promoting Caller ID Authentication to Combat Spoofed Robocalls. The Commission will consider a Report and Order that would continue its work to implement the TRACED Act and promote the deployment of caller ID authentication technology to combat spoofed robocalls. (WC Docket No. 17-97)
    • Combating 911 Fee Diversion. The Commission will consider a Notice of Inquiry that would seek comment on ways to dissuade states and territories from diverting fees collected for 911 to other purposes. (PS Docket Nos. 20-291, 09-14)
    • Modernizing Cable Service Change Notifications. The Commission will consider a Report and Order that would modernize requirements for notices cable operators must provide subscribers and local franchising authorities. (MB Docket Nos. 19-347, 17-105)
    • Eliminating Records Requirements for Cable Operator Interests in Video Programming. The Commission will consider a Report and Order that would eliminate the requirement that cable operators maintain records in their online public inspection files regarding the nature and extent of their attributable interests in video programming services. (MB Docket No. 20-35, 17-105)
    • Reforming IP Captioned Telephone Service Rates and Service Standards. The Commission will consider a Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking that would set compensation rates for Internet Protocol Captioned Telephone Service (IP CTS), deny reconsideration of previously set IP CTS compensation rates, and propose service quality and performance measurement standards for captioned telephone services. (CG Docket Nos. 13-24, 03-123)
    • Enforcement Item. The Commission will consider an enforcement action.

Other Developments

  • After Ireland’s Data Protection Commission (DPC) directed Facebook to stop transferring the personal data of European Union citizens to the United States (U.S.), the company filed suit in Ireland’s court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge. Earlier this summer, the Court of Justice for the European Union (CJEU) struck down the adequacy decision for the agreement between the European Union (EU) and United States (U.S.) that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data o the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • In a related development, the European Data Protection Board (EDPB) has established “a taskforce to look into complaints filed in the aftermath of the CJEU Schrems II judgement.” The EDPB noted the 101 identical complaints “lodged with EEA Data Protection Authorities against several controllers in the European Economic Area (EEA) member states regarding their use of Google/Facebook services which involve the transfer of personal data.” The Board added “[s]pecifically the complainants, represented by the NGO NOYB, claim that Google/Facebook transfer personal data to the U.S. relying on the EU-U.S. Privacy Shield or Standard Contractual Clauses and that according to the recent CJEU judgment in case C-311/18 the controller is unable to ensure an adequate protection of the complainants’ personal data.” The EDPB claimed “[t]he taskforce will analyse the matter and ensure a close cooperation among the members of the Board…[and] [t]his taskforce will prepare recommendations to assist controllers and processors with their duty to identify and implement appropriate supplementary measures to ensure adequate protection when transferring data to third countries.” EDPB Chair Andrea Jelinek cautioned “the implications of the judgment are wide-ranging, and the contexts of data transfers to third countries very diverse…[and] [t]herefore, there cannot be a one-size-fits-all, quick fix solution.” She added “[e]ach organisation will need to evaluate its own data processing operations and transfers and take appropriate measures.”
  • An Australian court ruled against Facebook in its efforts to dismiss a suit brought against the company for its role in retaining and providing personal data to Cambridge Analytica. A Federal Court of Australia dismissed Facebook’s filings to reverse a previous ruling that allowed the Office of the Australian Information Commissioner (OAIC) to sue Facebook’s United States and Irish entities.
    • In March, the OAIC filed suit in federal court in Australia, alleging the two companies transgressed the privacy rights of 311,127 Australians under Australia’s Privacy Act. The two companies could face liability as high as $1.7 million ASD per violation.
    • In its November 2018 report to Parliament titled “Investigation into the use of data analytics in political campaigns”, the ICO explained
      • One key strand of our investigation involved allegations that an app, ultimately referred to as ‘thisisyourdigitallife’, was developed by Dr Aleksandr Kogan and his company Global Science Research (GSR) in order to harvest the data of up to 87 million global Facebook users, including one million in the UK. Some of this data was then used by Cambridge Analytica, to target voters during the 2016 US Presidential campaign process.
    • In its July 2018 report titled “Democracy disrupted? Personal information and political influence,” the ICO explained
      • The online targeted advertising model used by Facebook is very complex, and we believe a high level of transparency in relation to political advertising is vital. This is a classic big-data scenario: understanding what data is going into the system; how users’ actions on Facebook are determining what interest groups they are placed in; and then the rules that are fed into any dynamic algorithms that enable organisations to target individuals with specific adverts and messaging.
      • Our investigation found significant fair-processing concerns both in terms of the information available to users about the sources of the data that are being used to determine what adverts they see and the nature of the profiling taking place. There were further concerns about the availability and transparency of the controls offered to users over what ads and messages they receive. The controls were difficult to find and were not intuitive to the user if they wanted to control the political advertising they received. Whilst users were informed that their data would be used for commercial advertising, it was not clear that political advertising would take place on the platform.
      • The ICO also found that despite a significant amount of privacy information and controls being made available, overall they did not effectively inform the users about the likely uses of their personal information. In particular, more explicit information should have been made available at the first layer of the privacy policy. The user tools available to block or remove ads were also complex and not clearly available to users from the core pages they would be accessing. The controls were also limited in relation to political advertising.
  • The Australian Competition & Consumer Commission (ACCC) announced it “will be examining the experiences of Australian consumers, developers, suppliers and others in a new report scrutinising mobile app stores” according to the agency’s press release. The ACCC’s inquiry comes at the same time regulators in the United States and the European Union are investigating the companies for their app store practices, which could lead to enforcement actions. The ACCC is also looking to institute a code that would require Google and Facebook to pay Australian media outlets for content used on their platforms. The ACCC stated that “[i]ssues to be examined include the use and sharing of data by apps, the extent of competition between Google and Apple’s app stores, and whether more pricing transparency is needed in Australia’s mobile apps market.” The ACCC added:
    • Consumers are invited to share their experiences with buying and using apps through a short survey. The ACCC has also released an issues paper seeking views and feedback from app developers and suppliers.
    • In the issues paper, the ACCC explained “[p]otential outcomes” could be:
      • findings regarding structural, competitive or behavioural issues affecting the supply of apps
      • increased information about competition, pricing and other practices in the supply of apps and on app marketplaces
      • ACCC action to address any conduct that raises concerns under the Competition and Consumer Act 2010, and
      • recommendations to the Government for legislative reform to address systemic issues.
  • The Government Accountability Office (GAO) found an agency has implemented spotty, incomplete privacy measures in using facial recognition technology (FRT) at ports of entry.
    • The House Homeland Security and Senate Homeland Security and Governmental Affairs asked the GAO
      • to review United States (U.S.) Customs and Border Protection (CBP) and Transportation Security Administration’s (TSA) facial recognition technology capabilities for traveler identity verification. This report addresses (1) the status of CBP’s testing and deployment of facial recognition technology at ports of entry, (2) the extent to which CBP’s use of facial recognition technology has incorporated privacy principles consistent with applicable laws and policies, (3) the extent to which CBP has assessed the accuracy and performance of its facial recognition capabilities at ports of entry, and (4) the status of TSA’s testing of facial recognition capabilities and the extent to which TSA’s facial recognition pilot tests incorporated privacy principles.
    • The GAO noted:
      • Most recently, in 2017, we reported that CBP had made progress in testing biometric exit capabilities, including facial recognition technology, but challenges continued to affect CBP’s efforts to develop and implement a biometric exit system, such as differences in the logistics and infrastructure among ports of entry. As we previously reported, CBP had tested various biometric technologies in different locations to determine which type of technology could be deployed on a large scale without disrupting legitimate travel and trade, while still meeting its mandate to implement a biometric entry-exit system. Based on the results of its testing, CBP concluded that facial recognition technology was the most operationally feasible and traveler-friendly option for a comprehensive biometric solution. Since then, CBP has prioritized testing and deploying facial recognition technology at airports (referred to as air exit), with seaports and land ports of entry to follow. These tests and deployments are part of CBP’s Biometric Entry-Exit Program.
      • As part of TSA’s mission to protect the nation’s transportation systems and to ensure freedom of movement for people and commerce, TSA has been exploring facial recognition technology for identity verification at airport checkpoints. Since 2017, TSA has conducted a series of pilot tests—some in partnership with CBP—to assess the feasibility of using facial recognition technology to automate traveler identity verification at airport security checkpoints. In April 2018, TSA signed a policy memorandum with CBP on the development and implementation of facial recognition capabilities at airports.
    • The GAO made recommendations to CBP:
      • The Commissioner of CBP should ensure that the Biometric Entry-Exit Program’s privacy notices contain complete and current information, including all of the locations where facial recognition is used and how travelers can request to opt out as appropriate. (Recommendation 1)
      • The Commissioner of CBP should ensure that the Biometric Entry-Exit Program’s privacy signage is consistently available at all locations where CBP is using facial recognition. (Recommendation 2)
      • The Commissioner of CBP should direct the Biometric Entry-Exit Program to develop and implement a plan to conduct privacy audits of its commercial partners’, contractors’, and vendors’ use of personally identifiable information. (Recommendation 3)
      • The Commissioner of CBP should develop and implement a plan to ensure that the biometric air exit capability meets its established photo capture requirement. (Recommendation 4)
      • The Commissioner of CBP should develop a process by which Biometric Entry-Exit program officials are alerted when the performance of air exit facial recognition falls below established thresholds. (Recommendation 5)
  • The United States (U.S.) Agency for Global Media (USAGM) is being sued by an entity it funds and oversees because
    • Previously, the United States Court of Appeals for the District of Columbia enjoined USAGM from “taking any action to remove or replace any officers or directors of the OTF,” pending the outcome of the suit which is being expedited.
    • Additionally, USAGM CEO and Chair of the Board Michael Pack is being accused in two different letters of seeking to compromise the integrity and independence of two organizations he oversees. There have been media accounts of the Trump Administration’s remaking of USAGM in ways critics contend are threatening the mission and effectiveness of the Open Technology Fund (OTF), a U.S. government non-profit designed to help dissidents and endangered populations throughout the world. The head of the OTF has been removed, evoking the ire of Members of Congress, and other changes have been implemented that are counter to the organization’s mission. Likewise, there are allegations that politically-motivated policy changes seek to remake the Voice of America (VOA) into a less independent entity.
      • In a letter to Pack, OTF argued that a number of recent actions Pack has undertaken have violated “firewall protections” in the organization’s grant agreement. They further argue that Pack is conflicted and should turn over the investigation to the United States (U.S.) Department of State’s Office of the Inspector General (OIG). OTF alleged the following:
        • 1. Attempts to compromise and undermine OTF’s independence: USAGM has repeatedly attempted to undermine OTF’s independence over the past several months.
        • 2. Attempts to compromise and undermine integrity: USAGM has also attempted to undermine the integrity of OTF by publicly making numerous false and misleading claims about OTF to the internet freedom community, the general public, and even to Congress.
        • 3. Attempts to compromise and undermine security: USAGM has attempted to undermine the security of OTF, our staff, and our project partners -many of whom operate in highly sensitive environments -by
          • 1) attempting to gain unauthorized and unsupervised access to our office space and
          • 2) by requesting vast amounts of sensitive information and documentation with no apparent grant-related purpose, and no regard for the security of that information and documentation
        • 4. Attempts to compromise and undermine privacy: Closely related to USAGM’s attempts to undermine OTF’s security, USAGM has also attempted to undermine the privacy of OTF’s staff and partners by requesting that OTF provide Personally Identifiable Information(PII) without a clearly articulated grant-related purpose, and with no guarantee that the PII will be handled in a secure manner.
        • 5. Attempts to compromise and undermine effectiveness: USAGM’s actions have undermined the effectiveness of OTF by:
          • 1) freezing and subsequently withholding $19,181,791 in congressionally appropriated funding from OTF, forcing OTF to issue stop-work orders to 49 of our 60 internet freedom projects;
          • 2) providing unjustified, duplicative, overbroad, and unduly burdensome requests for information and documentation, without any clear grant-related purpose, and with clearly unreasonable deadlines;
          • 3) attempting to divert and redirect funding obligated by USAGM to OTF in an effort to duplicate OTF’s work; and
          • 4) threatening to terminate OTF’s Grant Agreement.
    • OTF asserted
      • These actions individually serve to seriously undermine OTF’s organizational and programmatic effectiveness. In their combined aggregate they threaten to dismantle OTF’s basic ability to effectively carry out its congressionally mandated mission to the detriment of USAGM and the cause of internet freedom globally
    • A group of VOA journalists wrote the entity’s acting director, asserting that Pack’s actions risk crippling programs and projects for some countries that are considered national security priorities.” They added:
      • He has ordered the firing of contract journalists, with no valid reason, by cancelling their visas, forcing them back to home countries where the lives of some of them may be in jeopardy. Now the purge appears to be expanding to include U.S. permanent residents and even U.S. citizens, with Mr. Pack recklessly expressing that being a journalist is “a great cover for a spy.
  • The Cyberspace Solarium Commission (CSC) issued its latest white paper to address a continuing problem for the United States’ government: how to attract or train a sufficient cyber workforce when private sector salaries are generally better. In “Growing A Stronger Federal Cyber Workforce,” the CSC claimed “Currently more than one in three public-sector cyber jobs sits open…[and] [f]illing these roles has been a persistent and intractable problem over the past decade, in large part due to a lack of coordination and leadership.” The CSC averred “[i]n the context of this pervasive challenge, the fundamental purpose of this paper is to outline the elements required for a coherent strategy that enables substantive and coordinated investment in cyber workforce development and calls for a sustained investment in that strategy.” The CSC then proceeds to lay out “five elements to guide development of a federal cyber workforce strategy:
    • Organize: Federal departments and agencies must have flexible tools for organizing and managing their workforce that can adapt to each organization’s individual mission while also providing coherence across the entirety of the federal government. To appropriately organize the federal cyber workforce, the CSC recommends properly identifying and utilizing cyber-specific occupational classifications to allow more tailored workforce policies, building a federal cyber service to provide clear and agile hiring authorities and other personnel management tools, and establishing coordination structures to provide clear leadership for federal workforce development e orts.
    • Recruit: Federal leaders must focus on the programs that make public service an attractive prospect to talented individuals. In many ways, the federal government’s greatest tool for recruitment is the mission and unique learning opportunities inherent in federal work. To capitalize on these advantages, the government should invest in existing programs such as CyberCorps: Scholarship for Service and the Centers of Academic Excellence, while also working to mitigate recruitment barriers that stem from the personnel security clearance process.
    • Develop: e federal government, like all cyber employers, cannot expect every new employee to have hands-on experience, a four-year degree, and a list of industry certifications. Rather, the federal government will be stronger if it draws from a broad array of educational backgrounds and creates opportunities for employees to gain knowledge and experience as they work. is e ort will call for many innovative approaches, among which the Commission particularly recommends apprenticeship programs and upskilling opportunities to support cyber employee development.
    • Retain: Federal leaders should take a nuanced view of retention, recognizing that enabling talent to move flexibly between the public and private sectors enables a stronger cyber workforce overall. However, federal employers can take steps to encourage their employees to increase the time they spend in public service. Improving pay flexibility is a major consideration, but continuing the development of career pathways and providing interesting career development opportunities like rotational and exchange programs also can be critical. Of particular note, federal employers can increase retention of underrepresented groups through the removal of inequities and barriers to advancement in the workplace.
    • Stimulate growth: e federal government cannot simply recruit a larger share of the existing national talent pool. Rather, leaders must take steps to grow the talent pool itself in order to increase the numbers of those available for federal jobs. To promote growth of the talent pool nationwide, the federal government must first coordinate government efforts working toward this goal. Executive branch and congressional leaders should also invest in measures to promote diversity across the national workforce and incentivize research to provide a greater empirical understanding of cyber workforce dynamics. Finally, federal leaders must work to increase the military cyber workforce, which has a significant impact on the national cyber workforce because it serves as both a source and an employer of cyber talent.

Further Reading

  • Oracle reportedly wins deal for TikTok’s US operations as ‘trusted tech partner’” By Tom Warren and Nick Statt – The Verge. ByteDance chose Oracle over Microsoft but not for buying its operations in the United States (U.S.), Australia, Canada, and New Zealand. Now, Oracle is proposing to be TikTok’s trusted technology partner, which seems to be hosting TikTok’s operations in the U.S. and managing its data as a means of allaying the concerns of the U.S. government about access by the People’s Republic of China (PRC).
  • Why Do Voting Machines Break on Election Day?” By Adrianne Jeffries – The Markup. This piece seeks to debunk the hype by explaining that most voting issues are minor and easily fixed, which may well be a welcome message in the United States (U.S.) given the lies and fretting about the security and accuracy of the coming election. Nonetheless, the mechanical and systemic problems encountered by some Americans do speak to the need to update voting laws and standards. Among other problems are the high barriers to entry for firms making and selling voting machines.
  • Twitter steps up its fight against election misinformation” By Elizabeth Dwoskin – The Washington Post. Twitter and Google announced policy changes like Facebook did last week to help tamp down untrue claims and lies about voting and elections in the United States. Twitter will take a number of different approaches to handling lies and untrue assertions. If past is prologue, President Donald Trump may soon look to test the limits of this policy as he did shortly after Facebook announced its policy changes. Google will adjust searches on election day to place respected, fact oriented organizations at the top of search results.
  • China’s ‘hybrid war’: Beijing’s mass surveillance of Australia and the world for secrets and scandal” By Andrew Probyn and Matthew Doran – ABC News; “Zhenhua Data leak: personal details of millions around world gathered by China tech company” By Daniel Hurst in Canberra, Lily Kuo in Beijing and Charlotte Graham-McLay in Wellington – The Guardian. A massive database leaked to to an American shows the breadth and range of information collected by a company in the People’s Republic of China (PRC) alleged to be working with the country’s military and security services. Zhenhua Data is denying any wrongdoing or anything untoward, but the database contains information on 2.4 million people, most of whom live in western nations in positions of influence and power such as British and Australian prime Ministers Boris Johnson and Scott Morrison. Academics claim this sort of compilation of information from public and private sources is unprecedented and would allow the PRC to run a range of influence operations.
  • Europe Feels Squeeze as Tech Competition Heats Up Between U.S. and China” By Steven Erlanger and Adam Satariano – The New York Times. Structural challenges in the European Union (EU) and a lack of large technology companies have left the EU is a delicate position. It seeks to be the world’s de facto regulator but is having trouble keeping with the United States and the People’s Republic of China, the two dominant nations in technology.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by PixelAnarchy from Pixabay

EDPB Releases Guidance

The EDPB tries to clear up who is and is not a controller or processor and wades into the world of social media and targeting.

The European Data Protection Board (EDPB) has released for comment two sets of guidelines for input to elucidate portions of the General Data Protection Regulation (GDPR).

In the draft Guidelines 07/2020 on the concepts of controller and processor in the GDPR, the EDPB is looking to update guidance issued by its forerunner body on the predecessor data protection regime regarding who is a controller, joint controller, and processor. Any clarification of these definitions would obviously change how entities will be regulated under the GDPR, and ideally, harmonize data protection and processing regulation across the EU. There is the suggestion in the document that there is not currently standard construction of these definitions, causing the same entity to be regulated differently depending on the jurisdiction within the European Economic Area (EEA). The EDPB noted the guidelines were put together with the input of stakeholders, and it is possible that more input from a broader audience will result in a modified product.

This draft guidance is built on the principle of accountability as enshrined in the GDPR, meaning controllers and processors must not only comply with the GDPR but be able to demonstrate compliance with the GDPR. In fact, the EDPB asserts “[t]he aim of incorporating the accountability principle into the GDPR and making it a central principle was to emphasize that data controllers must implement appropriate and effective measures and be able to demonstrate compliance.” Both this emphasis and statement might mean that the EU encountered challenges with respect to entities accepting accountability in data protection under the GDPR’s forerunner, Directive 95/46/EC. Moreover, the need to precisely, or as precisely as possible, define who is and is not a controller, joint controller, or processor is crucial to apportioning responsibility and culpability for noncompliance. Therefore, these guidelines will be a crucial starting point for both data protection authorities (DPA) and the entities collecting and processing the personal data of EU persons. Moreover, the EDPB proposes to go beyond labels in determining who is a controller or processor by looking at what an entity is actually doing. By the same token, the Board makes clear the term controller should not be confused with this term in other legal contexts and should be interpreted broadly to ensure the greatest possible data protection.

The EDPB claimed “[t]he main aim is to clarify the meaning of the concepts and to clarify the different roles and the distribution of responsibilities between these actors.” The EDPB stated

The Article 29 Working Party issued guidance on the concepts of controller/processor in its opinion 1/2010 (WP169) in order to provide clarifications and concrete examples with respect to these concepts. Since the entry into force of the GDPR, many questions have been raised regarding to what extent the GDPR brought changes to the concepts of controller and processor and their respective roles. Questions were raised in particular to the substance and implications of the concept of joint controllership (e.g. as laid down in Article 26 GDPR) and to the specific obligations for processors laid down in Chapter IV (e.g. as laid down in Article 28 GDPR). Therefore, and as the EDPB recognizes that the concrete application of the concepts needs further clarification, the EDPB now deems it necessary to give more developed and specific guidance in order to ensure a consistent and harmonised approach throughout the EU and the EEA. The present guidelines replace the previous opinion of Working Party 29 on these concepts (WP169).

The EDPB summarized the concepts of these terms and the interplay between entities:

  • Controller
    • In principle, there is no limitation as to the type of entity that may assume the role of a controller but in practice it is usually the organisation as such, and not an individual within the organisation (such as the CEO, an employee or a member of the board), that acts as a controller.
    • A controller is a body that decides certain key elements of the processing. Controllership may be defined by law or may stem from an analysis of the factual elements or circumstances of the case. Certain processing activities can be seen as naturally attached to the role of an entity (an employer to employees, a publisher to subscribers or an association to its members). In many cases, the terms of a contract can help identify the controller, although they are not decisive in all circumstances.
    • A controller determines the purposes and means of the processing, i.e. the why and how of the processing. The controller must decide on both purposes and means. However, some more practical aspects of implementation (“non-essential means”) can be left to the processor. It is not necessary that the controller actually has access to the data that is being processed to be qualified as a controller.
  • Joint controllers
    • The qualification as joint controllers may arise where more than one actor is involved in the processing. The GDPR introduces specific rules for joint controllers and sets a framework to govern their relationship. The overarching criterion for joint controllership to exist is the joint participation of two or more entities in the determination of the purposes and means of a processing operation. Joint participation can take the form of a common decision taken by two or more entities or result from converging decisions by two or more entities, where the decisions complement each other and are necessary for the processing to take place in such a manner that they have a tangible impact on the determination of the purposes and means of the processing. An important criterion is that the processing would not be possible without both parties’ participation in the sense that the processing by each party is inseparable, i.e. inextricably linked. The joint participation needs to include the determination of purposes on the one hand and the determination of means on the other hand.
  • Processor
    • A processor is a natural or legal person, public authority, agency or another body, which processes personal data on behalf of the controller. Two basic conditions for qualifying as processor exist: that it is a separate entity in relation to the controller and that it processes personal data on the controller’s behalf.
    • The processor must not process the data otherwise than according to the controller’s instructions. The controller’s instructions may still leave a certain degree of discretion about how to best serve the controller’s interests, allowing the processor to choose the most suitable technical and organisational means. A processor infringes the GDPR, however, if it goes beyond the controller’s instructions and starts to determine its own purposes and means of the processing. The processor will then be considered a controller in respect of that processing and may be subject to sanctions for going beyond the controller’s instructions.
  • Relationship between controller and processor
    • A controller must only use processors providing sufficient guarantees to implement appropriate technical and organisational measures so that the processing meets the requirements of the GDPR. Elements to be taken into account could be the processor’s expert knowledge (e.g. technical expertise with regard to security measures and data breaches); the processor’s reliability; the processor’s resources and the processor’s adherence to an approved code of conduct or certification mechanism.
    • Any processing of personal data by a processor must be governed by a contract or other legal act which shall be in writing, including in electronic form, and be binding. The controller and the processor may choose to negotiate their own contract including all the compulsory elements or to rely, in whole or in part, on standard contractual clauses.
    • The GDPR lists the elements that have to be set out in the processing agreement. The processing agreement should not, however, merely restate the provisions of the GDPR; rather, it should include more specific, concrete information as to how the requirements will be met and which level of security is required for the personal data processing that is the object of the processing agreement.
  • Relationship among joint controllers
    • Joint controllers shall in a transparent manner determine and agree on their respective responsibilities for compliance with the obligations under the GDPR. The determination of their respective responsibilities must in particular regard the exercise of data subjects’ rights and the duties to provide information. In addition to this, the distribution of responsibilities should cover other controller obligations such as regarding the general data protection principles, legal basis, security measures, data breach notification obligation, data protection impact assessments, the use of processors, third country transfers and contacts with data subjects and supervisory authorities.
    • Each joint controller has the duty to ensure that they have a legal basis for the processing and that the data are not further processed in a manner that is incompatible with the purposes for which they were originally collected by the controller sharing the data.
    • The legal form of the arrangement among joint controllers is not specified by the GDPR. For the sake of legal certainty, and in order to provide for transparency and accountability, the EDPB recommends that such arrangement be made in the form of a binding document such as a contract or other legal binding act under EU or Member State law to which the controllers are subject.
    • The arrangement shall duly reflect the respective roles and relationships of the joint controllers vis-à- vis the data subjects and the essence of the arrangement shall be made available to the data subject.
    • Irrespective of the terms of the arrangement, data subjects may exercise their rights in respect of and against each of the joint controllers. Supervisory authorities are not bound by the terms of the arrangement whether on the issue of the qualification of the parties as joint controllers or the designated contact point.

In the Guidelines 08/2020 on the targeting of social media users, the Board explained that the genesis of this guidance came from the EDPB itself. These guidelines are, in a sense, a more targeted version of the other draft guidelines the EDPB has issued for comment in that they seek to clarify the responsibilities, joint and otherwise, of social media companies and others operating in the targeted advertising universe. Consequently, these would apply to companies like Facebook, Twitter, and other social media platforms and virtually any entity using such a platform to send a targeted advertisement to a user or group of users. However, the EDPB makes clear its concern with respect to these practices is not confined to the commercial world and explains at some length its concern that EU persons could be targeted with political materials, a common practice of the Russian Federation in a number of countries, including the EU in all likelihood. The Board stated “[t]he main aim of these guidelines is therefore to clarify the roles and responsibilities among the social media provider and the targeter, a term defined as those “that use social media services in order to direct specific messages at a set of social media users on the basis of specific parameters or criteria.”

The EDPB asserted

  • As part of their business model, many social media providers offer targeting services. Targeting services make it possible for natural or legal persons (“targeters”) to communicate specific messages to the users of social media in order to advance commercial, political, or other interests. A distinguishing characteristic of targeting is the perceived fit between the person or group being targeted and the message that is being delivered. The underlying assumption is that the better the fit, the higher the reception rate (conversion) and thus the more effective the targeting campaign (return on investment).
  • Mechanisms to target social media users have increased in sophistication over time. Organisations now have the ability to target individuals on the basis of a wide range of criteria. Such criteria may have been developed on the basis of personal data which users have actively provided or shared, such as their relationship status. Increasingly, however, targeting criteria are also developed on the basis of personal data which has been observed or inferred, either by the social media provider or by third parties, and collected (aggregated) by the platform or by other actors (e.g., data brokers) to support ad-targeting options. In other words, the targeting of social media users involves not just the act of “selecting” the individuals or groups of individuals that are the intended recipients of a particular message (the ‘target audience’), but rather it involves an entire process carried out by a set of stakeholders which results in the delivery of specific messages to individuals with social media accounts.
  • The combination and analysis of data originating from different sources, together with the potentially sensitive nature of personal data processed in the context of social media, creates risks to the fundamental rights and freedoms of individuals. From a data protection perspective, many risks relate to the possible lack of transparency and user control. For the individuals concerned, the underlying processing of personal data which results in the delivery of a targeted message is often opaque. Moreover, it may involve unanticipated or undesired uses of personal data, which raise questions not only concerning data protection law, but also in relation to other fundamental rights and freedoms. Recently, social media targeting has gained increased public interest and regulatory scrutiny in the context of democratic decision making and electoral processes.

The EDPB added

  • Taking into account the case law of the CJEU, as well as the provisions of the GDPR regarding joint controllers and accountability, the present guidelines offer guidance concerning the targeting of social media users, in particular as regards the responsibilities of targeters and social media providers. Where joint responsibility exists, the guidelines will seek to clarify what the distribution of responsibilities might look like between targeters and social media providers on the basis of practical examples.
  • The main aim of these guidelines is therefore to clarify the roles and responsibilities among the social media provider and the targeter. In order to do so, the guidelines also identify the potential risks for the rights and freedoms of individuals (section 3), the main actors and their roles (section 4), and tackles the application of key data protection requirements (such as lawfulness and transparency, DPIA, etc.) as well as key elements of arrangements between social media providers and the targeters.

The EDPB explained the main two means by which targeting occurs: “[s]ocial media users may be targeted on the basis of provided, observed or inferred data, as well as a combination thereof:

  1. a)  Targeting individuals on the basis of provided data – “Provided data” refers to information actively provided by the data subject to the social media provider and/or the targeter. For example:
    • A social media user might indicate his or her age in the description of his or her user profile. The social media provider, in turn, might enable targeting on the basis of this criterion.
    • A targeter might use information provided by the data subject to the targeter in order to target that individual specifically, for example by means of customer data (such as an e- mail address list), to be matched with data already held on the social media platform, leading to all those users who match being targeted with advertising.
  2. b)  Targeting on the basis of observed data – Targeting of social media users can also take place on the basis of observed data. Observed data are data provided by the data subject by virtue of using a service or device. For example, a particular social media user might be targeted on the basis of:
    • his or her activity on the social media platform itself (for instance the content that the user has shared, consulted or liked);
    • the use of devices on which the social media’s application is executed (for instance GPS coordinates, mobile telephone number);
    • data obtained by a third-party application developer by using the application programming interfaces (APIs) or software development kits (SDKs) offered by social media providers;
    • data collected through third-party websites that have incorporated social plugins or pixels;
    • data collected through other third parties (e.g. parties with whom the data subject has  interacted, purchased a product, subscribed to loyalty cards, …); or
    • data collected through services offered by companies owned or operated by the social media provider.

The EDPB added

Targeting on the basis of inferred data – “Inferred data” or “derived data” are created by the data controller on the basis of the data provided by the data subject or as observed by the controller. For example, a social media provider or a targeter might infer that an individual is likely to be interested in a certain activity or product on the basis of his or her web browsing behaviour and/or network connections.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

EDPB Steps Into Twitter Investigation

The body that consists of and oversees the EU’s DPAs will use its power under the GDPR to resolve  a dispute between agencies over the punishment for Twitter’s data breaches.

The European Data Protection Board (EDPB) will soon have the opportunity to use a key power for the first time since its inception in order to resolve a dispute among data protection authorities (DPA) in the European Union (EU). Unnamed DPAs have objected to proposed ruling by Ireland’s Data Protection Commission (DPC), the lead DPA investigating 2018 and 2019 Twitter data breaches. Consequently, per the General Data Protection Regulation (GDPR), the disagreement has been handed off to the EDPB, and depending on how resolution of this matter happens, the body could decide Twitter’s punishment, including a possible fine of up to 4% of its worldwide revenue. What’s more, the DPC is the lead agency investigating Facebook’s WhatsApp and Instagram, among other large technology companies, and may have to relinquish those decisions as well if other DPAs disagree with the DPC’s proposed punishment for any wrongdoing.

The DPC submitted its draft decision to other DPAs on the Twitter breach in May in accordance with Article 60 of the GDPR. The DPC stated “[t]he draft decision focusses on whether Twitter International Company has complied with Articles 33(1) and 33(5) of the GDPR” (i.e. the provision pertaining to data breach and proper notification protocol. The DPC further explained

  • This draft decision is one of a number of significant developments in DPC inquiries into “big tech” companies this week. Deputy Commissioner Graham Doyle has confirmed that: “In addition to submitting this draft decision to other EU supervisory authorities, we have this week sent a preliminary draft decision to WhatsApp Ireland Limited for their final submissions which will be taken in to account by the DPC before preparing a draft decision in that matter also for Article 60 purposes.  The inquiry into WhatsApp Ireland examines its compliance with Articles 12 to 14 of the GDPR in terms of transparency including in relation to transparency around what information is shared with Facebook.“
  • The DPC has also completed the investigation phase of a complaint-based inquiry which focuses on Facebook Ireland’s obligations to establish a lawful basis for personal data processing. This inquiry is now in the decision-making phase at the DPC.

Article 65 of the GDPR provides that the EDPB will make a binding decision on an investigation where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned.” In this case, at least one DPA has raised an objection to the DPC’s draft decision, thus triggering Article 65. Then the EDPB has a month to get two-thirds of its members to agree to a binding decision it may draft. If this is not achieved, then the Board has another two weeks to get a simple majority, and if this does not occur, then EDPB Chair  Andrea Jelinek alone may decide. Consequently, it is possible the EDPB redrafts the DPC decision and tries to get buy in from the DPAs that make up the Board to support a stronger punishment of Twitter.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Roland Mey from Pixabay

EDPB Issues FAQs On Privacy Shield Decision

While the EDPB does not provide absolute answers on how US entities looking to transfer EU personal data should proceed, the agencies provide their best thinking on what the path forward looks like.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

On 24 July, the European Data Protection Board (EDPB) has addressed, in part, the implications of the recent decision that struck down the European Union-United States Privacy Shield, an agreement that had allowed US companies to transfer and process the personal data of EU citizens. The EDPB fully endorsed the view that the United States’ (US) surveillance regime, notably Section 702 of the “Foreign Intelligence Surveillance Act” (FISA) and Executive Order (EO) 12333, makes most transfers to the US illegal except perhaps if entities holding and using the data take extra steps to protect it. The EDPB references another means that allows for transfers to possibly continue but that generally requires informed and explicit consent from each and every EU person involved. Finally, the EDPB does not address whether the European Commission (EC) and the US are able to execute a third agreement that would be legal under EU law.

The EDPB, which is comprised of the European Union’s (EU) data protection authorities (DPAs), has formally adopted a document spelling out its view on if data transfers under Privacy Shield to the US are still legal and how companies should proceed in using standard contractual clauses (SCCs) and Binding Corporate Rules (BCR), two alternative means of transferring data aside from Privacy Shield. The EDPB’s views suggest the DPAs and supervisory authorities (SA) in each EU nation are going to need to work on a case-by-case basis regarding the latter two means, for the EDPB stressed these are to be evaluated individually. Given recent criticism of how nations are funding and resourcing their DPAs, there may be capacity issues in managing this new work alongside existing enforcement and investigation matters. Moreover, the EDPB discusses use of the exceptions available in Article 49 of the General Data Privacy Regulation (GDPR), stressing that most such transfers are to be occasional.

In last week’s decision, the Court of Justice of the European Union (CJEU) invalidated the European Commission’s adequacy decision on the EU-US Privacy Shield, thus throwing into question all transfers of personal data from the EU into the US that relied on this means. The CJEU was more circumspect in ruling on the use of standard contractual clauses (SCC), another way to legally transfer personal data out of the EU in compliance with the bloc’s law. The court seems to suggest there may be cases in which the use of SCCs may be inadequate given a country’s inadequate protections of the data of EU residents, especially with respect to national security and law enforcement surveillance. The EDPB issued a statement when the decision was made supporting the CJEU but has now adopted a more detailed explanation of its views on the implications of the decision for data controllers, data processors, other nations, EU DPAs and SAs.

In “Frequently Asked Questions (FAQ) on the judgment of the CJEU in Case C-311/18 -Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems,” the EDPB explains its current thinking on the decision, much of which is built on existing guidance and interpretation of the GDPR. The EDPB explained that the FAQ “aims at presenting answers to some frequently asked questions received by SAs and will be developed and complemented along with further analysis, as the EDPB continues to examine and assess the judgment of the CJEU.”

Here are notable excerpts:

  • Is there any grace period during which I can keep on transferring data to the U.S. without assessing my legal basis for the transfer? No, the Court has invalidated the Privacy Shield Decision without maintaining its effects, because the U.S. law assessed by the Court does not provide an essentially equivalent level of protection to the EU. This assessment has to be taken into account for any transfer to the U.S.
  • I was transferring data to a U.S. data importer adherent to the Privacy Shield, what should I do now? Transfers on the basis of this legal framework are illegal. Should you wish to keep on transferring data to the U.S., you would need to check whether you can do so under the conditions laid down below.
  • I am using SCCs with a data importer in the U.S., what should I do? The Court found that U.S. law (i.e., Section 702 FISA and EO 12333) does not ensure an essentially equivalent level of protection. Whether or not you can transfer personal data on the basis of SCCs will depend on the result of your assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place. The supplementary measures along with SCCs, following a case-by-case analysis of the circumstances surrounding the transfer, would have to ensure that U.S. law does not impinge on the adequate level of protection they guarantee. If you come to the conclusion that, taking into account the circumstances of the transfer and possible supplementary measures, appropriate safeguards would not be ensured, you are required to suspend or end the transfer of personal data. However, if you are intending to keep transferring data despite this conclusion, you must notify your competent SA.
  • I am using Binding Corporate Rules (“BCRs”) with an entity in the U.S., what should I do? Given the judgment of the Court, which invalidated the Privacy Shield because of the degree of interference created by the law of the U.S. with the fundamental rights of persons whose data are transferred to that third country, and the fact that the Privacy Shield was also designed to bring guarantees to data transferred with other tools such as BCRs, the Court’s assessment applies as well in the context of BCRs, since U.S. law will also have primacy over this tool.
  • Whether or not you can transfer personal data on the basis of BCRs will depend on the result of your assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place. These supplementary measures along with BCRs, following a case-by-case analysis of the circumstances surrounding the transfer, would have to ensure that U.S. law does not impinge on the adequate level of protection they guarantee. If you come to the conclusion that, taking into account the circumstances of the transfer and possible supplementary measures, appropriate safeguards would not be ensured, you are required to suspend or end the transfer of personal data. However if you are intending to keep transferring data despite this conclusion, you must notify your competent SA.
  • Can I rely on one of the derogations of Article 49 GDPR to transfer data to the U.S.? It is still possible to transfer data from the EEA to the U.S. on the basis of derogations foreseen in Article 49 GDPR provided the conditions set forth in this Article apply. The EDPB refers to its guidelines on this provision. In particular, it should be recalled that when transfers are based on the consent of the data subject, it should be:
    • explicit,
    • specific for the particular data transfer or set of transfers (meaning that the data exporter must make sure to obtain specific consent before the transfer is put in place even if this occurs after the collection of the data has been made),and
    • informed, particularly as to the possible risks of the transfer (meaning the data subject should also informed of the specific risks resulting from the fact that their data will be transferred to a country that does not provide adequate protection and that no adequate safeguards aimed at providing protection for the data are being implemented).
  • With regard to transfers necessary for the performance of a contract between the data subject and the controller, it should be borne in mind that personal data may only be transferred when the transfer is occasional. It would have to be established on a case-by-case basis whether data transfers would be determined as “occasional” or “non-occasional”. In any case, this derogation can only be relied upon when the transfer is objectively necessary for the performance of the contract.
  • In relation to transfers necessary for important reasons of public interest(which must be recognized in EU or Member States’ law), the EDPB recalls that the essential requirement for the applicability of this derogation is the finding of an important public interest and not the nature of the organisation, and that although this derogation is not limited to data transfers that are “occasional”, this does not mean that data transfers on the basis of the important public interest derogation can take place on a large scale and in a systematic manner. Rather, the general principle needs to be respected according to which the derogations as set out in Article 49 GDPR should not become “the rule” in practice, but need to be restricted to specific situations and each data exporter needs to ensure that the transfer meets the strict necessity test.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Maret H. from Pixabay

Further Reading and Other Developments (17 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Speaking of which, the Technology Policy Update is being published daily during the week, and here are the Other Developments and Further Reading from this week.

Other Developments

  • Acting Senate Intelligence Committee Chair Marco Rubio (R-FL), Senate Foreign Relations Committee Chair Jim Risch (R-ID), and Senators Chris Coons (D-DE) and John Cornyn (R-TX) wrote Secretary of Commerce Wilbur Ross and Secretary of Defense Mike Esper “to ask that the Administration take immediate measures to bring the most advanced digital semiconductor manufacturing capabilities to the United States…[which] are critical to our American economic and national security and while our nation leads in the design of semiconductors, we rely on international manufacturing for advanced semiconductor fabrication.” This letter follows the Trump Administration’s May announcement that the Taiwan Semiconductor Manufacturing Corporation (TSMC) agreed to build a $12 billion plant in Arizona. It also bears note that one of the amendments pending to the “National Defense Authorization Act for Fiscal Year 2021“ (S.4049) would establish a grants program to stimulate semiconductor manufacturing in the US.
  • Senators Mark R. Warner (D-VA), Mazie K. Hirono (D-HI) and Bob Menendez (D-NJ) sent a letter to Facebook “regarding its failure to prevent the propagation of white supremacist groups online and its role in providing such groups with the organizational infrastructure and reach needed to expand.” They also “criticized Facebook for being unable or unwilling to enforce its own Community Standards and purge white supremacist and other violent extremist content from the site” and posed “a series of questions regarding Facebook’s policies and procedures against hate speech, violence, white supremacy and the amplification of extremist content.”
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published the Pipeline Cyber Risk Mitigation Infographic that was “[d]eveloped in coordination with the Transportation Security Administration (TSA)…[that] outlines activities that pipeline owners/operators can undertake to improve their ability to prepare for, respond to, and mitigate against malicious cyber threats.”
  • Representative Kendra Horn (D-OK) and 10 other Democrats introduced legislation “requiring the U.S. government to identify, analyze, and combat efforts by the Chinese government to exploit the COVID-19 pandemic” that was endorsed by “[t]he broader Blue Dog Coalition” according to their press release. The “Preventing China from Exploiting COVID-19 Act” (H.R.7484) “requires the Director of National Intelligence—in coordination with the Secretaries of Defense, State, and Homeland Security—to prepare an assessment of the different ways in which the Chinese government has exploited or could exploit the pandemic, which originated in China, in order to advance China’s interests and to undermine the interests of the United States, its allies, and the rules-based international order.” Horn and her cosponsors stated “[t]he assessment must be provided to Congress within 90 days and posted in unclassified form on the DNI’s website.”
  • The Supreme Court of Canada upheld the “Genetic Non-Discrimination Act” and denied a challenge to the legality of the statute brought by the government of Quebec, the Attorney General of Canada, and others. The court found:
    • The pith and substance of the challenged provisions is to protect individuals’ control over their detailed personal information disclosed by genetic tests, in the broad areas of contracting and the provision of goods and services, in order to address Canadians’ fears that their genetic test results will be used against them and to prevent discrimination based on that information. This matter is properly classified within Parliament’s power over criminal law. The provisions are supported by a criminal law purpose because they respond to a threat of harm to several overlapping public interests traditionally protected by the criminal law — autonomy, privacy, equality and public health.
  • The U.S.-China Economic and Security Review Commission published a report “analyzing the evolution of U.S. multinational enterprises (MNE) operations in China from 2000 to 2017.” The Commission found MNE’s operations in the People’s Republic of China “may indirectly erode the  United  States’  domestic industrial competitiveness  and  technological  leadership relative  to  China” and “as U.S. MNE activity in China increasingly focuses on the production of high-end technologies, the risk  that  U.S.  firms  are  unwittingly enabling China to  achieve  its industrial  policy and  military  development objectives rises.”
  • The Federal Communications Commission (FCC) and Huawei filed their final briefs in their lawsuit before the United States Court of Appeals for the Fifth Circuit arising from the FCC’s designation of Huawei as a “covered company” for purposes of a rule that denies Universal Service Funds (USF) “to purchase or obtain any equipment or services produced or provided by a covered company posing a national security threat to the integrity of communications networks or the communications supply chain.” Huawei claimed in its brief that “[t]he rulemaking and “initial designation” rest on the FCC’s national security judgments..[b]ut such judgments fall far afield of the FCC’s statutory  authority  and  competence.” Huawei also argued “[t]he USF rule, moreover, contravenes the Administrative Procedure Act (APA) and the Due Process Clause.” The FCC responded in its filing that “Huawei challenges the FCC’s decision to exclude carriers whose networks are vulnerable to foreign interference, contending that the FCC has neither statutory nor constitutional authority to make policy judgments involving “national security”…[but] [t]hese arguments are premature, as Huawei has not yet been injured by the Order.” The FCC added “Huawei’s claim that the Communications Act textually commits all policy determinations with national security implications to the President is demonstrably false.”
  • European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski released his Strategy for 2020-2024, “which will focus on Digital Solidarity.” Wiewiórowski explained that “three core pillars of the EDPS strategy outline the guiding actions and objectives for the organisation to the end of 2024:
    • Foresight: The EDPS will continue to monitor legal, social and technological advances around the world and engage with experts, specialists and data protection authorities to inform its work.
    • Action: To strengthen the EDPS’ supervision, enforcement and advisory roles the EDPS will promote coherence in the activities of enforcement bodies in the EU and develop tools to assist the EU institutions, bodies and agencies to maintain the highest standards in data protection.
    • Solidarity: While promoting digital justice and privacy for all, the EDPS will also enforce responsible and sustainable data processing, to positively impact individuals and maximise societal benefits in a just and fair way.
  • Facebook released a Civil Rights Audit, an “investigation into Facebook’s policies and practices began in 2018 at the behest and encouragement of the civil rights community and some members of Congress.” Those charged with conducting the audit explained that they “vigorously advocated for more and would have liked to see the company go further to address civil rights concerns in a host of areas that are described in detail in the report” including but not limited to
    • A stronger interpretation of its voter suppression policies — an interpretation that makes those policies effective against voter suppression and prohibits content like the Trump voting posts — and more robust and more consistent enforcement of those policies leading up to the US 2020 election.
    • More visible and consistent prioritization of civil rights in company decision-making overall.
    • More resources invested to study and address organized hate against Muslims, Jews and other targeted groups on the platform.
    • A commitment to go beyond banning explicit references to white separatism and white nationalism to also prohibit express praise, support and representation of white separatism and white nationalism even where the terms themselves are not used.
    • More concrete action and specific commitments to take steps to address concerns about algorithmic bias or discrimination.
    • They added that “[t]his report outlines a number of positive and consequential steps that the company has taken, but at this point in history, the Auditors are concerned that those gains could be obscured by the vexing and heartbreaking decisions Facebook has made that represent significant setbacks for civil rights.”
  • The National Security Commission on Artificial Intelligence (NSCAI) released a white paper titled “The Role of AI Technology in Pandemic Response and Preparedness” that “outlines a series of investments and initiatives that the United States must undertake to realize the full potential of AI to secure our nation against pandemics.” NSCAI noted its previous two white papers:
  • Secretary of Defense Mark Esper announced that Chief Technology Officer Michael J.K. Kratsios has “been designated to serve as Acting Under Secretary of Defense for Research and Engineering” even though he does not have a degree in science. The last Under Secretary held a PhD. However, Kratsios worked for venture capitalist Peter Thiel who backed President Donald Trump when he ran for office in 2016.
  • The United States’ Department of Transportation’s Federal Railroad Administration (FRA) issued research “to develop a cyber security risk analysis methodology for communications-based connected railroad technologies…[and] [t]he use-case-specific implementation of the methodology can identify potential cyber attack threats, system vulnerabilities, and consequences of the attack– with risk assessment and identification of promising risk mitigation strategies.”
  • In a blog post, a National Institute of Standards and Technology (NIST) economist asserted cybercrime may be having a much larger impact on the United States’ economy than previously thought:
    • In a recent NIST report, I looked at losses in the U.S. manufacturing industry due to cybercrime by examining an underutilized dataset from the Bureau of Justice Statistics, which is the most statistically reliable data that I can find. I also extended this work to look at the losses in all U.S. industries. The data is from a 2005 survey of 36,000 businesses with 8,079 responses, which is also by far the largest sample that I could identify for examining aggregated U.S. cybercrime losses. Using this data, combined with methods for examining uncertainty in data, I extrapolated upper and lower bounds, putting 2016 U.S. manufacturing losses to be between 0.4% and 1.7% of manufacturing value-added or between $8.3 billion and $36.3 billion. The losses for all industries are between 0.9% and 4.1% of total U.S. gross domestic product (GDP), or between $167.9 billion and $770.0 billion. The lower bound is 40% higher than the widely cited, but largely unconfirmed, estimates from McAfee.
  • The Government Accountability Office (GAO) advised the Federal Communications Commission (FCC) that it needs a comprehensive strategy for implementing 5G across the United States. The GAO concluded
    • FCC has taken a number of actions regarding 5G deployment, but it has not clearly developed specific and measurable performance goals and related measures–with the involvement of relevant stakeholders, including National Telecommunications and Information Administration (NTIA)–to manage the spectrum demands associated with 5G deployment. This makes FCC unable to demonstrate whether the progress being made in freeing up spectrum is achieving any specific goals, particularly as it relates to congested mid-band spectrum. Additionally, without having established specific and measurable performance goals with related strategies and measures for mitigating 5G’s potential effects on the digital divide, FCC will not be able to assess the extent to which its actions are addressing the digital divide or what actions would best help all Americans obtain access to wireless networks.
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “Time Guidance for Network Operators, Chief Information Officers, and Chief Information Security Officers” “to inform public and private sector organizations, educational institutions, and government agencies on time resilience and security practices in enterprise networks and systems…[and] to address gaps in available time testing practices, increasing awareness of time-related system issues and the linkage between time and cybersecurity.”
  • Fifteen Democratic Senators sent a letter to the Department of Defense, Office of the Director of National Intelligence (ODNI), Department of Homeland Security (DHS), Federal Bureau of Investigations (FBI), and U.S. Cyber Command, urging them “to take additional measures to fight influence campaigns aimed at disenfranchising voters, especially voters of color, ahead of the 2020 election.” They called on these agencies to take “additional measures:”
    • The American people and political candidates are promptly informed about the targeting of our political processes by foreign malign actors, and that the public is provided regular periodic updates about such efforts leading up to the general election.
    • Members of Congress and congressional staff are appropriately and adequately briefed on continued findings and analysis involving election related foreign disinformation campaigns and the work of each agency and department to combat these campaigns.
    • Findings and analysis involving election related foreign disinformation campaigns are shared with civil society organizations and independent researchers to the maximum extent which is appropriate and permissible.
    • Secretary Esper and Director Ratcliffe implement a social media information sharing and analysis center (ISAC) to detect and counter information warfare campaigns across social media platforms as authorized by section 5323 of the Fiscal Year 2020 National Defense Authorization Act.
    • Director Ratcliffe implement the Foreign Malign Influence Response Center to coordinate a whole of government approach to combatting foreign malign influence campaigns as authorized by section 5322 of the Fiscal Year 2020 National Defense Authorization Act.
  • The Information Technology and Innovation Foundation (ITIF) unveiled an issue brief “Why New Calls to Subvert Commercial Encryption Are Unjustified” arguing “that government efforts to subvert encryption would negatively impact individuals and businesses.” ITIF offered these “key takeaways:”
    • Encryption gives individuals and organizations the means to protect the confidentiality of their data, but it has interfered with law enforcement’s ability to prevent and investigate crimes and foreign threats.
    • Technological advances have long frustrated some in the law enforcement community, giving rise to multiple efforts to subvert commercial use of encryption, from the Clipper Chip in the 1990s to the San Bernardino case two decades later.
    • Having failed in these prior attempts to circumvent encryption, some law enforcement officials are now calling on Congress to invoke a “nuclear option”: legislation banning “warrant-proof” encryption.
    • This represents an extreme and unjustified measure that would do little to take encryption out of the hands of bad actors, but it would make commercial products less secure for ordinary consumers and businesses and damage U.S. competitiveness.
  • The White House released an executive order in which President Donald Trump determined “that the Special Administrative Region of Hong Kong (Hong Kong) is no longer sufficiently autonomous to justify differential treatment in relation to the People’s Republic of China (PRC or China) under the particular United States laws and provisions thereof set out in this order.” Trump further determined “the situation with respect to Hong Kong, including recent actions taken by the PRC to fundamentally undermine Hong Kong’s autonomy, constitutes an unusual and extraordinary threat, which has its source in substantial part outside the United States, to the national security, foreign policy, and economy of the United States…[and] I hereby declare a national emergency with respect to that threat.” The executive order would continue the Administration’s process of changing policy to ensure Hong Kong is treated the same as the PRC.
  • President Donald Trump also signed a bill passed in response to the People’s Republic of China (PRC) passing legislation the United States and other claim will strip Hong Kong of the protections the PRC agreed to maintain for 50 years after the United Kingdom (UK) handed over the city. The “Hong Kong Autonomy Act” “requires the imposition of sanctions on Chinese individuals and banks who are included in an annual State Department list found to be subverting Hong Kong’s autonomy” according to the bill’s sponsor Representative Brad Sherman (D-CA).
  • Representative Stephen Lynch, who chairs House Oversight and Reform Committee’s National Security Subcommittee, sent letters to Apple and Google “after the Office of the Director of National Intelligence (ODNI) and the Federal Bureau of Investigation (FBI) confirmed that mobile applications developed, operated, or owned by foreign entities, including China and Russia, could potentially pose a national security risk to American citizens and the United States” according to his press release. He noted in letters sent by the technology companies to the Subcommittee that:
    • Apple confirmed that it does not require developers to submit “information on where user data (if any such data is collected by the developer’s app) will be housed” and that it “does not decide what user data a third-party app can access, the user does.”
    • Google stated that it does “not require developers to provide the countries in which their mobile applications will house user data” and acknowledged that “some developers, especially those with a global user base, may store data in multiple countries.”
    • Lynch is seeking “commitments from Apple and Google to require information from application developers about where user data is stored, and to make users aware of that information prior to downloading the application on their mobile devices.”
  • Minnesota Attorney General Keith Ellison announced a settlement with Frontier Communications that “concludes the three major investigations and lawsuits that the Attorney General’s office launched into Minnesota’s major telecoms providers for deceptive, misleading, and fraudulent practices.” The Office of the Attorney General (OAG) stated
    • Based on its investigation, the Attorney General’s Office alleged that Frontier used a variety of deceptive and misleading practices to overcharge its customers, such as: billing customers more than they were quoted by Frontier’s agents; failing to disclose fees and surcharges in its sales presentations and advertising materials; and billing customers for services that were not delivered.
    • The OAG “also alleged that Frontier sold Minnesotans expensive internet services with so-called “maximum speed” ratings that were not attainable, and that Frontier improperly advertised its service as “reliable,” when in fact it did not provide enough bandwidth for customers to consistently receive their expected service.”
  • The European Data Protection Board (EDPB) issued guidelines “on the criteria of the Right to be Forgotten in the search engines cases under the GDPR” that “focuses solely on processing by search engine providers and delisting requests  submitted by data subjects” even Article 17 of the General Data Protection Regulation applies to all data controllers. The EDPB explained “This paper is divided into two topics:
    • The first topic concerns the grounds a data subject can rely on for a delisting request sent to a search engine provider pursuant to Article 17.1 GDPR.
    • The second topic concerns the exceptions to the Right to request delisting according to Article 17.3 GDPR.
  • The Australian Competition & Consumer Commission (ACCC) “is seeking views on draft Rules and accompanying draft Privacy Impact Assessment that authorise third parties who are accredited at the ‘unrestricted’ level to collect Consumer Data Right (CDR) data on behalf of another accredited person.” The ACCC explained “[t]his will allow accredited persons to utilise other accredited parties to collect CDR data and provide other services that facilitate the provision of goods and services to consumers.” In a March explanatory statement, the ACCC stated “[t]he CDR is an economy-wide reform that will apply sector-by-sector, starting with the banking sector…[and] [t]he objective of the CDR is to provide individual and business consumers (consumers) with the ability to efficiently and conveniently access specified data held about them by businesses (data holders), and to authorise the secure disclosure of that data to third parties (accredited data recipients) or to themselves.” The ACCC noted “[t]he CDR is regulated by both the ACCC and the Office of the Australian Information Commissioner (OAIC) as it concerns both competition and consumer matters as well as the privacy and confidentiality of consumer data.” Input is due by 20 July.
  • Office of the Inspector General (OIG) for the Department of the Interior (Interior) found that even though the agency spends $1.4 billion annually on cybersecurity “[g]uarding against increasing cybersecurity threats” remains one of Interior’s top challenges. The OIG asserted Interior “continues to struggle to implement an enterprise information technology (IT) security program that balances compliance, cost, and risk while enabling bureaus to meet their diverse missions.”
  • In a summary of its larger investigation into “Security over Information Technology Peripheral Devices at Select Office of Science Locations,” the Department of Energy’s Office of the Inspector General (OIG) that “identified weaknesses related to access controls and configuration settings” for peripheral devices (e.g. thumb drives, printers, scanners and other connected devices)  “similar in type to those identified in prior evaluations of the Department’s unclassified cybersecurity program.”
  • The House Homeland Security Committee’s Cybersecurity, Infrastructure Protection, and Innovation Subcommittee Ranking Member John Katko (R-NY) “a comprehensive national cybersecurity improvement package” according to his press release, consisting of these bills:
    • The “Cybersecurity and Infrastructure Security Agency Director and Assistant Directors Act:”  This bipartisan measure takes steps to improve guidance and long-term strategic planning by stabilizing the CISA Director and Assistant Directors positions. Specifically, the bill:
      • Creates a 5-year term for the CISA Director, with a limit of 2 terms. The term of office for the current Director begins on date the Director began to serve.
      • Elevates the Director to the equivalent of a Deputy Secretary and Military Service Secretaries.
      • Depoliticizes the Assistant Director positions, appointed by the Secretary of the Department of Homeland Security (DHS), categorizing them as career public servants. 
    • The “Strengthening the Cybersecurity and Infrastructure Security Agency Act of 2020:” This measure mandates a comprehensive review of CISA in an effort to strengthen its operations, improve coordination, and increase oversight of the agency. Specifically, the bill:
      • Requires CISA to review how additional appropriations could be used to support programs for national risk management, federal information systems management, and public-private cybersecurity and integration. It also requires a review of workforce structure and current facilities and projected needs. 
      • Mandates that CISA provides a report to the House and Senate Homeland Committees within 1-year of enactment. CISA must also provide a report and recommendations to GSA on facility needs. 
      • Requires GSA to provide a review to the Administration and House and Senate Committees on CISA facilities needs within 30-days of Congressional report. 
    • The “CISA Public-Private Talent Exchange Act:” This bill requires CISA to create a public-private workforce program to facilitate the exchange of ideas, strategies, and concepts between federal and private sector cybersecurity professionals. Specifically, the bill:
      • Establishes a public-private cyber exchange program allowing government and industry professionals to work in one another’s field.
      • Expands existing private outreach and partnership efforts. 
  • The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) is ordering United States federal civilian agencies “to apply the July 2020 Security Update for Windows Servers running DNS (CVE-2020-1350), or the temporary registry-based workaround if patching is not possible within 24 hours.” CISA stated “[t]he software update addresses a significant vulnerability where a remote attacker could exploit it to take control of an affected system and run arbitrary code in the context of the Local System Account.” CISA Director Christopher Krebs explained “due to the wide prevalence of Windows Server in civilian Executive Branch agencies, I’ve determined that immediate action is necessary, and federal departments and agencies need to take this remote code execution vulnerability in Windows Server’s Domain Name System (DNS) particularly seriously.”
  • The United States (US) Department of State has imposed “visa restrictions on certain employees of Chinese technology companies that provide material support to regimes engaging in human rights abuses globally” that is aimed at Huawei. In its statement, the Department stated “Companies impacted by today’s action include Huawei, an arm of the Chinese Communist Party’s (CCP) surveillance state that censors political dissidents and enables mass internment camps in Xinjiang and the indentured servitude of its population shipped all over China.” The Department claimed “[c]ertain Huawei employees provide material support to the CCP regime that commits human rights abuses.”
  • Earlier in the month, the US Departments of State, Treasury, Commerce, and of Homeland Security issued an “advisory to highlight the harsh repression in Xinjiang.” The agencies explained
    • Businesses, individuals, and other persons, including but not limited to academic institutions, research service providers, and investors (hereafter “businesses and individuals”), that choose to operate in Xinjiang or engage with entities that use labor from Xinjiang elsewhere in China should be aware of reputational, economic, and, in certain instances, legal, risks associated with certain types of involvement with entities that engage in human rights abuses, which could include Withhold Release Orders (WROs), civil or criminal investigations, and export controls.
  • The United Kingdom’s National Cyber Security Centre (NCSC), Canada’s Communications  Security Establishment (CSE), United States’ National Security Agency (NSA) and the United States’ Department of Homeland Security’s Cybersecurity and Infrastructure Security  Agency (CISA) issued a joint advisory on a Russian hacking organization’s efforts have “targeted various organisations involved in COVID-19 vaccine development in Canada, the United States and the United Kingdom, highly likely with the intention of stealing information and intellectual property relating to the development and testing of COVID-19 vaccines.” The agencies named APT29 (also known as ‘the Dukes’ or ‘Cozy Bear’), “a cyber espionage group, almost certainly part of the Russian intelligence services,” as the culprit behind “custom malware known as ‘WellMess’ and ‘WellMail.’”
    • This alert follows May advisories issued by Australia, the US, and the UK on hacking threats related to the pandemic. Australia’s Department of Foreign Affairs and Trade (DFAT) and the Australian Cyber Security Centre (ACSC) issued “Advisory 2020-009: Advanced Persistent Threat (APT) actors targeting Australian health sector organisations and COVID-19 essential services” that asserted “APT groups may be seeking information and intellectual property relating to vaccine development, treatments, research and responses to the outbreak as this information is now of higher value and priority globally.” CISA and NCSC issued a joint advisory for the healthcare sector, especially companies and entities engaged in fighting COVID-19. The agencies stated that they have evidence that Advanced Persistent Threat (APT) groups “are exploiting the COVID-19 pandemic as part of their cyber operations.” In an unclassified public service announcement, the Federal Bureau of Investigation (FBI) and CISA named the People’s Republic of China as a nation waging a cyber campaign against U.S. COVID-19 researchers. The agencies stated they “are issuing this announcement to raise awareness of the threat to COVID-19-related research.”
  • The National Initiative for Cybersecurity Education (NICE) has released a draft National Institute of Standards and Technology (NIST) Special Publication (SP) for comment due by 28 August. Draft NIST Special Publication (SP) 800-181 Revision 1, Workforce Framework for Cybersecurity (NICE Framework) that features several updates, including:
    • an updated title to be more inclusive of the variety of workers who perform cybersecurity work,
    • definition and normalization of key terms,
    • principles that facilitate agility, flexibility, interoperability, and modularity,
    • introduction of competencies,
  • Representatives Glenn Thompson (R-PA), Collin Peterson (D-MN), and James Comer (R-KY) sent a letter to Federal Communications Commission (FCC) “questioning the Commission’s April 20, 2020 Order granting Ligado’s application to deploy a terrestrial nationwide network to provide 5G services.”
  • The European Commission (EC) is asking for feedback on part of its recently released data strategy by 31 July. The EC stated it is aiming “to create a single market for data, where data from public bodies, business and citizens can be used safely and fairly for the common good…[and] [t]his initiative will draw up rules for common European data spaces (covering areas like the environment, energy and agriculture) to:
    • make better use of publicly held data for research for the common good
    • support voluntary data sharing by individuals
    • set up structures to enable key organisations to share data.
  • The United Kingdom’s Parliament is asking for feedback on its legislative proposal to regulate Internet of Things (IoT) devices. The Department for Digital, Culture, Media & Sport explained “the obligations within the government’s proposed legislative framework would fall mainly on the manufacturer if they are based in the UK, or if not based in the UK, on their UK representative.” The Department is also “developing an enforcement approach with relevant stakeholders to identify an appropriate enforcement body to be granted day to day responsibility and operational control of monitoring compliance with the legislation.” The Department also touted the publishing of the European Telecommunications Standards Institute’s (ETSI) “security baseline for Internet-connected consumer devices and provides a basis for future Internet of Things product certification schemes.”
  • Facebook issued a white paper, titled “CHARTING A WAY FORWARD: Communicating Towards People-Centered and Accountable Design About Privacy,” in which the company states its desire to be involved in shaping a United States privacy law (See below for an article on this). Facebook concluded:
    • Facebook recognizes the responsibility we have to make sure that people are informed about the data that we collect, use, and share.
    • That’s why we support globally consistent comprehensive privacy laws and regulations that, among other things, establish people’s basic rights to be informed about how their information is collected, used, and shared, and impose obligations for organizations to do the same, including the obligation to build internal processes that maintain accountability.
    • As improvements to technology challenge historic approaches to effective communications with people about privacy, companies and regulators need to keep up with changing times.
    • To serve the needs of a global community, on both the platforms that exist now and those that are yet to be developed, we want to work with regulators, companies, and other interested third parties to develop new ways of informing people about their data, empowering them to make meaningful choices, and holding ourselves accountable.
    • While we don’t have all the answers, there are many opportunities for businesses and regulators to embrace modern design methods, new opportunities for better collaboration, and innovative ways to hold organizations accountable.
  • Four Democratic Senators sent Facebook a letter “about reports that Facebook has created fact-checking exemptions for people and organizations who spread disinformation about the climate crisis on its social media platform” following a New York Times article this week on the social media’s practices regarding climate disinformation. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars. Senators Elizabeth Warren (D-WA), Tom Carper (D-DE), Sheldon Whitehouse (D-R.I.) and Brian Schatz (D-HI) argued “[i]f Facebook is truly “committed to fighting the spread of false news on Facebook and Instagram,” the company must immediately acknowledge in its fact-checking process that the climate crisis is not a matter of opinion and act to close loopholes that allow climate disinformation to spread on its platform.” They posed a series of questions to Facebook CEO Mark Zuckerberg on these practices, requesting answers by 31 July.
  • A Canadian court has found that the Canadian Security Intelligence Service (CSIS) “admittedly collected information in a manner that is contrary to this foundational commitment and then relied on that information in applying for warrants under the Canadian Security Intelligence Service Act, RSC 1985, c C-23 [CSIS Act]” according to a court summary of its redacted decision. The court further stated “[t]he Service and the Attorney General also admittedly failed to disclose to the Court the Service’s reliance on information that was likely collected unlawfully when seeking warrants, thereby breaching the duty of candour owed to the Court.” The court added “[t]his is not the first time this Court has been faced with a breach of candour involving the Service…[and] [t]he events underpinning this most recent breach were unfolding as recommendations were being implemented by the Service and the Attorney General to address previously identified candour concerns.” CSIS was found to have illegally collected and used metadata in a 2016 case ion its conduct between 2006-2016. In response to the most recent ruling, CSIS is vowing to implement a range of reforms. The National Security and Intelligence Review Agency (NSIRA) is pledging the same.
  • The United Kingdom’s National Police Chiefs’ Council (NPCC) announced the withdrawal of “[t]he ‘Digital device extraction – information for complainants and witnesses’ form and ‘Digital Processing Notice’ (‘the relevant forms’) circulated to forces in February 2019 [that] are not sufficient for their intended purpose.” In mid-June, the UK’s data protection authority, the Information Commissioner’s Office (ICO) unveiled its “finding that police data extraction practices vary across the country, with excessive amounts of personal data often being extracted, stored, and made available to others, without an appropriate basis in existing data protection law.” This withdrawal was also due, in part, to a late June Court of Appeal decision.  
  • A range of public interest and advocacy organizations sent a letter to Speaker of the House Nancy Pelosi (D-CA) and House Minority Leader Kevin McCarthy (R-CA) noting “there are intense efforts underway to do exactly that, via current language in the House and Senate versions of the FY2021 National Defense Authorization Act (NDAA) that ultimately seek to reverse the FCC’s recent bipartisan and unanimous approval of Ligado Networks’ regulatory plans.” They urged them “not endorse efforts by the Department of Defense and its allies to veto commercial spectrum authorizations…[and][t]he FCC has proven itself to be the expert agency on resolving spectrum disputes based on science and engineering and should be allowed to do the job Congress authorized it to do.” In late April, the FCC’s “decision authorize[d] Ligado to deploy a low-power terrestrial nationwide network in the 1526-1536 MHz, 1627.5-1637.5 MHz, and 1646.5-1656.5 MHz bands that will primarily support Internet of Things (IoT) services.” The agency argued the order “provides regulatory certainty to Ligado, ensures adjacent band operations, including Global Positioning System (GPS), are sufficiently protected from harmful interference, and promotes more efficient and effective use of [the U.S.’s] spectrum resources by making available additional spectrum for advanced wireless services, including 5G.”
  • The European Data Protection Supervisor (EDPS) rendered his opinion on the European Commission’s White Paper on Artificial Intelligence: a European approach to excellence and trust and recommended the following for the European Union’s (EU) regulation of artificial intelligence (AI):
    • applies both to EU Member States and to EU institutions, offices, bodies and agencies;
    • is designed to protect from any negative impact, not only on individuals, but also on communities and society as a whole;
    • proposes a more robust and nuanced risk classification scheme, ensuring any significant potential harm posed by AI applications is matched by appropriate mitigating measures;
    • includes an impact assessment clearly defining the regulatory gaps that it intends to fill.
    • avoids overlap of different supervisory authorities and includes a cooperation mechanism.
    • Regarding remote biometric identification, the EDPS supports the idea of a moratorium on the deployment, in the EU, of automated recognition in public spaces of human features, not only of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals, so that an informed and democratic debate can take place and until the moment when the EU and Member States have all the appropriate safeguards, including a comprehensive legal framework in place to guarantee the proportionality of the respective technologies and systems for the specific use case.
  • The Bundesamt für Verfassungsschutz (BfV), Germany’s domestic security agency, released a summary of its annual report in which it claimed:
    • The Russian Federation, the People’s Republic of China, the Islamic Republic of Iran and the Republic of Turkey remain the main countries engaged in espionage activities and trying to exert influence on Germany.
    • The ongoing digital transformation and the increasingly networked nature of our society increases the potential for cyber attacks, worsening the threat of cyber espionage and cyber sabotage.
    • The intelligence services of the Russian Federation and the People’s Republic of China in particular carry out cyber espionage activities against German agencies. One of their tasks is to boost their own economies with the help of information gathered by the intelligence services. This type of information-gathering campaign severely threatens the success and development opportunities of German companies.
    • To counteract this threat, Germany has a comprehensive cyber security architecture in place, which is operated by a number of different authorities. The BfV plays a major role in investigating and defending against cyber threats by detecting attacks, attributing them to specific attackers, and using the knowledge gained from this to draw up prevention strategies. The National Cyber Response Centre, in which the BfV plays a key role, was set up to consolidate the co-operation between the competent agencies. The National Cyber Response Centre aims to optimise the exchange of information between state agencies and to improve the co-ordination of protective and defensive measures against potential IT incidents.

Further Reading

  • Trump confirms cyberattack on Russian trolls to deter them during 2018 midterms” – The Washington Post. In an interview with former George W. Bush speechwriter Marc Thiessen, President Donald Trump confirmed he ordered a widely reported retaliatory attack on the Russian Federation’s Internet Research Agency as a means of preventing interference during the 2018 mid-term election. Trump claimed this attack he ordered was the first action the United States took against Russian hacking even though his predecessor warned Russian President Vladimir Putin to stop such activities and imposed sanctions at the end of 2016. The timing of Trump’s revelation is interesting given the ongoing furor over reports of Russian bounties paid to Taliban fighters for killing Americans the Trump Administration may have known of but did little or nothing to stop.
  • Germany proposes first-ever use of EU cyber sanctions over Russia hacking” – Deutsche Welle. Germany is looking to use the European Union’s (EU) cyber sanctions powers against Russia for its alleged 2015 16 GB exfiltration of data from the Bundestag’s systems, including from Chancellor Angela Merkel’s office. Germany has been alleging that Fancy Bear (aka APT28) and Russia’s military secret service GRU carried out the attack. Germany has circulated its case for sanctions to other EU nations and EU leadership. In 2017, the European Council declared “[t]he EU diplomatic response to malicious cyber activities will make full use of measures within the Common Foreign and Security Policy, including, if necessary, restrictive measures…[and] [a] joint EU response to malicious cyber activities would be proportionate to the scope, scale, duration, intensity, complexity, sophistication and impact of the cyber activity.”
  • Wyden Plans Law to Stop Cops From Buying Data That Would Need a Warrant” – VICE. Following on a number of reports that federal, state, and local law enforcement agencies are essentially sidestepping the Fourth Amendment through buying location and other data from people’s smartphones, Senator Ron Wyden (D-OR) is going to draft legislation that would seemingly close what he, and other civil libertarians, are calling a loophole to the warrant requirement.
  • Amazon Backtracks From Demand That Employees Delete TikTok” – The New York Times. Amazon first instructed its employees to remove ByteDance’s app, TikTok, on 11 July from company devices and then reversed course the same day, claiming the email had been erroneously sent out. The strange episode capped another tumultuous week for ByteDance as the Trump Administration is intensifying pressure in a number of ways on the company which officials claim is subject to the laws of the People’s Republic of China and hence must share information with the government in Beijing. ByteDance counters the app marketed in the United States is through a subsidiary not subject to PRC law. ByteDance also said it would no longer offer the app in Hong Kong after the PRC change in law has extended the PRC’s reach into the former British colony. TikTok was also recently banned in India as part of a larger struggle between India and he PRC. Additionally, the Democratic National Committee warned staff about using the app this week, too.
  • Is it time to delete TikTok? A guide to the rumors and the real privacy risks.” – The Washington Post. A columnist and security specialist found ByteDance’s app vacuums up information from users, but so does Facebook and other similar apps. They scrutinized TikTok’s privacy policy and where the data went, and they could not say with certainty that it goes to and stays on servers in the US and Singapore. 
  • California investigating Google for potential antitrust violations” – Politico. California Attorney General Xavier Becerra is going to conduct his own investigation of Google aside and apart from the investigation of the company’s advertising practices being conducted by virtually every other state in the United States. It was unclear why Becerra opted against joining the larger probe launched in September 2019. Of course, the Trump Administration’s Department of Justice is also investigating Google and could file suit as early as this month.
  • How May Google Fight an Antitrust Case? Look at This Little-Noticed Paper” – The New York Times. In a filing with the Australian Competition and Consumer Commission (ACCC), Google claimed it does not control the online advertising market and it is borne out by a number of indicia that argue against a monopolistic situation. The company is likely to make the same case to the United States’ government in its antitrust inquiry. However, similar arguments did not gain tractions before the European Commission, which levied a €1.49 billion for “breaching EU antitrust rules” in March 2019.
  •  “Who Gets the Banhammer Now?” – The New York Times. This article examines possible motives for the recent wave of action by social media platforms to police a fraction of the extreme and hateful speech activists and others have been asking them to take down for years. This piece makes the argument that social media platforms are businesses and operate as such and expecting them to behave as de facto public squares dedicated to civil political and societal discourse is more or less how we ended up where we are.
  • TikTok goes tit-for-tat in appeal to MPs: ‘stop political football’ – The Australian. ByteDance is lobbying hard in Canberra to talk Ministers of Parliament out of possibly banning TikTok like the United States has said it is considering. While ByteDance claims the data collected on users in Australia is sent to the US or Singapore, some experts are arguing just to maintain and improve the app would necessarily result in some non-People’s Republic of China (PRC) user data making its way back to the PRC. As Australia’s relationship with the PRC has grown more fraught with allegations PRC hackers infiltrated Parliament and the Prime Minister all but saying PRC hackers were targeting hospitals and medical facilities, the government in Canberra could follow India’s lead and ban the app.
  • Calls for inquiry over claims Catalan lawmaker’s phone was targeted” – The Guardian. British and Spanish newspapers are reporting that an official in Catalonia who favors separating the region from Spain may have had his smartphone compromised with industrial grade spyware typically used only by law enforcement and counterterrorism agencies. The President of the Parliament of Catalonia Roger Torrent claims his phone was hacked for domestic political purposes, which other Catalan leaders argued, too. A spokesperson for the Spanish government said “[t]he government has no evidence that the speaker of the Catalan parliament has been the victim of a hack or theft involving his mobile.” However, the University of Toronto’s CitizenLab, the entity that researched and claimed that Israeli firm NSO Group’s spyware was deployed via WhatsApp to spy on a range of journalists, officials, and dissidents, often by their own governments, confirmed that Torrent’s phone was compromised.
  • While America Looks Away, Autocrats Crack Down on Digital News Sites” – The New York Times. The Trump Administration’s combative relationship with the media in the United States may be encouraging other nations to crack down on digital media outlets trying to hold those governments to account.
  •  “How Facebook Handles Climate Disinformation” – The New York Times. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars.
  • Here’s how President Trump could go after TikTok” – The Washington Post. This piece lays out two means the Trump Administration could employ to press ByteDance in the immediate future: use of the May 2019 Executive Order “Securing the Information and Communications Technology and Services Supply Chain” or the Committee on Foreign Investment in the United States process examining ByteDance of the app Music.ly that became TikTok. Left unmentioned in this article is the possibility of the Federal Trade Commission (FTC) examining its 2019 settlement with ByteDance to settle violations of the “Children’s Online Privacy Protection Act” (COPPA).
  • You’re Doomscrolling Again. Here’s How to Snap Out of It.” – The New York Times. If you find yourself endlessly looking through social media feeds, this piece explains why and how you might stop doing so.
  • UK selling spyware and wiretaps to 17 repressive regimes including Saudi Arabia and China” – The Independent. There are allegations that the British government has ignored its own regulations on selling equipment and systems that can be used for surveillance and spying to other governments with spotty human rights records. Specifically, the United Kingdom (UK) has sold £75m to countries non-governmental organizations (NGO) are rated as “not free.” The claims include nations such as the People’s Republic of China (PRC), the Kingdom of Saudi Arabia, Bahrain, and others. Not surprisingly, NGOs and the minority Labour party are calling for an investigation and changes.
  • Google sued for allegedly tracking users in apps even after opting out” – c/net. Boies Schiller Flexner filed suit in what will undoubtedly seek to become a class action suit over Google’s alleged continuing to track users even when they turned off tracking features. This follows a suit filed by the same firm against Google in June, claiming its browser Chrome still tracks people when they switch to incognito mode.
  • Secret Trump order gives CIA more powers to launch cyberattacks” – Yahoo! News. It turns out that in addition to signing National Security Presidential Memorandum (NSPM) 13 that revamped and eased offensive cyber operations for the Department of Defense, President Donald Trump signed a presidential finding that has allowed the Central Intelligence Agency (CIA) to launch its own offensive cyber attacks, mainly at Russia and Iran, according to unnamed former United States (US) officials according to this blockbuster story. Now, the decision to commence with an attack is not vetted by the National Security Council; rather, the CIA makes the decision. Consequently, there have been a number of attacks on US adversaries that until now have not been associated with the US. And, the CIA is apparently not informing the National Security Agency or Cyber Command of its operations, raising the risk of US cyber forces working at cross purposes or against one another in cyberspace. Moreover, a recently released report blamed the lax security environment at the CIA for a massive exfiltration of hacking tools released by Wikileaks. 
  • Facebook’s plan for privacy laws? ‘Co-creating’ them with Congress” – Protocol. In concert with the release of a new white paper, Facebook Deputy Chief Privacy Officer Rob Sherman sat for an interview in which he pledged the company’s willingness to work with Congress to co-develop a national privacy law. However, he would not comment on any of the many privacy bills released thus far or the policy contours of a bill Facebook would favor except for advocating for an enhanced notice and consent regime under which people would be better informed about how their data is being used. Sherman also shrugged off suggestions Facebook may not be welcome given its record of privacy violations. Finally, it bears mention that similar efforts by other companies at the state level have not succeeded as of yet. For example, Microsoft’s efforts in Washington state have not borne fruit in the passage of a privacy law.
  • Deepfake used to attack activist couple shows new disinformation frontier” – Reuters. We are at the beginning of a new age of disinformation in which fake photographs and video will be used to wage campaigns against nations, causes, and people. An activist and his wife were accused of being terrorist sympathizers by a university student who apparently was an elaborate ruse for someone or some group looking to defame the couple. Small errors gave away the ruse this time, but advances in technology are likely to make detection all the harder.
  • Biden, billionaires and corporate accounts targeted in Twitter hack” – The Washington Post. Policymakers and security experts were alarmed when the accounts of major figures like Bill Gates and Barack Obama were hacked yesterday by some group seeking to sell bitcoin. They argue Twitter was lucky this time and a more ideologically motivated enemy may seek to cause havoc, say on the United States’ coming election. A number of experts are claiming the penetration of the platform must have been of internal controls for so many high profile accounts to be taken over at the same time.
  • TikTok Enlists Army of Lobbyists as Suspicions Over China Ties Grow” – The New York Times. ByteDance’s payments for lobbying services in Washington doubled between the last quarter of 2019 and thirst quarter of 2020, as the company has retained more than 35 lobbyists to push back against the Trump Administration’s rhetoric and policy changes. The company is fighting against a floated proposal to ban the TikTok app on national security grounds, which would cut the company off from another of its top markets after India banned it and scores of other apps from the People’s Republic of China. Even if the Administration does not bar use of the app in the United States, the company is facing legislation that would ban its use on federal networks and devices that will be acted upon next week by a Senate committee. Moreover, ByteDance’s acquisition of the app that became TikTok is facing a retrospective review of an inter-agency committee for national security considerations that could result in an unwinding of the deal. Moreover, the Federal Trade Commission (FTC) has been urged to review ByteDance’s compliance with a 2019 settlement that the company violated regulations protecting the privacy of children that could result in multi-billion dollar liability if wrongdoing is found.
  • Why Google and Facebook Are Racing to Invest in India” – Foreign Policy. With New Delhi banning 59 apps and platforms from the People’s Republic of China (PRC), two American firms have invested in an Indian giant with an eye toward the nearly 500 million Indians not yet online. Reliance Industries’ Jio Platforms have sold stakes to Google and Facebook worth $4.5 billion and $5.7 billion that gives them prized positions as the company looks to expand into 5G and other online ventures. This will undoubtedly give a leg up to the United States’ online giants in vying with competitors to the world’s second most populous nation.
  • “Outright Lies”: Voting Misinformation Flourishes on Facebook” – ProPublica. In this piece published with First Draft, “a global nonprofit that researches misinformation,” an analysis of the most popular claims made about mail voting show that many of them are inaccurate or false, thus violating the platforms terms of services yet Facebook has done nothing to remove them or mark them as inaccurate until this article was being written.
  • Inside America’s Secretive $2 Billion Research Hub” – Forbes. Using contract information obtained through Freedom of Information requests and interviews, light is shined on the little known non-profit MITRE Corporation that has been helping the United States government address numerous technological problems since the late 1950’s. The article uncovers some of its latest, federally funded projects that are raising eyebrows among privacy advocates: technology to life people’s fingerprints from social media pictures, technology to scan and copy Internet of Things (IoT) devices from a distance, a scanner to read a person’s DNA, and others.
  • The FBI Is Secretly Using A $2 Billion Travel Company As A Global Surveillance Tool” – Forbes. In his second blockbuster article in a week, Forbes reporter Thomas Brewster exposes how the United States (US) government is using questionable court orders to gather travel information from the three companies that essentially provide airlines, hotels, and other travel entities with back-end functions with respect to reservations and bookings. The three companies, one of whom, Sabre is a US multinational, have masses of information on you if you have ever traveled, and US law enforcement agencies, namely the Federal Bureau of Investigation, is using a 1789 statute to obtain orders all three companies have to obey for information in tracking suspects. Allegedly, this capability has only been used to track terror suspects but will now reportedly be used for COVID-19 tracking.
  • With Trump CIA directive, the cyber offense pendulum swings too far” – Yahoo! News. Former United States (US) National Coordinator for Security, Infrastructure Protection, and Counter-terrorism Richard Clarke argues against the Central Intelligence Agency (CIA) having carte blanche in conducting cyber operations without the review or input of other federal agencies. He suggests that the CIA in particular, and agencies in general, tend to push their authority to the extreme, which in this case could lead to incidents and lasting precedents in cyberspace that may haunt the US. Clarke also intimated that it may have been the CIA and not Israel that launched cyber attacks on infrastructure facilities in Tehran this month and last.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Europe’s Highest Court Strikes Down Privacy Shield

The agreement that has been allowing US companies to transfer the personal data of EU residents to the US was found to be invalid under EU law. The EU’s highest court seem to indicate standard contractual clauses, a frequently used means to transfer data, may be acceptable.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In the second major ruling from the European Union (EU) this week, earlier today, its highest court invalidated the agreement that has allowed multinational corporations and others to transfer the personal data of EU citizens to the United States (US) for commercial purposes since 2016. The court did not, however, find illegal standard contractual clauses, the means by which many such transfers are occurring. This is the second case an Austrian privacy activist has brought, alleging that Facebook was transferring his personal data into the US in violation of European law because US law, especially surveillance programs, resulted in less protection and fewer rights. The first case resulted in the previous transfer agreement being found illegal, and now this case has resulted in much the same outcome. The import of this ruling is not immediately clear.

Maximillian Schrems filed a complaint against Facebook with the Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under EU law because of the mass US surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-US Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the US passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”

However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The European Data Protection Board (EDPB) explained in a recent decision on Denmark’s SCC that

  • According to Article 28(3) General Data Protection Regulation (GDPR), the processing by a data processor shall be governed by a contract or other legal act under Union or Member State law that is binding on the processor with regard to the controller, setting out a set of specific aspects to regulate the contractual relationship between the parties. These include the subject-matter and duration of the processing, its nature and purpose, the type of personal data and categories of data subjects, among others.
  • Under Article 28(6) GDPR, without prejudice to an individual contract between the data controller and the data processor, the contract or the other legal act referred in paragraphs (3) and (4) of Article 28 GDPR may be based, wholly or in part on SCCs.

In a summary of its decision, the CJEU explained

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

The CJEU found

  • Regarding the level of protection required in respect of such a transfer, the Court holds that the requirements laid down for such purposes by the GDPR concerning appropriate safeguards, enforceable rights and effective legal remedies must be interpreted as meaning that data subjects whose personal data are transferred to a third country pursuant to standard data protection clauses must be afforded a level of protection essentially equivalent to that guaranteed within the EU by the GDPR, read in the light of the Charter. In those circumstances, the Court specifies that the assessment of that level of protection must take into consideration both the contractual clauses agreed between the data exporter established in the EU and the recipient of the transfer established in the third country concerned and, as regards any access by the public authorities of that third country to the data transferred, the relevant aspects of the legal system of that third country.
  • Regarding the supervisory authorities’ obligations in connection with such a transfer, the Court holds that, unless there is a valid Commission adequacy decision, those competent supervisory authorities are required to suspend or prohibit a transfer of personal data to a third country where they take the view, in the light of all the circumstances of that transfer, that the standard data protection clauses are not or cannot be complied with in that country and that the protection of the data transferred that is required by EU law cannot be ensured by other means, where the data exporter established in the EU has not itself suspended or put an end to such a transfer.

The CJEU stated “the limitations on the protection of personal data arising from the domestic law of the US on the access and use by US public authorities of such data transferred from the EU to that third country, which the Commission assessed in [its 2016 adequacy decision], are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law, by the principle of proportionality, in so far as the surveillance programmes based on those provisions are not limited to what is strictly necessary.”

The CJEU found the process put in place by the US government to handle complaints inadequate. The 2016 Privacy Shield resulted in the creation of an Ombudsman post that EU citizens could submit their complaints. This position is currently held by Under Secretary of State for Economic Growth, Energy, and the Environment Keith Krach.

The CJEU stated “the Ombudsperson mechanism referred to in that decision does  not  provide  data  subjects with any  cause  of  action  before  a  body  which  offers guarantees substantially equivalent to those required by EU law, such as to ensure both the independence  of  the Ombudsperson  provided  for  by  that  mechanism  and the  existence  of rules  empowering  the  Ombudsperson  to  adopt  decisions  that  are  binding  on  the US intelligence services.”

The decision on SCCs is more ambiguous as it is not entirely clear the circumstances under which they can be used. In its decision, the CJEU made clear that SCCs are not necessarily legal under EU law:

although there are situations in which, depending on the law and practices in force in the third country concerned, the recipient of such a transfer is in a position to guarantee the necessary protection of the data solely on the basis of standard data protection clauses, there are others in which the content of those standard clauses might not constitute a sufficient means of ensuring, in practice, the effective protection of personal data transferred to the third country concerned. That is the case, in particular, where the law of that third country allows its public authorities to interfere with the rights of the data subjects to which that data relates.

Reaction from the parties was mixed, particularly on what the CJEU’s ruling means for SCCs even though there was agreement that the Privacy Shield will soon no longer govern data transfers from the EU to the US.

The DPC issued a statement in which it asserted

Today’s judgment provides just that, firmly endorsing the substance of the concerns expressed by the DPC (and by the Irish High Court) to the effect that EU citizens do not enjoy the level of protection demanded by EU law when their data is transferred to the United States. In that regard, while the judgment most obviously captures Facebook’s transfers of data relating to Mr Schrems, it is of course the case that its scope extends far beyond that, addressing the position of EU citizens generally.

The DPC added

So, while in terms of the points of principle in play, the Court has endorsed the DPC’s position, it has also ruled that the SCCs transfer mechanism used to transfer data to countries worldwide is, in principle, valid, although it is clear that, in practice, the application of the SCCs transfer mechanism to transfers of personal data to the United States is now questionable. This is an issue that will require further and careful examination, not least because assessments will need to be made on a case by case basis.

At a press conference, EC Vice-President Věra Jourová claimed the “CJEU declared the Privacy Shield decision invalid, but also confirmed that the standard contractual clauses remain a valid tool for the transfer of personal data to processors established in third countries.” She asserted “[t]his means that the transatlantic data flows can continue, based on the broad toolbox for international transfers provided by the GDPR, for instance binding corporate rules or SCCs.” Jourová contended with regard to next steps, “[w]e are not starting from scratch…[and] [o]n the contrary, the Commission has already been working intensively to ensure that this toolbox is fit for purpose, including the modernisation of the Standard Contractual Clauses.” Jourová stated “we will be working closely with our American counterparts, based on today’s ruling.”

European Commissioner for Justice Didier Reynders stated

  • First, I welcome the fact that the Court confirmed the validity of our Decision on SCCs.
    • We have been working already for some time on modernising these clauses and ensuring that our toolbox for international data transfers is fit for purpose.
    • Standard Contractual Clauses are in fact the most used tool for international transfers of personal data and we wanted to ensure they can be used by businesses and fully in line with EU law.
    • We are now advanced with this work and we will of course take into account the requirements of judgement.
    • We will work with the European Data Protection Board, as well as the 27 EU Member States. It will be very important to start the process to have a formal approval to modernise the Standard Contractual Clauses as soon as possible. We have been in an ongoing process about such a modernisation for some time, but with an attention to the different elements of the decision of the Court today.
  • My second point: The Court has invalidated the Privacy Shield. We have to study the judgement in detail and carefully assess the consequences of this invalidation.

Reynders stated that “[i]n the meantime, transatlantic data flows between companies can continue using other mechanisms for international transfers of personal data available under the GDPR.”

In a statement, US Secretary of Commerce Wilbur Ross

While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts.

Ross continued

We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.

The Department of Commerce stated it “will continue to administer the Privacy Shield program, including processing submissions for self-certification and re-certification to the Privacy Shield Frameworks and maintaining the Privacy Shield List.” The agency added “[t]oday’s decision does not relieve participating organizations of their Privacy Shield obligations.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by harakir from Pixabay

FTC Settles A Pair of Privacy Shield Cases

The FTC imposes 20 year commitments for two companies who were not meeting their requirements in terms of transferring the personal data of EU residents out of Europe.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

The Federal Trade Commission (FTC) has announced its second Privacy Shield violation settlement in the last few weeks that will impose obligations over the next 20 years so long as the United States (US) companies choose to transfer and process the data of European Union (EU) citizens and residents. The 2016 agreement requires US entities to self-certify compliance subject to enforcement by the FTC for most companies and violations are punished under the Section 5 prohibition against deceptive practices of the FTC Act. The agreement requires a range of practices for those companies that choose to participate, including heeding standards for notice, consent, accountability for onward transfers, data security, data integrity and purpose limitation. A failure to fully comply represents a violation subject to enforcement.

In the settlement announced this week, the FTC claimed Ortho-Clinical Diagnostics, Inc. “participated in the Privacy Shield framework and complied with the program’s requirements, even though the company had allowed its certification to lapse in 2018” according to the agency’s press release. The FTC added

After Ortho’s certification lapsed, the Department of Commerce warned the company to either remove the claims or take steps to recertify its participation in the Privacy Shield program, which the company failed to do, the complaint alleges. The FTC also alleges Ortho violated the Privacy Shield principles by failing to verify annually that statements about its Privacy Shield practices were accurate. In addition, it also failed to comply with a Privacy Shield requirement that it affirm that the company would continue to apply Privacy Shield protections to personal information collected while participating in the program, according to the complaint.

In a Consent Agreement set to run for 20 years, Ortho-Clinical Diagnostics, Inc. “whether acting directly or indirectly, in connection with the advertising, marketing, promotion, offering for sale, or sale of any product or service, must affirm to the Department of Commerce, within ten (10) days after the effective date of this Order and on an annual basis thereafter for as long as it retains such information, that it will

1. continue to apply the EU-U.S. Privacy Shield framework principles to the personal information it received while it participated in the Privacy Shield; or

2. protect the information by another means authorized under EU (for the EU-U.S. Privacy Shield framework) or Swiss (for the Swiss-U.S. Privacy Shield framework) law, including by using a binding corporate rule or a contract that fully reflects the requirements of the relevant standard contractual clauses adopted by the European Commission

If the company decides not to participate in the Privacy Shield, it must delete all data within 10 days.

The FTC meted out a stiffer penalty to NTT Global Data Centers, Inc., formerly known as RagingWire Data Centers for Privacy Shield compliance violations. The company “must hire a third-party assessor to verify that it is adhering to its Privacy Shield promises if it plans to participate in the framework” per the FTC’s press release. The FTC explained

In a complaint filed in November 2019, the FTC alleged that, between January 2017 and October 2018, RagingWire claimed in its online privacy policy and marketing materials that the company participated in the Privacy Shield framework and complied with the program’s requirements. In fact, the FTC alleged, the company’s certification lapsed in January 2018 and it failed to comply with certain Privacy Shield requirements while it was a participant in the program. The FTC also alleged that, upon allowing its certification to lapse, RagingWire failed to take the necessary steps to confirm that it would comply with its continuing obligations relating to data received pursuant to the framework.

In the 20 year Consent Order with NTT Global Data Centers, the FTC stipulated

no later than 120 days after the effective date of this Order and for so long as Respondent is a self-certified participant in Privacy Shield, Respondent and its officers, agents, employees, and attorneys, and all other persons in active concert or participation with any of them, who receive actual notice of this Order, whether acting directly or indirectly, in connection with the advertisement, marketing, promotion, offering for sale, or sale of any product or service, shall obtain an annual outside compliance review from an independent third-party assessor approved by the Associate Director for the Division of Enforcement of the Bureau of Consumer Protection at the Federal Trade Commission, that demonstrates that the assertions Respondent makes about its Privacy Shield practices are true, and that those Privacy Shield practices have been implemented as represented and in accord with the Privacy Shield Principles. (emphasis added).

NTT Global Data Centers must also

1. continue to apply the EU-U.S. Privacy Shield framework principles to the personal information it received while it participated in the Privacy Shield; or

2. protect the information by another means authorized under EU (for the EU-U.S. Privacy Shield framework) or Swiss (for the Swiss-U.S. Privacy Shield framework) law, including by using a binding corporate rule or a contract that fully reflects the requirements of the relevant standard contractual clauses adopted by the European Commission

The FTC split over the Consent Order against NTT Global Data Centers, with Commissioner Rohit Chopra dissenting for these reasons:

  • American businesses that participate in the EU-U.S. Privacy Shield Framework should not have to compete with those that break their privacy promises.
  • The FTC charged a data center company with violating their Privacy Shield commitments, but our proposed settlement does not even attempt to adequately remedy the harm to the market.
  • The evidence in the record raises serious concerns that customers looking to follow the law relied on the company’s representations and may be locked into long-term contracts.
  • A quick settlement with a small firm for an inadvertent mistake may be appropriate, but it is inadequate for a dishonest, large firm violating a core pillar of Privacy Shield.
  • We must consider seeking additional remedies, including rights to renegotiate contracts, disgorgement of ill-gotten revenue and data, and notice and redress for customers.

Chair Joe Simons and Commissioners Noah Joshua Phillips and Christine Wilson argued in their majority statement that

Commissioner Chopra would ask us to reject a settlement that protects consumers and furthers our Privacy Shield goals, to instead continue litigation during an ongoing pandemic. There is no need and doing so would unnecessarily divert resources from other important matters, including investigations of other substantive violations of Privacy Shield. We do not support moving the goalposts in this manner and for this reason vote to accept the settlement, which not just accords with but exceeds the relief the Commission unanimously sought to obtain at the outset of the case.

Despite these and other Privacy Shield enforcement actions, it is likely EU officials will still find US enforcement lacking. The European Data Protection Board (EDPB or Board) released its most recent annual assessment of the Privacy Shield in December 2019 and again found both the agreement itself and implementation wanting. There was some overlap between the concerns of the EDPB and the European Commission (EC) as detailed in its recently released third assessment of the Privacy Shield, but the EDPB discusses areas that were either omitted from or downplayed in the EC’s report. The EDPB’s authority is persuasive with respect to Privacy Shield and carries weight with the EC; however, its concerns as detailed in previous annual reports have pushed the EC to demand changes, including but not limited to, pushing the Trump Administration to nominate Board Members to the Privacy and Civil Liberties Oversight Board (PCLOB) and the appointment of a new Ombudsperson to handle complaints about how the U.S. Intelligence Community is handling the personal data of EU citizens.

In January 2019, in the “EU-U.S. Privacy Shield – Second Annual Joint Review,” the EDPB noted some progress by the US in implementing the EU-U.S. Privacy Shield. However, the EU’s Data Protection Authorities (DPA) and EDPB took issue with a number of shortcomings in US implementation, many of which have been noted in previous analyses of US efforts to ensure that U.S. companies that agree to the Privacy Shield’s principles. Notably, the EDPB found problems with the assurances provided by the US government regarding the collection and use of personal data by national security and law enforcement agencies. The EDPB also found problems with how the Department of Commerce and FTC are enforcing the Privacy Shield in the US against commercial entities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by ipse dixit on Unsplash

Further Reading and Other Developments (11 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The United States District Court of Maine denied a motion by a number of telecommunications trade associations to enjoin enforcement of a new Maine law instituting privacy practices for internet service providers (ISP) in the state that limited information collection and processing. The plaintiffs claimed the 2017 repeal of the Federal Communications Commission’s (FCC) 2016 ISP Privacy Order preempted states from implementing their own privacy rules for ISPs. In its decision, the court denied the plaintiffs’ motion and will proceed to decide the merits of the case.
  • The European Data Protection Board (EDPB) has debuted a “One-Stop-Shop” register “containing decisions taken by national supervisory authorities following the One-Stop-Shop cooperation procedure (Art. 60 GDPR).” The EDPB explained “[u]nder the GDPR, Supervisory Authorities have a duty to cooperate on cases with a cross-border component to ensure a consistent application of the regulation – the so-called one-stop-shop (OSS) mechanism…[and] [u]nder the OSS, the Lead Supervisory Authority (LSA) is in charge of preparing the draft decisions and works together with the concerned SAs to reach consensus.” Hence this new repository will contain the decisions on which EU data protection authorities have cooperated in addressing alleged GDPR violations that reach across the borders of EU nations.
  • The chair of the House Energy and Commerce Committee and three subcommittee chairs wrote Facebook, Google, and Twitter asking the companies “provide the Committee with monthly reports similar in scope to what you are providing the European Commission regarding your COVID-19 disinformation efforts as they relate to United States users of your platform.” They are also asking that the companies brief them and staff on 22 July on these efforts. Given the Committee’s focus on disinformation, it is quite possible these monthly reports and the briefing could be the basis of more hearings and/or legislation. Chair Frank Pallone, Jr. (D-NJ), Oversight and Investigations Subcommittee Chair Diana DeGette (D-CO), Communications and Technology Subcommittee Chair Mike Doyle (D-PA) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) signed the letters.
  • Reports indicate the Federal Trade Commission (FTC) and Department of Justice (DOJ) are reviewing the February 2019 $5.7 million settlement between the FTC and TikTok for violating the Children’s Online Privacy Protection Act (COPPA). In May 2020, a number of public advocacy groups filed a complaint with the FTC, asking whether the agency has “complied with the consent decree.” If TikTok has violated the order, it could face huge fines as the FTC and DOJ could seek a range of financial penalties. This seems to be another front in the escalating conflict between the United States and the People’s Republic of China.
  • Tech Inquiry, an organization that “seek[s] to combat abuses in the tech industry through coupling concerned tech workers with relevant members of civil society” revealed “an in-depth analysis of all public US federal (sub)contracting data over the last four and a half years to estimate the rankings of tech companies, both in and out of Silicon Valley, as contractors with the military, law enforcement, and diplomatic arms of the United States.” Tech Inquiry claimed “[o]ur analysis shows a diversity of contracting postures (see Tables 2 and 3), not a systemic divide from Washington. Within a substantial list of namebrand tech companies, only Facebook, Apple, and Twitter look to be staying out of major military and law enforcement contracts.”
  • The United States Secret Service announced the formation of a new Cyber Fraud Task Force (CFTF) which merges “its Electronic Crimes Task Forces (ECTFs) and Financial Crimes Task Forces (FCTFs) into a single unified network.” The rationale given for the merger is “the line between cyber and financial crimes has steadily blurred, to the point today where the two – cyber and financial crimes – cannot be effectively disentangled.”
  • The United States Election Assistance Commission (EAC) held a virtual public hearing, “Lessons Learned from the 2020 Primary Elections” “to discuss the administration of primary elections during the coronavirus pandemic.”
  • The National Council of Statewide Interoperability Coordinators (NCSWIC), a Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) administered program, released its “NCSWIC Strategic Plan and Implementation Guide,” “a stakeholder-driven, multi-jurisdictional, and multi-disciplinary plan to enhance interoperable and emergency communications.” NCSWIC contended “[t]he plan is a critical mid-range (three-year) tool to help NCSWIC and its partners prioritize resources, strengthen governance, identify future investments, and address interoperability gaps.”
  • Access Now is pressing “video conferencing platforms” other than Zoom to issue “regular transparency reports… clarifying exactly how they protect personal user data and enforce policies related to freedom of expression.”

Further Reading

  • India bans 59 Chinese apps, including TikTok and WeChat, after deadly border clash” – South China Morning Post. As a seeming extension to the military skirmish India and the People’s Republic of China (PRC) engaged in, a number of PRC apps have been banned by the Indian government, begging the question of whether there will be further escalation between the world’s two most populous nations. India is the TikTok’s biggest market with more than 120 million users in the South Asian country, and a range of other apps and platforms also have millions of users. Most of the smartphones used in India are made by PRC entities. Moreover, if New Delhi joins Washington’s war on Huawei, ZTE, and other PRC companies, the cumulative effect could significantly affect the PRC’s global technological ambitions.
  • Huawei data flows under fire in German court case” – POLITICO. A former Huawei employee in Germany has sued the company alleging violations of the General Data Protection Regulation (GDPR) through the company’s use of standard contractual clauses. This person requested the data the company had collected from him and the reasons for doing so. Huawei claimed it had deleted the data. A German court’s decision that Huawei had violated the GDPR is being appealed. However, some bigger issues are raised by the case, including growing unease within the European Union, that People’s Republic of China firms are possibly illegally transferring and processing EU citizens’ data and a case before Europe’s highest court in which the legality of standard contractual clauses may be determined as early as this month.
  • Deutsche Telekom under pressure after reports on Huawei reliance” – Politico. A German newspaper reported on confidential documents showing that Deutsche Telekom deepened its relationship with Huawei as the United States’ government was pressuring its allies and other nations to stop using the equipment and services of the company. The German telecommunications company denied the claims, and a number of German officials expressed surprise and dismay, opining that the government of Chancellor Angela Merkel should act more swiftly to implement legislation to secure Germany’s networks.
  • Inside the Plot to Kill the Open Technology Fund” – Vice. According to critics, the Trump Administration’s remaking of the United States (US) Agency for Global Media (USAGM) is threatening the mission and effectiveness of the Open Technology Fund (OTF), a US government non-profit designed to help dissidents and endangered populations throughout the world. The OTF has funded a number of open technology projects, including the Signal messaging app, but the new USAGM head, Michael pack, is pressing for closed source technology.
  • How Police Secretly Took Over a Global Phone Network for Organized Crime” – Vice. European law enforcement agencies penetrated and compromised an encrypted messaging service in Europe, leading to a number of arrests and seizures of drugs. Encrochat had billed itself as completely secure, but hackers with the French government broke into the system and laid bare the details of numerous crimes. And, this is only the latest encrypted app that is marketed to criminals, meaning others will soon step into the void created when Encrochat shut down.
  • Virus-Tracing Apps Are Rife With Problems. Governments Are Rushing to Fix Them.” – The New York Times. In numerous nations around the world, the rush to design and distribute contact tracing apps to fight COVID-19 has resulted in a host of problems predicted by information technology professionals and privacy, civil liberties and human rights advocates. Some apps collect too much information, many are not secure, and some do not seem to perform their intended tasks. Moreover, without mass adoption, the utility of an app is questionable at best. Some countries have sought to improve and perfect their apps in response to criticism, but others are continuing to use and even mandate their citizens and residents use them.
  • Hong Kong Security Law Sets Stage for Global Internet Fight” – The New York Times. After the People’s Republic of China (PRC) passed a new law that strips many of the protections Hong Kong enjoyed, technology companies are caught in a bind, for now Hong Kong may well start demanding they hand over data on people living in Hong Kong or employees could face jail time. Moreover, the data demands made of companies like Google or Facebook could pertain to people anywhere in the world. Companies that comply with Beijing’s wishes would likely face turbulence in Washington and vice versa. TikTok said it would withdraw from Hong Kong altogether.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gino Crescoli from Pixabay

EDPB Opines Encryption Ban Would Endanger A Nation’s Compliance with GDPR

As the US and others call on technology companies to develop the means to crack encrypted communications, an EU entity argues any nation with such a law would likely not meet the GDPR’s requirements.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB’s letter comes amidst another push by the Trump Administration, Republican allies in Congress, and other nations to have technology companies develop workarounds or backdoors to its end-to-end encrypted devices, apps, and systems. The proponents of this change claim online child sexual predators, terrorists, and other criminals are using products and services like WhatsApp, Telegram, and iPhones to defeat legitimate, targeted government surveillance and enforcement. They reason that unless technology companies abandon their unnecessarily absolutist position and work towards a technological solution, the number of bad actors communicating in ways that cannot be broken (aka “going dark”) will increase, allowing for greater crime and wrongdoing.

On the other side of the issue, technology companies, civil liberties and privacy experts, and computer scientists argue that any weakening of or backdoors to encryption will eventually be stolen and exposed, making it easier for criminals to hack, steal, and exfiltrate. They assert the internet and digital age are built on secure communications and threatening this central feature would wreak havoc beyond the crimes the US and other governments are seeking to prevent.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

The EDPB’s view is being articulated at a time when, as noted, a number of nations led by the United States (US) continue to press technology companies to allow them access to communications, apps, platforms, and devices that are encrypted. Last year, the US, United Kingdom, Australia, New Zealand, and Canada (the so-called Five Eyes nations) met and claimed in one of the communiques, the Five Eyes ministers asserted that

We are concerned where companies deliberately design their systems in a way that precludes any form of access to content, even in cases of the most serious crimes. This approach puts citizens and society at risk by severely eroding a company’s ability to identify and respond to the most harmful illegal content, such as child sexual exploitation and abuse, terrorist and extremist material and foreign adversaries’ attempts to undermine democratic values and institutions, as well as law enforcement agencies’ ability to investigate serious crime.

The five nations contended that “[t]ech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” The Five Eyes also claimed that “[t]hose companies should also embed the safety of their users in their system designs, enabling them to take action against illegal content…[and] [a]s part of this, companies and Governments must work together to ensure that the implications of changes to their services are well understood and that those changes do not compromise public safety.”

The Five Eyes applauded “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.”

The Five Eyes added

We share concerns raised internationally, inside and outside of government, about the impact these changes could have on protecting our most vulnerable citizens, including children, from harm. More broadly, we call for detailed engagement between governments, tech companies, and other stakeholders to examine how proposals of this type can be implemented without negatively impacting user safety, while protecting cyber security and user privacy, including the privacy of victims.

In October 2019, in an open letter to Facebook CEO Mark Zuckerberg, US Attorney General William P. Barr, United Kingdom Home Secretary Priti Patel, Australia’s Minister for Home Affairs Peter Dutton, and then acting US Homeland Security Secretary Kevin McAleenan asked “that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

However, one of the Five Eyes nations has already taken legislative action to force technology companies and individuals cooperate with law enforcement investigations in ways that could threaten encryption. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

In a related development, this week, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

INSLM stated “[t]he essential effects of TOLA are as follows:

a. Schedule 1 gives police and intelligence agencies new powers to agree or require significant industry assistance from communications providers.

b. Schedules 2, 3 and 4 update existing powers and, in some cases, extended them to new agencies. c. Schedule 5 gives the Australian Security Intelligence Organisation (ASIO) significant new powers to seek and receive both voluntary and compulsory assistance.

INSLM found

  • In relation to Schedule 1, for the reasons set out in greater detail in the report, Technical Assistance Notice (TANs) and Technical Capability Notice (TCNs) should be authorised by a body which is independent of the issuing agency or government. These are powers designed to compel a Designated Communications Provider (DCP) to reveal private information or data of its customers and therefore the usual practice of independent authorisation should apply.
  • I am satisfied that the computer access warrant and associated powers conferred by Schedule 2 are both necessary and proportionate, subject to some amendments.
  • I am generally satisfied that the powers conferred by Schedules 3 and 4 are both necessary and proportionate, but there are some matters that should be addressed and further monitored.
  • I have concluded that Schedule 5 should be amended to limit its breadth and clarify its scope.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

European Commission Releases Its First Review of the GDPR

While emphasizing the positive developments, the EC calls for more work to help nations and DPAs better and more uniformly endorse the law.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

The European Commission submitted its two-year review of the General Data Protection Regulation (GDPR) that took effect across the European Union in May 2018. This review was required to occur two years after the new cross-border data protection structure took effect, and the GDPR further requires reviews every four years after the first review has been completed. It bears note the EC opted to exceed its statutory mandate in the report by covering more than international transfers of EU personal data and how well nations and DPAs are using coordination and cooperation mechanisms to ensure uniform, effective enforcement across the EU.

Overall, the EC touts what it frames as successes of the GDPR and calls for EU member states, data protection authorities (DPA), and the European Data Protection Board (EDPB or Board) to address and resolve a host of ongoing issues that make enforcement of and compliance with the GDPR more difficult. For example, the EC flags the resources and independence EU nations are providing their DPAs as a major issue, as many of the regulatory bodies lack the funding, technical capability, and human power to fully regulate the data rights and obligations enshrined in the GDPR. Another issue the EC discusses at some length are the differing national data protection laws, many of which conflict with or not fully implement the GDPR.

In terms of a top-line summary, the EC claimed

  • The   general   view is   that   two   years   after   it   started   to   apply,   the   GDPR   has successfully,  met  its  objectives  of  strengthening  the  protection  of  the  individual’s right  to  personal  data  protection  and  guaranteeing  the  free  flow  of  personal  data within  the  EU.  However  a  number  of  areas  for  future  improvement  have  also  been identified. 
  • Like most stakeholders and data protection authorities, the Commission is of  the  view  that  it  would  be  premature  at  this  stage  to  draw  definite  conclusions regarding the application of the GDPR.
  • It is likely that most of the issues identified by Member  States  and  stakeholders  will  benefit from  more  experience  in  applying  the GDPR  in  the  coming  years. 
  • Nevertheless,  this  report  highlights  the  challenges encountered so far in applying the GDPR and sets out possible ways to address them.
  • Notwithstanding  its  focus  is  on  the  two  issues  highlighted  in  Article  97(2)  of  the GDPR,   namely   international   transfers   and   the   cooperation   and   consistency mechanisms,  this  evaluation  and  review  takes  a  broader  approach  to  also  address issues which have been raised by various actors during the last two years.

Among its other findings, the EC asserted

  • However, developing a truly common European data protection culture between data protection authorities is still an on-going process. Data protection authorities have not yet made full use of the tools the GDPR provides, such as joint operations that could lead to joint investigations. At times, finding a common approach meant moving to the lowest common denominator and as a result, opportunities to foster more harmonisation were missed
  • Stakeholders generally welcome the guidelines from the Board and request additional ones on key concepts of the GDPR, but also point to inconsistencies between the national guidance and the Board guidelines. They underline the need for more practical advice, in particular more concrete examples, and the need for data protection authorities to be equipped with the necessary human, technical and financial resources to effectively carry out their tasks.

The EC called for greater funding and resources for DPAs to enforce the GDPR, especially in Ireland and Luxembourg which serve as the EU headquarters for a number of large technology companies:

Data protection authorities play an essential role in ensuring that the GDPR is enforced at national level and that the cooperation and consistency mechanisms within the Board functions effectively, including in particular the one-stop-shop mechanism for cross-border cases. Member States are therefore called upon to provide them with adequate resources as required by the GDPR.

The EC wrapped up the GDPR review by drawing a roadmap of sorts for future actions:

Based on this evaluation of the application of the GDPR since May 2018, the actions listed below have been identified as necessary to support its application. The Commission will monitor their implementation also in view of the forthcoming evaluation report in 2024.

The EC offered the following as ongoing or future actions to more fully realize implementation and enforcement of the GDPR to be undertaken by EU states, EU DPAs, the EC, the EDPB, and stakeholders in the EU and elsewhere:

Implementing and complementing the legal framework

Member States should

  • complete the alignment of their sectoral laws to the GDPR;
  • consider limiting the use of specification clauses which might create fragmentation and jeopardise the free flow of data within the EU;
  • assess whether national law implementing the GDPR is in all circumstances within the margins provided for Member State legislation.

The Commission will

  • pursue bilateral exchanges with Member States on the compliance of national laws with the GDPR, including on the independence and resources of national data protection authorities; make use of all the tools at its disposal, including infringement procedures, to ensure that Member States comply with the GDPR;
  • support further exchanges of views and national practices between Member States on topics which are subject to further specification at national level so as to reduce the level of fragmentation of the single market, such as processing of personal data relating to health and research, or which are subject to balancing with other rights such as the freedom of expression;
  • support a consistent application of the data protection framework in relation to new technologies to support innovation and technological developments;
  • use the GDPR Member States Expert Group (established during the transitory phase before the GDPR became applicable) to facilitate discussions and sharing of experience between Member States and with the Commission;
  • explore whether, in the light of further experience and relevant case-law, proposing possible future targeted amendments to certain provisions of the GDPR might be appropriate, in particular regarding records of processing by SMEs that do not have the processing of personal data as their core business (low risk), and the possible harmonisation of the age of children consent in relation to information society services.

Making the new governance system deliver its full potential

The Board and data protection authorities are invited to

  • develop efficient arrangements between data protection authorities regarding the functioning of the cooperation and consistency mechanisms, including on procedural aspects, building on the expertise of its members and by strengthening the involvement of its secretariat;
  • support harmonisation in applying and enforcing the GDPR using all means at its disposal, including by further clarifying key concepts of the GDPR, and ensuring that national guidance is fully in line with guidelines adopted by the Board;
  • encourage the use of all tools provided for in the GDPR to ensure that it is applied consistently;
  • step up cooperation among data protection authorities, for instance by conducting joint investigations.

The Commission will

  • continue to closely monitor the effective and full independence of national data protection authorities;
  • encourage cooperation between regulators (in particular in fields such as competition, electronic communications, security of network and information systems and consumer policy);
  • support the reflection within the Board on the procedures applied by the national data protection authorities in order to improve the cooperation on the cross-border cases.

Member States shall

  • allocate resources to data protection authorities that are sufficient for them to perform their tasks.

Supporting stakeholders

The Board and data protection authorities are invited to

  • adopt further guidelines which are practical, easily understandable, and which provide clear answers and avoid ambiguities on issues related to the application of the GDPR, for example on processing children’s data and data subject rights, including the exercise of the right of access and the right to erasure, consulting stakeholders in the process;
  • review the guidelines when further clarifications are necessary in the light of experience and developments including in the case law of the Court of Justice;
  • develop practical tools, such as harmonised forms for data breaches and simplified records of processing activities, to help low-risk SMEs meeting their obligations.

The Commission will

  • provide standard contractual clauses both for international transfers and the controller/processor-relationship;
  • provide for tools clarifying/supporting the application of data protection rules to children;
  • in line with the Data Strategy, explore practical means to facilitate increased use of the right to portability by individuals, such as by giving them more control over who can access and use machine-generated data;
  • support standardisation/certification in particular on cybersecurity aspects through the cooperation between the European Union Agency for Cybersecurity (ENISA), the data protection authorities and the Board;
  • when appropriate, make use of its right to request the Board to prepare guidelines and opinions on specific issues of importance to stakeholders;
  • when necessary provide guidance, while fully respecting the role of the Board;
  • support the activities of data protection authorities that facilitate implementation of GDPR obligations by SMEs, through financial support, especially for practical guidance and digital tools that can be replicated in other Member States.

Encouraging innovation

The Commission will

  • monitor the application of the GDPR to new technologies, also taking into account of possible future initiatives in the field of artificial intelligence and under the Data Strategy;
  • encourage, including through financial support, the drafting of EU codes of conduct in the area of health and research;
  • closely follow the development and the use of apps in the context of the COVID-19 pandemic.

The Board is invited to

  • issue guidelines on the application of the GDPR in the area of scientific research, artificial intelligence, blockchain, and possible other technological developments;
  • review the guidelines when further clarifications are necessary in the light of technological development.

Further developing the toolkit for data transfers

The Commission will

  • pursue adequacy dialogues with interested third countries, in line with the strategy set out in its 2017 Communication ‘Exchanging and Protecting Personal Data in a Globalised World‘, including where possible by covering data transfers to criminal law enforcement authorities (under the Data Protection Law Enforcement Directive) and other public authorities; this includes finalisation of the adequacy process with the Republic of Korea as soon as possible;
  • finalise the ongoing evaluation of the existing adequacy decisions and report to the European Parliament and the Council;
  • finalise the work on the modernisation of the standard contractual clauses, with a view to updating them in light of the GDPR, covering all relevant transfer scenarios and better reflecting modern business practices.

The Board is invited to

  • further clarify the interplay between the rules on international data transfers (Chapter V) with the GDPR’s territorial scope of application (Article 3);
  • ensure effective enforcement against operators established in third countries falling within the GDPR’s territorial scope of application, including as regards the appointment of a representative where applicable (Article 27);
  • streamline the assessment and eventual approval of binding corporate rules with a view to speed up the process;
  • complete the work on the architecture, procedures and assessment criteria for codes of conduct and certification mechanisms as tools for data transfers.

Promoting convergence and developing international cooperation

The Commission will

  • support ongoing reform processes in third countries on new or modernised data protection rules by sharing experience and best practices;
  • engage with African partners to promote regulatory convergence and support capacity-building of supervisory authorities as part of the digital chapter of the new EU-Africa partnership;
  • assess how cooperation between private operators and law enforcement authorities could be facilitated, including by negotiating bilateral and multilateral frameworks for data transfers in the context of access by foreign criminal law enforcement authorities to electronic evidence, to avoid conflicts of law while ensuring appropriate data protection safeguards;
  • engage with international and regional organisations such as the OECD, ASEAN or the G20 to promote trusted data flows based on high data protection standards, including in the context of the Data Flow with Trust initiative;
  • set up a ‘Data Protection Academy’ to facilitate and support exchanges between European and international regulators;
  • promote international enforcement cooperation between supervisory authorities, including through the negotiation of cooperation and mutual assistance agreements.

EC staff released a working document more detailed than the EC’s report and broader than the mandate in Article 97 of the GDPR:

Although its focus is on the two issues highlighted in Article 97(2) of the GDPR, namely international transfers and the cooperation and consistency mechanisms, this evaluation takes a broader approach in order to address issues which have been raised by various actors during the last two years.

EC staff highlighted the number and types of enforcement actions, taking care to stress their deterrent effect, in part, perhaps to counter criticism that the fines levied have often been a fraction of the statutory ceiling. Of course, this sort of argument is hard to dispute for how does one prove or disprove a negative (i.e. all the GDPR violations that were averted because regulated entities feared being punished in a fashion similar to entities subject to enforcement actions.) EC staff asserted:

The GDPR establishes independent data protection authorities and provides them with harmonised and strengthened enforcement powers. Since the GDPR applies, those authorities have been using of a wide range of corrective powers provided for in the GDPR, such as administrative fines (22 EU/EEA authorities)10, warnings and reprimands (23), orders to comply with data subject’s requests (26), orders to bring processing operations into compliance with the GDPR (27), and orders to rectify, erase or restrict processing (17). Around half of the data protection authorities (13) have imposed temporary or definitive limitations on processing, including bans. This demonstrates a conscious use of all corrective measures provided for in the GDPR; the data protection authorities did not shy away from imposing administrative fines in addition to or instead of other corrective measures, depending on the circumstances of individual cases.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Biljana Jovanovic from Pixabay