Further Reading, Other Development, and Coming Events (20 and 21 January 2021)

Further Reading

  • Amazon’s Ring Neighbors app exposed users’ precise locations and home addresses” By Zack Whittaker — Tech Crunch. Again Amazon’s home security platform suffers problems by way of users data being exposed or less than protected.
  • Harassment of Chinese dissidents was warning signal on disinformation” By Shawna Chen and Bethany Allen-Ebrahimian — Axios. In an example of how malicious online activities can spill into the real world as a number of Chinese dissidents were set upon by protestors.
  • How Social Media’s Obsession with Scale Supercharged Disinformation” By Joan Donovan — Harvard Business Review. Companies like Facebook and Twitter emphasized scale over safety in trying to grow as quickly as possible. This lead to a proliferation of fake accounts and proved welcome ground for the seeds of misinformation.
  • The Moderation War Is Coming to Spotify, Substack, and Clubhouse” By Alex Kantrowitz — OneZero. The same issues with objectionable and abusive content plaguing Twitter, Facebook, YouTube and others will almost certainly become an issue for the newer platforms, and in fact already are.
  • Mexican president mounts campaign against social media bans” By Mark Stevenson — The Associated Press. The leftist President of Mexico President Andrés Manuel López Obrador is vowing to lead international efforts to stop social media companies from censoring what he considers free speech. Whether this materializes into something substantial is not clear.
  • As Trump Clashes With Big Tech, China’s Censored Internet Takes His Side” By Li Yuan — The New York Times. The government in Beijing is framing the ban of former President Donald Trump after the attempted insurrection by social media platforms as proof there is no untrammeled freedom of speech. This position helps bolster the oppressive policing of online content the People’s Republic of China (PRC) wages against its citizens. And quite separately many Chinese people (or what appear to be actual people) are questioning what is often deemed the censoring of Trump in the United States (U.S.), a nation ostensibly committed to free speech. There is also widespread misunderstanding about the First Amendment rights of social media platforms not to host content with which they disagree and the power of platforms to make such determinations without fear that the U.S. government will punish them as is often the case in the PRC.
  • Trump admin slams China’s Huawei, halting shipments from Intel, others – sources” By Karen Freifeld and Alexandra Alper — Reuters. On its way out of the proverbial door, the Trump Administration delivered parting shots to Huawei and the People’s Republic of China by revoking one license and denying others to sell the PRC tech giant semiconductors. Whether the Biden Administration will reverse or stand by these actions remains to be seen. The companies, including Intel, could appeal. Additionally, there are an estimated $400 million worth of applications for similar licenses pending at the Department of Commerce that are now the domain of the new regime in Washington. It is too early to discern how the Biden Administration will maintain or modify Trump Administration policy towards the PRC.
  • Behind a Secret Deal Between Google and Facebook” By Daisuke Wakabayashi and Tiffany Hsu — The New York Times. The newspaper got its hands on an unredacted copy of the antitrust suit Texas Attorney General Ken Paxton and other attorneys general filed against Google, and it has details on the deal Facebook and Google allegedly struck to divide the online advertising world. Not only did Facebook ditch an effort launched by publishers to defeat Google’s overwhelming advantages in online advertising bidding, it joined Google’s rival effort with a guarantee that it would win a specified number of bids and more time to bid on ads. Google and Facebook naturally deny any wrongdoing.
  • Biden and Trump Voters Were Exposed to Radically Different Coverage of the Capitol Riot on Facebook” By Colin Lecher and Jon Keegan — The Markup. Using a tool on browsers the organization pays Facebook users to have, the Markup can track the type of material they see in their feed. Facebook’s algorithm fed people material about the 6 January 2021 attempted insurrection based on their political views. Many have pointed out that this very dynamic creates filter bubbles that poison democracy and public discourse.
  • Banning Trump won’t fix social media: 10 ideas to rebuild our broken internet – by experts” By Julia Carrie Wong — The Guardian. There are some fascinating proposals in this piece that could help address the problems of social media.
  • Misinformation dropped dramatically the week after Twitter banned Trump and some allies” By Elizabeth Dwoskin and Craig Timberg — The Washington Post. Research showed that lies, misinformation, and disinformation about election fraud dropped by three-quarters after former President Donald Trump was banned from Twitter and other platforms. Other research showed that a small group of conservatives were responsible for up to 20% of misinformation on this and other conspiracies.
  • This Was WhatsApp’s Plan All Along” By Shoshana Wodinsky — Gizmodo. This piece does a great job of breaking down into plain English the proposed changes to terms of service on WhatsApp that so enraged users that competitors Signal and Telegram have seen record-breaking downloads. Basically, it is all about reaping advertising dollars for Facebook through businesses and third-party partners using user data from business-related communications. Incidentally, WhatsApp has delayed changes until March because of the pushback.
  • Brussels eclipsed as EU countries roll out their own tech rules” By By Laura Kayali and Mark Scott — Politico EU. The European Union (EU) had a hard-enough task in trying to reach final language on a Digital Services Act and Digital Markets Act without nations like France, Germany, Poland, and others picking and choosing text from draft bills and enacting them into law. Brussels is not happy with this trend.

Other Developments

  • Federal Trade Commission (FTC) Chair Joseph J. Simons announced his resignation from the FTC effective on 29 January 2021 in keeping with tradition and past practice. This resignation clears the way for President Joe Biden to name the chair of the FTC, and along with FTC Commissioner Rohit Chopra’s nomination to head the Consumer Financial Protection Bureau (CFPB), the incoming President will get to nominate two Democratic FTC Commissioners, tipping the political balance of the FTC and likely ushering in a period of more regulation of the technology sector.
    • Simons also announced the resignation of senior staff: General Counsel Alden F. Abbott; Bureau of Competition Director Ian Conner; Bureau of Competition Deputy Directors Gail Levine and Daniel Francis; Bureau of Consumer Protection Director Andrew Smith; Bureau of Economics Director Andrew Sweeting; Office of Public Affairs Director Cathy MacFarlane; and Office of Policy Planning Director Bilal Sayyed.
  • In a speech last week before he sworn in, President Joe Biden announced his $1.9 trillion American Rescue Plan, and according to a summary, Biden will ask Congress to provide $10 billion for a handful of government facing programs to improve technology. Notably, Biden “is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks.” Biden is proposing to dramatically increase funding for a fund that would allow agencies to borrow and then pay back funds to update their technology. Moreover, Biden is looking to push more money to a program to aid officials at agencies who oversee technology development and procurement.
    • Biden stated “[t]o remediate the SolarWinds breach and boost U.S. defenses, including of the COVID-19 vaccine process, President-elect Biden is calling on Congress to:
      • Expand and improve the Technology Modernization Fund. ​A $9 billion investment will help the U.S. launch major new IT and cybersecurity shared services at the Cyber Security and Information Security Agency (CISA) and the General Services Administration and complete modernization projects at federal agencies. ​In addition, the president-elect is calling on Congress to change the fund’s reimbursement structure in order to fund more innovative and impactful projects.
      • Surge cybersecurity technology and engineering expert hiring​. Providing the Information Technology Oversight and Reform fund with $200 million will allow for the rapid hiring of hundreds of experts to support the federal Chief Information Security Officer and U.S. Digital Service.
      • Build shared, secure services to drive transformational projects. ​Investing$300 million in no-year funding for Technology Transformation Services in the General Services Administration will drive secure IT projects forward without the need of reimbursement from agencies.
      • Improving security monitoring and incident response activities. ​An additional $690M for CISA will bolster cybersecurity across federal civilian networks, and support the piloting of new shared security and cloud computing services.
  • The United States (U.S.) Department of Commerce issued an interim final rule pursuant to an executive order (EO) issued by former President Donald Trump to secure the United States (U.S.) information and communications supply chain. This rule will undoubtedly be reviewed by the Biden Administration and may be withdrawn or modified depending on the fate on the EO on which the rule relies.
    • In the interim final rule, Commerce explained:
      • These regulations create the processes and procedures that the Secretary of Commerce will use to identify, assess, and address certain transactions, including classes of transactions, between U.S. persons and foreign persons that involve information and communications technology or services designed, developed, manufactured, or supplied, by persons owned by, controlled by, or subject to the jurisdiction or direction of a foreign adversary; and pose an undue or unacceptable risk. While this interim final rule will become effective on March 22, 2021, the Department of Commerce continues to welcome public input and is thus seeking additional public comment. Once any additional comments have been evaluated, the Department is committed to issuing a final rule.
      • On November 27, 2019, the Department of Commerce (Department) published a proposed rule to implement the terms of the Executive Order. (84 FR 65316). The proposed rule set forth processes for (1) how the Secretary would evaluate and assess transactions involving ICTS to determine whether they pose an undue risk of sabotage to or subversion of the ICTS supply chain, or an unacceptable risk to the national security of the United States or the security and safety of U.S. persons; (2) how the Secretary would notify parties to transactions under review of the Secretary’s decision regarding the ICTS Transaction, including whether the Secretary would prohibit or mitigate the transaction; and (3) how parties to transactions reviewed by the Secretary could comment on the Secretary’s preliminary decisions. The proposed rule also provided that the Secretary could act without complying with the proposed procedures where required by national security. Finally, the Secretary would establish penalties for violations of mitigation agreements, the regulations, or the Executive Order.
      • In addition to seeking general public comment, the Department requested comments from the public on five specific questions: (1) Whether the Secretary should consider categorical exclusions or whether there are classes of persons whose use of ICTS cannot violate the Executive Order; (2) whether there are categories of uses or of risks that are always capable of being reliably and adequately mitigated; (3) how the Secretary should monitor and enforce any mitigation agreements applied to a transaction; (4) how the terms, “transaction,” “dealing in,” and “use of” should be clarified in the rule; and (5) whether the Department should add record-keeping requirements for information related to transactions.
      • The list of “foreign adversaries” consists of the following foreign governments and non-government persons: The People’s Republic of China, including the Hong Kong Special Administrative Region (China); the Republic of Cuba (Cuba); the Islamic Republic of Iran (Iran); the Democratic People’s Republic of Korea (North Korea); the Russian Federation (Russia); and Venezuelan politician Nicolás Maduro (Maduro Regime).
  • The Federal Trade Commission (FTC) adjusted its penalty amounts for inflation, including a boost to the per violation penalty virtually all the privacy bills introduced in the last Congress would allow the agency to wield against first-time violators. The penalty for certain unfair and deceptive acts or practices was increased from $43,280 to $43,792.
  • The United States (U.S.) Department of State stood up its new Bureau of Cyberspace Security and Emerging Technologies (CSET) as it has long planned. At the beginning of the Trump Administration, the Department of State dismantled the Cyber Coordinator Office and gave its cybersecurity portfolio to the Bureau of Economic Affairs, which displeased Congressional stakeholders. In 2019, the department notified Congress of its plan to establish CSET. The department asserted:
    • The need to reorganize and resource America’s cyberspace and emerging technology security diplomacy through the creation of CSET is critical, as the challenges to U.S. national security presented by China, Russia, Iran, North Korea, and other cyber and emerging technology competitors and adversaries have only increased since the Department notified Congress in June 2019 of its intent to create CSET.
    • The CSET bureau will lead U.S. government diplomatic efforts on a wide range of international cyberspace security and emerging technology policy issues that affect U.S. foreign policy and national security, including securing cyberspace and critical technologies, reducing the likelihood of cyber conflict, and prevailing in strategic cyber competition.  The Secretary’s decision to establish CSET will permit the Department to posture itself appropriately and engage as effectively as possible with partners and allies on these pressing national security concerns.
    • The Congressional Members of the Cyberspace Solarium Commission made clear their disapproval of the decision. Senators Angus King (I-ME) and Ben Sasse, (R-NE) and Representatives Mike Gallagher (R-WI) and Jim Langevin (D-RI) said:
      • In our report, we emphasize the need for a greater emphasis on international cyber policy at State. However, unlike the bipartisan Cyber Diplomacy Act, the State Department’s proposed Bureau will reinforce existing silos and […] hinder the development of a holistic strategy to promote cyberspace stability on the international stage. We urge President-elect Biden to pause this reorganization when he takes office in two weeks and work with Congress to enact meaningful reform to protect our country in cyberspace.
  • The Australian Cyber Security Centre (ACSC) the Risk Identification Guidance “developed to assist organisations in identifying risks associated with their use of suppliers, manufacturers, distributors and retailers (i.e. businesses that constitute their cyber supply chain)” and the Risk Management Guidance because “[c]yber supply chain risk management can be achieved by identifying the cyber supply chain, understanding cyber supply chain risk, setting cyber security expectations, auditing for compliance, and monitoring and improving cyber supply chain security practices.”
  • The United Kingdom’s Surveillance Camera Commissioner (SCC), issued “best practice guidance, ‘Facing the Camera’, to all police forces in England and Wales” The SCC explained that “The provisions of this document only apply to the use of facial recognition technology and the inherent processing of images by the police where such use is integral to a surveillance camera system being operated in ‘live time’ or ‘near real time’ operational scenarios.” Last summer, a British appeals court overturned a decision that found that a police force’s use of facial recognition technology in a pilot program that utilized live footage to be legal. The appeals court found the use of this technology by the South Wales Police Force a violation of “the right to respect for private life under Article 8 of the European  Convention  on  Human  Rights,  data  protection  legislation,  and  the  Public  Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.” The SCC stated:
    • The SCC considers surveillance to be an intrusive investigatory power where it is conducted by the police which impacts upon those fundamental rights and freedoms of people, as set out by the European Convention of Human Rights (ECHR) and the Human Rights Act 1998. In the context of surveillance camera systems which make use of facial recognition technology, the extent of state intrusion in such matters is significantly increased by the capabilities of algorithms which are in essence, integral to the surveillance conduct seeking to harvest information, private information, metadata, data, personal data, intelligence and evidence. Each of the aforementioned are bound by laws and rules which ought to be separately and jointly considered and applied in a manner which is demonstrably lawful and ethical and engenders public trust and confidence.
    • Whenever the police seek to use technology in pursuit of a legitimate aim, the key question arises as to whether the degree of intrusion which is caused to the fundamental freedoms of citizens by the police surveillance conduct using surveillance algorithms (biometric or otherwise) is necessary in a democratic society when considered alongside the legality and proportionality of their endeavours and intent. The type of equipment/technology/modality which they choose to use to that end (e.g. LFR, ANPR, thermal imaging, gait analysis, movement sensors etc), the manner in which such technological means are deployed, (such as using static cameras at various locations, used with body worn cameras or other mobile means), and whether such technology is used overtly alongside or networked with other surveillance technologies, are all factors which may significantly influence the depth of intrusion caused by police conduct upon citizen’s rights.
  • The Senate confirmed the nomination of Avril Haines to be the new Director of National Intelligence by an 89-10 vote after Senator Tom Cotton (R-AK) removed his hold on her nomination. However, Josh Hawley (R-MO) placed a hold on the nomination of Alejandro Mayorkas to be the next Secretary of Homeland Security and explained his action this way:
    • On Day 1 of his administration, President-elect Biden has said he plans to unveil an amnesty plan for 11 million immigrants in this nation illegally. This comes at a time when millions of American citizens remain out of work and a new migrant caravan has been attempting to reach the United States. Mr. Mayorkas has not adequately explained how he will enforce federal law and secure the southern border given President-elect Biden’s promise to roll back major enforcement and security measures. Just today, he declined to say he would enforce the laws Congress has already passed to secure the border wall system. Given this, I cannot consent to skip the standard vetting process and fast-track this nomination when so many questions remain unanswered.
  • Former Trump White House Cyber Coordinator Rob Joyce will replace the National Security Agency’s (NSA) Director of Cybersecurity Anne Neuberger who has been named the Biden White House’s Deputy National Security Advisor for Cyber and Emerging Technology. Anne Neuberger’s portfolio at the NSA included “lead[ing] NSA’s cybersecurity mission, including emerging technology areas like quantum-resistant cryptography.” Joyce was purged when former National Security Advisor John Bolton restructured the NSC in 2018, forcing out Joyce and his boss, former Homeland Security Advisor Tom Bossert. Presumably Joyce would have the same responsibilities. At the National Security Council, Neuberger would will work to coordinate cybersecurity and emerging technology policy across agencies and funnel policy options up to the full NSC and ultimately the President. This work would include Joyce.
  • The Supreme Court of the United States (SCOTUS) heard oral arguments on whether the Federal Trade Commission (FTC) Act gives the agency the power to seek monetary damages and restitution alongside permanent injunctions under Section 13(b). In AMG Capital Management, LLC v. FTC, the parties opposing the FTC argue the plain language of the statute does not allow for the seeking of restitution and monetary damages under this specific section of the FTC Act while the agency argues long accepted past practice and Congressional intent do, in fact, allow this relief to be sought when the FTC is seeking to punish violators of Section 5. The FTC is working a separate track to get a fix from Congress which could rewrite the FTC Act to make clear this sort of relief is legal. However, some stakeholders in the debate over privacy legislation may be using the case as leverage.
    • In October 2020, the FTC wrote the House and Senate committees with jurisdiction over the agency, asking for language to resolve the litigation over the power to seek and obtain restitution for victims of those who have violated Section 5 of the FTC Act and disgorgement of ill-gotten gains. The FTC is also asking that Congress clarify that the agency may act against violators even if their conduct has stopped as it has for more than four decades. Two federal appeals courts have ruled in ways that have limited the FTC’s long used powers, and now the Supreme Court of the United States is set to rule on these issues sometime next year. The FTC is claiming, however, that defendants are playing for time in the hopes that the FTC’s authority to seek and receive monetary penalties will ultimately be limited by the United States (U.S.) highest court. Judging by language tucked into a privacy bill introduced by the former chair of one of the committees, Congress may be willing to act soon.
    • The FTC asked the House Energy and Commerce and Senate Commerce, Science, and Transportation Committees “to take quick action to amend Section 13(b) [of the FTC Act i.e. 15 U.S.C. § 53(b)] to make clear that the Commission can bring actions in federal court under Section 13(b) even if conduct is no longer ongoing or impending when the suit is filed and can obtain monetary relief, including restitution and disgorgement, if successful.” The agency asserted “[w]ithout congressional action, the Commission’s ability to use Section 13(b) to provide refunds to consumer victims and to enjoin illegal activity is severely threatened.” All five FTC Commissioners signed the letter.
    • The FTC explained that adverse rulings by two federal appeals courts are constraining the agency from seeking relief for victims and punishment for violators of the FTC Act in federal courts below those two specific courts, but elsewhere defendants are either asking courts for a similar ruling or using delaying tactics in the hopes the Supreme Court upholds the two federal appeals courts:
      • …[C]ourts of appeals in the Third and Seventh Circuits have recently ruled that the agency cannot obtain any monetary relief under Section 13(b). Although review in the Supreme Court is pending, these lower court decisions are already inhibiting our ability to obtain monetary relief under 13(b). Not only do these decisions already prevent us from obtaining redress for consumers in the circuits where they issued, prospective defendants are routinely invoking them in refusing to settle cases with agreed-upon redress payments.
      • Moreover, defendants in our law enforcement actions pending in other circuits are seeking to expand the rulings to those circuits and taking steps to delay litigation in anticipation of a potential Supreme Court ruling that would allow them to escape liability for any monetary relief caused by their unlawful conduct. This is a significant impediment to the agency’s effectiveness, its ability to provide redress to consumer victims, and its ability to prevent entities who violate the law from profiting from their wrongdoing.
  • The United Kingdom’s Information Commissioner’s Office (ICO) issued guidance for British entities that may be affected by the massive SolarWinds hack that has compromised many key systems in the United States. The ICO advised:
    • Organisations should immediately check whether they are using a version of the software that has been compromised. These are versions 2019.4 HF 5, 2020.2 with no hotfix installed, and 2020.2 HF 1.
    • Organisations must also determine if the personal data they hold has been affected by the cyber-attack. If a reportable personal data breach is found, UK data controllers are required to inform the ICO within 72 hours of discovering the breach. Reports can be submitted online or organisations can call the ICO’s personal data breach helpline for advice on 0303 123 1113, option 2.
    • Organisations subject to the NIS Regulation will also need to determine if this incident has led to a “substantial impact on the provision’ of its digital services and report to the ICO.
  • Europol announced the takedown of “the world’s largest illegal marketplace on the dark web” in an operation coordinated by the following nations: “Germany, Australia, Denmark, Moldova, Ukraine, the United Kingdom (the National Crime Agency), and the USA (DEA, FBI, and IRS).” Europol added:
    • The Central Criminal Investigation Department in the German city of Oldenburg arrested an Australian citizen who is the alleged operator of DarkMarket near the German-Danish border over the weekend. The investigation, which was led by the cybercrime unit of the Koblenz Public Prosecutor’s Office, allowed officers to locate and close the marketplace, switch off the servers and seize the criminal infrastructure – more than 20 servers in Moldova and Ukraine supported by the German Federal Criminal Police office (BKA). The stored data will give investigators new leads to further investigate moderators, sellers, and buyers. 
  • The Enforcement Bureau (Bureau) of the Federal Communications Commission (FCC) issued an enforcement advisory intended to remind people that use of amateur and personal radios to commit crimes is itself a criminal offense that could warrant prosecution. The notice was issued because the FCC is claiming it is aware of discussion by some of how these means of communications may be superior to social media, which has been cracking down on extremist material since the attempted insurrection at the United States Capitol on 6 January. The Bureau stated:
    • The Bureau has become aware of discussions on social media platforms suggesting that certain radio services regulated by the Commission may be an alternative to social media platforms for groups to communicate and coordinate future activities.  The Bureau recognizes that these services can be used for a wide range of permitted purposes, including speech that is protected under the First Amendment of the U.S. Constitution.  Amateur and Personal Radio Services, however, may not be used to commit or facilitate crimes. 
    • Specifically, the Bureau reminds amateur licensees that they are prohibited from transmitting “communications intended to facilitate a criminal act” or “messages encoded for the purpose of obscuring their meaning.” Likewise, individuals operating radios in the Personal Radio Services, a category that includes Citizens Band radios, Family Radio Service walkie-talkies, and General Mobile Radio Service, are prohibited from using those radios “in connection with any activity which is against Federal, State or local law.” Individuals using radios in the Amateur or Personal Radio Services in this manner may be subject to severe penalties, including significant fines, seizure of the offending equipment, and, in some cases, criminal prosecution.
  • The European Data Protection Board (EDPB) issued its “Strategy for 2021-2023” in order “[t]o be effective in confronting the main challenges ahead.” The EDPB cautioned:
    • This Strategy does not provide an exhaustive overview of the work of the EDPB in the years to come. Rather it sets out the four main pillars of our strategic objectives, as well as set of key actions to help achieve those objectives. The EDPB will implement this Strategy within its Work Program, and will report on the progress achieved in relation to each Pillar as part of its annual reports.
    • The EDPB listed and explained the four pillars of its strategy:
      • PILLAR 1: ADVANCING HARMONISATION AND FACILITATING COMPLIANCE. The EDPB will continue to strive for a maximum degree of consistency in the application of data protection rules and limit fragmentation among Member States. In addition to providing practical, easily understandable and accessible guidance, the EDPB will develop and promote tools that help to implement data protection into practice, taking into account practical experiences of different stakeholders on the ground.
      • PILLAR 2: SUPPORTING EFFECTIVE ENFORCEMENT AND EFFICIENT COOPERATION BETWEEN NATIONAL SUPERVISORY AUTHORITIES. The EDPB is fully committed to support cooperation between all national supervisory authorities that work together to enforce European data protection law. We will streamline internal processes, combine expertise and promote enhanced coordination. We intend not only to ensure a more efficient functioning of the cooperation and consistency mechanisms, but also to strive for the development of a genuine EU-wide enforcement culture among supervisory authorities.
      • PILLAR 3: A FUNDAMENTAL RIGHTS APPROACH TO NEW TECHNOLOGIES. The protection of personal data helps to ensure that technology, new business models and society develop in accordance with our values, such as human dignity, autonomy and liberty. The EDPB will continuously monitor new and emerging technologies and their potential impact on the fundamental rights and daily lives of individuals. Data protection should work for all people, particularly in the face of processing activities presenting the greatest risks to individuals’ rights and freedoms (e.g. to prevent discrimination). We will help to shape Europe’s digital future in line with our common values and rules. We will continue to work with other regulators and policymakers to promote regulatory coherence and enhanced protection for individuals.
      • PILLAR 4: THE GLOBAL DIMENSION. The EDPB is determined to set and promote high EU and global standards for international data transfers to third countries in the private and the public sector, including in the law enforcement sector. We will reinforce our engagement with the international community to promote EU data protection as a global model and to ensure effective protection of personal data beyond EU borders.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) revealed that all but one of the videoconferencing platforms it and other data protection authorities’ (DPA) July 2020 letter urging them to “adopt principles to guide them in addressing some key privacy risks.” The ICO explained:
    • Microsoft, Cisco, Zoom and Google replied to the open letter. The joint signatories thank these companies for engaging on this important matter and for acknowledging and responding to the concerns raised. In their responses the companies highlighted various privacy and security best practices, measures, and tools that they advise are implemented or built-in to their video teleconferencing services.
    • The information provided by these companies is encouraging. It is a constructive foundation for further discussion on elements of the responses that the joint signatories feel would benefit from more clarity and additional supporting information.
    • The ICO stated:
      • The joint signatories have not received a response to the open letter from Houseparty. They strongly encourage Houseparty to engage with them and respond to the open letter to address the concerns raised.
  • The European Union Agency for Cybersecurity (ENISA) “launched a public consultation, which runs until 7 February 2021, on its draft of the candidate European Union Cybersecurity Certification Scheme on Cloud Services (EUCS)…[that] aims to further improve the Union’s internal market conditions for cloud services by enhancing and streamlining the services’ cybersecurity guarantees.” ENISA stated:
    • There are challenges to the certification of cloud services, such as a diverse set of market players, complex systems and a constantly evolving landscape of cloud services, as well as the existence of different schemes in Member States. The draft EUCS candidate scheme tackles these challenges by calling for cybersecurity best practices across three levels of assurance and by allowing for a transition from current national schemes in the EU. The draft EUCS candidate scheme is a horizontal and technological scheme that intends to provide cybersecurity assurance throughout the cloud supply chain, and form a sound basis for sectoral schemes.
    • More specifically, the draft EUCS candidate scheme:
      • Is a voluntary scheme;
      • The scheme’s certificates will be applicable across the EU Member States;
      • Is applicable for all kinds of cloud services – from infrastructure to applications;
      • Boosts trust in cloud services by defining a reference set of security requirements;
      • Covers three assurance levels: ‘Basic’, ‘Substantial’ and ‘High’;
      • Proposes a new approach inspired by existing national schemes and international standards;
      • Defines a transition path from national schemes in the EU;
      • Grants a three-year certification that can be renewed;
      • Includes transparency requirements such as the location of data processing and storage.

Coming Events

  • The Commerce, Science, and Transportation Committee will hold a hearing on the nomination of Gina Raimondo to be the Secretary of Commerce on 26 January.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Peggy und Marco Lachmann-Anke from Pixabay

UK and EU Defer Decision On Data Flows

Whether there will be an adequacy decision allowing the free flow of personal data under the GDPR from the EU to the recently departed UK has been punted. And, its recent status as a member of the EU notwithstanding, the UK might not get an adequacy decision.

In reaching agreement on many aspects of the United Kingdom’s (UK) exit from the European Union (EU), negotiators did not reach agreement on whether the EU would permit the personal data of EU persons to continue flowing to the UK under the easiest means possible. Instead, the EU and UK agreed to let the status quo continue until an adequacy decision is made or six months lapse. The value of data flowing between the UK and EU was valued at more than £100 billion in 2017 according to British estimates, with the majority of this trade being from the UK to the EU.

Under the General Data Protection Regulation (GDPR), the personal data of EU people can be transferred to other nations for most purposes once the European Commission (EC) has found the other nation has adequate protection equal to those granted in the EU. Of course, this has been an ongoing issue with data flows to the United States (U.S.) as two agreements (Safe Harbor and Privacy Shield) and their EC adequacy decisions were ruled illegal, in large part, because, according to the EU’s highest court, U.S. law does not provide EU persons with the same rights they have in the EU. Most recently, this occurred in 2020 when the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the EU-United States Privacy Shield (aka Schrems II). It bears note that transfers of personal data may occur through other means under the GDPR that may prove more resource intensive: standard data protection clauses (SCC), binding corporate rules (BCR), and others.

Nevertheless, an adequacy decision is seen as the most desirable means of transfer and the question of whether the UK’s laws are sufficient has lingered over the Brexit discussions, with some claiming that the nation’s membership in the Five Eyes surveillance alliance with the U.S. and others possibly disqualifying the UK. Given the range of thorny issues the UK and EU punted (e.g. how to handle the border between Northern Ireland and Ireland), it is not surprising that the GDPR and data flows was also punted.

The UK-EU Trade and Cooperation Agreement (TCA) explained the terms of the data flow agreement and, as noted, in the short term, the status quo will continue with data flows to the UK being treated as if it were still part of the EU. This state will persist until the EC reaches an adequacy decision or for four months with another two months of the status quo being possible in the absence of an adequacy decision so long as neither the UK nor EU object. Moreover, these provisions are operative only so long as the UK has its GDPR compliant data protection law (i.e. UK Data Protection Act 2018) in place and does exercise specified “designated powers.” The UK has also deemed EU and European Economic Area (EEA) and European Free Trade Association (EFTA) nations to be adequate for purposes of data transfers from the UK on a transitional basis.

Specifically, the TCA provides

For the duration of the specified period, transmission of personal data from the Union to the United Kingdom shall not be considered as transfer to a third country under Union law, provided that the data protection legislation of the United Kingdom on 31 December 2020, as it is saved and incorporated into United Kingdom law by the European Union (Withdrawal) Act 2018 and as modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (“the applicable data protection regime”), applies and provided that the United Kingdom does not exercise the designated powers without the agreement of the Union within the Partnership Council.

The UK also agreed to notify the EU if it “enters into a new instrument which can be relied on to transfer personal data to a third country under Article 46(2)(a) of the UK GDPR or section 75(1)(a) of the UK Data Protection Act 2018 during the specified period.” However, if the EU were to object, it appears from the terms of the TCA, all the EU could do is force the UK “to discuss the relevant object.” And yet, should the UK sign a treaty allowing personal data to flow to a nation the EU deems inadequate, this could obviously adversely affect the UK’s prospects of getting an adequacy decision.

Not surprisingly, the agreement also pertains to the continued flow of personal data as part of criminal investigations and law enforcement matters but not national security matters. Moreover, these matters fall outside the scope of the GDPR and would not be affected in many ways by an adequacy decision or a lack of one. In a British government summary, it is stated that the TCA

provide[s] for law enforcement and judicial cooperation between the UK, the EU and its Member States in relation to the prevention, investigation, detection and prosecution of criminal offences and the prevention of and fight against money laundering and financing of terrorism.

The text of the TCA makes clear national security matters visa vis data flows and information sharing are not covered:

This Part only applies to law enforcement and judicial cooperation in criminal matters taking place exclusively between the United Kingdom, on the one side, and the Union and the Member States, on the other side. It does not apply to situations arising between the Member States, or between Member States and Union institutions, bodies, offices and agencies, nor does it apply to the activities of authorities with responsibilities for safeguarding national security when acting in that field.

The TCA also affirms:

  • The cooperation provided for in this Part is based on the Parties’ long-standing commitment to ensuring a high level of protection of personal data.
  • To reflect that high level of protection, the Parties shall ensure that personal data processed under this Part is subject to effective safeguards in the Parties’ respective data protection regimes…

The United Kingdom’s data protection authority (DPA), the Information Commissioner’s Office (ICO), issued an explanation of how British law enforcement entities should act in light of the TCA. The ICO explained to British entities on law enforcement-related data transfers to the UK:

  • We are now a ‘third country’ for EU data protection purposes. If you receive personal data from a law enforcement partner in the EU, this means the sender will need to comply with the transfer provisions under their national data protection law (which are likely to be similar to those in Part 3 of the DPA 2018).
  • This means the EU sender needs to make sure other appropriate safeguards are in place – probably through a contract or other binding legal instrument, or by making their own assessment of appropriate safeguards. The sender can take into account the protection provided by the DPA 2018 itself when making this assessment.
  • If you receive personal data from other types of organisations in the EU or EEA who are subject to the GDPR, the sender will need to comply with the transfer provisions of the UK GDPR. You may want to consider putting standard contractual clauses (SCCs) in place to ensure adequate safeguards in these cases. We have produced an interactive tool to help you use the SCCs.

The ICO explained for transfers from the UK to the EU (but not the EEA):

  • There is a transitional adequacy decision in place to cover transfers to EU member states and Gibraltar. This will not extend to EEA countries outside the EU, where you should continue to consider other safeguards.
  • This means you can continue to send personal data from the UK to your law enforcement partners in the EU, as long as you can show the transfer is necessary for law enforcement purposes. You can also transfer personal data to non-law enforcement bodies in the EU if you can meet some additional conditions, but you will need to notify the ICO.

Turning back to an adequacy decision and commercial transfers of personal data from the EU to the UK, in what may well be a preview of a world in which there is no adequacy decision between the UK and EU, the European Data Protection Board (EDPB) issued an “information note” in mid-December that spells out how the GDPR would be applied:

  • In the absence of an adequacy decision applicable to the UK as per Article 45 GDPR, such transfers will require appropriate safeguards(e.g., standard data protection clauses, binding corporate rules, codes of conduct…), as well as enforceable data subject rights and effective legal remedies for data subjects, in accordance with Article 46 GDPR.
  • Subject to specific conditions, it may still be possible to transfer personal data to the UK based on a derogation listed in Article 49 GDPR. However, Article 49 GDPR has an exceptional nature and the derogations it contains must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive.
  • Moreover, where personal data are transferred to the UK on the basis of Article 46 GDPR safeguards, supplementary measures might be necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, in accordance with the Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.

Regarding commercial data transfers, the ICO issued a statement urging British entities to start setting up “alternative transfer mechanisms” to ensure data continues to flow from the EU to UK:

  • The Government has announced that the Treaty agreed with the EU will allow personal data to flow freely from the EU (and EEA) to the UK, until adequacy decisions have been adopted, for no more than six months.
  • This will enable businesses and public bodies across all sectors to continue to freely receive data from the EU (and EEA), including law enforcement agencies.
  • As a sensible precaution, before and during this period, the ICO recommends that businesses work with EU and EEA organisations who transfer personal data to them, to put in place alternative transfer mechanisms, to safeguard against any interruption to the free flow of EU to UK personal data.

However, even with these more restrictive means of transferring personal data to the UK exist, there will likely be legal challenges. It bears note that in light of Schrems II, EU DPAs are likely to apply a much higher level of scrutiny to SCCs, and challenges to the legality of using SCCs to transfer personal data to the U.S. have already been commenced. It also seems certain the legality of using SCCs to transfer data to the UK would be challenged, as well.

However, returning to the preliminary issue of whether the EC will give the UK an adequacy decision, there may a number of obstacles to a finding that the UK’s data protection and surveillance laws are indeed adequate under EU law[1]. Firstly, the UK’s surveillance practices in light of a recent set of CJEU rulings may prove difficult for the EC to stomach. In 2020, the CJEU handed down a pair of rulings (here and here) on the extent to which European Union (EU) nations may engage in bulk, indiscriminate collection of two types of data related to electronic communications. The CJEU found that while EU member nations may conduct these activities to combat crime or national security threats during periods limited by necessity and subject to oversight, nations may not generally require the providers of electronic communications to store and provide indiscriminate location data and traffic data in response to an actual national security danger or a prospective one. The CJEU combined three cases into two rulings that came from the UK, France, and Belgium to elucidate the reach of the Privacy and Electronic Communications Directive in relation to foundational EU laws.

The UK is, of course, one of the U.S.’s staunchest allies and partners when it comes to government surveillance of electronic communications. On this point, the CJEU summarized the beginning of the case out of the UK:

  • At the beginning of 2015, the existence of practices for the acquisition and use of bulk communications data by the various security and intelligence agencies of the United Kingdom, namely GCHQ, MI5 and MI6, was made public, including in a report by the Intelligence and Security Committee of Parliament (United Kingdom). On 5 June 2015, Privacy International, a non-governmental organisation, brought an action before the Investigatory Powers Tribunal (United Kingdom) against the Secretary of State for Foreign and Commonwealth Affairs, the Secretary of State for the Home Department and those security and intelligence agencies, challenging the lawfulness of those practices.

Secondly, the government of Prime Minister Boris Johnson may aspire to change data laws in ways the EU does not. In media accounts, unnamed EC officials were critical of the UK’s 2020 “National Data Strategy,” particularly references to “legal barriers (real and perceived)” to accessing data that “must be addressed.”

Thirdly, it may become a matter of politics. The EU has incentives to make the UK’s exit from the EU difficult to dissuade other nations from following the same path. Moreover, having previously been the second largest economy in the EU as measured by GDP, the UK may prove a formidable economy competitor, lending more weight to the view that the EU may not want to help the UK’s  businesses compete with the EU’s.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by succo from Pixabay


[1] European Union Parliament, “The EU-UK relationship beyond Brexit: options for Police Cooperation and Judicial Cooperation in Criminal Matters,” Page 8: Although the UK legal framework is currently broadly in line with the EU legal framework and the UK is a signatory to the European Convention on Human Rights (ECHR), there are substantial questions over whether the Data Protection Act fully incorporates the data protection elements required by the Charter of Fundamental Rights, concerning the use of the national security exemption from the GDPR used by the UK, the retention of data and bulk powers granted to its security services, and over its onward transfer of this data to third country security partners such as the ‘Five Eyes’ partners (Britain, the USA, Australia, New Zealand and Canada).

EU Regulators Settle Dispute Over Proper Punishment of Twitter For Breach

The EDPB uses its GDPR powers to manage a dispute between DPAs.

The European Data Protection Board (EDPB) concluded its first use of powers granted under the General Data Protection Regulation (GDPR) (Regulation (EU) 2016/679 of the European Parliament and of the Council) to resolve a dispute among EU regulators on how to apply the GDPR in punishing a violator. In this case, the EDPB had to referee how Twitter should be punished for a data breach arising from a bug affecting users of an Android OS. Ireland’s Data Protection Commission (DPC) and unnamed concerned supervisory agencies (CSA) disagreed about how Twitter should be fined for the GDPR breach, and so an unused article of the GDPR was triggered that put the EDPB in charge of resolving the dispute. The EDPB considered the objections raised by other EU agencies and found that the DPC needed to recalculate its fine that was set to be a maximum of $300,000 of a possible $69.2 million. Thereafter, the DPC revised and decided that “an administrative fine of €450,000 on Twitter” is “an effective, proportionate and dissuasive measure.”

The DPC issued a revised decision that incorporates the EDPB’s decision on the case that arose from a glitch that changed a person’s protected tweets to unprotected. Twitter users may protect their tweets, meaning only certain people, usually just followers, can see this content. However, a bug with the Android OS resulted in a person’s desire to protect their tweets being thwarted the DPC explained:

The bug that resulted in this data breach meant that, if a user operating an Android device changed the  email  address  associated  with  that  Twitter  account,  their  tweets  became  unprotected  and consequently were accessible to the wider public without the user’s knowledge.

The DPC said this breach occurred between September 2017 and January 2019, affecting 88,726 EU and European Economic Area (EEA) users, and on 8 January 2019, Twitter alerted the DPC, triggering an investigation. Twitter revealed:

On 26 December 2018, we received a bug report through our bug bounty program that if a Twitter user with a protected account, using Twitter for Android, changed their email address the bug would result in their account being unprotected.

Article 33(1) of the GDPR requires breaches to be reported to a DPA within 72 hours in most cases:

In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.

However, Twitter conceded by way of reason as to why it had not reported the breach within the 72 hour window:

The severity of the issue – and that it was reportable – was not appreciated until 3 January 2018 at which point Twitter’s incident response process was put into action.

Additionally, Article 33(5) would become relevant during the DPC investigation:

The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.

Consequently, Twitter had a responsibility as the controller to document all the relevant facts about the data breach and then to report the breach within 72 hours of becoming aware of the breach subject to a range of exceptions.

Shortly thereafter, the DPC named itself the lead supervisory agency (LSA), investigated and reached its proposed decision in late April and submitted it to the European Commission (EC). And, this is where the need for the EDPB to step in began.

Irish Data Protection Commissioner Helen Dixon explained the scope of the subsequent investigation:

  1. Whether Twitter International Company (TIC) complied with its obligations, in accordance with Article 33(1) GDPR, to notify the Commission of the Breach without undue delay and, where feasible, not later than 72 hours after having become aware of it; and
  2. Whether TIC complied with its obligation under Article 33(5) to document the Breach.

Dixon found that TIC did not comply with Article 33(1) and found unpersuasive the main claim of TIC that because Twitter, International, its processor under EU law, did not alert TIC in a timely fashion, it need not meet the 72 hour window. Moreover, Dixon found TIC did not meet its Article 33(5) obligations such that its compliance with Article 33 could be determined. However, the size of the fine became the issue necessitating the EDPB step in because the Austrian Supervisory Authority (Österreichische Datenschutzbehörde), the German Supervisory Authority (Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit) and the Italian Supervisory Authority (Garante per la protezione dei dati personali) made “relevant and reasoned” objections.

Per the GDPR, the EDPB intervened. Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

In its mid-November statement, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 2% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

According to the DPC, the EDPB ultimately decided that

…the [DPC] is required to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC, and to amend its Draft Decision by increasing the level of the fine in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.

Dixon went back and reasoned through the breach and compliance. She stressed that the GDPR infringements were largely aside and apart from the substance of the breach, which is why the administrative fine was low. Nonetheless, Dixon reexamined the evidence in light of the EDPB’s decision and concluded in relevant part:

  • I therefore consider that the nature of the obligations arising under Article 33(1) and Article 33(5) are such that, compliance is central to the overall functioning of the supervision and enforcement regime performed by supervisory authorities in relation to both the specific issue of personal data breaches but also the identification and assessment of wider issues of non-compliance by controllers. As such, non-compliance with these obligations has serious consequences in that it risks undermining the effective exercise by supervisory authorities of their functions under the GDPR. With regard to the nature of the specific infringements in these circumstances, it is clear, having regard to the foregoing, that in the circumstances of this case, the delayed notification under Article 33(1) inevitably delayed the Commission’s assessment of the Breach. With regard to Article 33(5), the deficiencies in the “documenting” of the Breach by TIC impacted on the Commission’s overall efficient assessment of the Breach, necessitating the raising of multiple queries concerning the facts and sequencing surrounding the notification of the Breach.
  • Accordingly, having regard to the potential for damage to data subjects caused by the delayed notification to the Commission (which I have set out above in the context of Article 83(2)(a)), the corollary of this is that any category of personal data could have been affected by the delayed notification. Whilst, as stated above, there was no direct evidence of damage, at the same time, it cannot be definitively said that there was no damage to data subjects or no affected categories of personal data.

Dixon also recalculated the fine that she noted was bound on the upper limit at €10 million or 2% of annual worldwide revenue after once again turning aside TIC’s argument that it independent of Twitter for purposes of determining a fine. Dixon determined the appropriate administrative fine would be about $500,000 and Twitter’s worldwide revenue was $3.46 billion in 2019 (meaning a maximum penalty of $69.2 million.) Dixon explained:

Having regard to all of the foregoing, and, in particular, having had due regard to all of the factors which I am required to consider under Articles 83(2)(a) to (k), as applicable, and in the interests of effectiveness, proportionality and deterrence, and in light of the re-assessment of the elements I have implemented and documented above in accordance with the EDPB Decision, I have decided to impose an administrative fine of $500,000, which equates (in my estimation for this purpose) to €450,000. In deciding to impose a fine in this amount, I have had regard to the previous range of the fine, set out in the Draft Decision (of $150,000 – $300,000), and to the binding direction in the EDPB Decision, at paragraph 207 thereof, that the level of the fine should be increased “..in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.”

In its Article 65 decision, the EDPB judged the various objections to the DPC’s proposed decision against Article 4(24) of the GDPR:

‘relevant and reasoned objection’ means an objection to a draft decision as to whether there is an infringement of this Regulation, or whether envisaged action in relation to the controller or processor complies with this Regulation, which clearly demonstrates the significance of the risks posed by the draft decision as regards the fundamental rights and freedoms of data subjects and, where applicable, the free flow of personal data within the Union;

The EDPB ultimately decided “the fine proposed in the Draft Decision is too low and therefore does not fulfil its purpose as a corrective measure, in particular it does not meet the requirements of Article 83(1) GDPR of being effective, dissuasive and proportionate.” The EDPB directed the DPC “to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC so as to ensure it is appropriate to the facts of the case.” However, the EDPB turned aside a number of other objections raised by EU DPAs as failing to meet the standard of review in Article 4(24):

  • the competence of the LSA;
  • the qualification of the roles of TIC and Twitter, Inc., respectively;
  • the infringements of the GDPR identified by the LSA;
  • the existence of possible additional (or alternative) infringements of the GDPR;
  • the lack of a reprimand;

However, the EDPB stressed:

Regarding the objections deemed not to meet the requirements stipulated by Art 4(24) GDPR, the EDPB does not take any position on the merit of any substantial issues raised by these objections. The EDPB reiterates that its current decision is without any prejudice to any assessments the EDPB may be called upon to make in other cases, including with the same parties, taking into account the contents of the relevant draft decision and the objections raised by the CSAs.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by papagnoc from Pixabay

Privacy Shield Hearing

The focus was on how the U.S. and EU can reach agreement on an arrangement that will not be struck down by the EU’s highest court.

Last week, the Senate Commerce, Science, and Transportation Committee held a hearing on the now invalidated European Union (EU)-United States (U.S.) Privacy Shield, a mechanism that allowed companies to transfer the personal data of EU residents to the U.S. The EU’s highest court struck down the adequacy decision that underpinned the system on the basis of U.S. surveillance activities and lack of redress that violated EU law. This is the second time in the decade the EU’s top court has invalidated a transfer arrangement, the first being the Safe Harbor system. Given the estimated billions, or even trillions, of dollars in value realized from data flows between the EU and U.S. there is keen interest on both sides of the Atlantic in finding a legal path forward. However, absent significant curtailment of U.S. surveillance and/or a significant expansion of the means by which EU nationals could have violations of their rights rectified, it would appear a third agreement may not withstand the inevitable legal challenges. Moreover, there are questions as to the legality of other transfer tools in light of the Court of Justice for the European Union’s decision in the case known as Schrems II, and the legality of some Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) may be soon be found in violation, too.

Consequently, a legislative fix, or some portion thereof, could be attached to federal privacy legislation. Hence, the striking down of Privacy Shield may provide additional impetus to Congress and the next Administration to reach a deal on privacy. Moreover, the lapsed reauthorization of some Foreign Intelligence Surveillance Act authorities may be another legislative opportunity for the U.S. to craft an approach amendable to the EU in order to either obtain an adequacy decision or a successor agreement to the Privacy Shield.

Chair Roger Wicker (R-MS) approached the issue from the perspective of international trade and the economic benefit accruing to businesses on both sides of the Atlantic. His opening remarks pertained less to the privacy and surveillance aspects of the CJEU’s ruling. Wicker appears to be making the case that the EU seems to misunderstand that redress rights in the U.S. are more than adequate, and the U.S.’ surveillance regime is similar to those of some EU nations. One wonders if the CJEU is inclined to agree with this position. Nonetheless, Wicker expressed hope that the EU and U.S. can reach “a durable and lasting data transfer framework…that provides meaningful data protections to consumers, sustains the free flow of information across the Atlantic, and encourages continued economic and strategic partnership with our European allies – a tall order but an essential order.” He worried about the effect of the CJEU’s ruling on SCCs. Wicker made the case that the EU and U.S. share democratic values and hinted that the ongoing talks in the committee to reach a federal data privacy law might include augmented redress rights that might satisfy the CJEU.

Ranking Member Maria Cantwell (D-WA) spoke very broadly about a range of issues related to data transfers and privacy. She stressed the importance of data flows in the context of larger trade relations. Cantwell also stressed the shared values between the U.S. and the EU and her hope that the two entities work “together on these very important national concerns, trade and technology, so that we can continue to improve economic opportunities and avoid moves towards protectionism.” She also called for federal privacy legislation but hinted that states should still be able to regulate privacy, suggesting her commitment to having a federal law be a floor for state laws. Cantwell also asserted that bulk surveillance, the likes of which the National security Agency has engaged in, may simply not be legal under EU law.

Deputy Assistant Secretary of Commerce for Services James Sullivan blurred the issues presented by Schrems II much like Cantwell did. The CJEU’s decision that focused on U.S. surveillance practices and the lack of meaningful recourse in the U.S. if an EU resident’s rights were violated was merged into a call for like-minded nations to unite against authoritarian nations. Sullivan distinguished between U.S. surveillance and the surveillance conducted by the People’s Republic of China (without naming the nation) and other regimes as if this should satisfy the EU as to the legality and propriety of U.S. treatment of EU personal data. Sullivan stated:

  • The Schrems II decision has created enormous uncertainties for U.S. companies and the transatlantic economy at a particularly precarious time. Immediately upon issuance of the ruling, the 5,400 Privacy Shield participants and their business partners in the EU could no longer rely on the Framework as a lawful basis for transferring personal data from Europe to the United States. Because neither the Court nor European data protection authorities provided for any enforcement grace period, Privacy Shield companies were left with three choices: (1) risk facing potentially huge fines (of up to 4 percent of total global turnover in the preceding year) for violating GDPR, (2) withdraw from the European market, or (3) switch right away to another more expensive data transfer mechanism.
  • Unfortunately, because of the Court’s ruling in the Privacy Shield context that U.S. laws relating to government access to data do not confer adequate protections for EU personal data, the use of other mechanisms like SCCs and BCRs to transfer EU personal data to the United States is now in question as well.
  • The objective of any potential agreement between the United States and the European Commission to address Schrems II is to restore the continuity of transatlantic data flows and the Framework’s privacy protections by negotiating targeted enhancements to Privacy Shield that address the Court’s concerns in Schrems II. Any such enhancements must respect the U.S. Government’s security responsibilities to our citizens and allies.
  • To be clear, we expect that any enhancements to the Privacy Shield Framework would also cover transfers under all other EU-approved data transfer mechanisms like SCCs and BCRs as well.
  • The Schrems II decision has underscored the need for a broader discussion among likeminded democracies on the issue of government access to data. Especially as a result of the extensive U.S. surveillance reforms since 2015, the United States affords privacy protections relating to national security data access that are equivalent to or greater than those provided by many other democracies in Europe and elsewhere.
  • To minimize future disruptions to data transfers, we have engaged with the European Union and other democratic nations in a multilateral discussion to develop principles based on common practices for addressing how best to reconcile law enforcement and national security needs for data with protection of individual rights.
  • It is our view that democracies should come together to articulate shared principles regarding government access to personal data—to help make clear the distinction between democratic societies that respect civil liberties and the rule of law and authoritarian governments that engage in the unbridled collection of personal data to surveil, manipulate, and control their citizens and other individuals without regard to personal privacy and human rights. Such principles would allow us to work with like-minded partners in preserving and promoting a free and open Internet enabled by the seamless flow of data.

Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips stressed he was speaking in a personal capacity and not for the FTC. He extolled the virtues of the “free and open” internet model in the U.S. with the double implication that it is superior both to nations like the PRC and Russia but also the EU model. Phillips seemed to be advocating for talking the EU into accepting that the U.S.’s privacy regime and civil liberties are stronger than any other nation. Her also made the case, like other witnesses, that the U.S. data privacy and protection regulation is more similar to the EU than the PRC, Russia, and others. Phillips also sought to blur the issues and recast Privacy Shield in the context of the global struggle between democracies and authoritarian regimes. Phillips asserted:

  • First, we need to find a path forward after Schrems II, to permit transfers between the U.S. and EU. I want to recognize the efforts of U.S. and EU negotiators to find a replacement for Privacy Shield. While no doubt challenging, I have confidence in the good faith and commitment of public servants like Jim Sullivan, with whom I have the honor of appearing today, and our partners across the Atlantic. I have every hope and expectation that protecting cross-border data flows will be a priority for the incoming Administration, and I ask for your help in ensuring it is.
  • Second, we must actively engage with nations evaluating their approach to digital governance, something we at the FTC have done, to share and promote the benefits of a free and open Internet. There is an active conversation ongoing internationally, and at every opportunity—whether in public forums or via private assistance—we must ensure our voice and view is heard.
  • Third, we should be vocal in our defense of American values and policies. While we as Americans always look to improve our laws—and I commend the members of this committee on their important work on privacy legislation and other critical matters—we do not need to apologize to the world. When it comes to civil liberties or the enforcement of privacy laws, we are second to none. Indeed, in my view, the overall U.S. privacy framework—especially with the additional protections built into Privacy Shield—should certainly qualify as adequate under EU standards.
  • Fourth, as European leaders call to strengthen ties with the U.S., we should prioritize making our regimes compatible for the free flow of data. This extends to the data governance regimes of like-minded countries outside of Europe as well. Different nations will have different rules, but relatively minor differences need not impede mutually-beneficial commerce. We need not and should not purport to aim for a single, identical system of data governance. And we should remind our allies, and remind ourselves, that far more unites liberal democracies than divides us.
  • Fifth and finally, if we must draw lines, those lines should be drawn between allies with shared values—the U.S., Europe, Japan, Australia, and others—and those, like China and Russia, that offer a starkly different vision. I am certainly encouraged when I hear recognition of this distinction from Europe. European Data Protection Supervisor Wojciech Wiewiórowski recently noted that the U.S. is much closer to Europe than is China and that he has a preference for data being processed by countries that share values with Europe. Some here in the U.S. are even proposing agreements to solidify the relationships among technologically advanced democracies, an idea worth exploring in more detail

Washington University Professor of Law Neil Richards stressed that the Schrems II decision spells out how the U.S. would achieve adequacy: reforming surveillance and providing meaningful redress for alleged privacy violations. Consequently, FISA would need to be rewritten and narrowed and a means for EU residents to seek relief beyond the current Ombudsman system is needed, possibly a statutory right to sue. Moreover, he asserted strong data protection and privacy laws are needed and some of the bills introduced in this Congress could fit the bill. Richards asserted:

In sum, the Schrems litigation is a creature of distrust, and while it has created problems for American law and commerce, it has also created a great opportunity. That opportunity lies before this Committee –the chance to regain American leadership in global privacy and data protection by passing a comprehensive law that provides appropriate safeguards, enforceable rights, and effective legal remedies for consumers. I believe that the way forward can not only safeguard the ability to share personal data across the Atlantic, but it can do so in a way that builds trust between the United States and our European trading partners and between American companies and their American and European customers. I believe that there is a way forward, but it requires us to recognize that strong, clear, trust-building rules are not hostile to business interest, that we need to push past the failed system of “notice and choice,” that we need to preserve effective consumer remedies and state-level regulatory innovation, and seriously consider a duty of loyalty. In that direction, I believe, lies not just consumer protection, but international cooperation and economic prosperity.

Georgia Tech University Professor Peter Swire explained that the current circumstances make the next Congress the best possibility in memory to enact privacy legislation because of the need for a Privacy Shield replacement, passage of the new California Privacy Rights Act (Proposition 24), and the Biden Administration’s likely support for such legislation. Swire made the following points:

  1. The European Data Protection Board in November issued draft guidance with an extremely strict interpretation of how to implement the Schrems II case.
  2. The decision in Schrems II is based on EU constitutional law. There are varying current interpretations in Europe of what is required by Schrems II, but constitutional requirements may restrict the range of options available to EU and U.S. policymakers.
  3. Strict EU rules about data transfers, such as the draft EDPB guidance, would appear to result in strict data localization, creating numerous major issues for EU- and U.S.-based businesses, as well as affecting many online activities of EU individuals.
  4. Along with concerns about lack of individual redress, the CJEU found that the EU Commission had not established that U.S. surveillance was “proportionate” in its scope and operation. Appendix 2 to this testimony seeks to contribute to an informed judgment on proportionality, by cataloguing developments in U.S. surveillance safeguards since the Commission’s issuance of its Privacy Shield decision in 2016.
  5. Negotiating an EU/U.S. adequacy agreement is important in the short term.
  6. A short-run agreement would assist in creating a better overall long-run agreement or agreements.
  7. As the U.S. considers its own possible legal reforms in the aftermath of Schrems II, it is prudent and a normal part of negotiations to seek to understand where the other party – the EU – may have flexibility to reform its own laws.
  8. Issues related to Schrems II have largely been bipartisan in the U.S., with substantial continuity across the Obama and Trump administrations, and expected as well for a Biden administration.
  9. Passing comprehensive privacy legislation would help considerably in EU/U.S. negotiations.
  10. This Congress may have a unique opportunity to enact comprehensive commercial privacy legislation for the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dooffy Design from Pixabay

Further Reading, Other Developments, and Coming Events (18 November)

Further Reading

  • Trump fires top DHS official who refuted his claims that the election was rigged” By Ellen Nakashima and Nick Miroff — The Washington Post. As rumored, President Donald Trump has decapitated the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA). Director Christopher Krebs was fired via Twitter, after he had endorsed a letter by 59 experts on election security who said there was no fraud in the election. Trump tweeted: “The recent statement by Chris Krebs on the security of the 2020 Election was highly inaccurate, in that there were massive improprieties and fraud — including dead people voting, Poll Watchers not allowed into polling locations, ‘glitches’ in the voting machines which changed votes from Trump to Biden, late voting, and many more. Therefore, effective immediately, Chris Krebs has been terminated as Director of the Cybersecurity and Infrastructure Security Agency.” Of course, the statement CISA cosigned and issued last week asserting there was no evidence of fraud or wrongdoing in the election probably did not help his prospects. Additionally, CISA Deputy Director Matthew Travis was essentially forced out when he was informed the normal succession plan would be ignored and he would not become the acting head of CISA. A CISA senior civil servant, Brandon Wales, will helm the agency in an acting basis. Last week, CISA’s Assistant Director for Cybersecurity Bryan Ware was forced out.
  • NSA Spied On Denmark As It Chose Its Future Fighter Aircraft: Report” By Thomas Newdick — The Drive. A Danish media outlet is claiming the United States U.S. National Security Agency (NSA) spied Denmark’s Ministry of Finance, the Ministry of Foreign Affairs, and the defense firm Terma in order to help Lockheed Martin’s bid to sell F-35 Joint Strike Fighters to Denmark. Eurofighter GmbH and Saab were offering their Typhoon and Gripen fighters to replace Denmark’s F-16s. Reportedly, the NSA used an existing arrangement with Denmark to obtain information from a program allowing the NSA access to fiber optics cables in the country. It is likely Denmark did not have such surveillance in mind when it struck this agreement with the U.S. Two whistleblowers reports have been filed with the Forsvarets Efterretningstjeneste (FE), Denmark’s Defense Intelligence Service, and there are allegations that the U.S. surveillance was illegal. However, the surveillance appears not to have influenced the Danish government, which opted for the F-35. Earlier this year, there were allegations the FE was improperly sharing Danish cables containing information on Danish citizens improperly.
  • Facebook Knows That Adding Labels To Trump’s False Claims Does Little To Stop Their Spread” By Craig Silverman and Ryan Mac — BuzzFeed News. These reporters must know half of Facebook’s staff because they always see what is going on internally with the company. In this latest scoop, they say they have seen internal numbers showing that labeling President Donald Trump’s false tweets have done little to slow their spread. In fact, labelling may only slow their spread by 8%. This outcome is contrary to a practice Facebook employed in 2017 under which fact checkers would label untrue posts as false. This reduced their virality by 80%.
  • Apple Halves Its App Store Fee for the Smaller Companies” By Jack Nicas — The New York Times. The holiday spirit must already be afoot in Cupertino, California, for small app developers will now only pay Apple 15% of in-app purchases for the privilege of being in the App Store. Of course, this decision has nothing to do with the antitrust pressure the company is facing in the European Union and United States (U.S.) and will have very little impact on their bottom line since app developers with less than $1 million in revenue (i.e., those entitled to a reduction) account for 2% of App Store revenue. It does give Apple leadership and executive some great talking points when pressed by antitrust investigators, legislators, and the media.
  • Inside the behind-the-scenes fight to convince Joe Biden about Silicon Valley” By Theodore Schleifer — recode. The jockeying among factions in the Democratic party and other stakeholders is fierce and will only grow fiercer when it comes to who will serve where in a Biden Administration. Silicon Valley and those who would reform tech are fighting to get people amenable to their policy goals placed in the new Administration. President-elect Joe Biden and his campaign were ambiguous on many tech policy issues and have flexibility which has been further helped by appointing people respected in both camps like new White House Chief of Staff Ron Klain.
  • Group of 165 Google critics calls for swift EU antitrust action – letter” By Foo Yun Chee — Reuters. A wide-ranging group of companies and industry associations are urging the European Union to investigate and punish what they see as Google’s anti-competitive dominance of online search engines, especially the One Box that now appears at the top of search results that points people to Google sites and products.

Other Developments

  • The European Union (EU) announced a revision of its export control process for allowing the export of dual use items, including cyber surveillance tools. The European Commission (EC) asserted “[t]hanks to the new Regulation, the EU can now effectively protect its interests and values and, in particular, address the risk of violations of human rights associated with trade in cyber-surveillance technologies without prior agreement at multilateral level…[and] also enhances the EU’s capacity to control trade flows in sensitive new and emerging technologies. The EC explained “[t]he new Regulation includes many of the Commission proposals for a comprehensive “system upgrade”, and will make the existing EU Export control system more effective by:
    • introducing a novel ‘human security’ dimension so the EU can respond to the challenges posed by emerging dual-use technologies – especially cyber-surveillance technologies – that pose a risk to national and international security, including protecting human rights;
    • updating key notions and definitions (e.g. definition of an “exporter” to apply to natural persons and researchers involved in dual-use technology transfers);
    • simplifying and harmonising licensing procedures and allowing the Commission to amend – by ‘simplified’ procedure, i.e. delegated act – the list of items or destinations subject to specific forms of control, thereby making the export control system more agile and able to evolve and adjust to circumstances;
    • enhancing information-exchange between licensing authorities and the Commission with a view to increasing transparency of licensing decisions;
    • coordination of, and support for, robust enforcement of controls, including enhancing secure electronic information-exchange between licensing and enforcement agencies;
    • developing an EU capacity-building and training programme for Member States’ licensing and enforcement authorities;
    • outreach to industry and transparency with stakeholders, developing a structured relationship with the private sector through specific consultations of stakeholders by the relevant Commission group of Member-State experts, and;
    • setting up a dialogue with third countries and seeking a level playing field at global level.
    • The European Parliament contended:
      • The reviewed rules, agreed by Parliament and Council negotiators, govern the export of so-called dual use goods, software and technology – for example, high-performance computers, drones and certain chemicals – with civilian applications that might be repurposed to be used in ways which violate human rights.
      • The current update, made necessary by technological developments and growing security risks, includes new criteria to grant or reject export licenses for certain items.
      • The Parliament added its negotiators
        • got agreement on setting up an EU-wide regime to control cyber-surveillance items that are not listed as dual-use items in international regimes, in the interest of protecting human rights and political freedoms;
        • strengthened member states’ public reporting obligations on export controls, so far patchy, to make the cyber-surveillance sector in particular more transparent;
        • increased the importance of human rights as licensing criterion; and
        • agreed on rules to swiftly include emerging technologies in the regulation.
  • The United States House of Representatives passed three technology bills by voice vote yesterday. Two of these bills would address in different ways the United States’ (U.S.) efforts to make up ground on the People’s Republic of China in the race to roll out 5G networks. It is possible but not foreseeable whether the Senate will take up these bills before year’s end and send them to the White House. It is possible given how discrete the bills are in scope. The House Energy and Commerce Committee provided these summaries:
    • The “Utilizing Strategic Allied (USA) Telecommunications Act of 2020” (H.R.6624) creates a new grant program through the National Telecommunications and Information Administration (NTIA) to promote technology that enhances supply chain security and market competitiveness in wireless communications networks.
      • One of the bill’s sponsors, House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) stated:
        • Earlier this year, the House passed, and the President signed, my Secure and Trusted Communications Networks Act to create a program to fund the replacement of suspect network equipment. Suspect equipment, including that produced by Huawei and ZTE, could allow foreign adversaries to surveil Americans at home or, worse, disrupt our communications systems.
        • While we are still pushing for Congress to appropriate funds to that end, it is important to recognize that my legislation was only half the battle, even when it is funded. We also need to create and foster competition for trusted network equipment that uses open interfaces so that the United States is not beholden to a market for network equipment that is becoming less competitive. This bill before us today, the Utilizing Strategic Allied Telecommunications Act, or the USA Telecommunications Act, does just that.
        • The bipartisan legislation creates a grant program and authorizes $750 million in funding for the National Telecommunications and Information Administration to help promote and deploy Open Radio Access Network technologies that can spur that type of competition. We must support alternatives to companies like Huawei and ZTE…
    • The “Spectrum IT Modernization Act of 2020” (H.R.7310) requires NTIA – in consultation with the Policy and Plans Steering Group – to submit to Congress a report on its plans to modernize agency information technology systems relating to managing the use of federal spectrum. 
      • A sponsor of the bill, House Energy and Commerce Committee Ranking Member Greg Walden (R-OR) explained:
      • H.R. 7310 would require NTIA to establish a process to upgrade their spectrum management infrastructure for the 21st century. The bill would direct the policy coordination arm of NTIA to submit a plan to Congress as to how they will standardize the data collection across agencies and then directs agencies with Federal spectrum assignments from NTIA to issue an implementation plan to interoperate with NTIA’s plan.
      • This is a good-government bill–it really is–and with continued support and oversight from Congress, we can continue the United States’ leadership in making Federal spectrum available for flexible use by the private sector.
    • The “Reliable Emergency Alert Distribution Improvement (READI) Act of 2020” (H.R.6096) amends the Warning, Alert, and Response Network Act to classify emergency alerts from the Federal Emergency Management Agency as a type of alert that commercial mobile service providers may not allow subscribers to block from their devices. The bill also directs the Federal Communications Commission (FCC) to adopt regulations to facilitate coordination with State Emergency Communications Committees in developing and modernizing State Emergency Alert System plans. Finally, the READI Act directs the FCC to examine the feasibility of modernizing the Emergency Alert System by expanding alert distribution to the internet and streaming services.  
  • The same privacy activists that brought the suits that resulted in the striking down of the Safe Harbor and Privacy Shield agreements have filed complaints in Spain and Germany that Apple has violated the European Union’s (EU) e-Privacy Directive and laws in each nation through its use of IDFA (Apple’s Identifier for Advertisers). Because the General Data Protection Regulation (GDPR) is not the grounds for the complaints, each nation could act without needing to consult other EU nations. Moreover, a similar system used by Google is also being investigated for possible violations. The group none of your business (noyb) asserted:
    • IDFA – the cookie in every iPhone user’s pocket. Each iPhone runs on Apple’s iOS operating system. By default, iOS automatically generates a unique “IDFA” (short for Identifier for Advertisers) for each iPhone. Just like a license plate this unique string of numbers and characters allows Apple and other third parties to identify users across applications and even connect online and mobile behaviour (“cross device tracking”).
    • Tracking without user consent. Apple’s operating system creates the IDFA without user’s knowledge or consent. After its creation, Apple and third parties (e.g. applications providers and advertisers) can access the IDFA to track users’ behaviour, elaborate consumption preferences and provide personalised advertising. Such tracking is strictly regulated by the EU “Cookie Law” (Article 5(3) of the e-Privacy Directive) and requires the users’ informed and unambiguous consent.
    • Insufficient “improvement” on third-party access. Recently Apple announced plans for future changes to the IDFA system. These changes seem to restrict the use of the IDFA for third parties (but not for Apple itself). Just like when an app requests access to the camera or microphone, the plans foresee a new dialog that asks the user if an app should be able to access the IDFA. However, the initial storage of the IDFA and Apple’s use of it will still be done without the users’ consent and therefore in breach of EU law. It is unclear when and if these changes will be implemented by the company.
    • No need for EU cooperation. As the complaint is based on Article 5(3) of the e-Privacy Directive and not the GDPR, the Spanish and German authorities can directly fine Apple, without the need for cooperation among EU Data Protection Authorities as under GDPR.
  • The Federal Trade Commission (FTC) Chair made remarks at antitrust conference on how antitrust law should view “an acquisition of a nascent competitive threat by a monopolist when there is reason to think that the state of competition today may not tell the whole story.” Chair Joseph Simons views are timely for a number of reasons, particularly the extent to which large technology firms have sought and bought smaller, newer companies. Obviously, the acquisitions of WhatsApp and Instagram by Facebook and YouTube and AdSense by Google come to mind as the sorts of acquisitions United States (U.S.) regulators approved, possibly without much thought given to what a future market may look like for competition if the larger, dominant company is allowed to proceed. Simons suggested regulators and courts would be wise to give this aspect of antitrust mush more thought, which could theoretically inform the approach the Biden Department of Justice and FTC take. Simons stated:
    • And if firms are looking to the future, then antitrust enforcers should too. We must be willing and able to recognize that harm to competition might not be obvious from looking at the marketplace as it stands. If we confine ourselves to examining a static picture of the market at the moment we investigate a practice or transaction, without regard to the dynamic business realities at work, then we risk forfeiting the benefits of competition that could arise in the future to challenge the dominant firm, even when this future competition is to some extent uncertain.
    • Simons asserted:
      • A merger or acquisition can of course constitute anticompetitive conduct for purposes of Section 2 [of the Sherman Act]
      • From a competition perspective, a monopolist can “squash” a nascent competitor by buying it, not just by targeting it with anticompetitive actions as Microsoft did. In fact, from the monopolist’s perspective, it may be easier and more effective to buy the nascent threat (even if only to keep it out of the hands of others) than to target it with other types of anticompetitive conduct.
      • A central issue in potential competition cases is the nature and strength of evidence that the parties will become actual competitors in the future. Some cases have applied Section 7 [of the Clayton Act] narrowly in this context: too narrowly, I think, given that the purpose of Section 7 is to prohibit acquisitions that “may” substantially lessen competition or “tend” to create a monopoly.
    • Simons concluded:
      • But uncertainty has always been a feature of the competitive process, even in markets that appear to be simple or traditional, and dealing with uncertainty is all in a day’s work for an antitrust enforcer. I have referred to the Microsoft case repeatedly today, so, in closing, let me remind everyone that there was some uncertainty about the future in Microsoft as well. The court, in holding that the plaintiff does not and should not bear the burden of “reconstruct[ing] a product’s hypothetical development,” observed that the defendant should appropriately be “made to suffer the uncertain consequences of its own undesirable conduct.” The same holds when the monopolist has simply chosen to acquire the threat.
  • The National Institute of Standards and Technology’s (NIST) National Initiative for Cybersecurity Education (NICE) revised the Workforce Framework for Cybersecurity (NICE Framework) that “improves communications about how to identify, recruit, develop, and retain cybersecurity talent ­ – offering a common, consistent lexicon that categorizes and describes cybersecurity work.” NIST explained:
    • The NICE Framework assists organizations with managing cybersecurity risks by providing a way to discuss the work and learners associated with cybersecurity. These cybersecurity risks are an important input into enterprise risk decisions as described in NIST Interagency Report 8286, Integrating Cybersecurity and Enterprise Risk Management (ERM).
    • NIST stated “[r]evisions to the NICE Framework (NIST Special Publication 800-181) provide:
      • A streamlined set of “building blocks” comprised of Task, Knowledge, and Skill Statements;
      • The introduction of Competencies as a mechanism for organizations to assess learners; and
      • A reference to artifacts, such as Work Roles and Knowledge Skills and Abilities statements, that will live outside of the publication to enable a more fluid update process.
  • A left center think tank published a report on how the United States (U.S.) and likeminded nations can better fight cybercrime. In the report addressed to President-elect Joe Biden and Vice President-elect Kamala Harris, the Third Way presented the results of a “multiyear effort to define concrete steps to improve the government’s ability to tackle the scourge of cybercrime by better identifying unlawful perpetrators and imposing meaningful consequences on them and those behind their actions.” In “A Roadmap to Strengthen US Cyber Enforcement: Where Do We Go From Here?,” the Third Way made a list of detailed recommendations on how the Biden Administration could better fight cybercrime, but in the cover letter to the report, there was a high level summary of these recommendations:
    • In this roadmap, we identify the challenges the US government faces in investigating and prosecuting these crimes and advancing the level of international cooperation necessary to do so. Cyberattackers take great pains to hide their identity, using sophisticated tools that require technical investigative and forensic expertise to attribute the attacks. The attacks are often done at scale, where perpetrators prey on multiple victims across many jurisdictions and countries, requiring coordination across criminal justice agencies. The skills necessary to investigate these crimes are in high demand in the private sector, making it difficult to retain qualified personnel. A number of diplomatic barriers make cross-border cooperation difficult, a challenge exacerbated often by blurred lines line between state and non-state actors in perpetrating these crimes.
    • This roadmap recommends actions that your administration can take to develop a comprehensive strategy to reduce cybercrime and minimize its impact on the American people by identifying the perpetrators and imposing meaningful consequences on them. We propose you make clear at the outset to the American public and global partners that cyber enforcement will be a top priority for your administration. In reinstating a White House cybersecurity position, we have extensive recommendations on how that position should address cybercrime. And, to make policy from an intelligence baseline, we believe you should request a National Intelligence Estimate on the linkages between cybercrime and nation-state cyber actors to understand the scope of the problem.
    • Our law enforcement working group has detailed recommendations to improve and modernize law enforcement’s ability to track and respond to cybercrime. And our global cooperation working group has detailed recommendations on creating a cohesive international cyber engagement strategy; assessing and improving the capacity of foreign partners on cybercrime; and improving the process for cross-border data requests that are critical to solving these crimes. We believe that with these recommendations, you can make substantial strides in bringing cybercriminals to justice and deterring future cybercriminals from victimizing Americans.

Coming Events

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (17 November)

Further Reading

  • How the U.S. Military Buys Location Data from Ordinary Apps” By Joseph Cox — Vice’s Motherboard. This article confirms the entirely foreseeable: the Department of Defense and its contractors are obtaining and using personal information from smartphones all over the world. Given this practice is common in United States’ (U.S.) law enforcement agencies, it is little surprise the U.S. military is doing the same. Perhaps the fact the U.S. is doing this has been one of the animating force behind the Trump Administration’s moves against applications from the People’s Republic of China (PRC)?
  • Regulators! Stand Back: Under a Biden administration, Big Tech is set for a field day” By Lizzie O’Shea — The Baffler. This piece argues that a Biden Administration may be little more than a return to the Obama Administration’s favorable view of and largely laissez-faire regulatory approach. At least one expert worries the next administration may do enough on addressing big tech to appear to be doing something but not nearly enough to change the current market and societal dynamics.
  • Cheating-detection companies made millions during the pandemic. Now students are fighting back.” By Drew Harwell — The Washington Post. There are scores of problems with online testing platforms, including weak or easily compromised data security and privacy safeguards. Many students report getting flagged for stretching, looking off-screen, and even needing to go to the restroom. However, the companies in the market are in growth-mode and seem unresponsive to such criticisms.
  • Zuckerberg defends not suspending ex-Trump aide Bannon from Facebook: recording” By Katie Paul — Reuters. On an internal company call, Facebook CEO Mark Zuckerberg defended the platform’s decision not to deactivate former White House advisor Steve Bannon’s account after he “metaphorically” advocated for the beheadings of Federal Bureau of Investigation Director Christopher Wray and National Institute of Allergy and Infectious Diseases (NIAID) Director Anthony Fauci. Zuckerberg also reassured employees that a Biden Administration would not necessarily be entirely adversarial to Facebook.
  • How Trump uses Twitter to distract the media – new research” By Ullrich Ecker, Michael Jetter, and Stephan Lewandowsky — The Conversation. Research backs up the assertion that President Donald Trump has tweeted bizarre non-sequiturs to distract from what he perceived to be negative stories, and it worked because the media reported on the tweets almost every time. Trump is not the only politician or leader using this strategy.
  • Bumble Vulnerabilities Put Facebook Likes, Locations And Pictures Of 95 Million Daters At Risk” By Thomas Brewster — Forbes. Users of the dating app, Bumble, were at risk due to weak security white hacker researchers easily circumvented. Worse still, it took the company months to address and fix these vulnerabilities after being informed.

Other Developments

  • A number of United States (U.S.) election security stakeholders issued a statement, carefully and tactfully refuting the claims of President Donald Trump and other Republicans who have claimed that President-elect Joe Biden won the election only because of massive fraud. These officials declared “[t]he November 3rd election was the most secure in American history” and “[t]here is no evidence that any voting system deleted or lost votes, changed votes, or was in any way compromised.”
    • The officials seemed to flatly contradict Trump and others:
      • While we know there are many unfounded claims and opportunities for misinformation about the process of our elections, we can assure you we have the utmost confidence in the security and integrity of our elections, and you should too.
    • The members of Election Infrastructure Government Coordinating Council (GCC) Executive Committee – Cybersecurity and Infrastructure Security Agency (CISA) Assistant Director Bob Kolasky, U.S. Election Assistance Commission Chair Benjamin Hovland, National Association of Secretaries of State (NASS) President Maggie Toulouse Oliver, National Association of State Election Directors (NASED) President Lori Augino, and Escambia County (Florida) Supervisor of Elections David Stafford – and the members of the Election Infrastructure Sector Coordinating Council (SCC) – Chair Brian Hancock (Unisyn Voting Solutions), Vice Chair Sam Derheimer (Hart InterCivic), Chris Wlaschin (Election Systems & Software), Ericka Haas (Electronic Registration Information Center), and Maria Bianchi (Democracy Works) issued the statement.
  • President Donald Trump signed an executive order that would bar from the United States’ (U.S.) security markets those companies from the People’s Republic of China (PRC) connected to the PRC’s “military-industrial complex.” This order would take effect on 11 January 2021 and seeks, as a matter of national security, to cut off access to U.S. capital for these PRC companies because “the PRC exploits United States investors to finance the development and modernization of its military.” Consequently, Trump declared a national emergency with respect to the PRC’s behavior, which triggers a host of powers at the Administration’s request to deny funds and access to the object of such an order. It remains to be seen whether the Biden Administration will rescind or keep in place this executive order when it takes office ten days after it takes effect. Nevertheless, Trump asserted:
    • that the PRC is increasingly exploiting United States capital to resource and to enable the development and modernization of its military, intelligence, and other security apparatuses, which continues to allow the PRC to directly threaten the United States homeland and United States forces overseas, including by developing and deploying weapons of mass destruction, advanced conventional weapons, and malicious cyber-enabled actions against the United States and its people.
  • Microsoft revealed it has “detected cyberattacks from three nation-state actors targeting seven prominent companies directly involved in researching vaccines and treatments for Covid-19.” Microsoft attributed these attacks to Russian and North Korean hackers and tied the announcement to its participation to the company’s advocacy at the Paris Peace Forum where the United States (U.S.) multinational reiterated its calls for “the world’s leaders to affirm that international law protects health care facilities and to take action to enforce the law.” Microsoft sought to position its cyber efforts among larger diplomatic efforts to define the norms of cyberspace and to bring cyber action into the body of international law. The company asserted:
    • In recent months, we’ve detected cyberattacks from three nation-state actors targeting seven prominent companies directly involved in researching vaccines and treatments for Covid-19. The targets include leading pharmaceutical companies and vaccine researchers in Canada, France, India, South Korea and the United States. The attacks came from Strontium, an actor originating from Russia, and two actors originating from North Korea that we call Zinc and Cerium.
    • Among the targets, the majority are vaccine makers that have Covid-19 vaccines in various stages of clinical trials. One is a clinical research organization involved in trials, and one has developed a Covid-19 test. Multiple organizations targeted have contracts with or investments from government agencies from various democratic countries for Covid-19 related work.
    • Strontium continues to use password spray and brute force login attempts to steal login credentials. These are attacks that aim to break into people’s accounts using thousands or millions of rapid attempts. Zinc has primarily used spear-phishing lures for credential theft, sending messages with fabricated job descriptions pretending to be recruiters. Cerium engaged in spear-phishing email lures using Covid-19 themes while masquerading as World Health Organization representatives. The majority of these attacks were blocked by security protections built into our products. We’ve notified all organizations targeted, and where attacks have been successful, we’ve offered help.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) announced a £1.25 million fine of Ticketmaster UK for failing “to put appropriate security measures in place to prevent a cyber-attack on a chat-bot installed on its online payment page” in violation of the General Data Protection Regulation (GDPR). The ICO explained:
    • The breach began in February 2018 when Monzo Bank customers reported fraudulent transactions. The Commonwealth Bank of Australia, Barclaycard, Mastercard and American Express all reported suggestions of fraud to Ticketmaster. But the company failed to identify the problem.
    • In total, it took Ticketmaster nine weeks from being alerted to possible fraud to monitoring the network traffic through its online payment page.
    • The ICO’s investigation found that Ticketmaster’s decision to include the chat-bot, hosted by a third party, on its online payment page allowed an attacker access to customers’ financial details.
    • Although the breach began in February 2018, the penalty only relates to the breach from 25 May 2018, when new rules under the GDPR came into effect. The chat-bot was completely removed from Ticketmaster UK Limited’s website on 23 June 2018.
    • The ICO added:
      • The data breach, which included names, payment card numbers, expiry dates and CVV numbers, potentially affected 9.4million of Ticketmaster’s customers across Europe including 1.5million in the UK.
      • Investigators found that, as a result of the breach, 60,000 payment cards belonging to Barclays Bank customers had been subjected to known fraud. Another 6,000 cards were replaced by Monzo Bank after it suspected fraudulent use.
      • The ICO found that Ticketmaster failed to:
        • Assess the risks of using a chat-bot on its payment page
        • Identify and implement appropriate security measures to negate the risks
        • Identify the source of suggested fraudulent activity in a timely manner
  • The Office of the Comptroller of the Currency, the Board of Governors of the Federal Reserve System, and the Federal Deposit Insurance Corporation issued an interagency paper titled “Sound Practices to Strengthen Operational Resilience.” The agencies stated the paper “generally describes standards for operational resilience set forth in the agencies’ existing rules and guidance for domestic banking organizations that have average total consolidated assets greater than or equal to (1) $250 billion or (2) $100 billion and have $75 billion or more in average cross-jurisdictional activity, average weighted short-term wholesale funding, average nonbank assets, or average off-balance-sheet exposure.” The agencies explained the paper also:
    • promotes a principles-based approach for effective governance, robust scenario analysis, secure and resilient information systems, and thorough surveillance and reporting.
    • includes an appendix focused on sound practices for managing cyber risk.
    • In the appendix, the agencies stressed they could not “endorse the use of any particular tool,” they did state:
      • To manage cyber risk and assess cybersecurity preparedness of its critical operations, core business lines and other operations, services, and functions firms may choose to use standardized tools that are aligned with common industry standards and best practices. Some of the tools that firms can choose from include the Federal Financial Institutions Examination Council (FFIEC) Cybersecurity Assessment Tool, the National Institute of Standards and Technology Cybersecurity Framework (NIST), the Center for Internet Security Critical Security Controls, and the Financial Services Sector Coordinating Council Cybersecurity Profile.
  • A class action was filed in the United Kingdom (UK) against Facebook over the Cambridge Analytica scandal. Facebook You Owe Us announced its legal action “for the illegal use of one million users’ data in the England and Wales.” The campaign claimed:
    • Group legal actions like Facebook You Owe Us will pave the way for consumers in the UK to gain redress and compensation for the persistent mass misuse of personal data by the world’s largest companies.  
    • Facebook has exhibited a pattern of unethical behaviour including allegations of election interference and failing to remove fake news. The Information Commissioners Office noted when issuing a £500,000 fine against Facebook for the Cambridge Analytica data breach that “protection of personal information and personal privacy is of fundamental importance, not only for the rights of individuals, but also as we now know, for the preservation of a strong democracy.” Facebook You Owe Us aims to fight back by holding the company to account for failing to protect Facebook users’ personal data and showing that Facebook is not above the law.  
    • The launch of Facebook You Owe Us follows Google You Owe Us’ victory in the Court of Appeal. The Google You Owe Us case has been appealed by Google and is now scheduled to be heard before the Supreme Court in April 2021. If successful, the case will demonstrate that personal data is of value to individuals and that companies cannot simply take it and profit from it illegally. Both cases are led by James Oldnall at Milberg London LLP, with Richard Lloyd, the former executive director of Which?. 

Coming Events

  • The Senate Homeland Security and Governmental Affairs Committee’s Regulatory Affairs and Federal Management Subcommittee will hold a hearing on how to modernize telework in light of what was learned during the COVID-19 pandemic on 18 November.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.
  • On 27 November, The European Data Protection Board “is organising a remote stakeholder workshop on the topic of Legitimate Interest.” The EDPB explained “[p]laces will be allocated on a first come, first served basis, depending on availability.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

EDPB Publishes Schrems II Recommendations; EU Parliament Issues Draft SCC Revisions

The EU takes steps to respond to the CJEU’s striking down of the EU-US Privacy Shield by augmenting SCCs and other transfer mechanisms.

The European Data Protection Board (EDPB) published recommendations for entities exporting and importing the personal data of European Union (EU) residents in light of the court decision striking down the adequacy decision that allowed transfers to the United States (U.S.). The EDPB noted that alternate mechanisms like standard contractual clauses (SCC) may still be used for transfers to nations without adequate protections of EU rights provided that supplemental measures are used. It should be noted that the EDPB said that supplemental measures will be needed for the use of any transfers to nations that do not guarantee the same level of rights as the EU, which would include Binding Corporate Rules (BCR). While, the EDPB’s recommendations will undoubtedly prove persuasive with the Supervisory Authorities (SA), each SA will ultimately assess whether the mechanisms and supplementary measures used by entities comport with General Data Protection Regulation (GDPR) and the EU’s Charter of Fundamental Rights.

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the Court of Justice for the European Union (CJEU) explained:

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the U.S. lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law. However, the CJEU left open the question of whether SCCs may permit the continued exporting of EU personal data into the U.S. for companies like Facebook, Google, and many, many others. Consequently, there has been no small amount of interpreting and questioning of whether this may be a way for the trans-Atlantic data flow worth billions, perhaps even trillions, of dollars to continue. And yet, the CJEU seemed clear that additional measures would likely be necessary. Indeed, the CJEU asserted “[c]ontrollers and processors should be encouraged to provide additional safeguards via contractual commitments that supplement standard protection clauses” and “[i]n so far as those standard data protection clauses cannot, having regard to their very nature, provide guarantees beyond a contractual obligation to ensure compliance with the level of protection required under EU law, they may require, depending on the prevailing position in a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with that level of protection.”

In “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,” the EDPB explained the genesis and rationale for the document:

  • The GDPR or the [CJEU] do not define or specify the “additional safeguards”, “additional measures” or “supplementary measures” to the safeguards of the transfer tools listed under Article 46.2 of the GDPR that controllers and processors may adopt to ensure compliance with the level of protection required under EU law in a particular third country.
  • The EDPB has decided, on its own initiative, to examine this question and to provide controllers and processors, acting as exporters, with recommendations on the process they may follow to identify and adopt supplementary measures. These recommendations aim at providing a methodology for the exporters to determine whether and which additional measures would need to be put in place for their transfers. It is the primary responsibility of exporters to ensure that the data transferred is afforded in the third country of a level of protection essentially equivalent to that guaranteed within the EU. With these recommendations, the EDPB seeks to encourage consistent application of the GDPR and the Court’s ruling, pursuant to the EDPB’s mandate

Broadly speaking, whether SCCs and supplemental measures will pass muster under the GDPR will be determined on a case-by-case basis. The EDPB did not offer much in the way of bright line rules. Indeed, it will be up to SAs to determine if transfers to nations like the U.S. are possible under the GDPR, meaning these recommendations may shed more light on this central question without deciding it. One wonders, as a practical matter, if the SAs will have the capacity, resources, and will to police SCCs to ensure the GDPR and Charter are being met.

Nonetheless, the EDPB stressed the principle of accountability under which controllers which export personal data must ensure that whatever mechanism and supplemental measures govern a data transfer, the data must receive the same protection it would in the EU. The EDPB made the point that EU protections travel with the data and should EU personal data make its way to a country where it is not possible for appropriate protection to occur, then the transfer violates the GDPR. Moreover, these recommendations pertain to both public and private transfers of EU data to private sector entities outside the EU.

These recommendations work like a decision tree with exporters needing to ask themselves a series of questions to determine whether they must use supplemental measures. This may prove a resource intensive process, for exporters will need to map all transfers (i.e. know exactly) where the data are going. The exporter must understand the laws and practices of the third nation in order to put in place appropriate measures if this is possible in order to meet the EU’s data protection standards.

Reading between the lines leads one to conclude that data exporters may not send personal data to the U.S. for its federal surveillance regime is not “necessary and proportionate,” at least from the EU’s view. The U.S. lacks judicial redress in the case a U.S. national, let alone a foreign national, objects to the sweeping surveillance. The U.S. also has neither a national data protection law nor a dedicated data protection authority. These hints seem to also convey the EDPB’s view on the sorts of legal reforms needed in the U.S. before an adequacy decision would pass muster with the CJEU.

The EDPB said it was still evaluating how Schrems II affects the use of BCR and ad hoc contractual clauses, two of the other alternate means of transferring EU personal data in the absence of an adequacy agreement.

Nevertheless, in an annex, the EDPB provided examples of supplementary measures that may be used depending on the circumstances, of course, such as “flawlessly implemented” encryption and pseudonymizing data. However, the EDPB discusses these in the context of different scenarios and calls for more conditions than just the two aforementioned. Moreover, the EDPB rules out two scenarios categorically as being inadequate: “Transfer to cloud services providers or other processors which require access to data in the clear” and “Remote access to data for business purposes.”

The EDPB also issued an update to guidance published after the first lawsuit brought by Maximilian Schrems resulted in the striking down of the Safe Harbor transfer agreement. The forerunner to the EDPB, the Working Party 29, had drafted and released the European Essential Guarantees, and so, in light of Schrems II, the EDPB updated and published “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” “to provide elements to examine, whether surveillance measures allowing access to personal data by public authorities in a third country, being national security agencies or law enforcement authorities, can be regarded as a justifiable interference or not” with fundamental EU rights and protections. As the EDPB explains, these recommendations are intended to help data controllers and exporters determine whether other nations have protections and processes in place equivalent to those of the EU visa vis their surveillance programs. The EDPB stressed that these are the essential guarantees and other features and processes may be needed for a determination of lawfulness under EU law.

The EDPB formulated the four European Essential Guarantees:

A. Processing should be based on clear, precise and accessible rules

B. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated

C. An independent oversight mechanism should exist

D. Effective remedies need to be available to the individual

The European Commission (EC) has also released for comment a draft revision of SCC for transfers of personal data to countries outside the EU. The EC is accepting comments and input until 10 December. It may be no accident that the EDPB and EC more or less acted in unison to address the practical and statutory changes necessary to effectuate the CJEU’s striking down of the EU-US Privacy Shield. Whatever the case, the EC released draft legislative language and, in an Annex, actual contract language for use by controllers and processors in the form of modules that are designed to be used in a variety of common circumstances (e.g., transfers by controllers to other controllers or a controller to a processor.) However, as the EDPB did, the EC stressed that SCCs form a floor and controllers, processors, and other parties are free to add additional language so long as it does not contradict or denigrate the rights protected by SCCs.

In the implementing decision, the EC asserted

the standard contractual clauses needed to be updated in light of new requirements in Regulation (EU) 2016/679. Moreover, since the adoption of these decisions, important developments have taken place in the digital economy, with the widespread use of new and more complex processing operations often involving multiple data importers and exporters, long and complex processing chains as well as evolving business relationships. This calls for a modernisation of the standard contractual clauses to better reflect those realities, by covering additional processing and transfer situations and to use a more flexible approach, for example with respect to the number of parties able to join the contract.

The EC continued:

The standard contractual clauses set out in the Annex to this Decision may be used by a controller or a processor in order to provide appropriate safeguards within the meaning of Article 46(1) of Regulation (EU) 2016/679 for the transfer of personal data to a processor or a controller established in a third country. This also includes the transfer of personal data by a controller or processor not established in the Union, to the extent that the processing is subject to Regulation (EU) 2016/679 pursuant to Article 3(2) thereof, because it relates to the offering of goods or services to data subjects in the Union or the monitoring of their behaviour as far as their behaviour takes place within the Union.

The EC explained the design and intent of the SCC language in the Annex:

  • The standard contractual clauses set out in the Annex to this Decision combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains. In addition to the general clauses, controllers and processors should select the module applicable to their situation, which makes it possible to tailor their obligations under the standard contractual clauses to their corresponding role and responsibilities in relation to the data processing at issue. It should be possible for more than two parties to adhere to the standard contractual clauses. Moreover, additional controllers and processors should be allowed to accede to the standard contractual clauses as data exporters or importers throughout the life cycle of the contract of which those clauses form a part.
  • These Clauses set out appropriate safeguards, including enforceable data subject rights and effective legal remedies, pursuant to Article 46(1), and Article 46 (2)(c) of Regulation (EU) 2016/679 and, with respect to data transfers from controllers to processors and/or processors to processors, standard contractual clauses pursuant to Article 28(7) of Regulation (EU) 2016/679, provided they are not modified, except to add or update information in the Annexes. This does not prevent the Parties from including the standard contractual clauses laid down in this Clauses in a wider contract, and to add other clauses or additional safeguards provided that they do not contradict, directly or indirectly, the standard contractual clauses or prejudice the fundamental rights or freedoms of data subjects. These Clauses are without prejudice to obligations to which the data exporter is subject by virtue of the Regulation (EU) 2016/679

In October, the Trump Administration released a crib sheet they are hoping U.S. multinationals will have success in using to argue to SAs that SCC and BCR and U.S. law satisfy the European Court of Justice’s ruling that struck down the EU-U.S. Privacy Shield. And, the Trump Administration is basically arguing, sure, we spy, but most EU citizens data is not surveilled and EU governments themselves often share in the proceeds of the surveillance we conduct. Moreover, there are plenty of safeguards and means of redress in the U.S. system because, you know, we say so. It is unlikely this analysis will be very persuasive in the EU, especially since these broad arguments do not go to the criticisms the EU has had under Privacy Shield about U.S. surveillance and privacy rights nor to the basis for the CJEU’s ruling.

Earlier this month, the European Data Protection Supervisor (EDPS) published a strategy detailing how EU agencies and bodies should comply with the CJEU ruling that struck down the EU-US Privacy Shield and threw into question the compliance of SCC with EU law and the GDPR. The EDPS has already started working with EU Institutions’, bodies, offices and agencies (EUIs) on the process of determining if their transfers of the personal data of people in the EU to the U.S. meets the CJEU’s judgement.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Anthony Beck from Pexels

EDPB Concludes First Use of Powers To Resolve Differences Between DPAs in Twitter Enforcement Action

The EDPB announces but does not release its release on the dispute between SAs in the EU over the appropriate punishment for Twitter’s data breaches.

The European Data Protection Board (EDPB) has used its powers under the General Data Protection Regulation (GDPR) for the first time to resolve a dispute between data protection authorities (DPA) in the European Union (EU) over an enforcement action. Unidentified DPAs had objected to the proposed action Ireland’s Data Protection Commission (DPC) had circulated, obligating the EDPB to utilize its Article 65 powers to craft a resolution to the disputed part of the action. The enforcement concerned 2018 and 2019 Twitter data breaches. Now, the DPC has a month to craft a decision on the basis of the EDPB decision unless the DPC challenges the decision in the Court of Justice for the European Union (CJEU).

The DPC submitted its draft decision to other DPAs on the Twitter breach in May in accordance with Article 60 of the GDPR. The DPC stated “[t]he draft decision focusses on whether Twitter International Company has complied with Articles 33(1) and 33(5) of the GDPR” (i.e. the provision pertaining to data breach and proper notification protocol. The DPC further explained

  • This draft decision is one of a number of significant developments in DPC inquiries into “big tech” companies this week. Deputy Commissioner Graham Doyle has confirmed that: “In addition to submitting this draft decision to other EU supervisory authorities, we have this week sent a preliminary draft decision to WhatsApp Ireland Limited for their final submissions which will be taken in to account by the DPC before preparing a draft decision in that matter also for Article 60 purposes.  The inquiry into WhatsApp Ireland examines its compliance with Articles 12 to 14 of the GDPR in terms of transparency including in relation to transparency around what information is shared with Facebook.“
  • The DPC has also completed the investigation phase of a complaint-based inquiry which focuses on Facebook Ireland’s obligations to establish a lawful basis for personal data processing. This inquiry is now in the decision-making phase at the DPC.

In its statement this week, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 4% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

The EDPB asserted:

The Irish SA shall adopt its final decision on the basis of the EDPB decision, which will be addressed to the controller, without undue delay and at the latest one month after the EDPB has notified its decision. The LSA and CSAs shall notify the EDPB of the date the final decision was notified to the controller. Following this notification, the EDPB will publish its decision on its website.

The EDPB also published FAQs on the Article 65 procedure.

More recently, the EDPB issued a draft of its construction of a key authority in the GDPR designed to guide and coordinate investigations that cross borders in the European Union (EU). An LSA is supposed to consider “relevant and reasoned objections” to draft decisions submitted by CSAs. If an LSA rejects such feedback, then the GDPR action gets kicked over to the EDPB. However, since this has only happened once, the EDPB thought it appropriate to define the term so all the EU DPA would understand what objections are relevant and reasoned.

The EDPB explained that the guidance “aims at establishing a common understanding of the notion of the terms “relevant and reasoned”, including what should be considered when assessing whether an objection “clearly demonstrates the significance of the risks posed by the draft decision.” The EDPB stated “[t]he unfamiliarity surrounding “what constitutes relevant and reasoned objection” has the potential to create misunderstandings and inconsistent applications by the supervisory authorities, the EU legislator (sic) suggested that the EDPB should issue guidelines on this concept (end of Recital 124 GDPR).”

Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by ElisaRiva from Pixabay

Schrems II Guidance

The agency that oversees the data protection of EU agencies has laid out its view on how they should comply with the GDPR after the EU-US Privacy Shield.

The European Data Protection Supervisor (EDPS) has published a strategy detailing how European Union (EU) agencies and bodies should comply with the Court of Justice of the European Union’s (CJEU) ruling that struck down the EU-United States (U.S.) Privacy Shield (aka Schrems II) and threw into question the compliance of Standard Contractual Clauses (SCC) with EU law and the General Protection Data Regulation (GDPR). The EDPS has already started working with EU Institutions’, bodies, offices and agencies (EUIs) on the process of determining if their transfers of the personal data of people in the EU to the U.S. meets the CJEU’s judgement.

The EDPS makes clear most of the transfers by EUIs to the U.S. are on account of using U.S. information and communications technology (ICT) products and services, meaning U.S. multinationals like Microsoft, Google, and others. The EDPS has proposed a strategy that would first identify risks and then move to address them. It bears stressing that this strategy applies only to EUIs and not private sector controllers, but it is likely the European Data Protection Board (EDPB) and EU DPAs will take notice of the EDPS’ strategy on how to comply with Schrems II. However, the EDPS acknowledges that it is obliged to follow the EDPB’s lead and vows to change its strategy upon issuance of EDPB guidance on Schrems II and SCC. And yet, the EDPS explained that EUIs will need to report back on how they are implementing the steps in the strategy, particularly on those ongoing transfers to countries like the U.S. that have inadequate data protection laws, those transfers that have been suspended, and any transfers being conducted per derogations in the GDPR. On the basis of this feedback, the EDPS will “establish long-term compliance” in 2021.

It seems a bit backwards for the EDPS to task the EUIs with determining which transfers under SCC may proceed under the GDPR when it might be a more efficient process for the EDPS to take on this job directly and rule on the ICT services and providers, permitting all EUIs to understand which comply with EU law and which do not. However, the EDPS is exploring the possibility of determining the sufficiency of data protection in other nations, most likely, first and foremost the U.S., and then working with EU stakeholders to coordinate compliance with the CJEU’s ruling and the GDPR.

The EDPS claimed the CJEU “clarified the roles and responsibilitiesof controllers, recipients of data outside of the European Economic Area (EEA) (data importers) and supervisory authorities…[and] ruled the following:

  • The Court invalidated the Privacy Shield adequacy Decision and confirmed that the SCCs were valid providing that they include effective mechanisms to ensure compliance in practice with the “essentially equivalent” level of protection guaranteed within the EU by the General Data Protection Regulation (GDPR). Transfers of personal data pursuant to the SCCs are suspended or prohibited in the event of a breach of such clauses, or in case it is impossible to honour them.
  • The SCCs for transfers may then require, depending on the prevailing position of a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with the level of protection guaranteed within the EU.
  • In order to continue these data transfers, the Court stresses that before transferring personal data to a third country, it is the data exporters’ and data importers’ responsibility to assess whether the legislation of the third country of destination enables the data importer to comply with the guarantees provided through the transfer tools in place. If this is not the case, it is also the exporter and the importer’s duty to assess whether they can implement supplementary measures to ensure an essentially equivalent level of protection as provided by EU law. Should data exporters, after taking into account the circumstances of the transfer and possible supplementary measures, conclude that appropriate safeguards cannot be ensured, they are required to suspend or terminate the transfer of personal data. In case the exporter intends nevertheless to continue the transfer of personal data, they must notify their competent SA.
  • The competent supervisory authority is required to suspend or prohibit a transfer of personal data to a third country pursuant to the SCCs if, when considering the circumstances of that transfer, those clauses are not or cannot be complied with in the third country of destination and the protection of the data transferred under EU law cannot be ensured by other means.

EDPS explained:

The EDPS’ report on the 2017 survey entitled, Measuring compliance with data protection rules in EU institutions, provides evidence that there has been a significant rise in the number of transfers related to the core business of EUIs in recent years. This number is even higher now, due to the increased use of ICT services and social media. The EDPS’ own-initiative investigation into the use of Microsoft products and services by EUIs and subsequent recommendations in that regard confirms the importance to ensure a level of protection that is essentially equivalent as the one guaranteed within the EU, as provided by relevant data protection laws, to be interpreted in accordance with the EU Charter. In this context, the EDPS has already flagged a number of linked issues concerning sub-processors, data location, international transfers and the risk of unlawful disclosure of data – issues that the EUIs were unable to control and ensure proper safeguards to protect data that left the EU/EEA. The issues we raised in our investigation report are consistent with the concerns expressed in the Court’s Judgment, which we are assessing in relation to any processor agreed to by EUIs.

Regarding data flows to the U.S. quite possibly in violation of the GDPR and Schrems II, the EDPS:

  • Moreover, a majority of data flows to processors most probably happen because EUIs use service providers that are either based in the U.S. or that use sub-processors based in the U.S., in particular for ICT services, which fall under the scope of U.S. surveillance laws. Such companies have primarily relied on the Privacy Shield adequacy Decision to transfer personal data to the U.S. and the use of SCCs as a secondary measure.
  • Therefore, the present Strategy emphasizes the priority to address transfers of data by EUIs or on their behalf in the context of controller to process or contract and/or processor to sub-processor contracts, in particular towards the United States.

The EDPS is calling for “a twofold approach as the most appropriate:

(1) Identify urgent compliance and/or enforcement actions through a risk based approach for transfers towards the U.S. presenting high risks for data subjects and in parallel

(2) provide guidance and pursue mid-term case-by-case EDPS compliance and or enforcement actions for all transfers towards the U.S. or other third countries.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pete Linforth from Pixabay

EDPB Data Protection By Design and Default Guidance

The EU’s arbiter on the GDPR explains what it considers data by design and default that complies with the GDPR.

The European Data Protection Board (EDPB or Board) issued “Guidelines 4/2019 on Article 25 Data Protection by Design and by Default Version 2.0,” which is “general guidance on the obligation of Data Protection by Design and by Default (DPbDD) set forth in Article 25 in the [General Data Protection Regulation] GDPR.” The EDPB’s Guidance follows guidance issued by at least three European Union (EU) data protection authorities (DPA) on data protection by design and by default. However, given the resource constrained nature of most EU DPAs, it is not clear how the data processing systems of controllers will be policed to ensure DPbDD. Presumably failings and violations would be turned up during investigations launched on other grounds.

Article 25 requires, in relevant part:

  • [T]he controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.
  • The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.

The EDPB pointed to the data protection and privacy by design guidance released by three EU DPAs:

The EDPB stated:

Data protection by design and data protection by default are complementary concepts, which mutually reinforce each other. Data subjects will benefit more from data protection by default if data protection by design is concurrently implemented – and vice versa.

The Board sought to explain its view on how controllers can meet these obligations under Article 25. The EDPB asserted:

The  core  obligation  is  the implementation  of appropriate measures  and necessary safeguards  that provide effective implementation of the data protection principles and, consequentially, data subjects’ rights and freedoms by design and by default. Article 25 prescribes both design and default elements that should be taken into account. (emphasis in the original.)

Again, and again throughout the Guidance, the EDPB stresses that “effective implementation” is the key, suggesting that processes and systems that appear compliant on the surface will not necessarily be found compliant should a controller be investigated.

Unlike the American approach to data protection, the size and resources of a controller have no bearing on the compliance obligations with respect to DPbDD. The EDPB stated

DPbDD is a requirement for all controllers, including small businesses and multinational companies alike. That being the case, the complexity of implementing DPbDD may vary based on the individual processing operation. Regardless of the size however, in all cases, positive benefits for controller and data subject can be achieved by implementing DPbDD.

Moreover, the GDPR’s Article 25 requirements regarding DPbPP apply to processing to be designed and processing systems that pre-date the GDPR:

The requirement described in Article 25 is for controllers to have data protection designed into the processing of personal data and as a default setting and this applies throughout the processing lifecycle. DPbDD is also a requirement for processing systems pre-existing before the GDPR entered into force. Controllers must have the processing consistently updated in line with the GDPR.

What’s more, the EDPB asserted “[c]ontrollers shall implement DPbDD before processing, and also continually at the time of processing, by regularly reviewing the effectiveness of the chosen measures and safeguards…[and] DPbDD also applies to existing systems that are processing personal data.”

The Board contextualized DPbDD in the GDPR and the EU’s human rights:

  • In line with Article 25(1) the controller shall implement appropriate technical and organisational measures which are designed to implement the data protection principles and to integrate the necessary safeguards into the processing in order to meet the requirements and protect the rights and freedoms of data subjects. Both appropriate measures and necessary safeguards are meant to serve the same purpose of protecting the rights of data subjects and ensuring that the protection of their personal data is built into the processing.
  • The controller should choose and be accountable for implementing default processing settings and options in a way that only processing that is strictly necessary to achieve the set, lawful purpose is carried out by default. Here, controllers should rely on their assessment of the necessity of the processing with regards to the legal grounds of Article 6(1). This means that by default, the controller shall not collect more data than is necessary, they shall not process the data collected more than is necessary for their purposes, nor shall they store the data for longer than necessary. The basic requirement is that data protection is built into the processing by default.

The EDPB explained:

In all stages of design of the processing activities, including procurement, tenders, outsourcing, development, support, maintenance, testing, storage, deletion, etc., the controller should take into account and consider the various elements of DPbDD which will be illustrated by examples in this chapter in the context of implementation of the principles.

The EDPB asserted the Guidance may also be of use to others with responsibilities under the GDPR: “Other actors, such as processors and producers of products, services and applications (henceforth “producers”), who are not directly addressed in Article 25, may also find these Guidelines useful in creating GDPR compliant products and services that enable controllers to fulfil their data protection obligations.” Moreover, a controller will be held accountable for the DPbDD of processors and sub-processors

Nonetheless, the Board made recommendations to processors:

  • Although not directly addressed in Article 25, processors and producers are also recognized as key enablers for DPbDD, they should be aware that controllers are required to only process personal data with systems and technologies that have built-in data protection.
  • When processing on behalf of controllers, or providing solutions to controllers, processors and producers should use their expertise to build trust and guide their customers, including SMEs, in designing /procuring solutions that embed data protection into the processing. This means in turn that the design of products and services should facilitate controllers’ needs.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

“Privacy” by Afsal CMK is licensed under CC BY 4.0