Further Reading, Other Developments, and Coming Events (18 February 2021)

Further Reading

  • Google, Microsoft, Qualcomm Protest Nvidia’s Acquisition of Arm Ltd.” By  David McLaughlin, Ian King, and Dina Bass — Bloomberg. Major United States (U.S.) tech multinationals are telling the U.S. government that Nvidia’s proposed purchase of Arm will hurt competition in the semi-conductor market, an interesting position for an industry renowned for being acquisition hungry. The British firm, Arm, is a key player in the semi-conductor business that deals with all companies, and the fear articulated by firms like Qualcomm, Microsoft, and Google is that Nvidia will cut supply and increase prices once it controls Arm. According to one report, Arm has made something like 95% of the chip architecture for the world’s smartphones and 95% of the chips made in the People’s Republic of China (PRC). The deal has to clear U.S., British, EU, and PRC regulators. In the U.S., the Federal Trade Commission (FTC) has reportedly made very large document requests, which indicates their interest in digging into the deal and suggests the possibility they may come out against the acquisition. The FTC may also be waiting to read the mood in Washington as there is renewed, bipartisan concern about antitrust and competition and about the semi-conductor industry. Finally, acting FTC Chair Rebecca Kelly Slaughter has come out against a lax approach to so-called vertical mergers such as the proposed Nvidia-Arm deal, which may well be the ultimate position of a Democratic FTC.
  • Are Private Messaging Apps the Next Misinformation Hot Spot?” By Brian X. Chen and Kevin Roose — The New York Times. The conclusion these two tech writers reach is that, on balance, private messaging apps like Signal and Telegram, are better for society than not. Moreover, they reason it is better to have extremists migrate from platforms like Facebook to ones where it is much harder to spread their views and proselytize.
  • Amazon Has Transformed the Geography of Wealth and Power” By Vauhini Vara — The Atlantic. A harrowing view of the rise of Amazon cast against the decline of the middle class and the middle of the United States (U.S.) Correlation is not causation, of course, but the company has sped the decline of a number of industries and arguably a number of cities.
  • Zuckerberg responds to Apple’s privacy policies: “We need to inflict pain” By Samuel Axon — Ars Technica. Relations between the companies have worsened as their CEO have taken personal shots at each other in public and private culminating in Apple’s change to its iOS requiring users to agree to being tracked by apps across the internet, which is Facebook’s bread and butter. Expect things to get worse as both Tim Cook and Mark Zuckerberg think augmented reality or mixed reality are the next major frontiers in tech, suggesting the competition may intensify.
  • Inside the Making of Facebook’s Supreme Court” By Kate Klonik — The New Yorker. A very immersive piece on the genesis and design of the Facebook Oversight Board, originally conceived of as a supreme court for content moderation. However, not all content moderation decisions can be referred to the Board; in fact, only when Facebook decides to take down content does a person have a right to appeal. Otherwise, one must depend on the company’s beneficence. So, for example, if Facebook decided to leave up content that is racist toward Muslims, a Facebook user could not appeal the decision. Additionally, Board decisions are not precedential, which, in plain English means, if the Board decides a take down of, say, Nazi propaganda comports with Facebook’s rules, the company would not be obligated to take down similar Nazi content thereafter. This latter wrinkle will ultimately serve to limit the power of the Board. The piece quotes critics, including many involved with the design and establishment of the Board, who see the final form as being little more than a fig leaf for public relations.

Other Developments

  • The Department of Health and Human Services (HHS) was taken to task by a federal appeals court in a blunt opinion decrying the agency’s failure to articulate even the most basic rationale for a multi-million dollar fine of a major Houston hospital for its data security and data privacy violations. HHS’ Office of Civil Rights had levied $4.348 million find on  the University of Texas M.D. Anderson Cancer Center (M.D. Anderson) for violations of the regulations promulgated pursuant to the “Health Insurance Portability and Accountability Act of 1996” (P.L. 104–191) and “Health Information Technology for Economic and Clinical Health Act” (HITECH Act) (P.L. 111-5) governing the security and privacy of certain classes of health information. M.D. Anderson appealed the decision, losing at each stage, until it reached the United States Court of Appeals for the Fifth Circuit (Fifth Circuit.) In its ruling, the Fifth Circuit held that OCR’s “decision  was  arbitrary,  capricious,  and contrary to law.” The Fifth Circuit vacated the penalty and sent the matter back to HHS for further consideration.
    • In its opinion, the Fifth Circuit explained the facts:
      • First, back in 2012, an M.D. Anderson faculty member’s laptop was stolen. The laptop was not encrypted or password-protected but contained “electronic protected health information (ePHI) for 29,021 individuals.” Second, also in 2012, an M.D. Anderson trainee lost an unencrypted USB thumb drive during her evening commute. That thumb drive contained ePHI for over 2,000 individuals. Finally, in 2013, a visiting researcher at M.D. Anderson misplaced another unencrypted USB thumb drive, this time containing ePHI for nearly 3,600 individuals.
      • M.D. Anderson disclosed these incidents to HHS. Then HHS determined that M.D. Anderson had violated two federal regulations. HHS promulgated both of those regulations under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act of 2009 (the “HITECH Act”). The first regulation requires entities covered by HIPAA and the HITECH Act to “[i]mplement a mechanism to encrypt” ePHI or adopt some other “reasonable and appropriate” method to limit access to patient data. 45 C.F.R. §§ 164.312(a)(2)(iv), 164.306(d) (the “Encryption Rule”). The second regulation prohibits the unpermitted disclosure of protected health information. Id. § 164.502(a) (the “Disclosure Rule”).
      • HHS also determined that M.D. Anderson had “reasonable cause” to know that it had violated the rules. 42 U.S.C. § 1320d-5(a)(1)(B) (setting out the “reasonable cause” culpability standard). So, in a purported exercise of its power under 42 U.S.C. § 1320d-5 (HIPAA’s enforcement provision), HHS assessed daily penalties of $1,348,000 for the Encryption Rule violations, $1,500,000 for the 2012 Disclosure Rule violations, and $1,500,000 for the 2013 Disclosure Rule violations. In total, HHS imposed a civil monetary penalty (“CMP” or “penalty”) of $4,348,000.
      • M.D. Anderson unsuccessfully worked its way through two levels of administrative appeals. Then it petitioned our court for review. See 42 U.S.C. § 1320a-7a(e)  (authorizing  judicial  review).  After  M.D.  Anderson  filed  its  petition, the Government conceded that it could not defend its penalty and asked us to reduce it by a factor of 10 to $450,000. 
  • The Australian Senate Standing Committee for the Scrutiny of Bills has weighed in on both the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 and the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, two major legislative proposals put forth in December 2020. This committee plays a special role in legislating in the Senate, for it must “scrutinise each bill introduced into the Parliament as to whether the bills, by express words or otherwise:
    • (i)  trespass unduly on personal rights and liberties;
    • (ii)  make rights, liberties or obligations unduly dependent upon insufficiently defined administrative powers;
    • (iii)  make rights, liberties or obligations unduly dependent upon non- reviewable decisions;
    • (iv)  inappropriately delegate legislative powers; or
    • (v)  insufficiently subject the exercise of legislative power to parliamentary scrutiny.
    • Regarding the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 (see here for analysis), the committee explained:
      • The bill seeks to amend the Surveillance Devices Act 2004 (SD Act), the Crimes Act 1914 (Crimes Act) and associated legislation to introduce three new types of warrants available to the Australian Federal Police (AFP) and the Australian Criminal Intelligence Commission (ACIC) for investigating and disrupting online crime. These are:
        • data disruption warrants, which enable the AFP and the ACIC to modify, add, copy or delete data for the purposes of frustrating the commission of serious offences online;
        • network activity warrants, which permit access to devices and networks used by suspected criminal networks, and
        • account takeover warrants, which provide the AFP and the ACIC with the ability to take control of a person’s online account for the purposes of gathering evidence to further a criminal investigation.
    • The committee flagged concerns about the bill in these categories:
      • Authorisation of coercive powers
        • Issuing authority
        • Time period for warrants
        • Mandatory considerations
        • Broad scope of offences
      • Use of coercive powers without a warrant
        • Emergency authorisations
      • Innocent third parties
        • Access to third party computers, communications in transit and account-based data
        • Compelling third parties to provide information
        • Broad definition of ‘criminal network of individuals’
      • Use of information obtained through warrant processes
        • Prohibitions on use
        • Storage and destruction of records
      • Presumption of innocence—certificate constitutes prima facie evidence
      • Reversal of evidential burden of proof
      • Broad delegation of administrative powers
        • Appropriate authorising officers of the ACIC
    • The committee asked for the following feedback from the government on the bill:
      • The committee requests the minister’s detailed advice as to:
        • why it is considered necessary and appropriate to enable law enforcement officers to disrupt or access data or takeover an online account without a warrant in certain emergency situations (noting the coercive and intrusive nature of these powers and the ability to seek a warrant via the telephone, fax or email);
        • the appropriateness of retaining information obtained under an emergency authorisation that is subsequently not approved by a judge or AAT member;
        • and the appropriateness of enabling law enforcement agencies to act to conceal any thing done under a warrant after the warrant has ceased to be in force, and whether the bill could be amended to provide a process for obtaining a separate concealment of access warrant if the original warrant has ceased to be in force.
      • The committee requests the minister’s detailed advice as to:
        • the effect of Schedules 1-3 on the privacy rights of third parties and a detailed justification for the intrusion on those rights, in particular:
        • why proposed sections 27KE and 27KP do not specifically require the judge or nominated AAT member to consider the privacy implications
        • for third parties of authorising access to a third party computer or
        • communication in transit;
        • why the requirement that an issuing authority be satisfied that an assistance order is justifiable and proportionate, having regard to the offences to which it would relate, only applies to an assistance order with respect to data disruption warrants, and not to all warrants; and
        • whether the breadth of the definitions of ‘electronically linked group of individuals’ and ‘criminal network of individuals’ can be narrowed to reduce the potential for intrusion on the privacy rights of innocent third parties.
    • The committee requests the minister’s detailed advice as to:
      • whether all of the exceptions to the restrictions on the use, recording or disclosure of protected information obtained under the warrants are appropriate and whether any exceptions are drafted in broader terms than is strictly necessary; and
      • why the bill does not require review of the continued need for the retention of records or reports comprising protected information on a more regular basis than a period of five years.
    • As the explanatory materials do not adequately address these issues, the committee requests the minister’s detailed advice as to:
      • why it is considered necessary and appropriate to provide for evidentiary certificates to be issued in connection a data disruption warrant or emergency authorisation, a network access warrant, or an account takeover warrant;
      • the circumstances in which it is intended that evidentiary certificates would be issued, including the nature of any relevant proceedings; and
      • the impact that issuing evidentiary certificates may have on individuals’ rights and liberties, including on the ability of individuals to challenge the lawfulness of actions taken by law enforcement agencies.
    • As the explanatory materials do not address this issue, the committee requests the minister’s advice as to why it is proposed to use offence-specific defences (which reverse the evidential burden of proof) in this instance. The committee’s consideration of the appropriateness of a provision which reverses the burden of proof is assisted if it explicitly addresses relevant principles as set out in the Guide to Framing Commonwealth Offences.
    • The committee requests the minister’s advice as to why it is considered necessary to allow for executive level members of staff of the ACIC to be ‘appropriate authorising officers’, in particular with reference to the committee’s scrutiny concerns in relation to the use of coercive powers without judicial authorisation under an emergency authorisation.
    • Regarding the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, the committee asserted the bill “seeks to establish a mandatory code of conduct to support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The committee requested less input on this bill:
      • requests the Treasurer’s advice as to why it is considered necessary and appropriate to leave the determination of which digital platforms must participate in the News Media and Digital Platforms Mandatory Bargaining Code to delegated legislation.
      • If it is considered appropriate to leave this matter to delegated legislation, the committee requests the Treasurer’s advice as to whether the bill can be amended to require the positive approval of each House of the Parliament before determinations made under proposed section 52E come into effect.
  • The European Data Protection Board (EDPB) issued a statement “on new draft provisions of the second additional protocol to the Council of Europe Convention on Cybercrime (Budapest Convention),” the second time it has weighed in on the rewrite of “the first international treaty on crimes committed via the Internet and other computer networks, dealing particularly with infringements of copyright, computer-related fraud, child pornography and violations of network security.” The EDPB took issue with the process of meeting and drafting new provisions:
    • Following up on the publication of new draft provisions of the second additional protocol to the Budapest Convention , the EDPB therefore, once again, wishes to provide an expert and constructive contribution with a view to ensure that data protection considerations are duly taken into account in the overall drafting process of the additional protocol, considering that the meetings dedicated to the preparation of the additional protocol are being held in closed sessions and that the direct involvement of data protection authorities in the drafting process has not been foreseen in the T-CY Terms of Reference
    • The EDPB offered itself again as a resource and key stakeholder that needs to be involved with the effort:
      • In November 2019, the EDPB also published its latest contribution to the consultation on a draft second additional protocol, indicating that it remained available for further contributions and called for an early and more proactive involvement of data protection authorities in the preparation of these specific provisions, in order to ensure an optimal understanding and consideration of data protections safeguards (emphasis in the original).
    • The EDPB further asserted:
      • The EDPB remains fully aware that situations where judicial and law enforcement authorities are faced with a “cross-border situation” with regards to access to personal data as part of their investigations can be a challenging reality and recognises the legitimate objective of enhancing international cooperation on cybercrime and access to information. In parallel, the EDPB reiterates that the protection of personal data and legal certainty must be guaranteed, thus contributing to the objective of establishing sustainable arrangements for the sharing of personal data with third countries for law enforcement purposes, which are fully compatible with the EU Treaties and the Charter of Fundamental Rights of the EU. The EDPB furthermore considers it essential to frame the preparation of the additional protocol within the framework of the Council of Europe core values and principles, and in particular human rights and the rule of law.
  • The European Commission (EC) published a statement on how artificial intelligence (AI) “can transform Europe’s health sector.” The EC sketched out legislation it hopes to introduce soon on regulating AI in the European union (EU). The EC asserted:
    • A high-standard health system, rich health data and a strong research and innovation ecosystem are Europe’s key assets that can help transform its health sector and make the EU a global leader in health-related artificial intelligence applications. 
    • The use of artificial intelligence (AI) applications in healthcare is increasing rapidly.
    • Before the COVID-19 pandemic, challenges linked to our ageing populations and shortages of healthcare professionals were already driving up the adoption of AI technologies in healthcare. 
    • The pandemic has all but accelerated this trend. Real-time contact tracing apps are just one example of the many AI applications used to monitor the spread of the virus and to reinforce the public health response to it.
    • AI and robotics are also key for the development and manufacturing of new vaccines against COVID-19.
    • The European Commission is currently preparing a comprehensive package of measures to address issues posed by the introduction of AI, including a European legal framework for AI to address fundamental rights and safety risks specific to the AI systems, as well as rules on liability related to new technologies.
  • The House Energy and Commerce Committee Chair Frank Pallone, Jr. (D-NJ) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) wrote to Apple CEO Tim Cook “urging review and improvement of Apple’s new App Privacy labels in light of recent reports suggesting they are often misleading or inaccurate.” Pallone and Schakowsky are working from a Washington Post article, in which the paper’s tech columnist learned that Apple’s purported ratings system to inform consumers about the privacy practices of apps is largely illusory and possibly illegally deceptive. Pallone and Schakowsky asserted:
    • According to recent reports, App Privacy labels can be highly misleading or blatantly false. Using software that logs data transmitted to trackers, a reporter discovered that approximately one third of evaluated apps that said they did not collect data had inaccurate labels. For example, a travel app labeled as collecting no data was sending identifiers and other data to a massive search engine and social media company, an app-analytics company, and even a Russian Internet company. A ‘slime simulator’ rated for ages 4 and older had a ‘Data Not Collected’ label, even though the app shares identifying information with major tech companies and shared data about the phone’s battery level, storage, general location, and volume level with a video game software development company.
    • Simplifying and enhancing privacy disclosures is a laudable goal, but consumer trust in privacy labeling approaches may be undermined if Apple’s App Privacy labels disseminate false and misleading information. Without meaningful, accurate information, Apple’s tool of illumination and transparency may become a source of consumer confusion and harm. False and misleading privacy labels can dupe privacy-conscious consumers into downloading data-intensive apps, ultimately eroding the credibility and integrity of the labels. A privacy label without credibility and integrity also may dull the competitive forces encouraging app developers to improve their data practices.
    • A privacy label is no protection if it is false. We urge Apple to improve the validity of its App Privacy labels to ensure consumers are provided meaningful information about their apps’ data practices and that consumers are not harmed by these potentially deceptive practices.
    • Pallone and Schakowsky stated “[t]o better understand Apple’s practices with respect to the privacy labels, we request that you provide written response to the following questions by February 23, 2021:
      • 1. Apple has stated that it conducts routine and ongoing audits of the information provided by developers and works with developers to correct any inaccuracies.
        • a. Please detail the process by which Apple audits the privacy information provided by app developers. Please explain how frequently audits are conducted, the criteria by which Apple selects which apps to audit, and the methods for verifying the accuracy of the privacy information provided by apps.
        • b. How many apps have been audited since the implementation of the App Privacy label? Of those, how many were found to have provided inaccurate or misleading information? 
      • 2. Does Apple ensure that App Privacy labels are corrected upon the discovery of inaccuracies or misleading information? If not, why not? For each app that has been found to have provided inaccurate or misleading information, how quickly was that label corrected?
      • 3. Please detail Apple’s enforcement policies when an app fails to provide accurate privacy information for the App Privacy label.
      • 4. Does Apple require more in-depth privacy disclosures and conduct more stringent oversight of apps targeted to children under the age of 13? If not, why not? If so, please describe the additional disclosures required and the oversight actions employed for these apps.
      • 5. Providing clear and easily comprehendible privacy information at the point of sale is certainly valuable, but privacy policies are not static. Does Apple notify users when one of their app’s privacy labels has materially changed? If not, why not. If so, how are users notified of such changes.
  • The United Kingdom’s Department for Digital, Culture, Media & Sport (DCMS) “published its draft rules of the road for governing the future use of digital identities…[and] [i]t is part of plans to make it quicker and easier for people to verify themselves using modern technology and create a process as trusted as using passports or bank statements” according to its press release. The DCMS wants feedback by 11 March 2021 on the draft trust framework. The DCMS stated:
    • Digital identity products allow people to prove who they are, where they live or how old they are. They are set to revolutionise transactions such as buying a house, when people are often required to prove their identity multiple times to a bank, conveyancer or estate agent, and buying age-restricted goods online or in person.
    • The new ‘trust framework’ lays out the draft rules of the road organisations should follow. It includes the principles, policies, procedures and standards governing the use of digital identity to allow for the sharing of information to check people’s identities or personal details, such as a user’s address or age, in a trusted and consistent way. This will enable interoperability and increase public confidence.
    • The framework, once finalised, is expected to be brought into law. It has specific standards and requirements for organisations which provide or use digital identity services including:
      • Having a data management policy which explains how they create, obtain, disclose, protect, and delete data;
      • Following industry standards and best practice for information security and encryption;
      • Telling the user if any changes, for example an update to their address, have been made to their digital identity;
      • Where appropriate, having a detailed account recovery process and notifying users if organisations suspect someone has fraudulently accessed their account or used their digital identity;
      • Following guidance on how to choose secure authenticators for their service.
  • The European Commission (EC) “opened infringement procedures against 24 Member States for failing to enact new EU telecom rules.”
    • The EC asserted:
      • The European Electronic Communications Code modernises the European regulatory framework for electronic communications, to enhance consumers’ choices and rights, for example by ensuring clearer contracts, quality of services, and competitive markets. The Code also ensures higher standards of communication services, including more efficient and accessible emergency communications. Furthermore, it allows operators to benefit from rules incentivising investments in very-high capacity networks, as well as from enhanced regulatory predictability, leading to more innovative digital services and infrastructures.
      • The European Electronic Communications Code that brings the regulatory framework governing the European telecom sector up to date with the new challenges came into force in December 2018, and Member States have had two years to implement its rules. It is a central piece of legislation to achieve Europe’s Gigabit society and ensure full participation of all EU citizens in the digital economy and society.

Coming Events

  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • The U.S.-China Economic and Security Review Commission will hold a hearing titled “Deterring PRC Aggression Toward Taiwan” on 18 February.
  • On 24 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Fanning the Flames: Disinformation and Extremism in the Media.”
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Estúdio Bloom on Unsplash

Further Reading, Other Developments, and Coming Events (16 February 2021)

Further Reading

  • India cuts internet around New Delhi as protesting farmers clash with police” By Esha Mitra and Julia Hollingsworth — CNN; “Twitter Temporarily Blocked Accounts Critical Of The Indian Government” By Pranav Dixit — BuzzFeed News. Prime Minister Narendra Modi’s government again shut down the internet as a way of managing unrest or discontent with government policies. The parties out of power have registered their opposition, but the majority seems intent on using this tactic time and again. One advocacy organization named India as the nation with the most shutdowns in 2019, by far. The government in New Delhi also pressed Twitter to take down tweets and accounts critical of the proposed changes in agricultural law. Twitter complied per its own policies and Indian law and then later restored the accounts and tweets.
  • Lacking a Lifeline: How a federal effort to help low-income Americans pay their phone bills failed amid the pandemic” By Tony Romm — The Washington Post. An excellent overview of this Federal Communications Commission (FCC) program and its shortcomings. The Trump era FCC blunted and undid Obama era FCC reforms designed to make the eligibility of potential users easier to discern, among other changes. At the end of the day, many enrollees are left with a fixed number of minutes for phone calls and 4GB of data a month, or roughly what my daughter often uses in a day.
  • She exposed tech’s impact on people of color. Now, she’s on Biden’s team.” By Emily Birnbaum — Protocol. The new Deputy Director for Science and Society in the Office of Science and Technology Policy (OSTP) is a former academic and researcher who often focused her studies on the intersection of race and technology, usually how the latter failed minorities. This is part of the Biden Administration’s fulfillment of its campaign pledges to establish a more inclusive White House. It remains to be seen how the administration will balance the views of those critical of big technology with those hailing from big technology as a number of former high ranking employees have already joined or are rumored to be joining the Biden team.
  • Vaccine scheduling sites are terrible. Can a new plan help Chicago fix them?” By Issie Lapowsky — Protocol. As should not be shocking, many jurisdictions across the country have problematic interfaces for signing up for vaccination against COVID-19. It sounds reminiscent of the problems that plagued the Obamacare exchanges rollout in that potentially well thought out policy was marred by a barely thought out public face.
  • Google launches News Showcase in Australia in sign of compromise over media code” By Josh Taylor — The Guardian; “Cracks in media code opposition as Microsoft outflanks Google and Facebook” By Lisa Visentin — The Sydney Morning Herald. Both Google and Canberra seem to be softening their positions as the company signed up a number of major media outlets for its News Showcase, a feature that will be made available in Australia that will compensate the news organizations at an undisclosed level. However, a few major players, Nine, News Corp., and the Australian Broadcasting Corporation, have not joined, with Nine saying it will not. Google’s de-escalation of rhetoric and tactics will likely allow Prime Minister Scott Morrison’s government to relax the proposed legislation that would mandate Google and Facebook compensate Australian news media (i.e., the News Media and Digital Platforms Mandatory Bargaining Code.) Microsoft’s theoretical entrance into the Australian market through Bing if Google and Facebook actually leave or limit their presence seems to be arguing against the latter two companies’ position that the new code is unworkable. It is not clear if Microsoft is acting earnestly or floating a possible scenario in order that the other companies be cast in a bad light. In any event, cristics of the platforms say the fight is not about the technical feasibility of compensating news media but rather about establishing a precedent of paying for content the platforms now get essentially for free. Other content creators and entities could start demanding payment, too. An interesting tidbit from the second article: Canada may soon join Australia and the European Union in enacting legislation requiring Big Tech to pay its media companies for using their content (i.e., “a more equitable digital regulatory framework across platforms and news media” according to a minister.)

Other Developments

  • The Maryland legislature overrode Governor Larry Hogan’s (R) veto, and the first tax on digital advertising has been enacted in the United States. The “Taxation – Tobacco Tax, Sales and Use Tax, and Digital Advertising Gross Revenues Tax” (HB0732) would impose a tax on digital advertising in the state and may be outside a federal bar on certain taxes on internet services. However, if the veto is overridden, there will inevitably be challenges, and quite likely a push in Congress to enact a federal law preempting such digital taxes. Additionally, the primary sponsor of the legislation has introduced another bill barring companies from passing along the costs of the tax to Maryland businesses and consumers.
    • In a bill analysis, the legislature asserted about HB0732:
      • The bill imposes a tax on the annual gross revenues of a person derived from digital advertising services in the State. The bill provides for the filing of the tax returns and making tax payments. The part of the annual gross revenues of a person derived from digital advertising services in the State are to be determined using an apportionment fraction based on the annual gross revenues of a person derived from digital advertising services in the State and the annual gross revenues of a person derived from digital advertising services in the United States. The Comptroller must adopt regulations that determine the state from which revenues from digital advertising services are derived.
      • The digital advertising gross revenues tax is imposed at the following rates:
        • 2.5% of the assessable base for a person with global annual gross revenues of $100.0 million through $1.0 billion;
        • 5% of the assessable base for a person with global annual gross revenues of $1.0 billion through $5.0 billion;
        • 7.5% of the assessable base for a person with global annual gross revenues of $5.0 billion through $15.0 billion; and
        • 10% of the assessable base for a person with global annual gross revenues exceeding $15.0 billion.
    • In his analysis, Maryland’s Attorney General explained:
      • House Bill 732 would enact a new “digital advertising gross revenues tax.” The tax would be “imposed on annual gross revenues of a person derived from digital advertising services in the State.” Digital advertising services are defined in the bill to include “advertisement services on a digital interface, including advertisements in the form of banner advertising, search engine advertising, interstitial advertising, and other comparable advertising services.” The annual gross revenues derived from digital advertising services is set out in a formula in the bill.
      • Attorney General Brian Frosh conceded there will be legal challenges to the new Maryland tax: there are “three grounds on which there is some risk that a reviewing court would find that the taxis unconstitutional: (1) preemption under the federal Internet Tax Freedom Act; (2) the Commerce Clause; and, (3) the First Amendment.”
  • Democratic Members introduced the “Secure Data and Privacy for Contact Tracing Act” (H.R.778/S.199) in both the House and Senate, legislation that “would provide grants to states that choose to use technology as part of contact tracing efforts for COVID-19 if they agree to adopt strong privacy protections for users” per their press release. Representatives Jackie Speier (D-CA) and Debbie Dingell (D-MI) introduced the House bill and Senators Brian Schatz (D-HI) and Tammy Baldwin (D-WI) the Senate version. Speier, Dingell, Schatz, and Baldwin contended “[t]he Secure Data and Privacy for Contact Tracing Actprovides grant funding for states to responsibly develop digital contact tracing technologies consistent with the following key privacy protections:
    • Digital contact tracing tech must be strictly voluntary and provide clear information on intended use.
    • Data requested must be minimized and proportionate to what is required to achieve contact tracing objectives.
    • Data must be deleted after contact tracing processing is complete, or at the end of the declaration of emergency.
    • States must develop a plan for how their digital contact tracing technology compliments more traditional contact tracing efforts and describe efforts to ensure their technology will be interoperable with other states. 
    • States must establish procedures for independent security assessments of digital contact tracing infrastructure and remediate vulnerabilities. 
    • Information gathered must be used strictly for public health functions authorized by the state and cannot be used for punitive measures, such as criminal prosecution or immigration enforcement.
    • Digital contact tracing tech must have robust detection capabilities consistent with CDC guidance on exposure. 
    • Digital contact tracing technology must ensure anonymity, allowing only authorized public health authorities or other authorized parties to have access to personally identifiable information.
  • The chair and ranking member of the Senate Intelligence Committee wrote the heads of the agencies leading the response to the Russian hack of the United States (U.S.) government and private sector entities through SolarWinds, taking them to task for their thus far cloistered, siloed approach. In an unusually blunt letter, Chair Mark Warner (D-VA) and Ranking Member Marco Rubio (R-FL) asked the agencies name a leader to the response triggered when former President Donald Trump triggered the system established in Presidential Policy Directive-41 because “[t]he federal government’s response so far has lacked the leadership and coordination warranted by a significant cyber event, and we have little confidence that we are on the shortest path to recovery.” Warner and Rubio directed this request to Director of National Intelligence Avril Haines, National Security Agency and Cyber Command head General Paul Nakasone, Federal Bureau of Investigation (FBI) Director Christopher Wray, and Cybersecurity and Infrastructure Security Agency (CISA) Acting Director Brandon Wales. Warner and Rubio further asserted:
    • The briefings we have received convey a disjointed and disorganized response to confronting the breach. Taking a federated rather than a unified approach means that critical tasks that are outside the central roles of your respective agencies are likely to fall through the cracks. The threat our country still faces from this incident needs clear leadership to develop and guide a unified strategy for recovery, in particular a leader who has the authority to coordinate the response, set priorities, and direct resources to where they are needed. The handling of this incident is too critical for us to continue operating the way we have been.
  • Huawei filed suit against the Federal Communications Commission’s (FCC) decision to “designate Huawei, as well as its parents, affiliates, and subsidiaries, as companies posing a national security threat to the integrity of our nation’s communications networks and the communications supply chain” through “In the Matter of Protecting Against National Security Threats to the Communications Supply Chain Through FCC Programs – Huawei Designation.” In the petition filed with the United States Court of Appeals for the Fifth Circuit, Huawei said it is “seek[ing] review of the Final Designation Order on the grounds that it exceeds the FCC’s statutory authority; violates federal law and the Constitution; is arbitrary, capricious, and an abuse of discretion, and not supported by substantial evidence, within the meaning of the Administrative Procedure Act, 5 U.S.C. § 701 et seq.; was adopted through a process that failed to provide Petitioners with the procedural protections afforded by the Constitution and the Administrative Procedure Act; and is otherwise contrary to law.”
  • According to unnamed sources, the Biden Administration has decided to postpone indefinitely the Trump Administration’s efforts to forcing ByteDance to sell TikTok as required by a Trump Administration executive order. Last September, it appeared that Oracle and Walmart had reached a deal in principle with ByteDance that quickly raised more questions that it settled (see here for more details and analysis.) There are reports of ByteDance working with the Committee on Foreign Investment in the United States (CFIUS), the inter-agency review group, that ordered ByteDance to spin off TikTok. TikTok and CFIUS are reportedly talking about what an acceptable divestment would look like, but of course, under recently implemented measures, the People’s Republic of China (PRC) would also have to sign off. Nonetheless, White House Press Secretary Jen Psaki remarked at a press conference “[t]here is a rigorous CFIUS process that is ongoing.”
  • The Biden Administration has asked two federal appeals courts to pause lawsuits brought to stop the United States (U.S.) government from enforcing the Trump Administration executive order banning TikTok from the United States (see here for more analysis.)
    • In the status report filed with the United States Court of Appeal for the District of Columbia, TikTok and the Department of Justice (DOJ) explained:
      • Defendants’ counsel informed Plaintiffs’ counsel regarding the following developments: As the Biden Administration has taken office, the Department of Commerce has begun a review of certain recently issued agency actions, including the Secretary’s prohibitions regarding the TikTok mobile application at issue in this case. In relation to those prohibitions, the Department plans to conduct an evaluation of the underlying record justifying those prohibitions. The government will then be better positioned to determine whether the national security threat described in the President’s August 6, 2020 Executive Order, and the regulatory purpose of protecting the security of Americans and their data, continue to warrant the identified prohibitions. The Department of Commerce remains committed to a robust defense of national security as well as ensuring the viability of our economy and preserving individual rights and data privacy.
    • In its unopposed motion, the DOJ asked the United States Court of Appeals for the Third Circuit “hold this case in abeyance, with status reports due at 60-day intervals.” The DOJ used exactly the same language as in the filing in the D.C. Circuit.
  • The Trump Administration’s President’s Council of Advisors on Science and Technology (PCAST) issued a report at the tail end of the  administration, “Industries of the Future Institutes: A New Model for American Science and Technology Leadership,” that “follows up on a recommendation from PCAST’s report, released June 30, 2020, involving the formation of a new type of multi-sector research and development organization: Industries of the Future Institutes (IotFIs)…[and] provides a framework to inform the design of IotFIs and thus should be used as preliminary guidance by funders and as a starting point for discussion among those considering participation.”
    • PCAST “propose[d] a revolutionary new paradigm for multi-sector collaboration—Industries of the Future Institutes (IotFIs)—to address some of the greatest societal challenges of our time and to ensure American science and technology (S&T) leadership for decades to come.” PCAST stated “[b]y driving research and development (R&D) at the intersection of two or more IotF areas, these Institutes not only will advance knowledge in the individual IotF topics, but they also will spur new research questions and domains of inquiry at their confluence.” PCAST added:
      • By engaging multiple disciplines and each sector of the U.S. R&D ecosystem—all within the same agile organizational framework—IotFIs will span the spectrum from discovery research to the development of new products and services at scale. Flexible intellectual property terms will incentivize participation of all sectors, and reduced administrative and regulatory burdens will optimize researcher time for creativity and productivity while maintaining appropriate safety, transparency, integrity, and accountability. IotFIs also will serve as a proving ground for new, creative approaches to organizational structure and function; broadening participation; workforce development; science, technology, engineering, and math education; and methods for engaging all sectors of the American research ecosystem. Ultimately, the fruits of IotFIs will sustain American global leadership in S&T, improve quality of life, and help ensure national and economic security for the future.
  • Per the European Commission’s (EC) request, the European Data Protection Board (EDPB) issued clarifications on the consistent application of the General Data Protection Regulation (GDPR) with a focus on health research. The EDPB explained:
    • The following response of the EDPB to the questions of the European Commission should be considered as a first attempt to take away some of the misunderstandings and misinterpretations as to the application of the GDPR to the domain of scientific health research. Generally speaking, most of these questions call for more time for in-depth analysis and/or a search for examples and best practices and can as yet not be completely answered.
    • In its guidelines (currently in preparation and due in 2021) on the processing personal data for scientific research purposes, the EDPB will elaborate further on these issues while also aiming to provide a more comprehensive interpretation of the various provisions in the GDPR that are relevant for the processing of personal data for scientific research purposes.
    • This will also entail a clarification of the extent and scope of the ‘special derogatory regime’ for the processing of personal data for scientific research purposes in the GDPR. It is important that this regime is not perceived as to imply a general exemption to all requirements in the GDPR in case of processing data for scientific research purposes. It should be taken into account that this regime only aims to provide for exceptions to specific requirements in specific situations and that the use of such exceptions is made dependent on ‘additional safeguards’ (Article 89(1) GDPR) to be in place.
  • The Government Accountability Office (GAO) has assessed how well the Federal Communications Commission (FCC) has rolled out and implemented its Lifeline National Verifier (referred to as Verifier by the GAO) to aid low income people in accessing telecommunications benefits. The Verifier was established in 2016 to address claims that allowing telecommunications carriers to make eligibility determinations for participation in the program to help people obtain lower cost communications had led to waste, fraud, and abuse. House Energy and Commerce Committee Chair Frank Pallone Jr. (D-NJ), Communications and Technology Subcommittee Chair Mike Doyle (D-PA), and six Democratic colleagues on the committee asked the GAO “to review FCC’s implementation of the Verifier.” The GAO explained “[t]his report examines (1) the status of the Verifier; (2) the extent to which FCC coordinated with state and federal stakeholders, educated consumers, and facilitated involvement of tribal stakeholders; and (3) the extent to which the Verifier is meeting its goals.” The GAO concluded:
    • The Lifeline program is an important tool that helps low-income Americans afford vital voice and broadband services. In creating the Lifeline National Verifier, FCC sought to facilitate eligible Americans’ access to Lifeline support while protecting the program from waste, fraud, and abuse. Although USAC, under FCC’s oversight, has made progress to implement the Verifier, many eligible consumers are unaware of it and may be unable to use it. Additionally, tribal governments and organizations do not have the information they need from FCC to effectively assist residents of tribal lands in using the Verifier to enroll in Lifeline, even though Lifeline support is critical to increasing access to affordable telecommunications services on tribal lands. Without FCC developing a plan to educate consumers about the Verifier and empowering tribal governments to assist residents of tribal lands with the Verifier, eligible consumers, especially those on tribal lands, will continue to lack awareness of the Verifier and the ability to use it.
    • Further, without measures and information to assess progress toward some of its goals, FCC lacks information it needs to refine and improve the Verifier. While it is too soon to determine if the Verifier is protecting against fraud, FCC has measures in place to monitor fraud moving forward. However, FCC lacks measures to track the Verifier’s progress toward the intent of its second goal of delivering value to Lifeline consumers. FCC also lacks information to help it assess and improve its efforts to meet the third goal of improving the consumer experience. Additionally, consumers may experience challenges with the Verifier’s online application, such as difficulty identifying the Verifier as a government service, and may be uncomfortable providing sensitive information to a website that does not use a “.gov” domain. Unless FCC identifies and addresses challenges with the Verifier’s manual review process and its online application, it will be limited in its ability to improve the consumer experience. As a result, some eligible consumers may abandon their applications and go without the support they need to access crucial telecommunications services. Given that a majority of Lifeline subscribers live in states without state database connections and therefore must undergo manual review more frequently, ensuring that challenges with the manual review process are resolved is particularly important.
    • The GAO recommended:
      • The Chairman of FCC should develop and implement a plan to educate eligible consumers about the Lifeline program and Verifier requirements that aligns with key practices for consumer education planning. (Recommendation 1)
      • The Chairman of FCC should provide tribal organizations with targeted information and tools, such as access to the Verifier, that equip them to assist residents of tribal lands with their Verifier applications. (Recommendation 2)
      • The Chairman of FCC should identify and use performance measures to track the Verifier’s progress in delivering value to consumers. (Recommendation 3)
      • The Chairman of FCC should ensure that it has quality information on consumers’ experience with the Verifier’s manual review process, and should use that information to improve the consumer experience to meet the Verifier’s goals. (Recommendation 4)
      • The Chairman of FCC should ensure that the Verifier’s online application and support website align with characteristics for leading federal website design, including that they are accurate, clear, understandable, easy to use, and contain a mechanism for users to provide feedback. (Recommendation 5)
      • The Chairman of FCC should convert the Verifier’s online application, checklifeline.org, to a “.gov” domain. (Recommendation 6)

Coming Events

  • The House Appropriations Committee’s Financial Services and General Government Subcommittee will hold an oversight hearing on the Election Assistance Commission (EAC) on 16 February with EAC Chair Benjamin Hovland.
  • On 17 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Connecting America: Broadband Solutions to Pandemic Problems” with these witnesses:
    • Free Press Action Vice President of Policy and General Counsel Matthew F. Wood
    • Topeka Public Schools Superintendent Dr. Tiffany Anderson
    • Communications Workers of America President Christopher M. Shelton
    • Wireless Infrastructure Association President and CEO Jonathan Adelstein
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Zachary Peterson on Unsplash

A Revised ePrivacy Regulation Proposed

A key stakeholder has proposed changes to the EC’s four-year old proposed ePrivacy Regulation, moving matters to the EU Parliament. 

Council of the European Union (Council) has released a long awaited compromise draft of the ePrivacy Regulation, a rewrite of the European Union’s existing rules on the privacy of electronic communications. This new law is intended to complement the General Data Protection Regulation (GDPR). This is an important but preliminary development, and now the Council will begin negotiations with the European Parliament to arrive at final ePrivacy Regulation language. The European Commission (EC) presented its ePrivacy Regulation proposal in January 2017 but lobbying in Brussels has been fierce. The last four years have been spent haggling over the final Regulation. However, as a regulation, the ePrivacy Regulation, like the GDPR, would become EU law throughout all the nations without needing member states to enact implementing legislation as it must for directives even though there is leeway for nations to legislate further in accordance with the draft.

In its press release, the Council asserted:

Today, member states agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services. These updated ‘ePrivacy’ rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices. Today’s agreement allows the Portuguese presidency to start talks with the European Parliament on the final text (emphasis in the original.)

The Council continued:

An update to the existing ePrivacy directive of 2002 is needed to cater for new technological and market developments, such as the current widespread use of Voice over IP, web-based email and messaging services, and the emergence of new techniques for tracking users’ online behaviour.

The new ePrivacy Regulation would repeal Directive 2002/58/EC (Regulation on Privacy and Electronic Communications) and enact new text to address a number of changes in electronic communications and services since the current regime was enacted. The ePrivacy Regulation was intended to be enacted alongside the GDPR, but this did not come to pass given the competing interests among EU nations.

As for the text itself, a few threshold matters are worth highlighting. First, the ePrivacy Regulation would apply to both natural and legal persons (i.e., actual people in the EU and EU entities such as businesses.) Second, the ePrivacy Regulation does not impinge the national security and defense data processing activities EU member states may undertake. Third, it applies to telecommunications providers and communications platforms. Fourth, the new regime would govern data processing electronic communications data or the personal data of EU residents in specified circumstances regardless of where the processing is occurring (e.g., Google processing EU communications in Egypt) and even if the processor is not established in the EU (e.g., a Taiwanese data broker processing certain communications of EU people or businesses.) Fifth, the ePrivacy Regulation sets up a tiered penalty system just like the GDPR’s with a lesser class of violations exposing the violator to up to a fine of either up to €10 million or 2% of the entity’s worldwide turnover with more serious violations facing liability of €20 million or 4% of worldwide turnover. Sixth, the European Data Protection Board (EDPB) would be given the task “to contribute to the consistent application of Chapters I and II and III of this Regulation) (i.e., the operative portions of the ePrivacy regime.)

In terms of the policy backdrop, the ePrivacy Regulation makes clear:

  • Article 7 of the Charter of Fundamental Rights of the European Union (“the Charter”) protects the fundamental right of everyone to the respect for private and family life, home and communications. Respect for the confidentiality of one’s communications is an essential dimension of this right, applying both to natural and legal persons. Confidentiality of electronic communications ensures that information exchanged between parties and the external elements of such communication, including when the information has been sent, from where, to whom, is not to be revealed to anyone other than to the parties involved in a communication. The principle of confidentiality should apply to current and future means of communication, including calls, internet access, instant messaging applications, e-mail, internet phone calls and personal messaging provided through social media.
  • The content of electronic communications may reveal highly sensitive information about the natural persons involved in the communication, from personal experiences and emotions to medical conditions, sexual preferences and political views, the disclosure of which could result in personal and social harm, economic loss or embarrassment. Similarly, metadata derived from electronic communications may also reveal very sensitive and personal information. These metadata includes the numbers called, the websites visited, geographical location, the time, date and duration when an individual made a call etc., allowing precise conclusions to be drawn regarding the private lives of the persons involved in the electronic communication, such as their social relationships, their habits and activities of everyday life, their interests, tastes etc.

The Council intends the ePrivacy Regulation to work in concert with the GDPR, specifying that where the former is silent on an issue, the latter shall control:

Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons.

Article I states the purpose of the ePrivacy Regulation:

This Regulation lays down rules regarding the protection of the fundamental rights and freedoms of legal persons in the provision and use of the electronic communications services, and in particular their rights to respect of communications.

The ePrivacy Regulation will apply to:

  • the processing of electronic communications content and of electronic communications metadata carried out in connection with the provision and the use of electronic communications services;
  • end-users’ terminal equipment information.
  • the offering of a publicly available directory of end-users of electronic communications services;
  • the sending of direct marketing communications to end-users.

Electronic communications data (a term encompassing both content and metadata) must generally kept confidential (subject to exceptions) but it may be processed under the following circumstances:

  • If “necessary to provide an electronic communication service”
  • it is necessary to maintain or restore the security of electronic communications networks and services, or detect technical faults, errors, security risks or attacks on electronic communications networks and services;
  • it is necessary to detect or prevent security risks or attacks on end-users’ terminal equipment;
  • it is necessary for compliance with a legal obligation to which the provider is subject laid down by Union or Member State law, which respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the safeguarding against and the prevention of threats to public security.

Electronic communications metadata may be processed in a number of scenarios without consent, including for maintaining networks and services, for the fulfillment of a contract to which the end-user is a party, or “it is necessary in order to protect the vital interest of a natural person.” Such processing of metadata may also be part of scientific or historical research and related purposes subject to additional requirements. And, of course, a person or entity could consent to such processing for one or more specified purposes.

There is a subsequent section that seems to contemplate other possible “compatible” processing of metadata without a person or entity’s consent and outside an EU or member state law. The regulations list a number of considerations the provider must take into account in making this determination such as the link between the reasons why these data were first collected and the intended additional processing, the context of the data collection, the nature of the metadata, the possible consequences to the end-user of further processing, and the use of safeguards such as encryption or pseudonymization. However, there are strict limits on how the processing may take place. If the information can be anonymized for processing, it must be. Otherwise, it must be made anonymous or erased after processing. Metadata must be processed in a pseudonymized fashion and cannot be used to determine the nature or characteristics of the user to build a user profile. Finally, metadata collected and processed under this provision of the ePrivacy Regulation cannot be shared with third parties unless it is made anonymous.

And so, it appears providers may engage in additional processing a Spanish resident may not have consented to so long as these conditions are met. However, the regulations do not spell out what sort of situations these may be, leaving the issue to EU courts. Given the lengthy negotiations over the ePrivacy Regulation, this may be one of the places the parties decided to leave open-ended.

Moreover, providers are to erase or anonymize electronic communications content and metadata when there is no longer a need for processing or for providing an electronic communications service subject to exceptions in the latter instance.

There is a broad bar on the use of people or entities’ devices or equipment for processing and against collecting information except subject to enumerated exceptions such as it is necessary to provide service, the person or entity consents, to measure the audience, to maintain or restore the security of the devices or service, or to provide a software update. There is also language as with metadata processing that would seem to allow processing in this context aside and apart from consent and EU or member state law so long as the provider lives within the same types of limits.

When a person connects her device to a network or another device, collection of information is forbidden unless it is needed to establish or maintain a connection, a user provides consent, it is needed to provide a requested service, or “it is necessary for the purpose of statistical purposes that is limited in time and space to the extent necessary for this purpose.”

EU member states may abridge some of these rights through legislation “where such a restriction respects the essence of the fundamental rights and freedoms and is a necessary, appropriate and proportionate measure in a democratic society to safeguard one or more of the general public interests referred to in Article 23(1)” of the GDPR, namely

  • public security;
  • the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security;
  • other important objectives of general public interest of the Union or of a Member State, in particular an important economic or financial interest of the Union or of a Member State, including monetary, budgetary and taxation a matters, public health and social security;
  • the protection of the data subject or the rights and freedoms of others;
  • the enforcement of civil law claims.

There are further provisions on EU people and entities turning off caller identification and blocking or allowing unsolicited calls and communications.

The regulatory structure will be similar to the one in effect under the GDPR with each member nation having a supervisory authority or authorities in place to monitor compliance with the new regulation and take action if necessary. Likewise, the EDPB shall have significant powers in the oversight and implementation of the ePrivacy Regulation but short of those provided under the GDPR, notably the authority to referee and adjudicate disputes over enforcement between nations. There is language directing all authorities to work cooperatively across borders, but that is it.

As mentioned, violators of the ePrivacy Regulation would face stiff fines just as under the GDPR with the more severe penalty tier being reserved for “[i]nfringements of the principle of confidentiality of communications, permitted processing of electronic communications data, time limits for erasure pursuant to Articles 5, 6, and 7.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Meurice from Pexels

Further Reading, Other Developments, and Coming Events (9 February 2021)

Further Reading

  • Why Intel’s troubles should concern us all” By Ina Fried — Axios. One of the last major American semi-conductor manufacturers is struggling to keep up with rivals, and this could be very bad for United States (U.S.) national security. Biden Administration officials have made noise signifying they understand, but we will see what, if any action, is taken. A provision in the FY 2021 National Defense Authorization Act (NDAA) could help, but it requires the Appropriations Committees to provide the funding to maintain and stimulate semi-conductor manufacturing in the U.S.
  • Companies and foreign countries vying for your DNA” By Jon Wertheim — CBS News. This piece is a frightening view of the waterfront in the high-tech world of genealogy, which is serving as a front of sorts to collect huge DNA data sets pharmaceutical companies and others will pay billions of dollars for. There are also concerns about investors from the People’s Republic of China (PRC) in light of the country’s ambition to lead the way into biotechnologies.
  • Brazil’s government plans 5G network separate from private market – document” By Lisandra Paraguassu — Reuters. It appears with former President Donald Trump having left office, plans in Brasilia to ban or sideline Huawei have left, too. Now the right-wing government is planning for a government 5G network in Brazil’s capital subject to high security standards that may rule out Huawei while leaving the rest of the nation’s 5G rollout to companies such as Huawei, a state of affairs Brazilian telcos might like considering that an estimated 50% of existing infrastructure is Huawei.
  • An AI saw a cropped photo of AOC. It autocompleted her wearing a bikini.” By Karen Hao — MIT Technology Review. Unsupervised learning algorithms are a new means by which algorithms are educated. Normally, algorithms are fed information, and with respect to images, researchers feed them an image along with its name. But, unsupervised leaning algorithms are let loose on the internet to learn, so it should not be surprising the toxicity of online life is absorbed. Consequently, an autocomplete function with a headshot of a man puts him in a suit whereas the headshot of a woman will be “completed” with a low-cut top or a bikini.
  • How the US Lost to Hackers” By Nicole Perlroth — The New York Times. This piece makes the point that the United States’ (U.S.) relentless focus on offensive cyber operations is now costing the nation as Russian, Chinese, Iranian, and other hackers are pillaging U.S. systems and assets. Defensive capabilities were always a stepchild, and this has left the U.S. vulnerable. A paradigm shift is needed across the U.S. because a number of other nations are every bit as good as the U.S. is.

Other Developments

  • Maryland may be on the verge of enacting the first tax in the United States (U.S.) on digital advertising. The Democratic majorities in the state Senate and House of Delegates seem poised to override the veto the Maryland governor’s veto. The “Taxation – Tobacco Tax, Sales and Use Tax, and Digital Advertising Gross Revenues Tax” (HB0732) would impose a tax on digital advertising in the state and may be outside a federal bar on certain taxes on internet services. However, if the veto is overridden, there will inevitably be challenges, and quite likely a push in Congress to enact a federal law preempting such digital taxes. Additionally, the primary sponsor of the legislation has introduced another bill barring companies from passing along the costs of the tax to Maryland businesses and consumers.
    • In a bill analysis, the legislature asserted about HB0732:
      • The bill imposes a tax on the annual gross revenues of a person derived from digital advertising services in the State. The bill provides for the filing of the tax returns and making tax payments. The part of the annual gross revenues of a person derived from digital advertising services in the State are to be determined using an apportionment fraction based on the annual gross revenues of a person derived from digital advertising services in the State and the annual gross revenues of a person derived from digital advertising services in the United States. The Comptroller must adopt regulations that determine the state from which revenues from digital advertising services are derived.
      • The digital advertising gross revenues tax is imposed at the following rates:
        • 2.5% of the assessable base for a person with global annual gross revenues of $100.0 million through $1.0 billion;
        • 5% of the assessable base for a person with global annual gross revenues of $1.0 billion through $5.0 billion;
        • 7.5% of the assessable base for a person with global annual gross revenues of $5.0 billion through $15.0 billion; and
        • 10% of the assessable base for a person with global annual gross revenues exceeding $15.0 billion.
    • In his analysis, Maryland’s Attorney General explained:
      • House Bill 732 would enact a new “digital advertising gross revenues tax.” The tax would be “imposed on annual gross revenues of a person derived from digital advertising services in the State.” Digital advertising services are defined in the bill to include “advertisement services on a digital interface, including advertisements in the form of banner advertising, search engine advertising, interstitial advertising, and other comparable advertising services.” The annual gross revenues derived from digital advertising services is set out in a formula in the bill.
      • Attorney General Brian Frosh conceded there will be legal challenges to the new Maryland tax: there are “three grounds on which there is some risk that a reviewing court would find that the taxis unconstitutional: (1) preemption under the federal Internet Tax Freedom Act; (2) the Commerce Clause; and, (3) the First Amendment.”
    • Governor Larry Hogan (R) vetoed the bill in May along with others, asserting:
      • These misguided bills would raise taxes and fees on Marylanders at a time when many are already out of work and financially struggling. With our state in the midst of a global pandemic and economic crash, and just beginning on our road to recovery, it would be unconscionable to raise taxes and fees now. To do so would further add to the very heavy burden that our citizens are already facing.
    • As mentioned, a follow on bill has been introduced to ensure the digital advertising tax will not result in higher costs for Maryland businesses and residents. The “Digital Advertising Gross Revenues Tax – Exemption and Restriction” (SB0787) provides:
      • A person who derives gross revenues from digital advertising services in the state may not directly pass on the cost of the tax imposed under this section to a customer who purchases the digital advertising services by means of a separate fee, surcharge, or line-item.
      • However, the news media would be exempted from the digital advertising tax in this bill.
  • The chair and subcommittee chairs of the House Energy and Commerce Committee wrote Facebook, Twitter, and Google “as part of their ongoing investigation into tech companies’ handling of the COVID-19 pandemic in response to reports that COVID-19 vaccine misinformation is escalating on their platforms” per the press release. Chair Frank Pallone, Jr. (D-NJ), Health Subcommittee Chair Anna G. Eshoo (D-CA), Oversight and Investigations Subcommittee Chair Diana DeGette (D-CO), Communications and Technology Subcommittee Chair Mike Doyle (D-PA), and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) noted the letters “are a follow-up to letters they sent to the same companies in July, expressing deep concern regarding the rampant rise of COVID-19 disinformation more generally.” They argued:
    • These COVID-19 vaccines and others in development present hope in turning the deadly tide of the last year and can be a powerful tool in our efforts to contain the pandemic—but only if the public has confidence in them. Thus, it is imperative that [Facebook, Twitter, and Google] stop[] the spread of false or misleading information about coronavirus vaccines on its platform. False and misleading information is dangerous, and if relied on by the public to make critical health choices, it could result in the loss of human life.
    • They posed the following questions:
      • Details of all actions the companies have taken to limit false or misleading COVID-19 vaccine misinformation or disinformation on their platforms;
      • Descriptions of all policy changes the companies have implemented to stop the spread of false or misleading COVID-19 vaccine misinformation, and how the companies are measuring the effectiveness of each such policy change;
      • Whether the companies have used information labels or other types of notifications to alert users about COVID-19 vaccine misinformation or disinformation, and if so, the date(s) it first began implanting labels or notifications and how the companies are measuring its effectiveness;
      • Details about the five common targeted advertisements that appear alongside COVID-19 vaccine misinformation or disinformation on the platforms;
      • Details on the companies’ COVID-19 vaccine misinformation and disinformation enforcement efforts; and
      • Whether the companies have coordinated any actions or activities with other online platforms related to COVID-19 vaccine misinformation or disinformation.
  • Graphika released a report on fake social media activity that seems to be advocating for Huawei and against the Belgian government’s proposed ban of the Chinese company in its 5G networks. Graphika asserted the following:
    • A cluster of inauthentic accounts on Twitter amplified, and sometimes created, articles that attacked the Belgian government’s recent plans to limit the access of “high-risk” suppliers to its 5G network. The plans are reportedly designed to limit the influence of Chinese firms, notably Huawei and ZTE. 
    • The operation appears to have been limited to Twitter, and it did not gain substantial traction: other than a systematic amplification by the real accounts of Huawei executives in Western Europe, its main amplification came from bots with zero followers. 
    • As so often in recent influence operations, the accounts used profile pictures created by artificial intelligence. 
    • There is insufficient forensic evidence to prove conclusively who was running the fake accounts, or who sponsored the operation.
  • One of the dueling groups convened at the United Nations (UN) to address information and communications technologies (ICTs) issues and problems has issued a draft report and related materials. The group backed by the Russian Federation, People’s Republic of China (PRC), and other nations, the Open-Ended Working Group (OEWG), has issued its Zero Draft, which details its discussions, findings, and recommendations. The OEWG is working alongside the United States led Group of Governmental Experts on Advancing responsible State behaviour in cyberspace in the context of international security, which is expected to finish its work in May 2021. The OEWG also made available the following:
    • In a 2018 U.N. press release, it was explained that two resolutions to create groups “aimed at shaping norm-setting guidelines for States to ensure responsible conduct in cyberspace:”
      • the draft resolution “Developments in the field of information and telecommunications in the context of international security” (document A/C.1/73/L.27.Rev.1), tabled by the Russian Federation.  By the text, the Assembly would decide to convene in 2019 an open-ended working group acting on a consensus basis to further develop the rules, norms and principles of responsible behaviour of States.
      • the draft resolution “Advancing Responsible State Behaviour in Cyberspace in the Context of International Security” (document A/C.1/73/L.37), tabled by the United States…[that] would request the Secretary-General, with the assistance of a group of governmental experts to be established in 2019, to continue to study possible cooperative measures to address existing and potential threats in the sphere of information security, including norms, rules and principles of responsible behaviour of States.
      • The U.N. noted that ‘[s]everal speakers pointed out that language in [the Russian proposal] departed from previous year’s versions and included excerpts from the Group of Governmental Experts reports in a manner that distorted their meaning and transformed the draft resolution.” The U.N. also acknowledged that “some delegates said [the U.S. proposal] called for the establishment of a new group of governmental experts, with the same mandate as the previous ones and the same selectivity in terms of its composition.” The U.N. added that “[m]ore broadly, while some delegates regretted to note that two separate, yet similar draft resolutions were tabled, others highlighted a need for bold, swift action to prevent cyberattacks and malicious online behaviour.”
    • In the 2018 resolution offered by Russia, an OEWG was convened “with a view to making the United Nations negotiation process on security in the use of information and communications technologies more democratic, inclusive and transparent…and to further develop the rules, norms and principles of responsible behaviour of States” from previous UN-sponsored efforts. The OEWG was further tasked with examining “the ways for their implementation; if necessary, to introduce changes to them or elaborate additional rules of behaviour; to study the possibility of establishing regular institutional dialogue with broad participation under the auspices of the United Nations; and to continue to study, with a view to promoting common understandings, existing and potential threats in the sphere of information security and possible cooperative measures to address them and how international law applies to the use of information and communications technologies by States, as well as confidence-building measures and capacity-building and the concepts.” The OEWG is charged with submitting “a report on the results of the study to the General Assembly at its seventy-fifth session, and to provide the possibility of holding, from within voluntary contributions, intersessional consultative meetings with the interested parties, namely business, non-governmental organizations and academia, to share views on the issues within the group’s mandate.”
  • The United States (U.S.) Department of Justice (DOJ) “announced a coordinated international law enforcement action to disrupt a sophisticated form of ransomware known as NetWalker.” The DOJ asserted:
    • NetWalker ransomware has impacted numerous victims, including companies, municipalities, hospitals, law enforcement, emergency services, school districts, colleges, and universities. Attacks have specifically targeted the healthcare sector during the COVID-19 pandemic, taking advantage of the global crisis to extort victims.
    • The NetWalker action includes charges against a Canadian national in relation to NetWalker ransomware attacks in which tens of millions of dollars were allegedly obtained, the seizure of approximately $454,530.19 in cryptocurrency from ransom payments, and the disablement of a dark web hidden resource used to communicate with NetWalker ransomware victims.
    • According to the affidavit, once a victim’s computer network is compromised and data is encrypted, actors that deploy NetWalker deliver a file, or ransom note, to the victim. Using Tor, a computer network designed to facilitate anonymous communication over the internet, the victim is then provided with the amount of ransom demanded and instructions for payment.
    • Actors that deploy NetWalker commonly gain unauthorized access to a victim’s computer network days or weeks prior to the delivery of the ransom note. During this time, they surreptitiously elevate their privileges within the network while spreading the ransomware from workstation to workstation. They then send the ransom note only once they are satisfied that they have sufficiently infiltrated the victim’s network to extort payment, according to the affidavit.
    • According to an indictment unsealed today, Sebastien Vachon-Desjardins of Gatineau, a Canadian national, was charged in the Middle District of Florida. Vachon-Desjardins is alleged to have obtained at least over $27.6 million as a result of the offenses charged in the indictment.
    • The Justice Department further announced that on Jan. 10, law enforcement seized approximately $454,530.19 in cryptocurrency, which was comprised of ransom payments made by victims of three separate NetWalker ransomware attacks.
    • This week, authorities in Bulgaria also seized a dark web hidden resource used by NetWalker ransomware affiliates to provide payment instructions and communicate with victims. Visitors to the resource will now find a seizure banner that notifies them that it has been seized by law enforcement authorities.
  • The European Data Protection Board (EDPB) has issued guidance to European Union (EU) member states that governs transfers of personal data under Directive (EU) 2016/680 (the Law Enforcement Directive aka the LED.) This guidance flows, in significant part, from Schrems II, the case that struck down the adequacy decision on which the United States-EU Privacy Shield relied. The EDPB noted
    • The LED “lay[s] down the specific rules with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against the prevention of threats to public security.”
    • The LED determines the grounds allowing the transfer of personal data to a third country or an international organisation in this context. One of the grounds for such transfer is the decision by the European Commission that the third country or international organisation in question ensures an adequate level of protection.
    • As specified by the CJEU, while the level of protection in the third country must be essentially equivalent to that guaranteed in the EU, ‘the means to which that third country has recourse, in this connection, for the purpose of such a level of protection may differ from those employed within the European Union ’but‘ those means must nevertheless prove, in practice, effective’. The adequacy standard therefore does not require to mirror point by point the EU legislation, but to establish the essential-core requirements of that legislation.
  • Canada’s federal and state privacy officials asserted in a statement “that [Clearview AI] violated federal and provincial privacy laws.” Clearview AI is an American firm that assembled much of its database by scraping photos from public facing websites, a practice that has left many privacy stakeholders uncomfortable. In a sense these findings are moot, for in summer 2020 shortly after this investigation was launched, Clearview AI announced it would no longer offer its facial recognition technology in Canada. However, a separate federal investigation of whether the Royal Mounted Canadian Police’s use of Clearview AI’s services violated Canadian law is ongoing. The Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Office of the Information and Privacy Commissioner for British Columbia and the Office of the Information and Privacy Commissioner of Alberta claimed:
    • Clearview AI’s technology allowed law enforcement and commercial organizations to match photographs of unknown people against the company’s databank of more than 3 billion images, including of Canadians and children, for investigation purposes. Commissioners found that this creates the risk of significant harm to individuals, the vast majority of whom have never been and will never be implicated in a crime.
    • The investigation found that Clearview had collected highly sensitive biometric information without the knowledge or consent of individuals. Furthermore, Clearview collected, used and disclosed Canadians’ personal information for inappropriate purposes, which cannot be rendered appropriate via consent.
    • When presented with the investigative findings, Clearview argued that:
      • Canadian privacy laws do not apply to its activities because the company does not have a “real and substantial connection” to Canada;
      • Consent was not required because the information was publicly available;
      • Individuals who placed or permitted their images to be placed on websites that were scraped did not have substantial privacy concerns justifying an infringement of the company’s freedom of expression;
      • Given the significant potential benefit of Clearview’s services to law enforcement and national security and the fact that significant harm is unlikely to occur for individuals, the balancing of privacy rights and Clearview’s business needs favoured the company’s entirely appropriate purposes; and
      • Clearview cannot be held responsible for offering services to law enforcement or any other entity that subsequently makes an error in its assessment of the person being investigated.
    • Commissioners rejected these arguments. They were particularly concerned that the organization did not recognize that the mass collection of biometric information from billions of people, without express consent, violated the reasonable expectation of privacy of individuals and that the company was of the view that its business interests outweighed privacy rights.
    • On the applicability of Canadian laws, they noted that Clearview collected the images of Canadians and actively marketed its services to law enforcement agencies in Canada. The RCMP became a paying customer and a total of 48 accounts were created for law enforcement and other organizations across the country.
    • The investigation also noted the potential risks to individuals whose images were captured and included in Clearview’s biometric database.  These potential harms include the risk of misidentification and exposure to potential data breaches.

Coming Events

  • On 10 February, the House Homeland Committee will hold a hearing titled “Homeland Cybersecurity: Assessing Cyber Threats and Building Resilience” with these witnesses:
    • Mr. Chris Krebs, Former Director, Cybersecurity and Infrastructure Security Agency, U.S. Department of Homeland Security
    • Ms. Sue Gordon, Former Principal Deputy Director of National Intelligence, Office of the Director of National Intelligence
    • Mr. Michael Daniel, President & CEO, Cyber Threat Alliance
    • Mr. Dmitri Alperovitch, Executive Chairman, Silverado Policy Accelerator
  • The House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Justice Restored: Ending Forced Arbitration and Protecting Fundamental Rights.”
  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Ranjat M from Pixabay

EDPB Asks For Input On Guidance Documents and Offers Input on EC Initiative

The EU’s meta-DPA further articulates portions of the GDPR.

The European Data Protection Board (EDPB) recently issued guidance documents agreed upon at its mid-December 2020 plenary meeting. The entity charged with articulating the General Data Protection Regulation (GDPR) published two documents on aspects of the European Union’s data protection regime, including case studies of how controllers and processors should handle data breaches and another on the limits on how far data protection rights may be restricted temporarily under EU or member state law. The EDPB articulated its view on how the GDPR and the recently enacted payment directive interrelate. The EDPB also shared its view on the EU’s plan to revamp its anti-money laundering laws, which may well inform how the bloc moves forward, especially with respect to data protection aspects of the issue.

The EDPB issued a draft “Guidelines 01/2021 on Examples regarding Data Breach Notification” for consultation that would ultimately complement data breach guidance under the GDPR the EDPB’s predecessor published in 2018. The EDPB’s draft guidance provides data breach cases that it thinks are timely and relevant. In particular, the EDPB explained:

  • The GDPR introduces the requirement for a personal data breach to be notified to the competent national supervisory authority (hereinafter “SA”) and, in certain cases, to communicate the breach to the individuals whose personal data have been affected by the breach (Articles 33 and 34).
  • The Article 29 Working Party already produced a general guidance on data breach notification in October 2017, analysing the relevant Sections of the GDPR (Guidelines on Personal data breach notification under Regulation 2016/679, WP 250) (hereinafter “Guidelines WP250). However, due to its nature and timing, this guideline did not address all practical issues in sufficient detail. Therefore, the need has arisen for a practice-oriented, case-based guidance that utilizes the experiences gained by SAs since the GDPR is applicable.
  • This document is intended to complement the Guidelines WP 250 and it reflects the common experiences of the SAs of the EEA since the GDPR became applicable. Its aim is to help data controllers in deciding how to handle data breaches and what factors to consider during risk assessment.
  • Though the cases presented below are fictitious, they are based on typical cases from the SA’s collective experience with data breach notifications. The analyses offered relate explicitly to the cases under scrutiny, but with the goal to provide assistance for data controllers in assessing their own data breaches. Any modification in the circumstances of the cases described below may result in different or more significant levels of risk, thus requiring different or additional measures. These guidelines structure the cases according to certain categories of breaches (e.g. ransomware attacks). Certain mitigating measures are called for in each case when dealing with a certain category of breaches. These measures are not necessarily repeated in each case analysis belonging to the same category of breaches. For the cases belonging to the same category only the differences are laid out. Therefore, the reader should read all cases relevant to relevant category of a breach to identify and distinguish all the correct measures to be taken.

The consultation ends on 2 March 2021.

The EDPB also issued for consultation “Guidelines 10/2020 on restrictions under Article 23 GDPR” which aims to elucidate a provision of the GDPR that allows EU member states to restrict some of the data protection rights under certain, limited circumstances. The EDPB explained:

This document seeks to provide guidance as to the application of Article 23 GDPR. These Guidelines provide a thorough analysis of the criteria to apply restrictions, the assessments that need to be observed, how data subjects can exercise their rights once the restriction is lifted and the consequences for infringements of Article 23 GDPR.

The EDPB explained elsewhere in the guidance:

Article 23 GDPR allows under specific conditions, a national or Union legislator to restrict, by way of a legislative measure, the scope of the obligations and rights provided for in Articles 12 to 22 and Article 34, as well as Article 5 GDPR in so far as its provisions correspond to the rights and obligations provided for in Articles 12 to 22, when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard, inter alia, important objectives of general public interest of the Union or of a Member State.

The EDPB summarized the requirements under Article 23:

  • [Article 23] is entitled ‘restrictions’ and it provides that, under Union or Member State law, the application of certain provisions of the Regulation, mainly relating to the rights of the data subjects and controllers’ obligations, may be restricted in the situations therein listed. Restrictions should be seen as exceptions to the general rule of allowing the exercise of rights and observing the obligations enshrined in the GDPR. As such, restrictions should be interpreted narrowly, only be applied in specifically provided circumstances and only when certain conditions are met.
  • Even in exceptional situations, the protection of personal data cannot be restricted in its entirety. It must be upheld in all emergency measures, as per Article 23 GDPR thus contributing to the respect of the overarching values of democracy, rule of law and fundamental rights on which the Union is founded: any measure taken by Member States shall respect the general principles of law, the essence of the fundamental rights and freedoms and shall not be irreversible and data controllers and processors shall continue to comply with data protection rules.
  • In all cases, where Union or Member State law allows restrictions to data subjects’ rights or to the obligations of the controllers (including joint controllers) and processors, it should be noted that the accountability principle, as laid down in Art. 5(2) GDPR, is still applicable. This means that the controller is responsible for, and shall be able to demonstrate to the data subjects his or her compliance with the EU data protection framework, including the principles relating to the processing of their data.
  • When the EU or national legislator lays down restrictions based on Art. 23 GDPR, it shall ensure that it meets the requirements set out in Art. 52(1) of Charter, and in particular conduct a proportionality assessment so that restrictions are limited to what is strictly necessary.

The EDPB issued the second version of the “Guidelines 06/2020 on the interplay of the Second Payment Services Directive and the GDPR” and explained its background and purpose:

The second Payment Services Directive (hereinafter “PSD2”) has introduced a number of novelties in the payment services field. While it creates new opportunities for consumers and enhances transparency in such field, the application of the PSD2 raises certain questions and concerns in respect of the need that the data subjects remain in full control of their personal data. The GDPR applies to the processing of personal data including processing activities carried out in the context of payment services as defined by the PSD2. Thus, controllers acting in the field covered by the PSD2 must always ensure compliance with the requirements of the GDPR, including the principles of data protection set out in Article 5 of the GDPR, as well as the relevant provisions of the ePrivacy Directive. While the PSD2 and the Regulatory Technical Standards for strong customer authentication and common and secure open standards of communication (hereinafter “RTS”) contain certain provisions relating to data protection and security, uncertainty has arisen about the interpretation of these provisions as well as the interplay between the general data protection framework and the PSD2.

The EDPB continued:

  • On July 5 2018, the EDPB issued a letter regarding the PSD2, in which the EDPB provided clarifications on questions concerning the protection of personal data in relation to the PSD2, in particular on the processing of personal data of non-contracting parties (so called ‘silent party data’) by account information service providers (hereinafter “AISPs”) and payment initiation service providers (hereinafter “PISPs”), the procedures with regard to giving and withdrawing consent, the RTS and the cooperation between account servicing payment services providers (hereinafter “ASPSPs”) in relation to security measures. Whereas the preparatory work of these guidelines involved the collection of inputs from stakeholders, both in writing and at a stakeholder event, in order to identify the most pressing challenges.
  • These guidelines aim to provide further guidance on data protection aspects in the context of the PSD2, in particular on the relationship between relevant provisions on the GDPR and the PSD2. The main focus of these guidelines is on the processing of personal data by AISPs and PISPs. As such, this document addresses conditions for granting access to payment account information by ASPSPs and for the processing of personal data by PISPs and AISPs, including the requirements and safeguards in relation to the processing of personal data by PISPs and AISPs for purposes other than the initial purposes for which the data have been collected, especially when they have been collected in the context of the provision of an account information service9. This document also addresses different notions of explicit consent under the PSD2 and the GDPR, the processing of ‘silent party data’, the processing of special categories of personal data by PISPs and AISPs, the application of the main data protection principles set forth by the GDPR, including data minimisation, transparency, accountability and security measures. The PSD2 involves cross- functional responsibilities in the fields of, inter alia, consumer protection and competition law. Considerations regarding these fields of law are beyond the scope of these guidelines.

Moreover, he EDPB weighed in on the “Action plan for a comprehensive Union policy on preventing money laundering and terrorism financing,” which the EDPB characterized thusly:

  • According to the Action Plan, the Commission aims to present new legislative proposals in the first quarter of 2021, inter alia, establishing a single rulebook on these topics (i.e. a Regulation or a more detailed revised Directive), ensuring EU level supervision (either by granting new powers to an existing EU Agency or by establishing a new dedicated body), and creating a support and coordination mechanism for Financial Intelligence Units.
  • The applicable anti-money laundering measures include very broad and far-reaching obligations on financial services providers and other obliged entities to identify and know their customers, to monitor transactions undertaken using their services, and to report any suspicious transactions. Furthermore, the legislation stipulates long retention periods. These measures cover the entire European financial services industry, and therefore affect, in a comprehensive manner, all persons using financial services, each time that they use these services.

The EDPB called on the European Commission (EC) to keep data protection in mind when drafting AML legislation:

  • The EDPB, and before it the Article 29 Working Party, has repeatedly noted the privacy and data protection challenges related to these measures in the past. The upcoming update to the legislation is an opportunity to address the interplay between the protection of privacy and personal data and the anti-money laundering measures, as well as their concrete application on the ground.
  • In this context, the EDPB stresses that the intended update to the anti-money laundering framework shall not be undertaken without a review of the relationship between the anti-money laundering measures and the rights to privacy and data protection. In this discussion, relevance and accuracy of the data collected plays a paramount role. The EDPB is indeed convinced that a closer articulation between the two sets of rules would benefit both the protection of personal data and the efficiency of the AML framework. In this respect, the EDPB would like to reiterate the need for a clear legal basis for the processing of personal data and stating the purposes and the limits of such processing, in line with Article 5(1) GDPR, in particular regarding information sharing and international transfers of data, as noted by the EDPS in its opinion on the European Commission’s action plan for a comprehensive Union policy on preventing money laundering and terrorism financing.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by fotografierende from Pexels

EDPB and EDPS Issue Opinions On EC’s Draft SCCs

The EU’s two bloc-wide data protection entities weighed in on the EC’s proposed changes to SCCs, meant to satisfy the Schrems II ruling.

The European Union’s (EU) data protection authorities have rendered their joint opinions on the European Commission’s (EC) draft revisions of Standard Contractual Clauses (SCC) permissible under the General Data Protection Regulation (GDPR). At present, SCCs are the primary means by which companies are transferring the personal data of EU residents to other nations for processing, especially the United States (U.S.), without adequacy decisions. Since the adequacy decision on the U.S. was struck down companies have been left only with SCCs, and there are efforts afoot to have the EU’s top court to strike down SCCs governing the transfer of personal data to the U.S. on account of what critics call inadequate redress and protection from U.S. surveillance.

Before I turn to the European Data Protection Board (EDPB) and European Data Protection Supervisor’s (EDPS) joint opinions, some background would be helpful. In mid-2020, in a very anticipated decision, the EU’s top court struck down the adequacy decision underpinning the U.S.-EU Privacy Shield agreement. Under the GDPR, the easiest way for a controller to transfer the personal data of EU residents for processing outside the EU is through such a decision that essentially says the laws of the other nation are basically equivalent to the EU’s with respect to the rights they provide. The U.S. is the biggest trading partner with the EU with respect to these data flows with companies like Facebook and Google generating billions, maybe even trillions, of dollars in economic activity. Consequently, both Washington and Brussels have many reasons to favor the easiest route to making data flows happen. However, the forerunner to Privacy Shield (i.e. Safe Harbor) was also struck down, largely because of the inadequacy of U.S. privacy rights and mass surveillance, and so the U.S. made some changes, but these, too, proved inadequate, and litigation brought by Austrian activist and privacy advocate Maximillian Schrems against Facebook finally made its way to the Court of Justice for the European Union (CJEU).

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the CJEU explained:

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the U.S. lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law. However, the CJEU left open the question of whether SCCs may permit the continued exporting of EU personal data into the U.S. for companies like Facebook, Google, and many, many others. Consequently, there has been no small amount of interpreting and questioning of whether this may be a way for the trans-Atlantic data flow to continue. And yet, the CJEU seemed clear that additional measures would likely be necessary. Indeed, the CJEU asserted “[c]ontrollers and processors should be encouraged to provide additional safeguards via contractual commitments that supplement standard protection clauses” and “[i]n so far as those standard data protection clauses cannot, having regard to their very nature, provide guarantees beyond a contractual obligation to ensure compliance with the level of protection required under EU law, they may require, depending on the prevailing position in a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with that level of protection.”

Thereafter the EC stepped into the breach to seemingly shore up SCCs to protect them from the same fate as Privacy Shield, for it sems like it is a matter of time before the legality of SCCs are challenged. In mid-November 2020, the EC released for comment a draft revision of SCC for transfers of personal data to countries outside the EU with input due by 10 December. The EC had last revised EU law on SCCs in 2010, some years before the GDPR came into force. The EC released draft legislative language and, in an Annex, actual contract language for use by controllers and processors in the form of modules that are designed to be used in a variety of common circumstances (e.g., transfers by controllers to other controllers or a controller to a processor.) However, the EC stressed that SCCs form a floor and controllers, processors, and other parties are free to add additional language so long as it does not contradict or denigrate the rights protected by SCCs.

In the implementing decision, the EC asserted

the standard contractual clauses needed to be updated in light of new requirements in Regulation (EU) 2016/679. Moreover, since the adoption of these decisions, important developments have taken place in the digital economy, with the widespread use of new and more complex processing operations often involving multiple data importers and exporters, long and complex processing chains as well as evolving business relationships. This calls for a modernisation of the standard contractual clauses to better reflect those realities, by covering additional processing and transfer situations and to use a more flexible approach, for example with respect to the number of parties able to join the contract.

The EC continued:

The standard contractual clauses set out in the Annex to this Decision may be used by a controller or a processor in order to provide appropriate safeguards within the meaning of Article 46(1) of Regulation (EU) 2016/679 for the transfer of personal data to a processor or a controller established in a third country. This also includes the transfer of personal data by a controller or processor not established in the Union, to the extent that the processing is subject to Regulation (EU) 2016/679 pursuant to Article 3(2) thereof, because it relates to the offering of goods or services to data subjects in the Union or the monitoring of their behaviour as far as their behaviour takes place within the Union.

The EC explained the design and intent of the SCC language in the Annex:

  • The standard contractual clauses set out in the Annex to this Decision combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains. In addition to the general clauses, controllers and processors should select the module applicable to their situation, which makes it possible to tailor their obligations under the standard contractual clauses to their corresponding role and responsibilities in relation to the data processing at issue. It should be possible for more than two parties to adhere to the standard contractual clauses. Moreover, additional controllers and processors should be allowed to accede to the standard contractual clauses as data exporters or importers throughout the life cycle of the contract of which those clauses form a part.
  • These Clauses set out appropriate safeguards, including enforceable data subject rights and effective legal remedies, pursuant to Article 46(1), and Article 46 (2)(c) of Regulation (EU) 2016/679 and, with respect to data transfers from controllers to processors and/or processors to processors, standard contractual clauses pursuant to Article 28(7) of Regulation (EU) 2016/679, provided they are not modified, except to add or update information in the Annexes. This does not prevent the Parties from including the standard contractual clauses laid down in this Clauses in a wider contract, and to add other clauses or additional safeguards provided that they do not contradict, directly or indirectly, the standard contractual clauses or prejudice the fundamental rights or freedoms of data subjects. These Clauses are without prejudice to obligations to which the data exporter is subject by virtue of the Regulation (EU) 2016/679

On the same day, the EC released its SCC proposals, the EDPB issued guidance documents, which was surely not coincidental. In “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,” the EDPB explained the genesis and rationale for the document:

  • The GDPR or the [CJEU] do not define or specify the “additional safeguards”, “additional measures” or “supplementary measures” to the safeguards of the transfer tools listed under Article 46.2 of the GDPR that controllers and processors may adopt to ensure compliance with the level of protection required under EU law in a particular third country.
  • The EDPB has decided, on its own initiative, to examine this question and to provide controllers and processors, acting as exporters, with recommendations on the process they may follow to identify and adopt supplementary measures. These recommendations aim at providing a methodology for the exporters to determine whether and which additional measures would need to be put in place for their transfers. It is the primary responsibility of exporters to ensure that the data transferred is afforded in the third country of a level of protection essentially equivalent to that guaranteed within the EU. With these recommendations, the EDPB seeks to encourage consistent application of the GDPR and the Court’s ruling, pursuant to the EDPB’s mandate

Broadly speaking, whether SCCs and supplemental measures will pass muster under the GDPR will be determined on a case-by-case basis. The EDPB did not offer much in the way of bright line rules. Indeed, it will be up to SAs to determine if transfers to nations like the U.S. are possible under the GDPR, meaning these recommendations may shed more light on this central question without deciding it. One wonders, as a practical matter, if the SAs will have the capacity, resources, and will to police SCCs to ensure the GDPR and Charter are being met.

Nonetheless, the EDPB stressed the principle of accountability under which controllers which export personal data must ensure that whatever mechanism and supplemental measures govern a data transfer, the data must receive the same protection it would in the EU. The EDPB made the point that EU protections travel with the data and should EU personal data make its way to a country where it is not possible for appropriate protection to occur, then the transfer violates the GDPR. Moreover, these recommendations pertain to both public and private transfers of EU data to private sector entities outside the EU.

These recommendations work like a decision tree with exporters needing to ask themselves a series of questions to determine whether they must use supplemental measures. This may prove a resource intensive process, for exporters will need to map all transfers (i.e. know exactly) where the data are going. The exporter must understand the laws and practices of the third nation in order to put in place appropriate measures if this is possible in order to meet the EU’s data protection standards.

Reading between the lines leads one to conclude that data exporters may not send personal data to the U.S. for its federal surveillance regime is not “necessary and proportionate,” at least from the EU’s view. The U.S. lacks judicial redress in the case a U.S. national, let alone a foreign national, objects to the sweeping surveillance. The U.S. also has neither a national data protection law nor a dedicated data protection authority. These hints seem to also convey the EDPB’s view on the sorts of legal reforms needed in the U.S. before an adequacy decision would pass muster with the CJEU.

The EDPB said it was still evaluating how Schrems II affects the use of BCR and ad hoc contractual clauses, two of the other alternate means of transferring EU personal data in the absence of an adequacy agreement.

Nevertheless, in an annex, the EDPB provided examples of supplementary measures that may be used depending on the circumstances, of course, such as “flawlessly implemented” encryption and pseudonymizing data. However, the EDPB discusses these in the context of different scenarios and calls for more conditions than just the two aforementioned. Moreover, the EDPB rules out two scenarios categorically as being inadequate: “Transfer to cloud services providers or other processors which require access to data in the clear” and “Remote access to data for business purposes.”

The EDPB also issued an update to guidance published after the first lawsuit brought by Maximilian Schrems resulted in the striking down of the Safe Harbor transfer agreement. The forerunner to the EDPB, the Working Party 29, had drafted and released the European Essential Guarantees, and so, in light of Schrems II, the EDPB updated and published “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” “to provide elements to examine, whether surveillance measures allowing access to personal data by public authorities in a third country, being national security agencies or law enforcement authorities, can be regarded as a justifiable interference or not” with fundamental EU rights and protections. As the EDPB explains, these recommendations are intended to help data controllers and exporters determine whether other nations have protections and processes in place equivalent to those of the EU visa vis their surveillance programs. The EDPB stressed that these are the essential guarantees and other features and processes may be needed for a determination of lawfulness under EU law.

The EDPB formulated the four European Essential Guarantees:

A. Processing should be based on clear, precise and accessible rules

B. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated

C. An independent oversight mechanism should exist

D. Effective remedies need to be available to the individual

Where the new joint opinions of the EDPB and EDPS fit into this process is that the EC asked for a joint opinion on its drafts as noted at the beginning of one of their opinions:

On 12 November 2020, the European Commission requested a joint opinion of the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) on the basis of Article 42(1), (2) of Regulation (EU) 2018/1725 (EU DPR) on these two sets of draft standard contractual clauses and the respective implementing acts.

Consequently, the EDPB and EDPS issued the following:

In Joint Opinion 1/2021, the two bodies explained:

The EDPB and the EDPS are of the opinion that clauses which merely restate the provisions of Article 28(3) and (4) GDPR and Article 29 (3) and (4) EUDPR are inadequate to constitute standard contractual clauses. The Board and EDPS have therefore decided to analyse the document in its entirety, including the appendices. In the opinion of the Board and the EDPS, a contract under Article 28 GDPR or Article 29 EUDPR should further stipulate and clarify how the provisions will be fulfilled. It is in this light that the Draft SCCs submitted to the Board and EDPS for opinion are analysed.

The EDPB and EDPS go on to ask the EC to better clarify the difference between the legislation on transfers between controllers and processors, which is meant to happen only inside the EU, and the transfers to third countries. They asked for clarity on the scope of the language. The EDPB and EDPS also asked that the EC expand the the intra-EU SCC decision to include those nations that have been found adequate (e.g. Israel, Japan, New Zealand, and others.)

The EDPB and EDPS did find much to like, however:

  • Adopted standard contractual clauses constitute a set of guarantees to be used as is, as they are intended to protect data subjects and mitigate specific risks associated with the fundamental principles of data protection.
  • The EDPB and the EDPS welcome in general the adoption of standard contractual clauses as a strong accountability tool that facilitates compliance by controllers and processors to their obligations under the GDPR and the EUDPR.
  • The EDPB already issued opinions on standard contractual clauses prepared by the Danish Supervisory Authority2 and the Slovenian Supervisory Authority 3.
  • To ensure a coherent approach to personal data protection throughout the Union, the EDPB and the EDPS strongly welcome the envisaged adoption of SCCs having an EU-wide effect by the Commission.
  • The same set of SCCs will indeed apply irrespective of whether this relationship involves private entities, public authorities of the Member States or EU institutions or bodies. These EU-wide SCCs will ensure further harmonisation and legal certainty.
  • The EDPB and the EDPS also welcome the fact that the same set of SCCs should apply in respect of the relationship between controllers and processors subject to GDPR and EUDPR respectively.

In Joint Opinion 2/2021, the EDPB and EDPS stated:

The Draft SCCs combine general clauses with a modular approach to cater for various transfer scenarios. In addition to the general clauses, controllers and processors should select the module applicable to their situation among the four following modules:

  • Module One: transfer controller to controller;
  • Module Two: transfer controller to processor;
  • Module Three: transfer processor to processor;
  • Module Four: transfer processor to controller.

Again, the EDPB and EDPS wanted greater clarity on the language in this decision, especially regarding SCCs governing EU institutions subject not to the GDPR but to Regulation (EU) 2018/1725 (aka the EUDPR). In general, the EDPB and EDPS had this comment on the actual draft SCCs:

The EDPB and the EDPS welcome the introduction of specific modules for each transfer scenarios. However, the EDPB and the EDPS note that it is not clear whether one set of the SCCs can include several modules in practice to address different situations, or whether this should amount to the signing of several sets of the SCCs. In order to achieve maximum readability and easiness in the practical application of the SCCs, the EDPB and the EDPS suggest that the European Commission provides additional guidance (e.g. in the form of flowcharts, publication of Frequently Asked Questions (FAQs), etc.). In particular, it should be made clear that the combination of different modules in a single set of SCCs cannot lead to the blurring of roles and responsibilities among the parties.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dimitris Vetsikas from Pixabay

Further Reading, Other Development, and Coming Events (20 and 21 January 2021)

Further Reading

  • Amazon’s Ring Neighbors app exposed users’ precise locations and home addresses” By Zack Whittaker — Tech Crunch. Again Amazon’s home security platform suffers problems by way of users data being exposed or less than protected.
  • Harassment of Chinese dissidents was warning signal on disinformation” By Shawna Chen and Bethany Allen-Ebrahimian — Axios. In an example of how malicious online activities can spill into the real world as a number of Chinese dissidents were set upon by protestors.
  • How Social Media’s Obsession with Scale Supercharged Disinformation” By Joan Donovan — Harvard Business Review. Companies like Facebook and Twitter emphasized scale over safety in trying to grow as quickly as possible. This lead to a proliferation of fake accounts and proved welcome ground for the seeds of misinformation.
  • The Moderation War Is Coming to Spotify, Substack, and Clubhouse” By Alex Kantrowitz — OneZero. The same issues with objectionable and abusive content plaguing Twitter, Facebook, YouTube and others will almost certainly become an issue for the newer platforms, and in fact already are.
  • Mexican president mounts campaign against social media bans” By Mark Stevenson — The Associated Press. The leftist President of Mexico President Andrés Manuel López Obrador is vowing to lead international efforts to stop social media companies from censoring what he considers free speech. Whether this materializes into something substantial is not clear.
  • As Trump Clashes With Big Tech, China’s Censored Internet Takes His Side” By Li Yuan — The New York Times. The government in Beijing is framing the ban of former President Donald Trump after the attempted insurrection by social media platforms as proof there is no untrammeled freedom of speech. This position helps bolster the oppressive policing of online content the People’s Republic of China (PRC) wages against its citizens. And quite separately many Chinese people (or what appear to be actual people) are questioning what is often deemed the censoring of Trump in the United States (U.S.), a nation ostensibly committed to free speech. There is also widespread misunderstanding about the First Amendment rights of social media platforms not to host content with which they disagree and the power of platforms to make such determinations without fear that the U.S. government will punish them as is often the case in the PRC.
  • Trump admin slams China’s Huawei, halting shipments from Intel, others – sources” By Karen Freifeld and Alexandra Alper — Reuters. On its way out of the proverbial door, the Trump Administration delivered parting shots to Huawei and the People’s Republic of China by revoking one license and denying others to sell the PRC tech giant semiconductors. Whether the Biden Administration will reverse or stand by these actions remains to be seen. The companies, including Intel, could appeal. Additionally, there are an estimated $400 million worth of applications for similar licenses pending at the Department of Commerce that are now the domain of the new regime in Washington. It is too early to discern how the Biden Administration will maintain or modify Trump Administration policy towards the PRC.
  • Behind a Secret Deal Between Google and Facebook” By Daisuke Wakabayashi and Tiffany Hsu — The New York Times. The newspaper got its hands on an unredacted copy of the antitrust suit Texas Attorney General Ken Paxton and other attorneys general filed against Google, and it has details on the deal Facebook and Google allegedly struck to divide the online advertising world. Not only did Facebook ditch an effort launched by publishers to defeat Google’s overwhelming advantages in online advertising bidding, it joined Google’s rival effort with a guarantee that it would win a specified number of bids and more time to bid on ads. Google and Facebook naturally deny any wrongdoing.
  • Biden and Trump Voters Were Exposed to Radically Different Coverage of the Capitol Riot on Facebook” By Colin Lecher and Jon Keegan — The Markup. Using a tool on browsers the organization pays Facebook users to have, the Markup can track the type of material they see in their feed. Facebook’s algorithm fed people material about the 6 January 2021 attempted insurrection based on their political views. Many have pointed out that this very dynamic creates filter bubbles that poison democracy and public discourse.
  • Banning Trump won’t fix social media: 10 ideas to rebuild our broken internet – by experts” By Julia Carrie Wong — The Guardian. There are some fascinating proposals in this piece that could help address the problems of social media.
  • Misinformation dropped dramatically the week after Twitter banned Trump and some allies” By Elizabeth Dwoskin and Craig Timberg — The Washington Post. Research showed that lies, misinformation, and disinformation about election fraud dropped by three-quarters after former President Donald Trump was banned from Twitter and other platforms. Other research showed that a small group of conservatives were responsible for up to 20% of misinformation on this and other conspiracies.
  • This Was WhatsApp’s Plan All Along” By Shoshana Wodinsky — Gizmodo. This piece does a great job of breaking down into plain English the proposed changes to terms of service on WhatsApp that so enraged users that competitors Signal and Telegram have seen record-breaking downloads. Basically, it is all about reaping advertising dollars for Facebook through businesses and third-party partners using user data from business-related communications. Incidentally, WhatsApp has delayed changes until March because of the pushback.
  • Brussels eclipsed as EU countries roll out their own tech rules” By By Laura Kayali and Mark Scott — Politico EU. The European Union (EU) had a hard-enough task in trying to reach final language on a Digital Services Act and Digital Markets Act without nations like France, Germany, Poland, and others picking and choosing text from draft bills and enacting them into law. Brussels is not happy with this trend.

Other Developments

  • Federal Trade Commission (FTC) Chair Joseph J. Simons announced his resignation from the FTC effective on 29 January 2021 in keeping with tradition and past practice. This resignation clears the way for President Joe Biden to name the chair of the FTC, and along with FTC Commissioner Rohit Chopra’s nomination to head the Consumer Financial Protection Bureau (CFPB), the incoming President will get to nominate two Democratic FTC Commissioners, tipping the political balance of the FTC and likely ushering in a period of more regulation of the technology sector.
    • Simons also announced the resignation of senior staff: General Counsel Alden F. Abbott; Bureau of Competition Director Ian Conner; Bureau of Competition Deputy Directors Gail Levine and Daniel Francis; Bureau of Consumer Protection Director Andrew Smith; Bureau of Economics Director Andrew Sweeting; Office of Public Affairs Director Cathy MacFarlane; and Office of Policy Planning Director Bilal Sayyed.
  • In a speech last week before he sworn in, President Joe Biden announced his $1.9 trillion American Rescue Plan, and according to a summary, Biden will ask Congress to provide $10 billion for a handful of government facing programs to improve technology. Notably, Biden “is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks.” Biden is proposing to dramatically increase funding for a fund that would allow agencies to borrow and then pay back funds to update their technology. Moreover, Biden is looking to push more money to a program to aid officials at agencies who oversee technology development and procurement.
    • Biden stated “[t]o remediate the SolarWinds breach and boost U.S. defenses, including of the COVID-19 vaccine process, President-elect Biden is calling on Congress to:
      • Expand and improve the Technology Modernization Fund. ​A $9 billion investment will help the U.S. launch major new IT and cybersecurity shared services at the Cyber Security and Information Security Agency (CISA) and the General Services Administration and complete modernization projects at federal agencies. ​In addition, the president-elect is calling on Congress to change the fund’s reimbursement structure in order to fund more innovative and impactful projects.
      • Surge cybersecurity technology and engineering expert hiring​. Providing the Information Technology Oversight and Reform fund with $200 million will allow for the rapid hiring of hundreds of experts to support the federal Chief Information Security Officer and U.S. Digital Service.
      • Build shared, secure services to drive transformational projects. ​Investing$300 million in no-year funding for Technology Transformation Services in the General Services Administration will drive secure IT projects forward without the need of reimbursement from agencies.
      • Improving security monitoring and incident response activities. ​An additional $690M for CISA will bolster cybersecurity across federal civilian networks, and support the piloting of new shared security and cloud computing services.
  • The United States (U.S.) Department of Commerce issued an interim final rule pursuant to an executive order (EO) issued by former President Donald Trump to secure the United States (U.S.) information and communications supply chain. This rule will undoubtedly be reviewed by the Biden Administration and may be withdrawn or modified depending on the fate on the EO on which the rule relies.
    • In the interim final rule, Commerce explained:
      • These regulations create the processes and procedures that the Secretary of Commerce will use to identify, assess, and address certain transactions, including classes of transactions, between U.S. persons and foreign persons that involve information and communications technology or services designed, developed, manufactured, or supplied, by persons owned by, controlled by, or subject to the jurisdiction or direction of a foreign adversary; and pose an undue or unacceptable risk. While this interim final rule will become effective on March 22, 2021, the Department of Commerce continues to welcome public input and is thus seeking additional public comment. Once any additional comments have been evaluated, the Department is committed to issuing a final rule.
      • On November 27, 2019, the Department of Commerce (Department) published a proposed rule to implement the terms of the Executive Order. (84 FR 65316). The proposed rule set forth processes for (1) how the Secretary would evaluate and assess transactions involving ICTS to determine whether they pose an undue risk of sabotage to or subversion of the ICTS supply chain, or an unacceptable risk to the national security of the United States or the security and safety of U.S. persons; (2) how the Secretary would notify parties to transactions under review of the Secretary’s decision regarding the ICTS Transaction, including whether the Secretary would prohibit or mitigate the transaction; and (3) how parties to transactions reviewed by the Secretary could comment on the Secretary’s preliminary decisions. The proposed rule also provided that the Secretary could act without complying with the proposed procedures where required by national security. Finally, the Secretary would establish penalties for violations of mitigation agreements, the regulations, or the Executive Order.
      • In addition to seeking general public comment, the Department requested comments from the public on five specific questions: (1) Whether the Secretary should consider categorical exclusions or whether there are classes of persons whose use of ICTS cannot violate the Executive Order; (2) whether there are categories of uses or of risks that are always capable of being reliably and adequately mitigated; (3) how the Secretary should monitor and enforce any mitigation agreements applied to a transaction; (4) how the terms, “transaction,” “dealing in,” and “use of” should be clarified in the rule; and (5) whether the Department should add record-keeping requirements for information related to transactions.
      • The list of “foreign adversaries” consists of the following foreign governments and non-government persons: The People’s Republic of China, including the Hong Kong Special Administrative Region (China); the Republic of Cuba (Cuba); the Islamic Republic of Iran (Iran); the Democratic People’s Republic of Korea (North Korea); the Russian Federation (Russia); and Venezuelan politician Nicolás Maduro (Maduro Regime).
  • The Federal Trade Commission (FTC) adjusted its penalty amounts for inflation, including a boost to the per violation penalty virtually all the privacy bills introduced in the last Congress would allow the agency to wield against first-time violators. The penalty for certain unfair and deceptive acts or practices was increased from $43,280 to $43,792.
  • The United States (U.S.) Department of State stood up its new Bureau of Cyberspace Security and Emerging Technologies (CSET) as it has long planned. At the beginning of the Trump Administration, the Department of State dismantled the Cyber Coordinator Office and gave its cybersecurity portfolio to the Bureau of Economic Affairs, which displeased Congressional stakeholders. In 2019, the department notified Congress of its plan to establish CSET. The department asserted:
    • The need to reorganize and resource America’s cyberspace and emerging technology security diplomacy through the creation of CSET is critical, as the challenges to U.S. national security presented by China, Russia, Iran, North Korea, and other cyber and emerging technology competitors and adversaries have only increased since the Department notified Congress in June 2019 of its intent to create CSET.
    • The CSET bureau will lead U.S. government diplomatic efforts on a wide range of international cyberspace security and emerging technology policy issues that affect U.S. foreign policy and national security, including securing cyberspace and critical technologies, reducing the likelihood of cyber conflict, and prevailing in strategic cyber competition.  The Secretary’s decision to establish CSET will permit the Department to posture itself appropriately and engage as effectively as possible with partners and allies on these pressing national security concerns.
    • The Congressional Members of the Cyberspace Solarium Commission made clear their disapproval of the decision. Senators Angus King (I-ME) and Ben Sasse, (R-NE) and Representatives Mike Gallagher (R-WI) and Jim Langevin (D-RI) said:
      • In our report, we emphasize the need for a greater emphasis on international cyber policy at State. However, unlike the bipartisan Cyber Diplomacy Act, the State Department’s proposed Bureau will reinforce existing silos and […] hinder the development of a holistic strategy to promote cyberspace stability on the international stage. We urge President-elect Biden to pause this reorganization when he takes office in two weeks and work with Congress to enact meaningful reform to protect our country in cyberspace.
  • The Australian Cyber Security Centre (ACSC) the Risk Identification Guidance “developed to assist organisations in identifying risks associated with their use of suppliers, manufacturers, distributors and retailers (i.e. businesses that constitute their cyber supply chain)” and the Risk Management Guidance because “[c]yber supply chain risk management can be achieved by identifying the cyber supply chain, understanding cyber supply chain risk, setting cyber security expectations, auditing for compliance, and monitoring and improving cyber supply chain security practices.”
  • The United Kingdom’s Surveillance Camera Commissioner (SCC), issued “best practice guidance, ‘Facing the Camera’, to all police forces in England and Wales” The SCC explained that “The provisions of this document only apply to the use of facial recognition technology and the inherent processing of images by the police where such use is integral to a surveillance camera system being operated in ‘live time’ or ‘near real time’ operational scenarios.” Last summer, a British appeals court overturned a decision that found that a police force’s use of facial recognition technology in a pilot program that utilized live footage to be legal. The appeals court found the use of this technology by the South Wales Police Force a violation of “the right to respect for private life under Article 8 of the European  Convention  on  Human  Rights,  data  protection  legislation,  and  the  Public  Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.” The SCC stated:
    • The SCC considers surveillance to be an intrusive investigatory power where it is conducted by the police which impacts upon those fundamental rights and freedoms of people, as set out by the European Convention of Human Rights (ECHR) and the Human Rights Act 1998. In the context of surveillance camera systems which make use of facial recognition technology, the extent of state intrusion in such matters is significantly increased by the capabilities of algorithms which are in essence, integral to the surveillance conduct seeking to harvest information, private information, metadata, data, personal data, intelligence and evidence. Each of the aforementioned are bound by laws and rules which ought to be separately and jointly considered and applied in a manner which is demonstrably lawful and ethical and engenders public trust and confidence.
    • Whenever the police seek to use technology in pursuit of a legitimate aim, the key question arises as to whether the degree of intrusion which is caused to the fundamental freedoms of citizens by the police surveillance conduct using surveillance algorithms (biometric or otherwise) is necessary in a democratic society when considered alongside the legality and proportionality of their endeavours and intent. The type of equipment/technology/modality which they choose to use to that end (e.g. LFR, ANPR, thermal imaging, gait analysis, movement sensors etc), the manner in which such technological means are deployed, (such as using static cameras at various locations, used with body worn cameras or other mobile means), and whether such technology is used overtly alongside or networked with other surveillance technologies, are all factors which may significantly influence the depth of intrusion caused by police conduct upon citizen’s rights.
  • The Senate confirmed the nomination of Avril Haines to be the new Director of National Intelligence by an 89-10 vote after Senator Tom Cotton (R-AK) removed his hold on her nomination. However, Josh Hawley (R-MO) placed a hold on the nomination of Alejandro Mayorkas to be the next Secretary of Homeland Security and explained his action this way:
    • On Day 1 of his administration, President-elect Biden has said he plans to unveil an amnesty plan for 11 million immigrants in this nation illegally. This comes at a time when millions of American citizens remain out of work and a new migrant caravan has been attempting to reach the United States. Mr. Mayorkas has not adequately explained how he will enforce federal law and secure the southern border given President-elect Biden’s promise to roll back major enforcement and security measures. Just today, he declined to say he would enforce the laws Congress has already passed to secure the border wall system. Given this, I cannot consent to skip the standard vetting process and fast-track this nomination when so many questions remain unanswered.
  • Former Trump White House Cyber Coordinator Rob Joyce will replace the National Security Agency’s (NSA) Director of Cybersecurity Anne Neuberger who has been named the Biden White House’s Deputy National Security Advisor for Cyber and Emerging Technology. Anne Neuberger’s portfolio at the NSA included “lead[ing] NSA’s cybersecurity mission, including emerging technology areas like quantum-resistant cryptography.” Joyce was purged when former National Security Advisor John Bolton restructured the NSC in 2018, forcing out Joyce and his boss, former Homeland Security Advisor Tom Bossert. Presumably Joyce would have the same responsibilities. At the National Security Council, Neuberger would will work to coordinate cybersecurity and emerging technology policy across agencies and funnel policy options up to the full NSC and ultimately the President. This work would include Joyce.
  • The Supreme Court of the United States (SCOTUS) heard oral arguments on whether the Federal Trade Commission (FTC) Act gives the agency the power to seek monetary damages and restitution alongside permanent injunctions under Section 13(b). In AMG Capital Management, LLC v. FTC, the parties opposing the FTC argue the plain language of the statute does not allow for the seeking of restitution and monetary damages under this specific section of the FTC Act while the agency argues long accepted past practice and Congressional intent do, in fact, allow this relief to be sought when the FTC is seeking to punish violators of Section 5. The FTC is working a separate track to get a fix from Congress which could rewrite the FTC Act to make clear this sort of relief is legal. However, some stakeholders in the debate over privacy legislation may be using the case as leverage.
    • In October 2020, the FTC wrote the House and Senate committees with jurisdiction over the agency, asking for language to resolve the litigation over the power to seek and obtain restitution for victims of those who have violated Section 5 of the FTC Act and disgorgement of ill-gotten gains. The FTC is also asking that Congress clarify that the agency may act against violators even if their conduct has stopped as it has for more than four decades. Two federal appeals courts have ruled in ways that have limited the FTC’s long used powers, and now the Supreme Court of the United States is set to rule on these issues sometime next year. The FTC is claiming, however, that defendants are playing for time in the hopes that the FTC’s authority to seek and receive monetary penalties will ultimately be limited by the United States (U.S.) highest court. Judging by language tucked into a privacy bill introduced by the former chair of one of the committees, Congress may be willing to act soon.
    • The FTC asked the House Energy and Commerce and Senate Commerce, Science, and Transportation Committees “to take quick action to amend Section 13(b) [of the FTC Act i.e. 15 U.S.C. § 53(b)] to make clear that the Commission can bring actions in federal court under Section 13(b) even if conduct is no longer ongoing or impending when the suit is filed and can obtain monetary relief, including restitution and disgorgement, if successful.” The agency asserted “[w]ithout congressional action, the Commission’s ability to use Section 13(b) to provide refunds to consumer victims and to enjoin illegal activity is severely threatened.” All five FTC Commissioners signed the letter.
    • The FTC explained that adverse rulings by two federal appeals courts are constraining the agency from seeking relief for victims and punishment for violators of the FTC Act in federal courts below those two specific courts, but elsewhere defendants are either asking courts for a similar ruling or using delaying tactics in the hopes the Supreme Court upholds the two federal appeals courts:
      • …[C]ourts of appeals in the Third and Seventh Circuits have recently ruled that the agency cannot obtain any monetary relief under Section 13(b). Although review in the Supreme Court is pending, these lower court decisions are already inhibiting our ability to obtain monetary relief under 13(b). Not only do these decisions already prevent us from obtaining redress for consumers in the circuits where they issued, prospective defendants are routinely invoking them in refusing to settle cases with agreed-upon redress payments.
      • Moreover, defendants in our law enforcement actions pending in other circuits are seeking to expand the rulings to those circuits and taking steps to delay litigation in anticipation of a potential Supreme Court ruling that would allow them to escape liability for any monetary relief caused by their unlawful conduct. This is a significant impediment to the agency’s effectiveness, its ability to provide redress to consumer victims, and its ability to prevent entities who violate the law from profiting from their wrongdoing.
  • The United Kingdom’s Information Commissioner’s Office (ICO) issued guidance for British entities that may be affected by the massive SolarWinds hack that has compromised many key systems in the United States. The ICO advised:
    • Organisations should immediately check whether they are using a version of the software that has been compromised. These are versions 2019.4 HF 5, 2020.2 with no hotfix installed, and 2020.2 HF 1.
    • Organisations must also determine if the personal data they hold has been affected by the cyber-attack. If a reportable personal data breach is found, UK data controllers are required to inform the ICO within 72 hours of discovering the breach. Reports can be submitted online or organisations can call the ICO’s personal data breach helpline for advice on 0303 123 1113, option 2.
    • Organisations subject to the NIS Regulation will also need to determine if this incident has led to a “substantial impact on the provision’ of its digital services and report to the ICO.
  • Europol announced the takedown of “the world’s largest illegal marketplace on the dark web” in an operation coordinated by the following nations: “Germany, Australia, Denmark, Moldova, Ukraine, the United Kingdom (the National Crime Agency), and the USA (DEA, FBI, and IRS).” Europol added:
    • The Central Criminal Investigation Department in the German city of Oldenburg arrested an Australian citizen who is the alleged operator of DarkMarket near the German-Danish border over the weekend. The investigation, which was led by the cybercrime unit of the Koblenz Public Prosecutor’s Office, allowed officers to locate and close the marketplace, switch off the servers and seize the criminal infrastructure – more than 20 servers in Moldova and Ukraine supported by the German Federal Criminal Police office (BKA). The stored data will give investigators new leads to further investigate moderators, sellers, and buyers. 
  • The Enforcement Bureau (Bureau) of the Federal Communications Commission (FCC) issued an enforcement advisory intended to remind people that use of amateur and personal radios to commit crimes is itself a criminal offense that could warrant prosecution. The notice was issued because the FCC is claiming it is aware of discussion by some of how these means of communications may be superior to social media, which has been cracking down on extremist material since the attempted insurrection at the United States Capitol on 6 January. The Bureau stated:
    • The Bureau has become aware of discussions on social media platforms suggesting that certain radio services regulated by the Commission may be an alternative to social media platforms for groups to communicate and coordinate future activities.  The Bureau recognizes that these services can be used for a wide range of permitted purposes, including speech that is protected under the First Amendment of the U.S. Constitution.  Amateur and Personal Radio Services, however, may not be used to commit or facilitate crimes. 
    • Specifically, the Bureau reminds amateur licensees that they are prohibited from transmitting “communications intended to facilitate a criminal act” or “messages encoded for the purpose of obscuring their meaning.” Likewise, individuals operating radios in the Personal Radio Services, a category that includes Citizens Band radios, Family Radio Service walkie-talkies, and General Mobile Radio Service, are prohibited from using those radios “in connection with any activity which is against Federal, State or local law.” Individuals using radios in the Amateur or Personal Radio Services in this manner may be subject to severe penalties, including significant fines, seizure of the offending equipment, and, in some cases, criminal prosecution.
  • The European Data Protection Board (EDPB) issued its “Strategy for 2021-2023” in order “[t]o be effective in confronting the main challenges ahead.” The EDPB cautioned:
    • This Strategy does not provide an exhaustive overview of the work of the EDPB in the years to come. Rather it sets out the four main pillars of our strategic objectives, as well as set of key actions to help achieve those objectives. The EDPB will implement this Strategy within its Work Program, and will report on the progress achieved in relation to each Pillar as part of its annual reports.
    • The EDPB listed and explained the four pillars of its strategy:
      • PILLAR 1: ADVANCING HARMONISATION AND FACILITATING COMPLIANCE. The EDPB will continue to strive for a maximum degree of consistency in the application of data protection rules and limit fragmentation among Member States. In addition to providing practical, easily understandable and accessible guidance, the EDPB will develop and promote tools that help to implement data protection into practice, taking into account practical experiences of different stakeholders on the ground.
      • PILLAR 2: SUPPORTING EFFECTIVE ENFORCEMENT AND EFFICIENT COOPERATION BETWEEN NATIONAL SUPERVISORY AUTHORITIES. The EDPB is fully committed to support cooperation between all national supervisory authorities that work together to enforce European data protection law. We will streamline internal processes, combine expertise and promote enhanced coordination. We intend not only to ensure a more efficient functioning of the cooperation and consistency mechanisms, but also to strive for the development of a genuine EU-wide enforcement culture among supervisory authorities.
      • PILLAR 3: A FUNDAMENTAL RIGHTS APPROACH TO NEW TECHNOLOGIES. The protection of personal data helps to ensure that technology, new business models and society develop in accordance with our values, such as human dignity, autonomy and liberty. The EDPB will continuously monitor new and emerging technologies and their potential impact on the fundamental rights and daily lives of individuals. Data protection should work for all people, particularly in the face of processing activities presenting the greatest risks to individuals’ rights and freedoms (e.g. to prevent discrimination). We will help to shape Europe’s digital future in line with our common values and rules. We will continue to work with other regulators and policymakers to promote regulatory coherence and enhanced protection for individuals.
      • PILLAR 4: THE GLOBAL DIMENSION. The EDPB is determined to set and promote high EU and global standards for international data transfers to third countries in the private and the public sector, including in the law enforcement sector. We will reinforce our engagement with the international community to promote EU data protection as a global model and to ensure effective protection of personal data beyond EU borders.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) revealed that all but one of the videoconferencing platforms it and other data protection authorities’ (DPA) July 2020 letter urging them to “adopt principles to guide them in addressing some key privacy risks.” The ICO explained:
    • Microsoft, Cisco, Zoom and Google replied to the open letter. The joint signatories thank these companies for engaging on this important matter and for acknowledging and responding to the concerns raised. In their responses the companies highlighted various privacy and security best practices, measures, and tools that they advise are implemented or built-in to their video teleconferencing services.
    • The information provided by these companies is encouraging. It is a constructive foundation for further discussion on elements of the responses that the joint signatories feel would benefit from more clarity and additional supporting information.
    • The ICO stated:
      • The joint signatories have not received a response to the open letter from Houseparty. They strongly encourage Houseparty to engage with them and respond to the open letter to address the concerns raised.
  • The European Union Agency for Cybersecurity (ENISA) “launched a public consultation, which runs until 7 February 2021, on its draft of the candidate European Union Cybersecurity Certification Scheme on Cloud Services (EUCS)…[that] aims to further improve the Union’s internal market conditions for cloud services by enhancing and streamlining the services’ cybersecurity guarantees.” ENISA stated:
    • There are challenges to the certification of cloud services, such as a diverse set of market players, complex systems and a constantly evolving landscape of cloud services, as well as the existence of different schemes in Member States. The draft EUCS candidate scheme tackles these challenges by calling for cybersecurity best practices across three levels of assurance and by allowing for a transition from current national schemes in the EU. The draft EUCS candidate scheme is a horizontal and technological scheme that intends to provide cybersecurity assurance throughout the cloud supply chain, and form a sound basis for sectoral schemes.
    • More specifically, the draft EUCS candidate scheme:
      • Is a voluntary scheme;
      • The scheme’s certificates will be applicable across the EU Member States;
      • Is applicable for all kinds of cloud services – from infrastructure to applications;
      • Boosts trust in cloud services by defining a reference set of security requirements;
      • Covers three assurance levels: ‘Basic’, ‘Substantial’ and ‘High’;
      • Proposes a new approach inspired by existing national schemes and international standards;
      • Defines a transition path from national schemes in the EU;
      • Grants a three-year certification that can be renewed;
      • Includes transparency requirements such as the location of data processing and storage.

Coming Events

  • The Commerce, Science, and Transportation Committee will hold a hearing on the nomination of Gina Raimondo to be the Secretary of Commerce on 26 January.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Peggy und Marco Lachmann-Anke from Pixabay

UK and EU Defer Decision On Data Flows

Whether there will be an adequacy decision allowing the free flow of personal data under the GDPR from the EU to the recently departed UK has been punted. And, its recent status as a member of the EU notwithstanding, the UK might not get an adequacy decision.

In reaching agreement on many aspects of the United Kingdom’s (UK) exit from the European Union (EU), negotiators did not reach agreement on whether the EU would permit the personal data of EU persons to continue flowing to the UK under the easiest means possible. Instead, the EU and UK agreed to let the status quo continue until an adequacy decision is made or six months lapse. The value of data flowing between the UK and EU was valued at more than £100 billion in 2017 according to British estimates, with the majority of this trade being from the UK to the EU.

Under the General Data Protection Regulation (GDPR), the personal data of EU people can be transferred to other nations for most purposes once the European Commission (EC) has found the other nation has adequate protection equal to those granted in the EU. Of course, this has been an ongoing issue with data flows to the United States (U.S.) as two agreements (Safe Harbor and Privacy Shield) and their EC adequacy decisions were ruled illegal, in large part, because, according to the EU’s highest court, U.S. law does not provide EU persons with the same rights they have in the EU. Most recently, this occurred in 2020 when the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the EU-United States Privacy Shield (aka Schrems II). It bears note that transfers of personal data may occur through other means under the GDPR that may prove more resource intensive: standard data protection clauses (SCC), binding corporate rules (BCR), and others.

Nevertheless, an adequacy decision is seen as the most desirable means of transfer and the question of whether the UK’s laws are sufficient has lingered over the Brexit discussions, with some claiming that the nation’s membership in the Five Eyes surveillance alliance with the U.S. and others possibly disqualifying the UK. Given the range of thorny issues the UK and EU punted (e.g. how to handle the border between Northern Ireland and Ireland), it is not surprising that the GDPR and data flows was also punted.

The UK-EU Trade and Cooperation Agreement (TCA) explained the terms of the data flow agreement and, as noted, in the short term, the status quo will continue with data flows to the UK being treated as if it were still part of the EU. This state will persist until the EC reaches an adequacy decision or for four months with another two months of the status quo being possible in the absence of an adequacy decision so long as neither the UK nor EU object. Moreover, these provisions are operative only so long as the UK has its GDPR compliant data protection law (i.e. UK Data Protection Act 2018) in place and does exercise specified “designated powers.” The UK has also deemed EU and European Economic Area (EEA) and European Free Trade Association (EFTA) nations to be adequate for purposes of data transfers from the UK on a transitional basis.

Specifically, the TCA provides

For the duration of the specified period, transmission of personal data from the Union to the United Kingdom shall not be considered as transfer to a third country under Union law, provided that the data protection legislation of the United Kingdom on 31 December 2020, as it is saved and incorporated into United Kingdom law by the European Union (Withdrawal) Act 2018 and as modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (“the applicable data protection regime”), applies and provided that the United Kingdom does not exercise the designated powers without the agreement of the Union within the Partnership Council.

The UK also agreed to notify the EU if it “enters into a new instrument which can be relied on to transfer personal data to a third country under Article 46(2)(a) of the UK GDPR or section 75(1)(a) of the UK Data Protection Act 2018 during the specified period.” However, if the EU were to object, it appears from the terms of the TCA, all the EU could do is force the UK “to discuss the relevant object.” And yet, should the UK sign a treaty allowing personal data to flow to a nation the EU deems inadequate, this could obviously adversely affect the UK’s prospects of getting an adequacy decision.

Not surprisingly, the agreement also pertains to the continued flow of personal data as part of criminal investigations and law enforcement matters but not national security matters. Moreover, these matters fall outside the scope of the GDPR and would not be affected in many ways by an adequacy decision or a lack of one. In a British government summary, it is stated that the TCA

provide[s] for law enforcement and judicial cooperation between the UK, the EU and its Member States in relation to the prevention, investigation, detection and prosecution of criminal offences and the prevention of and fight against money laundering and financing of terrorism.

The text of the TCA makes clear national security matters visa vis data flows and information sharing are not covered:

This Part only applies to law enforcement and judicial cooperation in criminal matters taking place exclusively between the United Kingdom, on the one side, and the Union and the Member States, on the other side. It does not apply to situations arising between the Member States, or between Member States and Union institutions, bodies, offices and agencies, nor does it apply to the activities of authorities with responsibilities for safeguarding national security when acting in that field.

The TCA also affirms:

  • The cooperation provided for in this Part is based on the Parties’ long-standing commitment to ensuring a high level of protection of personal data.
  • To reflect that high level of protection, the Parties shall ensure that personal data processed under this Part is subject to effective safeguards in the Parties’ respective data protection regimes…

The United Kingdom’s data protection authority (DPA), the Information Commissioner’s Office (ICO), issued an explanation of how British law enforcement entities should act in light of the TCA. The ICO explained to British entities on law enforcement-related data transfers to the UK:

  • We are now a ‘third country’ for EU data protection purposes. If you receive personal data from a law enforcement partner in the EU, this means the sender will need to comply with the transfer provisions under their national data protection law (which are likely to be similar to those in Part 3 of the DPA 2018).
  • This means the EU sender needs to make sure other appropriate safeguards are in place – probably through a contract or other binding legal instrument, or by making their own assessment of appropriate safeguards. The sender can take into account the protection provided by the DPA 2018 itself when making this assessment.
  • If you receive personal data from other types of organisations in the EU or EEA who are subject to the GDPR, the sender will need to comply with the transfer provisions of the UK GDPR. You may want to consider putting standard contractual clauses (SCCs) in place to ensure adequate safeguards in these cases. We have produced an interactive tool to help you use the SCCs.

The ICO explained for transfers from the UK to the EU (but not the EEA):

  • There is a transitional adequacy decision in place to cover transfers to EU member states and Gibraltar. This will not extend to EEA countries outside the EU, where you should continue to consider other safeguards.
  • This means you can continue to send personal data from the UK to your law enforcement partners in the EU, as long as you can show the transfer is necessary for law enforcement purposes. You can also transfer personal data to non-law enforcement bodies in the EU if you can meet some additional conditions, but you will need to notify the ICO.

Turning back to an adequacy decision and commercial transfers of personal data from the EU to the UK, in what may well be a preview of a world in which there is no adequacy decision between the UK and EU, the European Data Protection Board (EDPB) issued an “information note” in mid-December that spells out how the GDPR would be applied:

  • In the absence of an adequacy decision applicable to the UK as per Article 45 GDPR, such transfers will require appropriate safeguards(e.g., standard data protection clauses, binding corporate rules, codes of conduct…), as well as enforceable data subject rights and effective legal remedies for data subjects, in accordance with Article 46 GDPR.
  • Subject to specific conditions, it may still be possible to transfer personal data to the UK based on a derogation listed in Article 49 GDPR. However, Article 49 GDPR has an exceptional nature and the derogations it contains must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive.
  • Moreover, where personal data are transferred to the UK on the basis of Article 46 GDPR safeguards, supplementary measures might be necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, in accordance with the Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.

Regarding commercial data transfers, the ICO issued a statement urging British entities to start setting up “alternative transfer mechanisms” to ensure data continues to flow from the EU to UK:

  • The Government has announced that the Treaty agreed with the EU will allow personal data to flow freely from the EU (and EEA) to the UK, until adequacy decisions have been adopted, for no more than six months.
  • This will enable businesses and public bodies across all sectors to continue to freely receive data from the EU (and EEA), including law enforcement agencies.
  • As a sensible precaution, before and during this period, the ICO recommends that businesses work with EU and EEA organisations who transfer personal data to them, to put in place alternative transfer mechanisms, to safeguard against any interruption to the free flow of EU to UK personal data.

However, even with these more restrictive means of transferring personal data to the UK exist, there will likely be legal challenges. It bears note that in light of Schrems II, EU DPAs are likely to apply a much higher level of scrutiny to SCCs, and challenges to the legality of using SCCs to transfer personal data to the U.S. have already been commenced. It also seems certain the legality of using SCCs to transfer data to the UK would be challenged, as well.

However, returning to the preliminary issue of whether the EC will give the UK an adequacy decision, there may a number of obstacles to a finding that the UK’s data protection and surveillance laws are indeed adequate under EU law[1]. Firstly, the UK’s surveillance practices in light of a recent set of CJEU rulings may prove difficult for the EC to stomach. In 2020, the CJEU handed down a pair of rulings (here and here) on the extent to which European Union (EU) nations may engage in bulk, indiscriminate collection of two types of data related to electronic communications. The CJEU found that while EU member nations may conduct these activities to combat crime or national security threats during periods limited by necessity and subject to oversight, nations may not generally require the providers of electronic communications to store and provide indiscriminate location data and traffic data in response to an actual national security danger or a prospective one. The CJEU combined three cases into two rulings that came from the UK, France, and Belgium to elucidate the reach of the Privacy and Electronic Communications Directive in relation to foundational EU laws.

The UK is, of course, one of the U.S.’s staunchest allies and partners when it comes to government surveillance of electronic communications. On this point, the CJEU summarized the beginning of the case out of the UK:

  • At the beginning of 2015, the existence of practices for the acquisition and use of bulk communications data by the various security and intelligence agencies of the United Kingdom, namely GCHQ, MI5 and MI6, was made public, including in a report by the Intelligence and Security Committee of Parliament (United Kingdom). On 5 June 2015, Privacy International, a non-governmental organisation, brought an action before the Investigatory Powers Tribunal (United Kingdom) against the Secretary of State for Foreign and Commonwealth Affairs, the Secretary of State for the Home Department and those security and intelligence agencies, challenging the lawfulness of those practices.

Secondly, the government of Prime Minister Boris Johnson may aspire to change data laws in ways the EU does not. In media accounts, unnamed EC officials were critical of the UK’s 2020 “National Data Strategy,” particularly references to “legal barriers (real and perceived)” to accessing data that “must be addressed.”

Thirdly, it may become a matter of politics. The EU has incentives to make the UK’s exit from the EU difficult to dissuade other nations from following the same path. Moreover, having previously been the second largest economy in the EU as measured by GDP, the UK may prove a formidable economy competitor, lending more weight to the view that the EU may not want to help the UK’s  businesses compete with the EU’s.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by succo from Pixabay


[1] European Union Parliament, “The EU-UK relationship beyond Brexit: options for Police Cooperation and Judicial Cooperation in Criminal Matters,” Page 8: Although the UK legal framework is currently broadly in line with the EU legal framework and the UK is a signatory to the European Convention on Human Rights (ECHR), there are substantial questions over whether the Data Protection Act fully incorporates the data protection elements required by the Charter of Fundamental Rights, concerning the use of the national security exemption from the GDPR used by the UK, the retention of data and bulk powers granted to its security services, and over its onward transfer of this data to third country security partners such as the ‘Five Eyes’ partners (Britain, the USA, Australia, New Zealand and Canada).

EU Regulators Settle Dispute Over Proper Punishment of Twitter For Breach

The EDPB uses its GDPR powers to manage a dispute between DPAs.

The European Data Protection Board (EDPB) concluded its first use of powers granted under the General Data Protection Regulation (GDPR) (Regulation (EU) 2016/679 of the European Parliament and of the Council) to resolve a dispute among EU regulators on how to apply the GDPR in punishing a violator. In this case, the EDPB had to referee how Twitter should be punished for a data breach arising from a bug affecting users of an Android OS. Ireland’s Data Protection Commission (DPC) and unnamed concerned supervisory agencies (CSA) disagreed about how Twitter should be fined for the GDPR breach, and so an unused article of the GDPR was triggered that put the EDPB in charge of resolving the dispute. The EDPB considered the objections raised by other EU agencies and found that the DPC needed to recalculate its fine that was set to be a maximum of $300,000 of a possible $69.2 million. Thereafter, the DPC revised and decided that “an administrative fine of €450,000 on Twitter” is “an effective, proportionate and dissuasive measure.”

The DPC issued a revised decision that incorporates the EDPB’s decision on the case that arose from a glitch that changed a person’s protected tweets to unprotected. Twitter users may protect their tweets, meaning only certain people, usually just followers, can see this content. However, a bug with the Android OS resulted in a person’s desire to protect their tweets being thwarted the DPC explained:

The bug that resulted in this data breach meant that, if a user operating an Android device changed the  email  address  associated  with  that  Twitter  account,  their  tweets  became  unprotected  and consequently were accessible to the wider public without the user’s knowledge.

The DPC said this breach occurred between September 2017 and January 2019, affecting 88,726 EU and European Economic Area (EEA) users, and on 8 January 2019, Twitter alerted the DPC, triggering an investigation. Twitter revealed:

On 26 December 2018, we received a bug report through our bug bounty program that if a Twitter user with a protected account, using Twitter for Android, changed their email address the bug would result in their account being unprotected.

Article 33(1) of the GDPR requires breaches to be reported to a DPA within 72 hours in most cases:

In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.

However, Twitter conceded by way of reason as to why it had not reported the breach within the 72 hour window:

The severity of the issue – and that it was reportable – was not appreciated until 3 January 2018 at which point Twitter’s incident response process was put into action.

Additionally, Article 33(5) would become relevant during the DPC investigation:

The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.

Consequently, Twitter had a responsibility as the controller to document all the relevant facts about the data breach and then to report the breach within 72 hours of becoming aware of the breach subject to a range of exceptions.

Shortly thereafter, the DPC named itself the lead supervisory agency (LSA), investigated and reached its proposed decision in late April and submitted it to the European Commission (EC). And, this is where the need for the EDPB to step in began.

Irish Data Protection Commissioner Helen Dixon explained the scope of the subsequent investigation:

  1. Whether Twitter International Company (TIC) complied with its obligations, in accordance with Article 33(1) GDPR, to notify the Commission of the Breach without undue delay and, where feasible, not later than 72 hours after having become aware of it; and
  2. Whether TIC complied with its obligation under Article 33(5) to document the Breach.

Dixon found that TIC did not comply with Article 33(1) and found unpersuasive the main claim of TIC that because Twitter, International, its processor under EU law, did not alert TIC in a timely fashion, it need not meet the 72 hour window. Moreover, Dixon found TIC did not meet its Article 33(5) obligations such that its compliance with Article 33 could be determined. However, the size of the fine became the issue necessitating the EDPB step in because the Austrian Supervisory Authority (Österreichische Datenschutzbehörde), the German Supervisory Authority (Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit) and the Italian Supervisory Authority (Garante per la protezione dei dati personali) made “relevant and reasoned” objections.

Per the GDPR, the EDPB intervened. Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

In its mid-November statement, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 2% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

According to the DPC, the EDPB ultimately decided that

…the [DPC] is required to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC, and to amend its Draft Decision by increasing the level of the fine in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.

Dixon went back and reasoned through the breach and compliance. She stressed that the GDPR infringements were largely aside and apart from the substance of the breach, which is why the administrative fine was low. Nonetheless, Dixon reexamined the evidence in light of the EDPB’s decision and concluded in relevant part:

  • I therefore consider that the nature of the obligations arising under Article 33(1) and Article 33(5) are such that, compliance is central to the overall functioning of the supervision and enforcement regime performed by supervisory authorities in relation to both the specific issue of personal data breaches but also the identification and assessment of wider issues of non-compliance by controllers. As such, non-compliance with these obligations has serious consequences in that it risks undermining the effective exercise by supervisory authorities of their functions under the GDPR. With regard to the nature of the specific infringements in these circumstances, it is clear, having regard to the foregoing, that in the circumstances of this case, the delayed notification under Article 33(1) inevitably delayed the Commission’s assessment of the Breach. With regard to Article 33(5), the deficiencies in the “documenting” of the Breach by TIC impacted on the Commission’s overall efficient assessment of the Breach, necessitating the raising of multiple queries concerning the facts and sequencing surrounding the notification of the Breach.
  • Accordingly, having regard to the potential for damage to data subjects caused by the delayed notification to the Commission (which I have set out above in the context of Article 83(2)(a)), the corollary of this is that any category of personal data could have been affected by the delayed notification. Whilst, as stated above, there was no direct evidence of damage, at the same time, it cannot be definitively said that there was no damage to data subjects or no affected categories of personal data.

Dixon also recalculated the fine that she noted was bound on the upper limit at €10 million or 2% of annual worldwide revenue after once again turning aside TIC’s argument that it independent of Twitter for purposes of determining a fine. Dixon determined the appropriate administrative fine would be about $500,000 and Twitter’s worldwide revenue was $3.46 billion in 2019 (meaning a maximum penalty of $69.2 million.) Dixon explained:

Having regard to all of the foregoing, and, in particular, having had due regard to all of the factors which I am required to consider under Articles 83(2)(a) to (k), as applicable, and in the interests of effectiveness, proportionality and deterrence, and in light of the re-assessment of the elements I have implemented and documented above in accordance with the EDPB Decision, I have decided to impose an administrative fine of $500,000, which equates (in my estimation for this purpose) to €450,000. In deciding to impose a fine in this amount, I have had regard to the previous range of the fine, set out in the Draft Decision (of $150,000 – $300,000), and to the binding direction in the EDPB Decision, at paragraph 207 thereof, that the level of the fine should be increased “..in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.”

In its Article 65 decision, the EDPB judged the various objections to the DPC’s proposed decision against Article 4(24) of the GDPR:

‘relevant and reasoned objection’ means an objection to a draft decision as to whether there is an infringement of this Regulation, or whether envisaged action in relation to the controller or processor complies with this Regulation, which clearly demonstrates the significance of the risks posed by the draft decision as regards the fundamental rights and freedoms of data subjects and, where applicable, the free flow of personal data within the Union;

The EDPB ultimately decided “the fine proposed in the Draft Decision is too low and therefore does not fulfil its purpose as a corrective measure, in particular it does not meet the requirements of Article 83(1) GDPR of being effective, dissuasive and proportionate.” The EDPB directed the DPC “to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC so as to ensure it is appropriate to the facts of the case.” However, the EDPB turned aside a number of other objections raised by EU DPAs as failing to meet the standard of review in Article 4(24):

  • the competence of the LSA;
  • the qualification of the roles of TIC and Twitter, Inc., respectively;
  • the infringements of the GDPR identified by the LSA;
  • the existence of possible additional (or alternative) infringements of the GDPR;
  • the lack of a reprimand;

However, the EDPB stressed:

Regarding the objections deemed not to meet the requirements stipulated by Art 4(24) GDPR, the EDPB does not take any position on the merit of any substantial issues raised by these objections. The EDPB reiterates that its current decision is without any prejudice to any assessments the EDPB may be called upon to make in other cases, including with the same parties, taking into account the contents of the relevant draft decision and the objections raised by the CSAs.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by papagnoc from Pixabay

Privacy Shield Hearing

The focus was on how the U.S. and EU can reach agreement on an arrangement that will not be struck down by the EU’s highest court.

Last week, the Senate Commerce, Science, and Transportation Committee held a hearing on the now invalidated European Union (EU)-United States (U.S.) Privacy Shield, a mechanism that allowed companies to transfer the personal data of EU residents to the U.S. The EU’s highest court struck down the adequacy decision that underpinned the system on the basis of U.S. surveillance activities and lack of redress that violated EU law. This is the second time in the decade the EU’s top court has invalidated a transfer arrangement, the first being the Safe Harbor system. Given the estimated billions, or even trillions, of dollars in value realized from data flows between the EU and U.S. there is keen interest on both sides of the Atlantic in finding a legal path forward. However, absent significant curtailment of U.S. surveillance and/or a significant expansion of the means by which EU nationals could have violations of their rights rectified, it would appear a third agreement may not withstand the inevitable legal challenges. Moreover, there are questions as to the legality of other transfer tools in light of the Court of Justice for the European Union’s decision in the case known as Schrems II, and the legality of some Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) may be soon be found in violation, too.

Consequently, a legislative fix, or some portion thereof, could be attached to federal privacy legislation. Hence, the striking down of Privacy Shield may provide additional impetus to Congress and the next Administration to reach a deal on privacy. Moreover, the lapsed reauthorization of some Foreign Intelligence Surveillance Act authorities may be another legislative opportunity for the U.S. to craft an approach amendable to the EU in order to either obtain an adequacy decision or a successor agreement to the Privacy Shield.

Chair Roger Wicker (R-MS) approached the issue from the perspective of international trade and the economic benefit accruing to businesses on both sides of the Atlantic. His opening remarks pertained less to the privacy and surveillance aspects of the CJEU’s ruling. Wicker appears to be making the case that the EU seems to misunderstand that redress rights in the U.S. are more than adequate, and the U.S.’ surveillance regime is similar to those of some EU nations. One wonders if the CJEU is inclined to agree with this position. Nonetheless, Wicker expressed hope that the EU and U.S. can reach “a durable and lasting data transfer framework…that provides meaningful data protections to consumers, sustains the free flow of information across the Atlantic, and encourages continued economic and strategic partnership with our European allies – a tall order but an essential order.” He worried about the effect of the CJEU’s ruling on SCCs. Wicker made the case that the EU and U.S. share democratic values and hinted that the ongoing talks in the committee to reach a federal data privacy law might include augmented redress rights that might satisfy the CJEU.

Ranking Member Maria Cantwell (D-WA) spoke very broadly about a range of issues related to data transfers and privacy. She stressed the importance of data flows in the context of larger trade relations. Cantwell also stressed the shared values between the U.S. and the EU and her hope that the two entities work “together on these very important national concerns, trade and technology, so that we can continue to improve economic opportunities and avoid moves towards protectionism.” She also called for federal privacy legislation but hinted that states should still be able to regulate privacy, suggesting her commitment to having a federal law be a floor for state laws. Cantwell also asserted that bulk surveillance, the likes of which the National security Agency has engaged in, may simply not be legal under EU law.

Deputy Assistant Secretary of Commerce for Services James Sullivan blurred the issues presented by Schrems II much like Cantwell did. The CJEU’s decision that focused on U.S. surveillance practices and the lack of meaningful recourse in the U.S. if an EU resident’s rights were violated was merged into a call for like-minded nations to unite against authoritarian nations. Sullivan distinguished between U.S. surveillance and the surveillance conducted by the People’s Republic of China (without naming the nation) and other regimes as if this should satisfy the EU as to the legality and propriety of U.S. treatment of EU personal data. Sullivan stated:

  • The Schrems II decision has created enormous uncertainties for U.S. companies and the transatlantic economy at a particularly precarious time. Immediately upon issuance of the ruling, the 5,400 Privacy Shield participants and their business partners in the EU could no longer rely on the Framework as a lawful basis for transferring personal data from Europe to the United States. Because neither the Court nor European data protection authorities provided for any enforcement grace period, Privacy Shield companies were left with three choices: (1) risk facing potentially huge fines (of up to 4 percent of total global turnover in the preceding year) for violating GDPR, (2) withdraw from the European market, or (3) switch right away to another more expensive data transfer mechanism.
  • Unfortunately, because of the Court’s ruling in the Privacy Shield context that U.S. laws relating to government access to data do not confer adequate protections for EU personal data, the use of other mechanisms like SCCs and BCRs to transfer EU personal data to the United States is now in question as well.
  • The objective of any potential agreement between the United States and the European Commission to address Schrems II is to restore the continuity of transatlantic data flows and the Framework’s privacy protections by negotiating targeted enhancements to Privacy Shield that address the Court’s concerns in Schrems II. Any such enhancements must respect the U.S. Government’s security responsibilities to our citizens and allies.
  • To be clear, we expect that any enhancements to the Privacy Shield Framework would also cover transfers under all other EU-approved data transfer mechanisms like SCCs and BCRs as well.
  • The Schrems II decision has underscored the need for a broader discussion among likeminded democracies on the issue of government access to data. Especially as a result of the extensive U.S. surveillance reforms since 2015, the United States affords privacy protections relating to national security data access that are equivalent to or greater than those provided by many other democracies in Europe and elsewhere.
  • To minimize future disruptions to data transfers, we have engaged with the European Union and other democratic nations in a multilateral discussion to develop principles based on common practices for addressing how best to reconcile law enforcement and national security needs for data with protection of individual rights.
  • It is our view that democracies should come together to articulate shared principles regarding government access to personal data—to help make clear the distinction between democratic societies that respect civil liberties and the rule of law and authoritarian governments that engage in the unbridled collection of personal data to surveil, manipulate, and control their citizens and other individuals without regard to personal privacy and human rights. Such principles would allow us to work with like-minded partners in preserving and promoting a free and open Internet enabled by the seamless flow of data.

Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips stressed he was speaking in a personal capacity and not for the FTC. He extolled the virtues of the “free and open” internet model in the U.S. with the double implication that it is superior both to nations like the PRC and Russia but also the EU model. Phillips seemed to be advocating for talking the EU into accepting that the U.S.’s privacy regime and civil liberties are stronger than any other nation. Her also made the case, like other witnesses, that the U.S. data privacy and protection regulation is more similar to the EU than the PRC, Russia, and others. Phillips also sought to blur the issues and recast Privacy Shield in the context of the global struggle between democracies and authoritarian regimes. Phillips asserted:

  • First, we need to find a path forward after Schrems II, to permit transfers between the U.S. and EU. I want to recognize the efforts of U.S. and EU negotiators to find a replacement for Privacy Shield. While no doubt challenging, I have confidence in the good faith and commitment of public servants like Jim Sullivan, with whom I have the honor of appearing today, and our partners across the Atlantic. I have every hope and expectation that protecting cross-border data flows will be a priority for the incoming Administration, and I ask for your help in ensuring it is.
  • Second, we must actively engage with nations evaluating their approach to digital governance, something we at the FTC have done, to share and promote the benefits of a free and open Internet. There is an active conversation ongoing internationally, and at every opportunity—whether in public forums or via private assistance—we must ensure our voice and view is heard.
  • Third, we should be vocal in our defense of American values and policies. While we as Americans always look to improve our laws—and I commend the members of this committee on their important work on privacy legislation and other critical matters—we do not need to apologize to the world. When it comes to civil liberties or the enforcement of privacy laws, we are second to none. Indeed, in my view, the overall U.S. privacy framework—especially with the additional protections built into Privacy Shield—should certainly qualify as adequate under EU standards.
  • Fourth, as European leaders call to strengthen ties with the U.S., we should prioritize making our regimes compatible for the free flow of data. This extends to the data governance regimes of like-minded countries outside of Europe as well. Different nations will have different rules, but relatively minor differences need not impede mutually-beneficial commerce. We need not and should not purport to aim for a single, identical system of data governance. And we should remind our allies, and remind ourselves, that far more unites liberal democracies than divides us.
  • Fifth and finally, if we must draw lines, those lines should be drawn between allies with shared values—the U.S., Europe, Japan, Australia, and others—and those, like China and Russia, that offer a starkly different vision. I am certainly encouraged when I hear recognition of this distinction from Europe. European Data Protection Supervisor Wojciech Wiewiórowski recently noted that the U.S. is much closer to Europe than is China and that he has a preference for data being processed by countries that share values with Europe. Some here in the U.S. are even proposing agreements to solidify the relationships among technologically advanced democracies, an idea worth exploring in more detail

Washington University Professor of Law Neil Richards stressed that the Schrems II decision spells out how the U.S. would achieve adequacy: reforming surveillance and providing meaningful redress for alleged privacy violations. Consequently, FISA would need to be rewritten and narrowed and a means for EU residents to seek relief beyond the current Ombudsman system is needed, possibly a statutory right to sue. Moreover, he asserted strong data protection and privacy laws are needed and some of the bills introduced in this Congress could fit the bill. Richards asserted:

In sum, the Schrems litigation is a creature of distrust, and while it has created problems for American law and commerce, it has also created a great opportunity. That opportunity lies before this Committee –the chance to regain American leadership in global privacy and data protection by passing a comprehensive law that provides appropriate safeguards, enforceable rights, and effective legal remedies for consumers. I believe that the way forward can not only safeguard the ability to share personal data across the Atlantic, but it can do so in a way that builds trust between the United States and our European trading partners and between American companies and their American and European customers. I believe that there is a way forward, but it requires us to recognize that strong, clear, trust-building rules are not hostile to business interest, that we need to push past the failed system of “notice and choice,” that we need to preserve effective consumer remedies and state-level regulatory innovation, and seriously consider a duty of loyalty. In that direction, I believe, lies not just consumer protection, but international cooperation and economic prosperity.

Georgia Tech University Professor Peter Swire explained that the current circumstances make the next Congress the best possibility in memory to enact privacy legislation because of the need for a Privacy Shield replacement, passage of the new California Privacy Rights Act (Proposition 24), and the Biden Administration’s likely support for such legislation. Swire made the following points:

  1. The European Data Protection Board in November issued draft guidance with an extremely strict interpretation of how to implement the Schrems II case.
  2. The decision in Schrems II is based on EU constitutional law. There are varying current interpretations in Europe of what is required by Schrems II, but constitutional requirements may restrict the range of options available to EU and U.S. policymakers.
  3. Strict EU rules about data transfers, such as the draft EDPB guidance, would appear to result in strict data localization, creating numerous major issues for EU- and U.S.-based businesses, as well as affecting many online activities of EU individuals.
  4. Along with concerns about lack of individual redress, the CJEU found that the EU Commission had not established that U.S. surveillance was “proportionate” in its scope and operation. Appendix 2 to this testimony seeks to contribute to an informed judgment on proportionality, by cataloguing developments in U.S. surveillance safeguards since the Commission’s issuance of its Privacy Shield decision in 2016.
  5. Negotiating an EU/U.S. adequacy agreement is important in the short term.
  6. A short-run agreement would assist in creating a better overall long-run agreement or agreements.
  7. As the U.S. considers its own possible legal reforms in the aftermath of Schrems II, it is prudent and a normal part of negotiations to seek to understand where the other party – the EU – may have flexibility to reform its own laws.
  8. Issues related to Schrems II have largely been bipartisan in the U.S., with substantial continuity across the Obama and Trump administrations, and expected as well for a Biden administration.
  9. Passing comprehensive privacy legislation would help considerably in EU/U.S. negotiations.
  10. This Congress may have a unique opportunity to enact comprehensive commercial privacy legislation for the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dooffy Design from Pixabay