Epic Games Asks EC To Investigate Apple

The maker of Fortnite filed an antitrust action against Apple in the EU; A UK court rules on Epic Games action, giving it a split decision.

Epic Games sought to contest Apple and Google’s decisions to remove the maker of popular multi-player game, Fortnite, from their application stores last summer after the game maker started offering users the option to pay for in app purchases outside the tech giants payment systems. Epic Games has filed suit in the United States (U.S.) and the United Kingdom against both companies and recently decided to file a claim with the European Commission (EC), essentially asking that its ongoing investigation of Apple’s application store practices be widened to include Epic Games’ claims. Thus far, Epic Games has only filed suit against Apple in Australia. And yet, at the same time Epic Games was seeking to widen the playing field in making its antitrust and anti-competition case before a number of different governments, the UK ruled against part of Epic Games’ suit in that jurisdiction.

First, in a press release, Epic Games announced that it had filed a complaint with the EC, specifically the Directorate-General for Competition, claiming “that through a series of carefully designed anti-competitive restrictions, Apple has not just harmed but completely eliminated competition in app distribution and payment processes.” Epic Games added “Apple uses its control of the iOS ecosystem to benefit itself while blocking competitors and its conduct is an abuse of a dominant position and in breach of EU competition law.”

Epic Games’ complaint may well get consolidated into an existing EC investigation of Apple. In June 2020, the EC announced two antitrust investigations of Apple regarding allegations of unfair and anticompetitive practices with its App Store and Apple Pay. In a press release, the EC announced it “has opened a formal antitrust investigation to assess whether Apple’s conduct in connection with Apple Pay violates EU competition rules…[that] concerns Apple’s terms, conditions and other measures for integrating Apple Pay in merchant apps and websites on iPhones and iPads, Apple’s limitation of access to the Near Field Communication (NFC) functionality (“tap and go”) on iPhones for payments in stores, and alleged refusals of access to Apple Pay.” The EC noted that “[f]ollowing a preliminary investigation, the Commission has concerns that Apple’s terms, conditions, and other measures related to the integration of Apple Pay for the purchase of goods and services on merchant apps and websites on iOS/iPadOS devices may distort competition and reduce choice and innovation.” The EC contended “Apple Pay is the only mobile payment solution that may access the NFC “tap and go” technology embedded on iOS mobile devices for payments in stores.” The EC revealed “[t]he investigation will also focus on alleged restrictions of access to Apple Pay for specific products of rivals on iOS and iPadOS smart mobile devices” and “will investigate the possible impact of Apple’s practices on competition in providing mobile payments solutions.”

In a press release issued the same day, the EC explained it had also “opened formal antitrust investigations to assess whether Apple’s rules for app developers on the distribution of apps via the App Store violate EU competition rules.” The EC said “[t]he investigations concern in particular the mandatory use of Apple’s own proprietary in-app purchase system and restrictions on the ability of developers to inform iPhone and iPad users of alternative cheaper purchasing possibilities outside of apps.” The EC added “[t]he investigations concern the application of these rules to all apps, which compete with Apple’s own apps and services in the European Economic Area (EEA)…[and] [t]he investigations follow-up on separate complaints by Spotify and by an e-book/audiobook distributor on the impact of the App Store rules on competition in music streaming and e-books/audiobooks.”

The EC provided further detail on the scope of its inquiry and “will investigate in particular two restrictions imposed by Apple in its agreements with companies that wish to distribute apps to users of Apple devices:

(i)   The mandatory use of Apple’s own proprietary in-app purchase system “IAP” for the distribution of paid digital content. Apple charges app developers a 30% commission on all subscription fees through IAP.

(ii)  Restrictions on the ability of developers to inform users of alternative purchasing possibilities outside of apps. While Apple allows users to consume content such as music, e-books and audiobooks purchased elsewhere (e.g. on the website of the app developer) also in the app, its rules prevent developers from informing users about such purchasing possibilities, which are usually cheaper.

The EC explained the genesis of part of this inquiry being allegations leveled by Swedish music streaming platform, Spotify. The EC stated “[o]n 11 March 2019, music streaming provider and competitor of Apple Music, Spotify, filed a complaint about the two rules in Apple’s license agreements with developers and the associated App Store Review Guidelines, and their impact on competition for music streaming services.” The EC explained the other part as “[o]n 5 March 2020, an e-book and audiobook distributor, also filed a complaint against Apple, which competes with the complainant through its Apple Books app.” The EC asserted “[t]his complaint raises similar concerns to those under investigation in the Spotify case but with regard to the distribution of e-books and audiobooks.”

As noted in its press release on filing a claim with the EC, Epic Games had previously filed claims against both Apple and Google (and subsidiaries) in the United Kingdom’s Competition Appeal Tribunal (CAT or Tribunal). However, Epic Games received a split decision from the Tribunal, which found the UK not to be the forum for the company to bring an antitrust action against Apple. However, in the same decision, the CAT found that the UK might be the right forum for Epic Games’ suit against Google’s two Irish subsidiaries but not against Google and Alphabet for the same reasons the Tribunal articulated in preventing Epic Games’ suit to proceed against Apple and a subsidiary. Moreover, the Tribunal has only found jurisdiction for a limited number of claims against Google’s Irish subsidiaries.

With respect to Epic Games’ claims against Apple, the Tribunal stated

  • The factual and economic evidence, including expert economic evidence, that would have to be given on such issues in the Apple action under UK competition law therefore substantially overlaps with the evidence that will be given in the US proceedings. The additional cost if this action proceeded in the Tribunal in addition to the US proceedings is accordingly, in my view, very significant even allowing for the cost of presenting expert evidence of English law to the US court.
  • I have no doubt that the US Federal court is well able to receive evidence on foreign law. And since the law here is in the same language, and the US is not only a common law country but has a well-developed jurisprudence in antitrust/competition cases, I do not see why a US Federal judge would have difficulties in understanding and applying UK competition law to [Alphabet Inc’s] conduct as regards the UK.
  • In short, in balancing these various factors I consider that the US is an appropriate forum for this dispute. And I am far from persuaded that England (or the UK, since the CAT is a UK tribunal) is clearly or distinctly the more appropriate forum. I reach this conclusion without the need to consider the implications of the exclusive jurisdiction clause in the Apple  Developer  Program  License  Agreement (DPLA.)

Regarding Epic Games’ claims against Google, the Tribunal found

that England (or the UK) would be clearly or distinctly the appropriate forum for trial of the claims concerning the Restrictive Terms in the Google Play Store Developer Distribution Agreement (DDA) and the removal of Fortnite from the Google Play Store as regards the UK. I have considered whether on that basis England should be regarded as the appropriate forum for all the claims in the action. In my view, that would not here be the right conclusion. The claims as regards the Mobile  Application  Distribution  Agreements (MADA) and the Technical Restrictions are wholly distinct and while there may well be some evidence that applies to all the claims, those two claims raise different and substantial issues and will involve significant additional evidence. Taking that into account, I see no good ground for finding that this is clearly or distinctly the appropriate forum for trial of those claims just because they are based on alleged breach of the same statutory prohibitions and are included in the same claim form.

The Tribunal held:

(a) In the Apple action, the application for permission to serve the proceedings on [Apple Inc.] out of the jurisdiction is refused.

(b) In the Google action, the application for permission to serve the proceedings on [Alphabet Inc] and [Google Inc.] out of the jurisdiction is granted for the claims for breach of the Chapter I and Chapter II prohibitions under the Competition Act 1998 (CA 1998) as regards the alleged “Restrictive Terms” in the DDA and the removal of Fortnite from the Google Play Store, and the injunctions claimed at paras (c), (d) and (h) of the prayer to the Claim Form. Permission is refused as regards the other claims made.

In essence, the Tribunal is sending Epic Games back to the U.S. court in which it has already filed suit against Apple while leaving open the possibility of a narrower lawsuit against Alphabet and Google to move forward. Last fall, a federal court denied Epic Games’ request for a preliminary injunction requiring Apple to put Fortnite back into the App Store. The judge assigned the case had signaled this request would likely fail as its request for a temporary restraining order was also rejected. A trial is set for May 2021. The United States District Court for the Northern District of California summarized Epic’s motion:

In this motion for preliminary injunction, Epic Games asks the Court to force Apple to reinstate Fortnite to the Apple App Store, despite its acknowledged breach of its licensing agreements and operating guidelines, and to stop Apple from terminating its affiliates’ access to developer tools for other applications, including Unreal Engine, while Epic Games litigates its claims.

The court stated:

Epic Games bears the burden in asking for such extraordinary relief. Given the novelty and the magnitude of the issues, as well as the debate in both the academic community and society at large, the Court is unwilling to tilt the playing field in favor of one party or the other with an early ruling of likelihood of success on the merits. Epic Games has strong arguments regarding Apple’s exclusive distribution through the iOS App Store, and the in-app purchase (“IAP”) system through which Apple takes 30% of certain IAP payments. However, given the limited record, Epic Games has not sufficiently addressed Apple’s counter arguments. The equities, addressed in the temporary restraining order, remain the same.

The court held:

Apple and all persons in active concert or participation with Apple, are preliminarily enjoined from taking adverse action against the Epic Affiliates with respect to restricting, suspending or terminating the Epic Affiliates from the Apple’s Developer Program, on the basis that Epic Games enabled IAP direct processing in Fortnite through means other than the Apple IAP system, or on the basis of the steps Epic Games took to do so. This preliminary injunction shall remain in effect during the pendency of this litigation unless the Epic Affiliates breach: (1) any of their governing agreements with Apple, or (2) the operative App Store guidelines. This preliminary injunction supersedes the prior temporary restraining order. 

In its August 2020 complaint, Epic Games argued that Apple’s practices violate federal and California antitrust and anti-competition laws. Epic Games argued:

This case concerns Apple’s use of a series of anti-competitive restraints and monopolistic practices in markets for (i) the distribution of software applications (“apps”) to users of mobile computing devices like smartphones and tablets, and (ii) the processing of consumers’ payments for digital content used within iOS mobile apps(“in-app content”). Apple imposes unreasonable and unlawful restraints to completely monopolize both markets and prevent software developers from reaching the over one billion users of its mobile devices (e.g., iPhone and iPad) unless they go through a single store controlled by Apple, the App Store, where Apple exacts an oppressive 30% tax on the sale of every app. Apple also requires software developers who wish to sell digital in-app content to those consumers to use a single payment processing option offered by Apple, In-App Purchase, which likewise carries a 30% tax.

In contrast, software developers can make their products available to users of an Apple personal computer (e.g., Mac or MacBook) in an open market, through a variety of stores or even through direct downloads from a developer’s website, with a variety of payment options and competitive processing fees that average 3%, a full ten times lower than the exorbitant 30% fees Apple applies to its mobile device in-app purchases.

In its late August denial of Epic Games’ request for a temporary restraining order, the court decided the plaintiff does not necessarily have an antitrust case strong enough to succeed on the merits, has not demonstrated irreparable harm because the “current predicament appears to be of its own making,” would unjustifiably be enriched if Fortnite is reinstated to the App Store without having to pay 30% of in app purchases to Apple, and is not operating in a public interest strong enough to overcome the expectation private parties will honor their contracts or resolve disputes through normal means.

Epic Games has also filed suit against Apple in Australia. In its November 2020 concise statement of its complaint, Epic Games alleged:

  • Among other things, Apple’s conduct has forced Epic and other app developers to pay Apple monopoly prices (the 30% commission) in connection with all in-app purchases of their in-app content on iOS devices. This has led to harms including increased prices for in-app content by iOS device users in Australia and lost profits for Epic. When Epic introduced Epic Direct Payment, Fortnite users on iOS for the first time had a competitive alternative to Apple’s IAP payment system, which in turn enabled Epic to pass along its cost savings by offering its users a 20% reduction in in-app prices.
  • Apple’s conduct has also denied app developers (such as Epic) and iOS device users their choice of in-app content payment providers and denied app developers and iOS device users the choice of app stores for distribution of apps on iOS devices.
  • Further, Apple’s conduct referred to in paragraph 7 above has harmed Epic through, inter alia, loss of goodwill in respect of both Fortnite, other Epic games on iOS devices, and Epic more broadly. This loss and damage to Epic’s ongoing business and to its reputation and trust with customers is permanent and irreparable.
  • But for Apple’s conduct, like on Apple personal computers, app developers such as Epic would (or would be likely to) distribute its software through other channels. These other channels would cause competition on the basis of (among other things) price, service, and innovation, including by Apple. Epic would also offer users of its software a range of payment processing options. Absent Apple’s conduct, these competing in-app payment processors would cause Apple to compete on the basis of price, service, and innovation. The state of competition should be no different for Apple’s iOS devices.

Regarding its suit against Google in the U.S., Epic Games’ action was folded into a number of other suits also alleging Google and its subsidiaries violated antitrust statutes as explained in this status report:

On February 5, 2021, the United States Judicial Panel on Multidistrict Litigation (JPML) issued a Transfer Order (ECF No. 1) creating the centralized action In re Google Play Store Antitrust Litigation, No. 3:21-md-02981-JD (“Play Store MDL”). The Play Store MDL is comprised of (A) one individual action; (B) one consolidated developer class action; (C) one consolidated consumer class action (all three of (A)-(C) have already been pending before this Court); and (D) six new tag along consumer class action cases transferred to this Court from other District Courts.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Vlad Gorshkov on Unsplash

EDPS Renders Opinions on EC Legislation

EDPS suggests changes to Digital Services Act and Digital Markets Act, the EU’s legislation to remake how the bloc regulates large technology firms.

European Data Protection Supervisor (EDPS) Wojciech Wiewiorówski has issued his opinions on two of the centerpieces for the European Commission’s (EC) digital strategy: the Digital Services Act and a Digital Markets Act. The EDPS is fulfilling its statutory role in reviewing draft legislation and informing the EC of his opinion, especially any needed changes to ensure the legislation comports with European Union (EU) law. Not surprisingly, the EDPS is calling on the EC to make significant changes, and it is unclear how far the EC will go in redrafting the two major bills to meet the EDPS.

In February 2020, the new EC leadership issued a digital strategy, Shaping Europe’s Digital Future, along with two components of this strategy: a “European strategy for data” and a white paper on artificial intelligence. In the digital strategy, the EC explained “[i]n her political guidelines, [new EC] President [Ursula] von der Leyen stressed the need for Europe to lead the transition to a healthy planet and a new digital world.” The EC stated that over the next five years, it “will focus on three key objectives to ensure that digital solutions help Europe to pursue its own way towards a digital transformation that works for the benefit of people through respecting our values…[and] will also put Europe in a position to be a trendsetter in the global debate:

  • Technology that works for people: Development, deployment and uptake of technology that makes a real difference to people’s daily lives. A strong and competitive economy that masters and shapes technology in a way that respects European values.
  • A fair and competitive economy: A frictionless single market, where companies of all sizes and in any sector can compete on equal terms, and can develop, market and use digital technologies, products and services at a scale that boosts their productivity and global competitiveness, and consumers can be con dent that their rights are respected.
  • An open, democratic and sustainable society: A trustworthy environment in which citizens are empowered in how they act and interact, and of the data they provide both online and offline. A European way to digital transformation which enhances our democratic values, respects our fundamental rights, and contributes to a sustainable, climate-neutral and resource-efficient economy.

Among the “Key Actions” listed in “Shaping Europe’s Digital Future” was:

  • New and revised rules to deepen the Internal Market for Digital Services, by increasing and harmonising the responsibilities of online platforms and information service providers and reinforce the oversight over platforms’ content policies in the EU. (Q4 2020, as part of the Digital Services Act package).

In December 2020, the EC issued the drafts the Digital Services Act and a Digital Markets Act and stated “[t]he two proposals are at the core of the Commission’s ambition to make this Europe’s Digital Decade” in its press release. (see here for more analysis.)

In his press release, the EDPS stated:

  • The EDPS welcomes the proposal for a Digital Services Act that seeks to promote a transparent and safe online environment. In his Opinion, the EDPS recommends additional measures to better protect individuals when it comes to content moderation, online targeted advertising and recommender systems used by online platforms, such as social media and marketplaces.
  • The EDPS highlights that any form of content moderation should take place in accordance with the rule of law. Profiling for the purpose of content moderation should be prohibited unless the online service provider can demonstrate that such measures are strictly necessary to address the systemic risks explicitly identified in the Digital Services Act. Furthermore, the European legislators should consider a ban on online targeted advertising based on pervasive tracking and restrict the categories of data that can be processed for such advertising methods.
  • In his Opinion on the Digital Markets Act, the EDPS welcomes the European Commission’s proposal that seeks to promote fair and open digital markets and the fair processing of personal data by regulating large online platforms acting as gatekeepers.
  • The EDPS highlights the importance of fostering competitive digital markets so that individuals have a bigger choice of online platforms and services that they can use. Giving users better control over their personal data can reinforce contestability in digital markets. Increased interoperability can help to address user lock-in and ultimately create opportunities for services to offer better data protection.
  • To guarantee the successful implementation of the European Commission’s Digital Services Act package, the EDPS calls for a clear legal basis and structure for closer cooperation between the relevant oversight authorities, including data protection authorities, consumer protection authorities and competition authorities.

In its opinion, the EDPS summarized the Digital Markets Act:

The objective of the Proposal is to address at EU level the most salient incidences of unfair practices and weak contestability in relation to so-called “core platform services”. To this end, the Proposal:

  • establishes the conditions under which providers of core platform services should be designated as “gatekeepers” (Chapter II);
  • sets out the practices of gatekeepers that limit contestability and that are unfair, laying down obligations that the designated gatekeepers should comply with, some of which are susceptible to further specification (Chapter III);
  • provides rules for carrying out market investigations (Chapter IV); and
  • contains provisions concerning the implementation and enforcement of this Regulation (Chapter V).

The EDPS made clear its “Opinion contains specific recommendation to help ensure that the Proposal complements the GDPR effectively and increases protection for the fundamental rights and freedoms of the persons concerned and avoids frictions with current data protection rules.” The EDPS summarized its detailed recommendations to the EC on how to improve the Digital Markets Act:

  • In this Opinion the EDPS welcomes the Proposal, as it seeks to promote fair and open markets and the fair processing of personal data. Already in 2014, the EDPS pointed out how competition, consumer protection and data protection law are three inextricably linked policy areas in the context of the online platform economy. The EDPS considers that the relationship between these three areas should be a relationship of complementarity, not a relationship where one area replaces or enters into friction with another.
  • The EDPS highlights in this Opinion those provisions of the Proposal which produce the effect of mutually reinforcing contestability of the market and ultimately also control by the person concerned on her or his personal data. This is the case for instance of Article 5(f), prohibiting the mandatory subscription by the end-users to other core platforms services offered by the gatekeeper; Article 6(1)(b), allowing the end-user to un-install pre-installed software applications on the core platform service; Article 6(1)(e), prohibiting the gatekeeper from restricting the ability of end-users to switch between different software applications and services; and Article 13, laying down the obligation for the gatekeeper to submit to the Commission an independently audited description of any techniques for profiling of consumers that the gatekeeper applies to or across its core platform services.
  • At the same time, the EDPS provides specific recommendations to help ensure that the Proposal complements the GDPR effectively, increasing protection for the fundamental rights and freedoms of the persons concerned, and avoiding frictions with current data protection rules. In this regard, the EDPS recommends in particular specifying in Article 5(a) of the Proposal that the gatekeeper shall provide end-users with a solution of easy and prompt accessibility for consent management; clarifying the scope of the data portability envisaged in Article 6(1)(h) of the Proposal; and rewording Article 6(1)(i) of the Proposal to ensure full consistency with the GDPR; and raising the attention on the need for effective anonymisation and re-identification tests when sharing query, click and view data in relation to free and paid search generated by end users on online search engines of the gatekeeper.
  • Moreover, the EDPS invites the co-legislators to consider introducing minimum interoperability requirements for gatekeepers and to promote the development of technical standards at European level, in accordance with the applicable Union legislation on European standardisation.
  • Finally, building among others on the experience of the Digital Clearinghouse, the EDPS recommends specifying under Article 32(1) that the Digital Markets Advisory Committee shall include representatives of the EDPB, and calls, more broadly, for an institutionalised and structured cooperation between the relevant competent oversight authorities, including data protection authorities. This cooperation should ensure in particular that all relevant information can be exchanged with the relevant authorities so they can fulfil their complementary role, while acting in accordance with their respective institutional mandates.

Regarding the “Digital Services Act,” in his opinion, the EDPS summarized the draft law:

The aim of the Proposal is to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services . To this end, the Proposal:

  • contains provisions on the exemption of liability of providers of intermediary services (Chapter II);
  • sets out “due diligence obligations”, adapted to the type and nature of the intermediary service concerned (Chapter III); and
  • contains provisions concerning the implementation and enforcement of the proposed Regulation (Chapter IV).

The EDPS laid out its position on the legislation and the problems it is designed to solve regarding online content and its harms:

  • The EDPS welcomes the recognition that certain online activities, in particular in the context of online platforms, present increasing risks not only for the fundamental rights of individuals, but for society as a whole . This is all the more evident for providers of very online large platforms and well reflected in the consideration that providers of very large platforms should bear the highest standard of due diligence obligations, proportionate to their societal impact
  • In his Opinion on online manipulation and personal data, the EDPS identified several risks and harms resulting from how personal data is used to determine the online experience. The Opinion also highlighted how the existing business models behind many online services has contributed to increased political and ideological polarisation, disinformation and manipulation. Similar risks have also been highlighted by the European Data Protection Board (EDPB) in its Guidelines on the targeting of social media users .
  • The EDPS welcomes the recognition that the design of digital services provided by very large online platforms is generally optimised to benefit their often advertising-driven business models and can cause societal concerns . The Proposal also recognises the potential harms resulting from the use of algorithmic systems, in particular as regards their potential for amplifying certain content, including disinformation. The EDPS considers both phenomena and harms are intrinsically linked to the so-called “attention economy”, with services and applications being designed to maximise attention and engagement in order to gather more data on customers, to better target advertising and increase returns.
  • While the Proposal includes a set of risk mitigation measures, most provisions are designed to promote transparency and accountability, without directly addressing the root cause by way of ex ante rules. To be clear, the EDPS fully supports measures seeking to enhance transparency and accountability of platforms. At the same time, additional measures are warranted to properly address the systemic risks identified above, in particular in relation to very large online platforms. In this regard, the Proposal must be seen also in light of the Proposal for a Digital Markets Act, which includes additional rules to promote fair and open markets and the fair processing of personal data. The present Opinion will also recommend further measures in this regard, including in relation to online advertising.

Again, the EDPS summarized what it liked and did not in the Digital Services Act (DSA):

  • The EDPS welcomes that the Proposal seeks to complement rather than replace existing protections under Regulation (EU) 2016/679 and Directive 2002/58/EC. That being said, the Proposal will clearly have an impact on processing of personal data. The EDPS considers it necessary to ensure complementarity in the supervision and oversight of online platforms and other providers of hosting services.
  • Certain activities in the context of online platforms present increasing risks not only for the rights of individuals, but for society as a whole. While the Proposal includes a set of risk mitigation measures, additional safeguards are warranted, in particular in relation to content moderation, online advertising and recommender systems.
  • Content moderation should take place in accordance with the rule of law. Given the already endemic monitoring of individuals’ behaviour, particularly in the context of online platforms, the DSA should delineate when efforts to combat “illegal content” legitimise the use of automated means to detect, identify and address illegal content. Profiling for purposes of content moderation should be prohibited unless the provider can demonstrate that such measures are strictly necessary to address the systemic risks explicitly identified by the DSA.
  • Given the multitude of risks associated with online targeted advertising, the EDPS urges the co-legislators to consider additional rules going beyond transparency. Such measures should include a phase-out leading to a prohibition of targeted advertising on the basis of pervasive tracking, as well as restrictions in relation to the categories of data that can be processed for targeting purposes and the categories of data that may be disclosed to advertisers or third parties to enable or facilitate targeted advertising.
  • In accordance with the requirements of data protection by design and by default, recommender systems should by default not be based on profiling. Given their significant impact, the EDPS also recommends additional measures to further promote transparency and user control in relation to recommender systems.
  • More generally, the EDPS recommends introducing minimum interoperability requirements for very large online platforms and to promote the development of technical standards at European level, in accordance with the applicable Union legislation on European standardisation.
  • Having regard to the experience and developments related to the Digital Clearinghouse, the EDPS strongly recommends providing for an explicit and comprehensive legal basis for the cooperation and exchange of relevant information among supervisory authorities, each acting within their respective areas of competence. The Digital Services Act should ensure institutionalised and structured cooperation between the competent oversight authorities, including data protection authorities, consumer protection authorities and competition authorities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by USGS on Unsplash

Further Reading, Other Developments, and Coming Events (18 February 2021)

Further Reading

  • Google, Microsoft, Qualcomm Protest Nvidia’s Acquisition of Arm Ltd.” By  David McLaughlin, Ian King, and Dina Bass — Bloomberg. Major United States (U.S.) tech multinationals are telling the U.S. government that Nvidia’s proposed purchase of Arm will hurt competition in the semi-conductor market, an interesting position for an industry renowned for being acquisition hungry. The British firm, Arm, is a key player in the semi-conductor business that deals with all companies, and the fear articulated by firms like Qualcomm, Microsoft, and Google is that Nvidia will cut supply and increase prices once it controls Arm. According to one report, Arm has made something like 95% of the chip architecture for the world’s smartphones and 95% of the chips made in the People’s Republic of China (PRC). The deal has to clear U.S., British, EU, and PRC regulators. In the U.S., the Federal Trade Commission (FTC) has reportedly made very large document requests, which indicates their interest in digging into the deal and suggests the possibility they may come out against the acquisition. The FTC may also be waiting to read the mood in Washington as there is renewed, bipartisan concern about antitrust and competition and about the semi-conductor industry. Finally, acting FTC Chair Rebecca Kelly Slaughter has come out against a lax approach to so-called vertical mergers such as the proposed Nvidia-Arm deal, which may well be the ultimate position of a Democratic FTC.
  • Are Private Messaging Apps the Next Misinformation Hot Spot?” By Brian X. Chen and Kevin Roose — The New York Times. The conclusion these two tech writers reach is that, on balance, private messaging apps like Signal and Telegram, are better for society than not. Moreover, they reason it is better to have extremists migrate from platforms like Facebook to ones where it is much harder to spread their views and proselytize.
  • Amazon Has Transformed the Geography of Wealth and Power” By Vauhini Vara — The Atlantic. A harrowing view of the rise of Amazon cast against the decline of the middle class and the middle of the United States (U.S.) Correlation is not causation, of course, but the company has sped the decline of a number of industries and arguably a number of cities.
  • Zuckerberg responds to Apple’s privacy policies: “We need to inflict pain” By Samuel Axon — Ars Technica. Relations between the companies have worsened as their CEO have taken personal shots at each other in public and private culminating in Apple’s change to its iOS requiring users to agree to being tracked by apps across the internet, which is Facebook’s bread and butter. Expect things to get worse as both Tim Cook and Mark Zuckerberg think augmented reality or mixed reality are the next major frontiers in tech, suggesting the competition may intensify.
  • Inside the Making of Facebook’s Supreme Court” By Kate Klonik — The New Yorker. A very immersive piece on the genesis and design of the Facebook Oversight Board, originally conceived of as a supreme court for content moderation. However, not all content moderation decisions can be referred to the Board; in fact, only when Facebook decides to take down content does a person have a right to appeal. Otherwise, one must depend on the company’s beneficence. So, for example, if Facebook decided to leave up content that is racist toward Muslims, a Facebook user could not appeal the decision. Additionally, Board decisions are not precedential, which, in plain English means, if the Board decides a take down of, say, Nazi propaganda comports with Facebook’s rules, the company would not be obligated to take down similar Nazi content thereafter. This latter wrinkle will ultimately serve to limit the power of the Board. The piece quotes critics, including many involved with the design and establishment of the Board, who see the final form as being little more than a fig leaf for public relations.

Other Developments

  • The Department of Health and Human Services (HHS) was taken to task by a federal appeals court in a blunt opinion decrying the agency’s failure to articulate even the most basic rationale for a multi-million dollar fine of a major Houston hospital for its data security and data privacy violations. HHS’ Office of Civil Rights had levied $4.348 million find on  the University of Texas M.D. Anderson Cancer Center (M.D. Anderson) for violations of the regulations promulgated pursuant to the “Health Insurance Portability and Accountability Act of 1996” (P.L. 104–191) and “Health Information Technology for Economic and Clinical Health Act” (HITECH Act) (P.L. 111-5) governing the security and privacy of certain classes of health information. M.D. Anderson appealed the decision, losing at each stage, until it reached the United States Court of Appeals for the Fifth Circuit (Fifth Circuit.) In its ruling, the Fifth Circuit held that OCR’s “decision  was  arbitrary,  capricious,  and contrary to law.” The Fifth Circuit vacated the penalty and sent the matter back to HHS for further consideration.
    • In its opinion, the Fifth Circuit explained the facts:
      • First, back in 2012, an M.D. Anderson faculty member’s laptop was stolen. The laptop was not encrypted or password-protected but contained “electronic protected health information (ePHI) for 29,021 individuals.” Second, also in 2012, an M.D. Anderson trainee lost an unencrypted USB thumb drive during her evening commute. That thumb drive contained ePHI for over 2,000 individuals. Finally, in 2013, a visiting researcher at M.D. Anderson misplaced another unencrypted USB thumb drive, this time containing ePHI for nearly 3,600 individuals.
      • M.D. Anderson disclosed these incidents to HHS. Then HHS determined that M.D. Anderson had violated two federal regulations. HHS promulgated both of those regulations under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act of 2009 (the “HITECH Act”). The first regulation requires entities covered by HIPAA and the HITECH Act to “[i]mplement a mechanism to encrypt” ePHI or adopt some other “reasonable and appropriate” method to limit access to patient data. 45 C.F.R. §§ 164.312(a)(2)(iv), 164.306(d) (the “Encryption Rule”). The second regulation prohibits the unpermitted disclosure of protected health information. Id. § 164.502(a) (the “Disclosure Rule”).
      • HHS also determined that M.D. Anderson had “reasonable cause” to know that it had violated the rules. 42 U.S.C. § 1320d-5(a)(1)(B) (setting out the “reasonable cause” culpability standard). So, in a purported exercise of its power under 42 U.S.C. § 1320d-5 (HIPAA’s enforcement provision), HHS assessed daily penalties of $1,348,000 for the Encryption Rule violations, $1,500,000 for the 2012 Disclosure Rule violations, and $1,500,000 for the 2013 Disclosure Rule violations. In total, HHS imposed a civil monetary penalty (“CMP” or “penalty”) of $4,348,000.
      • M.D. Anderson unsuccessfully worked its way through two levels of administrative appeals. Then it petitioned our court for review. See 42 U.S.C. § 1320a-7a(e)  (authorizing  judicial  review).  After  M.D.  Anderson  filed  its  petition, the Government conceded that it could not defend its penalty and asked us to reduce it by a factor of 10 to $450,000. 
  • The Australian Senate Standing Committee for the Scrutiny of Bills has weighed in on both the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 and the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, two major legislative proposals put forth in December 2020. This committee plays a special role in legislating in the Senate, for it must “scrutinise each bill introduced into the Parliament as to whether the bills, by express words or otherwise:
    • (i)  trespass unduly on personal rights and liberties;
    • (ii)  make rights, liberties or obligations unduly dependent upon insufficiently defined administrative powers;
    • (iii)  make rights, liberties or obligations unduly dependent upon non- reviewable decisions;
    • (iv)  inappropriately delegate legislative powers; or
    • (v)  insufficiently subject the exercise of legislative power to parliamentary scrutiny.
    • Regarding the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 (see here for analysis), the committee explained:
      • The bill seeks to amend the Surveillance Devices Act 2004 (SD Act), the Crimes Act 1914 (Crimes Act) and associated legislation to introduce three new types of warrants available to the Australian Federal Police (AFP) and the Australian Criminal Intelligence Commission (ACIC) for investigating and disrupting online crime. These are:
        • data disruption warrants, which enable the AFP and the ACIC to modify, add, copy or delete data for the purposes of frustrating the commission of serious offences online;
        • network activity warrants, which permit access to devices and networks used by suspected criminal networks, and
        • account takeover warrants, which provide the AFP and the ACIC with the ability to take control of a person’s online account for the purposes of gathering evidence to further a criminal investigation.
    • The committee flagged concerns about the bill in these categories:
      • Authorisation of coercive powers
        • Issuing authority
        • Time period for warrants
        • Mandatory considerations
        • Broad scope of offences
      • Use of coercive powers without a warrant
        • Emergency authorisations
      • Innocent third parties
        • Access to third party computers, communications in transit and account-based data
        • Compelling third parties to provide information
        • Broad definition of ‘criminal network of individuals’
      • Use of information obtained through warrant processes
        • Prohibitions on use
        • Storage and destruction of records
      • Presumption of innocence—certificate constitutes prima facie evidence
      • Reversal of evidential burden of proof
      • Broad delegation of administrative powers
        • Appropriate authorising officers of the ACIC
    • The committee asked for the following feedback from the government on the bill:
      • The committee requests the minister’s detailed advice as to:
        • why it is considered necessary and appropriate to enable law enforcement officers to disrupt or access data or takeover an online account without a warrant in certain emergency situations (noting the coercive and intrusive nature of these powers and the ability to seek a warrant via the telephone, fax or email);
        • the appropriateness of retaining information obtained under an emergency authorisation that is subsequently not approved by a judge or AAT member;
        • and the appropriateness of enabling law enforcement agencies to act to conceal any thing done under a warrant after the warrant has ceased to be in force, and whether the bill could be amended to provide a process for obtaining a separate concealment of access warrant if the original warrant has ceased to be in force.
      • The committee requests the minister’s detailed advice as to:
        • the effect of Schedules 1-3 on the privacy rights of third parties and a detailed justification for the intrusion on those rights, in particular:
        • why proposed sections 27KE and 27KP do not specifically require the judge or nominated AAT member to consider the privacy implications
        • for third parties of authorising access to a third party computer or
        • communication in transit;
        • why the requirement that an issuing authority be satisfied that an assistance order is justifiable and proportionate, having regard to the offences to which it would relate, only applies to an assistance order with respect to data disruption warrants, and not to all warrants; and
        • whether the breadth of the definitions of ‘electronically linked group of individuals’ and ‘criminal network of individuals’ can be narrowed to reduce the potential for intrusion on the privacy rights of innocent third parties.
    • The committee requests the minister’s detailed advice as to:
      • whether all of the exceptions to the restrictions on the use, recording or disclosure of protected information obtained under the warrants are appropriate and whether any exceptions are drafted in broader terms than is strictly necessary; and
      • why the bill does not require review of the continued need for the retention of records or reports comprising protected information on a more regular basis than a period of five years.
    • As the explanatory materials do not adequately address these issues, the committee requests the minister’s detailed advice as to:
      • why it is considered necessary and appropriate to provide for evidentiary certificates to be issued in connection a data disruption warrant or emergency authorisation, a network access warrant, or an account takeover warrant;
      • the circumstances in which it is intended that evidentiary certificates would be issued, including the nature of any relevant proceedings; and
      • the impact that issuing evidentiary certificates may have on individuals’ rights and liberties, including on the ability of individuals to challenge the lawfulness of actions taken by law enforcement agencies.
    • As the explanatory materials do not address this issue, the committee requests the minister’s advice as to why it is proposed to use offence-specific defences (which reverse the evidential burden of proof) in this instance. The committee’s consideration of the appropriateness of a provision which reverses the burden of proof is assisted if it explicitly addresses relevant principles as set out in the Guide to Framing Commonwealth Offences.
    • The committee requests the minister’s advice as to why it is considered necessary to allow for executive level members of staff of the ACIC to be ‘appropriate authorising officers’, in particular with reference to the committee’s scrutiny concerns in relation to the use of coercive powers without judicial authorisation under an emergency authorisation.
    • Regarding the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, the committee asserted the bill “seeks to establish a mandatory code of conduct to support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The committee requested less input on this bill:
      • requests the Treasurer’s advice as to why it is considered necessary and appropriate to leave the determination of which digital platforms must participate in the News Media and Digital Platforms Mandatory Bargaining Code to delegated legislation.
      • If it is considered appropriate to leave this matter to delegated legislation, the committee requests the Treasurer’s advice as to whether the bill can be amended to require the positive approval of each House of the Parliament before determinations made under proposed section 52E come into effect.
  • The European Data Protection Board (EDPB) issued a statement “on new draft provisions of the second additional protocol to the Council of Europe Convention on Cybercrime (Budapest Convention),” the second time it has weighed in on the rewrite of “the first international treaty on crimes committed via the Internet and other computer networks, dealing particularly with infringements of copyright, computer-related fraud, child pornography and violations of network security.” The EDPB took issue with the process of meeting and drafting new provisions:
    • Following up on the publication of new draft provisions of the second additional protocol to the Budapest Convention , the EDPB therefore, once again, wishes to provide an expert and constructive contribution with a view to ensure that data protection considerations are duly taken into account in the overall drafting process of the additional protocol, considering that the meetings dedicated to the preparation of the additional protocol are being held in closed sessions and that the direct involvement of data protection authorities in the drafting process has not been foreseen in the T-CY Terms of Reference
    • The EDPB offered itself again as a resource and key stakeholder that needs to be involved with the effort:
      • In November 2019, the EDPB also published its latest contribution to the consultation on a draft second additional protocol, indicating that it remained available for further contributions and called for an early and more proactive involvement of data protection authorities in the preparation of these specific provisions, in order to ensure an optimal understanding and consideration of data protections safeguards (emphasis in the original).
    • The EDPB further asserted:
      • The EDPB remains fully aware that situations where judicial and law enforcement authorities are faced with a “cross-border situation” with regards to access to personal data as part of their investigations can be a challenging reality and recognises the legitimate objective of enhancing international cooperation on cybercrime and access to information. In parallel, the EDPB reiterates that the protection of personal data and legal certainty must be guaranteed, thus contributing to the objective of establishing sustainable arrangements for the sharing of personal data with third countries for law enforcement purposes, which are fully compatible with the EU Treaties and the Charter of Fundamental Rights of the EU. The EDPB furthermore considers it essential to frame the preparation of the additional protocol within the framework of the Council of Europe core values and principles, and in particular human rights and the rule of law.
  • The European Commission (EC) published a statement on how artificial intelligence (AI) “can transform Europe’s health sector.” The EC sketched out legislation it hopes to introduce soon on regulating AI in the European union (EU). The EC asserted:
    • A high-standard health system, rich health data and a strong research and innovation ecosystem are Europe’s key assets that can help transform its health sector and make the EU a global leader in health-related artificial intelligence applications. 
    • The use of artificial intelligence (AI) applications in healthcare is increasing rapidly.
    • Before the COVID-19 pandemic, challenges linked to our ageing populations and shortages of healthcare professionals were already driving up the adoption of AI technologies in healthcare. 
    • The pandemic has all but accelerated this trend. Real-time contact tracing apps are just one example of the many AI applications used to monitor the spread of the virus and to reinforce the public health response to it.
    • AI and robotics are also key for the development and manufacturing of new vaccines against COVID-19.
    • The European Commission is currently preparing a comprehensive package of measures to address issues posed by the introduction of AI, including a European legal framework for AI to address fundamental rights and safety risks specific to the AI systems, as well as rules on liability related to new technologies.
  • The House Energy and Commerce Committee Chair Frank Pallone, Jr. (D-NJ) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) wrote to Apple CEO Tim Cook “urging review and improvement of Apple’s new App Privacy labels in light of recent reports suggesting they are often misleading or inaccurate.” Pallone and Schakowsky are working from a Washington Post article, in which the paper’s tech columnist learned that Apple’s purported ratings system to inform consumers about the privacy practices of apps is largely illusory and possibly illegally deceptive. Pallone and Schakowsky asserted:
    • According to recent reports, App Privacy labels can be highly misleading or blatantly false. Using software that logs data transmitted to trackers, a reporter discovered that approximately one third of evaluated apps that said they did not collect data had inaccurate labels. For example, a travel app labeled as collecting no data was sending identifiers and other data to a massive search engine and social media company, an app-analytics company, and even a Russian Internet company. A ‘slime simulator’ rated for ages 4 and older had a ‘Data Not Collected’ label, even though the app shares identifying information with major tech companies and shared data about the phone’s battery level, storage, general location, and volume level with a video game software development company.
    • Simplifying and enhancing privacy disclosures is a laudable goal, but consumer trust in privacy labeling approaches may be undermined if Apple’s App Privacy labels disseminate false and misleading information. Without meaningful, accurate information, Apple’s tool of illumination and transparency may become a source of consumer confusion and harm. False and misleading privacy labels can dupe privacy-conscious consumers into downloading data-intensive apps, ultimately eroding the credibility and integrity of the labels. A privacy label without credibility and integrity also may dull the competitive forces encouraging app developers to improve their data practices.
    • A privacy label is no protection if it is false. We urge Apple to improve the validity of its App Privacy labels to ensure consumers are provided meaningful information about their apps’ data practices and that consumers are not harmed by these potentially deceptive practices.
    • Pallone and Schakowsky stated “[t]o better understand Apple’s practices with respect to the privacy labels, we request that you provide written response to the following questions by February 23, 2021:
      • 1. Apple has stated that it conducts routine and ongoing audits of the information provided by developers and works with developers to correct any inaccuracies.
        • a. Please detail the process by which Apple audits the privacy information provided by app developers. Please explain how frequently audits are conducted, the criteria by which Apple selects which apps to audit, and the methods for verifying the accuracy of the privacy information provided by apps.
        • b. How many apps have been audited since the implementation of the App Privacy label? Of those, how many were found to have provided inaccurate or misleading information? 
      • 2. Does Apple ensure that App Privacy labels are corrected upon the discovery of inaccuracies or misleading information? If not, why not? For each app that has been found to have provided inaccurate or misleading information, how quickly was that label corrected?
      • 3. Please detail Apple’s enforcement policies when an app fails to provide accurate privacy information for the App Privacy label.
      • 4. Does Apple require more in-depth privacy disclosures and conduct more stringent oversight of apps targeted to children under the age of 13? If not, why not? If so, please describe the additional disclosures required and the oversight actions employed for these apps.
      • 5. Providing clear and easily comprehendible privacy information at the point of sale is certainly valuable, but privacy policies are not static. Does Apple notify users when one of their app’s privacy labels has materially changed? If not, why not. If so, how are users notified of such changes.
  • The United Kingdom’s Department for Digital, Culture, Media & Sport (DCMS) “published its draft rules of the road for governing the future use of digital identities…[and] [i]t is part of plans to make it quicker and easier for people to verify themselves using modern technology and create a process as trusted as using passports or bank statements” according to its press release. The DCMS wants feedback by 11 March 2021 on the draft trust framework. The DCMS stated:
    • Digital identity products allow people to prove who they are, where they live or how old they are. They are set to revolutionise transactions such as buying a house, when people are often required to prove their identity multiple times to a bank, conveyancer or estate agent, and buying age-restricted goods online or in person.
    • The new ‘trust framework’ lays out the draft rules of the road organisations should follow. It includes the principles, policies, procedures and standards governing the use of digital identity to allow for the sharing of information to check people’s identities or personal details, such as a user’s address or age, in a trusted and consistent way. This will enable interoperability and increase public confidence.
    • The framework, once finalised, is expected to be brought into law. It has specific standards and requirements for organisations which provide or use digital identity services including:
      • Having a data management policy which explains how they create, obtain, disclose, protect, and delete data;
      • Following industry standards and best practice for information security and encryption;
      • Telling the user if any changes, for example an update to their address, have been made to their digital identity;
      • Where appropriate, having a detailed account recovery process and notifying users if organisations suspect someone has fraudulently accessed their account or used their digital identity;
      • Following guidance on how to choose secure authenticators for their service.
  • The European Commission (EC) “opened infringement procedures against 24 Member States for failing to enact new EU telecom rules.”
    • The EC asserted:
      • The European Electronic Communications Code modernises the European regulatory framework for electronic communications, to enhance consumers’ choices and rights, for example by ensuring clearer contracts, quality of services, and competitive markets. The Code also ensures higher standards of communication services, including more efficient and accessible emergency communications. Furthermore, it allows operators to benefit from rules incentivising investments in very-high capacity networks, as well as from enhanced regulatory predictability, leading to more innovative digital services and infrastructures.
      • The European Electronic Communications Code that brings the regulatory framework governing the European telecom sector up to date with the new challenges came into force in December 2018, and Member States have had two years to implement its rules. It is a central piece of legislation to achieve Europe’s Gigabit society and ensure full participation of all EU citizens in the digital economy and society.

Coming Events

  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • The U.S.-China Economic and Security Review Commission will hold a hearing titled “Deterring PRC Aggression Toward Taiwan” on 18 February.
  • On 24 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Fanning the Flames: Disinformation and Extremism in the Media.”
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Estúdio Bloom on Unsplash

A Revised ePrivacy Regulation Proposed

A key stakeholder has proposed changes to the EC’s four-year old proposed ePrivacy Regulation, moving matters to the EU Parliament. 

Council of the European Union (Council) has released a long awaited compromise draft of the ePrivacy Regulation, a rewrite of the European Union’s existing rules on the privacy of electronic communications. This new law is intended to complement the General Data Protection Regulation (GDPR). This is an important but preliminary development, and now the Council will begin negotiations with the European Parliament to arrive at final ePrivacy Regulation language. The European Commission (EC) presented its ePrivacy Regulation proposal in January 2017 but lobbying in Brussels has been fierce. The last four years have been spent haggling over the final Regulation. However, as a regulation, the ePrivacy Regulation, like the GDPR, would become EU law throughout all the nations without needing member states to enact implementing legislation as it must for directives even though there is leeway for nations to legislate further in accordance with the draft.

In its press release, the Council asserted:

Today, member states agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services. These updated ‘ePrivacy’ rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices. Today’s agreement allows the Portuguese presidency to start talks with the European Parliament on the final text (emphasis in the original.)

The Council continued:

An update to the existing ePrivacy directive of 2002 is needed to cater for new technological and market developments, such as the current widespread use of Voice over IP, web-based email and messaging services, and the emergence of new techniques for tracking users’ online behaviour.

The new ePrivacy Regulation would repeal Directive 2002/58/EC (Regulation on Privacy and Electronic Communications) and enact new text to address a number of changes in electronic communications and services since the current regime was enacted. The ePrivacy Regulation was intended to be enacted alongside the GDPR, but this did not come to pass given the competing interests among EU nations.

As for the text itself, a few threshold matters are worth highlighting. First, the ePrivacy Regulation would apply to both natural and legal persons (i.e., actual people in the EU and EU entities such as businesses.) Second, the ePrivacy Regulation does not impinge the national security and defense data processing activities EU member states may undertake. Third, it applies to telecommunications providers and communications platforms. Fourth, the new regime would govern data processing electronic communications data or the personal data of EU residents in specified circumstances regardless of where the processing is occurring (e.g., Google processing EU communications in Egypt) and even if the processor is not established in the EU (e.g., a Taiwanese data broker processing certain communications of EU people or businesses.) Fifth, the ePrivacy Regulation sets up a tiered penalty system just like the GDPR’s with a lesser class of violations exposing the violator to up to a fine of either up to €10 million or 2% of the entity’s worldwide turnover with more serious violations facing liability of €20 million or 4% of worldwide turnover. Sixth, the European Data Protection Board (EDPB) would be given the task “to contribute to the consistent application of Chapters I and II and III of this Regulation) (i.e., the operative portions of the ePrivacy regime.)

In terms of the policy backdrop, the ePrivacy Regulation makes clear:

  • Article 7 of the Charter of Fundamental Rights of the European Union (“the Charter”) protects the fundamental right of everyone to the respect for private and family life, home and communications. Respect for the confidentiality of one’s communications is an essential dimension of this right, applying both to natural and legal persons. Confidentiality of electronic communications ensures that information exchanged between parties and the external elements of such communication, including when the information has been sent, from where, to whom, is not to be revealed to anyone other than to the parties involved in a communication. The principle of confidentiality should apply to current and future means of communication, including calls, internet access, instant messaging applications, e-mail, internet phone calls and personal messaging provided through social media.
  • The content of electronic communications may reveal highly sensitive information about the natural persons involved in the communication, from personal experiences and emotions to medical conditions, sexual preferences and political views, the disclosure of which could result in personal and social harm, economic loss or embarrassment. Similarly, metadata derived from electronic communications may also reveal very sensitive and personal information. These metadata includes the numbers called, the websites visited, geographical location, the time, date and duration when an individual made a call etc., allowing precise conclusions to be drawn regarding the private lives of the persons involved in the electronic communication, such as their social relationships, their habits and activities of everyday life, their interests, tastes etc.

The Council intends the ePrivacy Regulation to work in concert with the GDPR, specifying that where the former is silent on an issue, the latter shall control:

Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons.

Article I states the purpose of the ePrivacy Regulation:

This Regulation lays down rules regarding the protection of the fundamental rights and freedoms of legal persons in the provision and use of the electronic communications services, and in particular their rights to respect of communications.

The ePrivacy Regulation will apply to:

  • the processing of electronic communications content and of electronic communications metadata carried out in connection with the provision and the use of electronic communications services;
  • end-users’ terminal equipment information.
  • the offering of a publicly available directory of end-users of electronic communications services;
  • the sending of direct marketing communications to end-users.

Electronic communications data (a term encompassing both content and metadata) must generally kept confidential (subject to exceptions) but it may be processed under the following circumstances:

  • If “necessary to provide an electronic communication service”
  • it is necessary to maintain or restore the security of electronic communications networks and services, or detect technical faults, errors, security risks or attacks on electronic communications networks and services;
  • it is necessary to detect or prevent security risks or attacks on end-users’ terminal equipment;
  • it is necessary for compliance with a legal obligation to which the provider is subject laid down by Union or Member State law, which respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the safeguarding against and the prevention of threats to public security.

Electronic communications metadata may be processed in a number of scenarios without consent, including for maintaining networks and services, for the fulfillment of a contract to which the end-user is a party, or “it is necessary in order to protect the vital interest of a natural person.” Such processing of metadata may also be part of scientific or historical research and related purposes subject to additional requirements. And, of course, a person or entity could consent to such processing for one or more specified purposes.

There is a subsequent section that seems to contemplate other possible “compatible” processing of metadata without a person or entity’s consent and outside an EU or member state law. The regulations list a number of considerations the provider must take into account in making this determination such as the link between the reasons why these data were first collected and the intended additional processing, the context of the data collection, the nature of the metadata, the possible consequences to the end-user of further processing, and the use of safeguards such as encryption or pseudonymization. However, there are strict limits on how the processing may take place. If the information can be anonymized for processing, it must be. Otherwise, it must be made anonymous or erased after processing. Metadata must be processed in a pseudonymized fashion and cannot be used to determine the nature or characteristics of the user to build a user profile. Finally, metadata collected and processed under this provision of the ePrivacy Regulation cannot be shared with third parties unless it is made anonymous.

And so, it appears providers may engage in additional processing a Spanish resident may not have consented to so long as these conditions are met. However, the regulations do not spell out what sort of situations these may be, leaving the issue to EU courts. Given the lengthy negotiations over the ePrivacy Regulation, this may be one of the places the parties decided to leave open-ended.

Moreover, providers are to erase or anonymize electronic communications content and metadata when there is no longer a need for processing or for providing an electronic communications service subject to exceptions in the latter instance.

There is a broad bar on the use of people or entities’ devices or equipment for processing and against collecting information except subject to enumerated exceptions such as it is necessary to provide service, the person or entity consents, to measure the audience, to maintain or restore the security of the devices or service, or to provide a software update. There is also language as with metadata processing that would seem to allow processing in this context aside and apart from consent and EU or member state law so long as the provider lives within the same types of limits.

When a person connects her device to a network or another device, collection of information is forbidden unless it is needed to establish or maintain a connection, a user provides consent, it is needed to provide a requested service, or “it is necessary for the purpose of statistical purposes that is limited in time and space to the extent necessary for this purpose.”

EU member states may abridge some of these rights through legislation “where such a restriction respects the essence of the fundamental rights and freedoms and is a necessary, appropriate and proportionate measure in a democratic society to safeguard one or more of the general public interests referred to in Article 23(1)” of the GDPR, namely

  • public security;
  • the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security;
  • other important objectives of general public interest of the Union or of a Member State, in particular an important economic or financial interest of the Union or of a Member State, including monetary, budgetary and taxation a matters, public health and social security;
  • the protection of the data subject or the rights and freedoms of others;
  • the enforcement of civil law claims.

There are further provisions on EU people and entities turning off caller identification and blocking or allowing unsolicited calls and communications.

The regulatory structure will be similar to the one in effect under the GDPR with each member nation having a supervisory authority or authorities in place to monitor compliance with the new regulation and take action if necessary. Likewise, the EDPB shall have significant powers in the oversight and implementation of the ePrivacy Regulation but short of those provided under the GDPR, notably the authority to referee and adjudicate disputes over enforcement between nations. There is language directing all authorities to work cooperatively across borders, but that is it.

As mentioned, violators of the ePrivacy Regulation would face stiff fines just as under the GDPR with the more severe penalty tier being reserved for “[i]nfringements of the principle of confidentiality of communications, permitted processing of electronic communications data, time limits for erasure pursuant to Articles 5, 6, and 7.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Meurice from Pexels

Further Reading, Other Developments, and Coming Events (10 February 2021)

Further Reading

  • A Hacker Tried to Poison a Florida City’s Water Supply, Officials Say” By Andy Greenberg — WIRED. Given the fact that most water and sewage systems are linked to the internet, even their operational systems, it is surprising these sorts of incidents do not occur more frequently.
  • UK regulator to write to WhatsApp over Facebook data sharing” By Alex Hern — The Guardian. The United Kingdom’s (UK) Information Commissioner Elizabeth Denham said her agency will be pressing Facebook to keep the data its subsidiary, WhatsApp, separate. Now that the UK has exited the European Union, it is no longer bound by the EU‘s system which made Ireland’s Data Protection Commission the lead regulator on Facebook and WhatsApp. And so, WhatsApp’s 2017 commitment not to hand over user data to Facebook until it was compliant with the General Data Protection Regulation (GDPR) falls to the ICO to oversee in the UK.
  • Telegram, Pro-Democracy Tool, Struggles Over New Fans From Far Right” By Michael Schwirtz — The New York Times. The same features that makes messaging app Telegram ideal for warding off attacks by authoritarian regimes to shut down communication makes the platform ideal for right-wing extremists in the United States (U.S.) Federal and state authorities may see their attempts to track and monitor domestic terrorism hit the same roadblocks that foiled Moscow and Tehran’s attempts to crack down on Telegram. The platform uses end-to-end encrypted communications and has servers all over the world.
  • Exclusive: The end of the Maher era at Wikipedia” By Felix Salmon — Axios. The CEO who revitalized Wikimedia is leaving the organization stronger than she found it.
  • After Defending Its Low-Cost Internet Offering, Comcast Agrees To Increase Speeds” By Caroline O’Donovan — BuzzFeed News. The bad publicity seems to have worked on Comcast as the company is now meeting most of the demands of activists, students, and officials by increasing the speed of its low cost broadband option. Comcast said the changes will take effect on 1 March.

Other Developments

  • The Federal Communications Commission (FCC) announced that it is “seeking comment on several petitions requesting permission to use E-Rate program funds to support remote learning during the pandemic.” Comments are due by 16 February and reply comments are due by 23 February. The FCC explained:
    • Today’s Public Notice from the FCC’s Wireline Competition Bureau highlights three petitions that cover the bulk of issues presented in other petitions filed with the Commission.  These include petitions filed by a coalition of E-Rate stakeholders led by the Schools, Health & Libraries Broadband (SHLB) Coalition; a petition filed on behalf of the State of Colorado; and a petition filed by the State of Nevada, Nevada Board of Education and Nevada Department of Education. 
    • The FCC noted:
      • The E-Rate program was authorized by Congress as part of the Telecommunications Act of 1996 (the Telecommunications Act), and created by the Commission in 1997 to, among other things, enhance, to the extent technically feasible and economically reasonable, access to advanced telecommunications and information services for all public and nonprofit elementary and secondary schools and libraries. Under the E-Rate program, eligible schools, libraries, and consortia (comprised of eligible schools and libraries) may request universal service discounts for eligible services and/or equipment (collectively, eligible services), including connections necessary to support broadband connectivity to eligible schools and libraries. Eligible services must be used “primarily for educational purposes.” In the case of schools, “educational purposes” is defined as “activities that are integral, immediate, and proximate to the education of students. In the case of libraries, “educational purposes” is defined as activities that are “integral, immediate, and proximate to the provision of library services to library patrons.”
      • As the pandemic continues to force schools and libraries across the country to remain closed and rely on remote learning and virtual services, either in whole or in part, the need for broadband connections—particularly for those students, teachers, staff, and patrons that lack an adequate connection at home—is more critical than ever.  Eligible schools and libraries explain that they are hampered in their ability to address the connectivity needs brought on, and in many cases exacerbated, by COVID-19 because of the restrictions on off-campus use of E-Rate-funded services and facilities.   Last spring, as the COVID-19 pandemic forced schools and libraries to grapple with the challenges of transitioning to remote learning, the FCC began to receive requests for emergency relief aimed at ensuring that all students have sufficient connectivity at home.
  • The European Commission’s President appealed to the United States (U.S.) in joining the European Union to jointly regulate technology. At the Davos Agenda, EC President Ursula von der Leyen made remarks, a significant portion of which focused on technological issues and the European Union’s (EU) proposals, the Digital Services Act and Digital Markets Act. It is unclear to extent to which the new administration in Washington will be willing to work with the EU. Undoubtedly, the Biden Administration will interpret a number of EU policies and decisions as being implicitly aimed at the U.S. technology sector but there may be common ground. Von der Leyen stated:
    • A year ago at Davos, we talked also intensively about digitalisation. The pandemic has massively accelerated the process. The European Union will dedicate 20% of NextGenerationEU to digital projects. To nurture innovative ecosystems, for example where universities, companies, innovators can access data and cooperate. To boost the vibrant start-up scene we have in cities like Sofia and Lisbon and to become a global hub for Artificial Intelligence. So that the 2020s can finally be Europe’s Digital Decade.
    • But for this to be a success, we must also address the darker sides of the digital world. Like for so many of us, the storming of the Capitol came as a shock to me. We are always quick to say: Democracy and values, they are part of our DNA. And that is true. But we must nurture our democracy every day, and defend our institutions against the corrosive power of hate speech, of disinformation, fake news and incitement to violence. In a world where polarising opinions are the loudest, it is a short step from crude conspiracy theories to the death of a police officer. Unfortunately, the storming of the Capitol Hill showed us how just true that is.
    • The business model of online platforms has an impact – and not only on free and fair competition, but also on our democracies, our security and on the quality of our information. That is why we need to contain this immense power of the big digital companies. Because we want the values we cherish in the offline world also to be respected online. At its most basic, this means that what is illegal offline should be illegal online too. And we want the platforms to be transparent about how their algorithms work. Because we cannot accept that decisions, that have a far-reaching impact on our democracy, are taken by computer programmes alone.
    • Right after von der Leyen addressed the unease she and others felt about the U.S. President’s freedom of expression being abridged because of a company’s rules outside of any controlling legal framework, she stated:
      • I want to invite our friends in the United States to join our initiatives. Together, we could create a digital economy rulebook that is valid worldwide: It goes from data protection and privacy to the security of critical infrastructure. A body of rules based on our values: Human rights and pluralism, inclusion and the protection of privacy. So Europe stands ready.
      • The challenges to our democracy, the pandemic, climate change – in his inauguration speech President Joe Biden so aptly spoke of a Cascade of Crises. And indeed, we face an outstanding set of challenges. But we can meet them – if we work together. That is what we all have to learn again after four long years. That it is not a sign of weakness, to reach out and help each other, but a signal of strength.
  • Consumer Reports tried to become an authorized agent under the “California Consumer Privacy Act” (CCPA) (AB 375) to make do not sell personal data requests or opt out requests. The CCPA was designed to allow California residents to use services that would handle these preferences on a global scale. In their report on the pilot program, Consumer Reports concluded:
    • Unfortunately, too many companies have made it difficult, if not impossible, for agents and consumers to submit opt-out requests. The AG should enforce companies’ compliance with the law so that the authorized agent provisions work as intended. Moreover, the AG should promulgate additional common-sense rules to make sure that opt outs are simple and effective, even when submitted by an authorized agent.
    • Consumer Reports made these recommendations:
      • The AG should hold companies accountable when they violate the law. The AG needs to hold companies accountable for failure to comply with the CCPA’s authorized agent provisions. Without a viable authorized agent option, consumers could be left to navigate complicated processes or interfaces in order to exercise their California privacy rights themselves. Enforcement will help ensure that companies work harder to make sure that they have appropriate agent flows. The AG should also step in when customer service isn’t effective, and should consider directing enforcement resources to encourage better training in this area.
      • The AG should clarify that data shared for cross-context targeted advertising is a sale, and tighten the restrictions on service providers. Many companies have exploited ambiguities in the definition of sale and the rules surrounding service providers to ignore consumers’ requests to opt out of behavioral advertising. While the newly-passed California Privacy Rights Act will largely address these loopholes, these provisions will not go into effect until January 1, 2023. Thus, the AG should exercise its broad authority to issue rules to clarify that the transfer of data between unrelated companies for any commercial purpose falls under the definition of sale. Another common way for companies to avoid honoring consumers’ right to opt out of behavioral advertising is by claiming a service provider exemption. For example, the Interactive Advertising Bureau (IAB), a trade group that represents the ad tech industry, developed a framework for companies to evade the opt out by abusing a provision in the CCPA meant to permit a company to perform certain limited services on its behalf. To address this problem, the AG should clarify that companies cannot transfer data to service providers for behavioral advertising if the consumer has opted out of sale.
      • The AG should prohibit dark patterns as outlined in the Third Set of Proposed Modifications. We appreciate that the AG has proposed to “require minimal steps to allow the consumer to opt-out” and to prohibit dark patterns, “a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out[,]” in the Third Set of Proposed Modifications to the CCPA Regulations. This proposal should be finalized as quickly as possible. This is essential, given the difficulties that authorized agents and consumers have experienced in attempting to stop the sale of their information, as demonstrated in the study.
      • The AG should require companies to notify agents when the opt-out request has been received and when it has been honored. Too often, the company provided no information on whether or not the opt-out request had been honored. While the CCPA rules require companies to notify consumers if an opt-out request has been rejected, there is no requirement to provide notice of receipt, or notice of confirmation—nor is there guidance on how to respond to opt-out requests when the company does not possess the consumer’s data. The authorized agent was, in some cases, unable to explain to the consumer whether not the opt-out process had been completed. To ensure that the authorized agent service is effective, companies must be required to provide notification upon receipt and completion of the opt-out request. Required notification is also important for compliance purposes. For example, the regulations require companies to comply with opt outs within 15 business days. Without providing adequate notification, there’s no way to judge whether or not the company has honored the law and to hold them accountable if not. Further, if the company does sell consumers’ personal information, but does not have personal information about the consumer who is the subject of the request, the company should be required to notify the agent that the request has been received, and that the company will honor the opt out if and when they do collect the consumer’s data. In the case of an agent opt out, the notification should go to the agent. Otherwise, the consumer could end up getting emails from hundreds, if not thousands, of different companies.
      • The AG should clarify that if an agent inadvertently submits a request incorrectly, the company should either accept it or inform the agent how to submit it appropriately. The regulations provide helpful guidance with respect to consumer access and deletion requests, which ensures that even if a consumer inadvertently submits a request incorrectly, there is a process in place to help them submit it properly. If a consumer submits a request in a manner that is not one of the designated methods of submission, or is deficient in some manner unrelated to the verification process, the business shall either: (1) Treat the request as if it had been submitted in accordance with the business’s designated manner, or (2) Provide the consumer with information on how to submit the request or remedy any deficiencies with the request, if applicable. The AG should clarify that this guidance applies to all authorized agent-submitted requests as well.
  • The Government Accountability Office (GAO) assessed the Department of Defense’s (DOD) efforts to transition to a more secure version of the Global Positioning System (GPS), an initiative that spans back to the administration of former President George W. Bush. The GAO stated “due to the complexity of the technology, M-code remains years away from being widely fielded across DOD. M-code-capable receiver equipment includes different components, and the development and manufacture of each is key to the modernization effort. These include:
    • special M-code application-specific integrated circuit chips,
    • special M-code receiver cards, being developed under the Air Force Military GPS User Equipment (MGUE) programs, and
    • the next generation of GPS receivers capable of using M-code signals from GPS satellites.
    • The GAO added:
      • DOD will need to integrate all of these components into different types of weapon systems… Integration across DOD will be a considerable effort involving hundreds of different weapon systems, including some with complex and unique integration needs or configurations.
    • The GAO further asserted:
      • The Air Force is almost finished—approximately one year behind schedule— developing and testing one M-code card for testing on the Marine Corps Joint Light Tactical Vehicle and the Army Stryker vehicle. However, one card intended for use in aircraft and ships is significantly delayed and missed key program deadlines. The Air Force is revising its schedule for testing this card.
      • The M-code card development delays have had ripple effects on GPS receiver modernization efforts and the weapon systems that intend to use them.
  • The advocate who brought the cases that brought down both the Safe Harbor and Privacy Shield agreements between the United States (U.S.) and European Union (EU) announced that Ireland’s Data Protection Commission (DPC) has agreed to finally decide on the legality of Facebook’s data transfers to the U.S. that gave rise to both lawsuits. In a press release, none of your business (noyb). Last fall, noyb announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish DPC today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.)” In September 2020, after the DPC directed Facebook to stop transferring the personal data of European Union citizens to the U.S., the company filed suit in Ireland’s court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • In explaining the most recent development, noyb further asserted:
      • The DPC has agreed with Max Schrems’ demand to swiftly end a 7.5 year battle over EU-US data transfers by Facebook and come to a decision on Facebook’s EU-US data flows. This only came after a Judicial Review against the DPC was filed by Mr Schrems. The case would have been heard by the Irish High Court today.
      • New “own volition” procedure blocked pending complaint from 2013. The Irish DPC oversees the European operations of Facebook. In Summer 2020 the European Court of Justice (CJEU) ruled on a complaint by Mr Schrems that had been pending since 2013 and came before the CJEU for the second time (“Schrems II”): Under the CJEU judgment the DPC must stop Facebook’s EU-US data flows over extreme US Surveillance Laws (like FISA 702). Instead of implementing this ruling, the DPC started a new “own volition” case and paused the original procedure for an indefinite time. Mr Schrems and Facebook brought two Judicial Review procedures against the DPC: While Facebook argued in December that the “own volition” procedure should not go ahead, Mr Schrems argued that his complaints procedure should be heard independently of the “own volition” case.
      • Walls are closing in on Facebook’s EU-US data transfers. The DPC has now settled the second Judicial Review with Mr Schrems just a day before the hearing was to take place, and pledged to finalize his complaints procedure swiftly.
      • As part of the settlement, Mr Schrems will also be heard in the “own volition” procedure and get access to all submissions made by Facebook, should the Court allow the “own volition” investigation to go ahead. Mr Schrems and the DPC further agreed that the case will be dealt with under the GDPR, not the Irish Data Protection Act that was applicable before 2018. The DPC may await the High Court judgement in Facebook’s Judicial Review before investigating the original complaint.
      • This agreement could in essence make the original complaints procedure from 2013 the case that ultimately determines the destiny of Facebook’s EU-US transfers in the wake of the Snowden disclosures. Under the GDPR the DPC has every liberty to issue fines of up to 4% pf Facebook’s global turnover and transfer prohibitions, even on the basis of this individual case.
  • The Information Technology Industry Council (ITI), BSA | The Software Alliance, Internet Association, Computer and Communications Industry Association, and the National Foreign Trade Council made recommendations to the Biden Administration on technology policy and asserted in their press release:
    • Prioritize strategic engagement with U.S. trading partners by ensuring continued protected transatlantic data flows, establishing a U.S.-EU Trade & Technology Council, engaging China through prioritization of digital and technology issues, broadening U.S. engagement and leadership in the Asia-Pacific region, addressing key barriers to digital trade with India, and providing capacity building assistance to the African Union;
    • Promote U.S. competitiveness through leadership on digital trade by countering unilateral, targeted digital taxes, building acceptance of state-of-the-art digital trade commitments, promoting workforce development initiatives globally, and more; and
    • Reassert U.S. multilateral leadership by strengthening and leveraging engagement in global fora such as the WTO, OECD, United Nations, G20, G7, APEC, and others, and by expanding existing plurilateral trade agreements.
  • A group of civil rights organizations and public interest organizations issued “Civil Rights, Privacy, and Technology: Recommended 2021 Oversight Priorities for the 117th Congress” that builds upon the October 2020 Civil Rights Principles for the Era of Big Data. These groups stated:
    • The 117th Congress must take action to ensure that technology serves all people in the United States, rather than facilitating discrimination or reinforcing existing inequities.
    • They cited the following areas of policy that need to be addressed:
      • Broadband Internet
      • Democracy: Voting, the Census, and Hateful Content Online
      • Policing and Justice
      • Immigration Surveillance Technology
      • Commercial Data Practices and Privacy
      • Workers, Labor, and Hiring
  • The United Kingdom’s (UK) Information Commissioner Elizabeth Denham sketched out how she is approaching her final year in office in a blog post. Denham stated:
    • The ICO’s immediate focus remains supporting organisations through the impacts of COVID 19. We have prioritised providing advice and support on data protection related aspects of the pandemic since the start, and will continue to do so, adjusting and responding to the new challenges the country will face until, well, ‘all this is finished’. That work includes protecting people’s rights, and making sure data protection is considered at the earliest stage of any innovations.
    • The Age Appropriate Design Code will start to have a real impact, as the transition period around its introduction comes to an end, and we will be working hard to support organisations to make the necessary changes to comply with the law.
    • We’ll also be focused on supporting organisations around data sharing, following the publication of our guidance last month. The guidance is accompanied by practical resources to help organisations share data in line with the law. As I discussed with the House of Lords Public Services Committee this month, data sharing is an important area of focus, and we will also be supporting broader work to encourage the necessary culture change to remove obstacles to data sharing.
    • Other support for organisations planned for this year includes guidance on political campaigning, facial recognition, and codes of conduct and certification schemes, as well as a digital version of our Data Protection Practitioners’ Conference in April. We’ll also have the latest phases of our grants scheme and sandbox programme. Both are an effective way of the ICO supporting original thinking around privacy, illustrated by the innovative data sharing projects we’ve recently worked with.
    • Our operational work will also continue, including the latest phases of our work looking at data broking, the use of sexual crime victims’ personal information, and adtech, including audits focused on digital marketing platforms.

Coming Events

  • On 10 February, the House Homeland Committee will hold a hearing titled “Homeland Cybersecurity: Assessing Cyber Threats and Building Resilience” with these witnesses:
    • Mr. Chris Krebs, Former Director, Cybersecurity and Infrastructure Security Agency, U.S. Department of Homeland Security
    • Ms. Sue Gordon, Former Principal Deputy Director of National Intelligence, Office of the Director of National Intelligence
    • Mr. Michael Daniel, President & CEO, Cyber Threat Alliance
    • Mr. Dmitri Alperovitch, Executive Chairman, Silverado Policy Accelerator
  • The House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Justice Restored: Ending Forced Arbitration and Protecting Fundamental Rights” on 11 February.
  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Supushpitha Atapattu from Pexels

Further Reading, Other Developments, and Coming Events (8 February 2021)

Further Reading

  • ‘A kiss of death’: Top GOP tech critics are personae non gratae after election challenge” By Cristiano Lima — Politico. I take these articles with a block of salt, not least of which because many inside the Beltway articles lack perspective and a sense of history. For sure, in the short term the Josh Hawleys and Ted Cruzes of the world are radioactive to Democrats, but months down the road things will look different, especially if Democrats need votes or allies in the Senate. For example, former Senator David Vitter’s (R-LA) interesting activities with prostitutes made him radioactive for some time and then all was forgotten because he held a valuable currency: a vote.
  • I Talked to the Cassandra of the Internet Age” By Charlie Warzel — The New York Times. A sobering read on the implications of the attention economy. We would all be helped by slowing down and choosing what to focus on.
  • A Vast Web of Vengeance” By Kashmir Hill — The New York Times. A true horror story illustrating the power platforms give anyone to slander others. The more these sorts of stories move to the fore of the consciousness of policymakers, the greater the chances of reform to 47 USC 230 (Section 230), which many companies used to deny requests that they take down defamatory, untrue material.
  • Amazon says government demands for user data spiked by 800% in 2020” By Zack Whitaker — TechCrunch. In an interesting development, Germany far outpaced the United States (U.S.) in information requests between 1 July and 31 December 2020 for Amazon except for Amazon Web Services (AWS). Regarding AWS, the U.S. accounted for 75% of requests. It bears note there were over 27,000 non-AWS requests and only 523 AWS requests.
  • Russian hack brings changes, uncertainty to US court system” By MaryClaire Dale — Associated Press. Because the Administrative Office of United States (U.S.) Courts may have been part of the massive SolarWinds hack, lawyers involved with cases that have national security aspects may no longer file materials electronically. It appears these cases will go old school with paper filings only, stored on a computers in federal courts that have no connection to the internet. However, it is apparently believed at present that the Foreign Intelligence Surveillance Court system was not compromised by the Russians.

Other Developments

  • Senator Ted Cruz (R-TX) placed a hold on Secretary of Commerce designate Gina Raimondo’s nomination, explaining on Twitter: “I’ll lift the hold when the Biden admin commits to keep the massive Chinese Communist Party spy operation Huawei on the Entity List.” Cruz was one of three Republicans to vote against reporting out Raimondo’s nomination from the Senate Commerce, Science, and Transportation Committee. Even though the Ranking Member, Senator Roger Wicker (R-MS), voted to advance her nomination to the Senate floor, he, too, articulated concerns about Raimondo and the Biden Administration’s refusal to commit to keeping Huawei on the Department of Commerce’s Entity List, a designation that cuts off needed technology and products from the company from the People’s Republic of China (PRC). Wicker said “I do remain concerned about the Governor’s reluctance to state unequivocally that she intends to keep Huawei on the department’s entity list…[and] [k]eeping Huawei on this list is important for the security of our networks and I urge the Governor and the administration to make its position clear.” Of course, the continuing Republican focus on the PRC is seeking to box in the Biden Administration and to try to force them to maintain the Trump Administration’s policies. The new administration has refused to make hard commitments on the PRC thus far and will likely seek different tactics than the Trump Administration even though there will likely be agreement on the threat posed by the PRC and its companies.
  • Virginia’s “Consumer Data Protection Act” (SB 1392/HB 2307) advanced from the Virginia Senate to the House of Delegates by a 36-0-1 vote on 5 February. The package was sent to the Communications, Technology and Innovation Subcommittee in the House on 7 February. Last week, it appeared as if the legislature would not have time to finish work on the United States’ second privacy law, but Governor Ralph Northam (D) convened a special session right before the legislature was set to adjourn. Now, there will be more time to address this bill and other priorities.
  • Senators Brian Schatz (D-HI), Deb Fischer (R-NE), Richard Blumenthal (D-CT), Rick Scott (R-FL) and Jacky Rosen (D-NV) introduced “The Safe Connections Act” “to help survivors of domestic violence and other crimes cut ties with their abusers and separate from shared wireless service plans, which can be exploited to monitor, stalk, or control victims” per their press release. The Senators asserted “the Safe Connections Act would help them stay safe and connected by:
    • Allowing survivors to separate a mobile phone line from any shared plan involving an abuser without penalties or other requirements. This includes lines of any dependents in their care;
    • Requiring the Federal Communications Commission (FCC) to initiate a rulemaking proceeding to seek comment on how to help survivors who separate from a shared plan enroll in the Lifeline Program for up to six-months as they become financially stable; and
    • Requiring the FCC to establish rules that would ensure any calls or texts to hotlines do not appear on call logs.
  • The European Commission’s Directorate-General for Justice and Consumers issued the “Report on the implementation of specific provisions of Regulation (EU) 2016/679,” the General Data Protection Regulation (GDPR), in which it was determined that implementation of these provisions at the member state level is uneven. The implication of this assessment released some 2.5 years after the GDPR took effect is that it may be some time more before each European Union state has made the statutory and policy changes necessary to the data protection regime full effect. And so, the Directorate-General made “[t]he following general observations can be made in relation to the implementation of the GDPR clauses under assessment:
    • As regards Article 8(1) GDPR (i.e., Conditions applicable to child’s consent in relation to information society services), the majority of the Member States have set an age limit lower than 16 years of age for the validity of the consent of a minor in relation to information society services. Nine Member States set the age limit at 16 years age, while eight Member States opted for that of 13 years, six for that of 14 years and three for 15 years.
    • With respect to Article 9(4) GDPR (i.e., Processing of special categories of personal data), most Member States provide for conditions/limitations with regard to the processing of genetic data, biometric data or data concerning health. Such limitations/conditions typically consist in listing the categories of persons who have access to such data, ensuring that they are subject to confidentiality obligations, or making processing subject to prior authorisation from the competent national authority. No national provision restricting or prohibiting the free movement of personal data within the European Union has been identified.
    • As regards Article 23(1) GDPR, and irrespective of the areas of public interest assessed under Article 23(1)(c) and (e) GDPR (i.e. public security, public administration, public health, taxation and migration), some Member States provide for restrictions in the area of (i) social security; or (ii) supervision of financial market participants, functioning of the guarantee systems and resolution and macroeconomic analyses. Concerning Article 23(1)(c) GDPR, the majority of Member States allow for restrictions of various provisions referred to in Article 23(1) GDPR. Normally there is a general reference to public security, while more specific areas of processing include the processing of personal data for the investigation and prosecution of crimes, and the use of video cameras for surveillance. Most commonly, the restrictions apply only where certain conditions are met. In some Member States the proportionality and necessity test is not contemplated at all, while in most Member States it is established in law, rather than left to the data controller. The overwhelming majority of Member States do not sufficiently implement the conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 23(1)(e) GDPR in relation to public administration, half of the Member States provide for restrictions for such purpose. Normally there is a general reference to general public interest or public administration, while more specific areas of processing include discussions of the Council of Ministers and investigation of judicial or ‘administrative’ police authorities in connection with the commission of a crime or administrative infringement. Most commonly, the restrictions apply only where certain conditions are met. In some Member States the proportionality and necessity test is not contemplated at all, whereas in some other Member States the test is established in law or left to the data controller. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 23(1)(e) GDPR in relation to public health, a minority of the Member States provide for restrictions for such purpose. Normally there is a general reference to public health or general public interest, while more specific areas of processing include the security of food chain and medical files. In most Member States, the applicable restrictions apply only where certain conditions are met. The proportionality and necessity test is generally established in the law. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • With respect to Article 23(1)(e) GDPR in relation to taxation, a sizeable number of Member States provide restrictions for such purposes. There tends to be a general reference to taxation or general public interest, while more specific areas of processing include recovery of taxes, as well as automated tax data transfer procedures. Normally, the applicable restrictions apply only where certain conditions are met. The proportionality and necessity test is generally left to the data controller. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 23(1)(e) GDPR in relation to migration, a minority of the Member States provide for restrictions for such purpose. Normally there is a general reference to migration or general public interest. The applicable restrictions tend to apply only where certain conditions are met. The proportionality and necessity test is generally left to the data controller. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 85(1) GDPR (which requires Member States to reconcile by law the right to the protection of personal data with the right to freedom of expression and information), the majority of the Member States provide for provisions aiming to reconcile the right to the protection of personal data with the right to freedom of expression and information. These provisions are usually in the national data protection act implementing the GDPR, however, in some instances there are also specific provisions in media laws to this effect.
    • With respect to Article 85(2) GDPR (Reconciliation of the right to the protection of personal data with the right to freedom of expression and information), most Member States provide exemptions/derogations from the rules set out in Chapters II, III, IV, V, VI, VII and IX GDPR. More often than not, no specific balancing or reconciliation test is identified in the national legislation. A detailed account of the exemptions/derogations can be found in Annex 2 – Implementation of Article 85(2) GDPR.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) announced it is resuming the “investigation into real time bidding (RTB) and the adtech industry” in response to the COVID-19 pandemic. Simon McDougall, ICO Deputy Commissioner – Regulatory Innovation and Technology stated in a blog posting:
    • Enabling transparency and protecting vulnerable citizens are priorities for the ICO. The complex system of RTB can use people’s sensitive personal data to serve adverts and requires people’s explicit consent, which is not happening right now.
    • Sharing people’s data with potentially hundreds of companies, without properly assessing and addressing the risk of these counterparties, also raises questions around the security and retention of this data.
    • Our work will continue with a series of audits focusing on data management platforms* and we will be issuing assessment notices to specific companies in the coming months. The outcome of these audits will give us a clearer picture of the state of the industry.
    • The investigation is vast and complex and, because of the sensitivity of the work, there will be times where it won’t be possible to provide regular updates. However, we are committed to publishing our final findings, once the investigation is concluded.
    • We are also continuing to work with the Competition and Markets Authority (CMA) in considering Google’s Privacy Sandbox proposals to phase out support for third party cookies on Chrome.
  • Washington State Representative Shelley Kloba (D) and cosponsors introduced a bill, HB 1303, to establish a data brokers registry in Washington state that would also levy a 1.8% tax on gross revenue from selling personal data. In her press release, Kloba stated:
    • We are spending more and more of our lives on our phones and devices. From this has arisen a new business model where brokers collect, analyze, and resell personal data collected from applications on our phones and other devices. Currently, this type of business is totally unregulated and untaxed, and these businesses are reselling information with no compensation to the people of Washington. My legislation would shine a light on this very active segment of our economy while also establishing a small tax on the companies that profit from selling our personal data. Brokers that make money from collecting our personal information should contribute their fair share in tax revenue, and there should be more transparency on the number of businesses engaged in this industry.
    • HB 1303 would:
      • Impose a 1.8% Business & Occupation (B&O) tax on gross income arising from the sale of personal data.
      • Require companies that engage in this type of economic activity to register annually with the Department of Revenue (DOR).
      • Require DOR to provide the Legislature with an annual report on this information.
    • Recently, Kloba and cosponsors introduced the “People’s Privacy Act” (HB 1433), a bill to establish a privacy and data protection regime in Washington state. (see here for analysis.)
  • The Federal Trade Commission (FTC) used recently granted authority to police the use of algorithms and automated processes to buy tickets for entertainment and sporting events. The “Better Online Ticket Sales (BOTS) Act” (P.L. 114-274) “was enacted in 2016 and gives the FTC authority to take law enforcement action against individuals and companies that use bots or other means to circumvent limits on online ticket purchases” per the agency’s press release. The FTC stating it is taking “legal action against three ticket brokers based in New York who allegedly used automated software to illegally buy up tens of thousands of tickets for popular concerts and sporting events, then subsequently made millions of dollars reselling the tickets to fans at higher prices.” The FTC added:
    • The three ticket brokers will be subject to a judgment of more than $31 million in civil penalties for violating the Better Online Ticket Sales (BOTS) Act, under a proposed settlement reached with the FTC. Due to their inability to pay, the judgment will be partially suspended, requiring them to pay $3.7 million.
    • The FTC explained “[u]nder the terms of the proposed orders, judgments will be entered against the defendants for civil penalties as follows:
  • The National Institute of Standards and Technology (NIST) pushed back the deadline for comments until 26 February 2021 for four guidance documents on the Internet of Things:
    • Draft NIST SP 800-213, IoT Device Cybersecurity Guidance for the Federal Government: Establishing IoT Device Cybersecurity Requirements, has background and recommendations to help federal agencies consider how an IoT device they plan to acquire can integrate into a federal information system. IoT devices and their support for security controls are presented in the context of organizational and system risk management. SP 800-213 provides guidance on considering system security from the device perspective. This allows for the identification of IoT device cybersecurity requirements—the abilities and actions a federal agency will expect from an IoT device and its manufacturer and/or third parties, respectively.
    • Draft NISTIR 8259B, IoT Non-Technical Supporting Capability Core Baseline, complements the NISTIR 8259A device cybersecurity core baseline by detailing additional, non-technical supporting activities typically needed from manufacturers and/or associated third parties. This non-technical baseline collects and makes explicit supporting capabilities like documentation, training, customer feedback, etc.
    • Draft NISTIR 8259C, Creating a Profile Using the IoT Core Baseline and Non-Technical Baseline, describes a process, usable by any organization, that starts with the core baselines provided in NISTIRs 8259A and 8259B and explains how to integrate those baselines with organization- or application-specific requirements (e.g., industry standards, regulatory guidance) to develop a IoT cybersecurity profile suitable for specific IoT device customers or applications. The process in NISTIR 8259C guides organizations needing to define a more detailed set of capabilities responding to the concerns of a specific sector, based on some authoritative source such as a standard or other guidance, and could be used by organizations seeking to procure IoT technology or by manufacturers looking to match their products to customer requirements.
    • Draft NISTIR 8259D, Profile Using the IoT Core Baseline and Non-Technical Baseline for the Federal Government, provides a worked example result of applying the NISTIR 8259C process, focused on the federal government customer space, where the requirements of the FISMA process and the SP 800-53 security and privacy controls catalog are the essential guidance. NISTIR 8259D provides a device-centric, cybersecurity-oriented profile of the NISTIR 8259A and 8259B core baselines, calibrated against the FISMA low baseline described in NIST SP 800-53B as an example of the criteria for minimal securability for federal use cases.
  • The New York State Department of Financial Services (NYDFS) announced “[r]egulated entities and licensed persons must file the Certification of Compliance for the calendar year 2020 by April 15, 2021” These certificates are due under the NYDFS’ cybersecurity regulations with which most financial services companies in the state must comply. These regulations took effect in May 2017.

Coming Events

  • On 10 February, the House Homeland Committee will hold a hearing titled “Homeland Cybersecurity: Assessing Cyber Threats and Building Resilience” with these witnesses:
    • Mr. Chris Krebs, Former Director, Cybersecurity and Infrastructure Security Agency, U.S. Department of Homeland Security
    • Ms. Sue Gordon, Former Principal Deputy Director of National Intelligence, Office of the Director of National Intelligence
    • Mr. Michael Daniel, President & CEO, Cyber Threat Alliance
    • Mr. Dmitri Alperovitch, Executive Chairman, Silverado Policy Accelerator
  • The House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Justice Restored: Ending Forced Arbitration and Protecting Fundamental Rights.”
  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Martin Ceralde on Unsplash

EDPB and EDPS Issue Opinions On EC’s Draft SCCs

The EU’s two bloc-wide data protection entities weighed in on the EC’s proposed changes to SCCs, meant to satisfy the Schrems II ruling.

The European Union’s (EU) data protection authorities have rendered their joint opinions on the European Commission’s (EC) draft revisions of Standard Contractual Clauses (SCC) permissible under the General Data Protection Regulation (GDPR). At present, SCCs are the primary means by which companies are transferring the personal data of EU residents to other nations for processing, especially the United States (U.S.), without adequacy decisions. Since the adequacy decision on the U.S. was struck down companies have been left only with SCCs, and there are efforts afoot to have the EU’s top court to strike down SCCs governing the transfer of personal data to the U.S. on account of what critics call inadequate redress and protection from U.S. surveillance.

Before I turn to the European Data Protection Board (EDPB) and European Data Protection Supervisor’s (EDPS) joint opinions, some background would be helpful. In mid-2020, in a very anticipated decision, the EU’s top court struck down the adequacy decision underpinning the U.S.-EU Privacy Shield agreement. Under the GDPR, the easiest way for a controller to transfer the personal data of EU residents for processing outside the EU is through such a decision that essentially says the laws of the other nation are basically equivalent to the EU’s with respect to the rights they provide. The U.S. is the biggest trading partner with the EU with respect to these data flows with companies like Facebook and Google generating billions, maybe even trillions, of dollars in economic activity. Consequently, both Washington and Brussels have many reasons to favor the easiest route to making data flows happen. However, the forerunner to Privacy Shield (i.e. Safe Harbor) was also struck down, largely because of the inadequacy of U.S. privacy rights and mass surveillance, and so the U.S. made some changes, but these, too, proved inadequate, and litigation brought by Austrian activist and privacy advocate Maximillian Schrems against Facebook finally made its way to the Court of Justice for the European Union (CJEU).

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the CJEU explained:

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the U.S. lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law. However, the CJEU left open the question of whether SCCs may permit the continued exporting of EU personal data into the U.S. for companies like Facebook, Google, and many, many others. Consequently, there has been no small amount of interpreting and questioning of whether this may be a way for the trans-Atlantic data flow to continue. And yet, the CJEU seemed clear that additional measures would likely be necessary. Indeed, the CJEU asserted “[c]ontrollers and processors should be encouraged to provide additional safeguards via contractual commitments that supplement standard protection clauses” and “[i]n so far as those standard data protection clauses cannot, having regard to their very nature, provide guarantees beyond a contractual obligation to ensure compliance with the level of protection required under EU law, they may require, depending on the prevailing position in a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with that level of protection.”

Thereafter the EC stepped into the breach to seemingly shore up SCCs to protect them from the same fate as Privacy Shield, for it sems like it is a matter of time before the legality of SCCs are challenged. In mid-November 2020, the EC released for comment a draft revision of SCC for transfers of personal data to countries outside the EU with input due by 10 December. The EC had last revised EU law on SCCs in 2010, some years before the GDPR came into force. The EC released draft legislative language and, in an Annex, actual contract language for use by controllers and processors in the form of modules that are designed to be used in a variety of common circumstances (e.g., transfers by controllers to other controllers or a controller to a processor.) However, the EC stressed that SCCs form a floor and controllers, processors, and other parties are free to add additional language so long as it does not contradict or denigrate the rights protected by SCCs.

In the implementing decision, the EC asserted

the standard contractual clauses needed to be updated in light of new requirements in Regulation (EU) 2016/679. Moreover, since the adoption of these decisions, important developments have taken place in the digital economy, with the widespread use of new and more complex processing operations often involving multiple data importers and exporters, long and complex processing chains as well as evolving business relationships. This calls for a modernisation of the standard contractual clauses to better reflect those realities, by covering additional processing and transfer situations and to use a more flexible approach, for example with respect to the number of parties able to join the contract.

The EC continued:

The standard contractual clauses set out in the Annex to this Decision may be used by a controller or a processor in order to provide appropriate safeguards within the meaning of Article 46(1) of Regulation (EU) 2016/679 for the transfer of personal data to a processor or a controller established in a third country. This also includes the transfer of personal data by a controller or processor not established in the Union, to the extent that the processing is subject to Regulation (EU) 2016/679 pursuant to Article 3(2) thereof, because it relates to the offering of goods or services to data subjects in the Union or the monitoring of their behaviour as far as their behaviour takes place within the Union.

The EC explained the design and intent of the SCC language in the Annex:

  • The standard contractual clauses set out in the Annex to this Decision combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains. In addition to the general clauses, controllers and processors should select the module applicable to their situation, which makes it possible to tailor their obligations under the standard contractual clauses to their corresponding role and responsibilities in relation to the data processing at issue. It should be possible for more than two parties to adhere to the standard contractual clauses. Moreover, additional controllers and processors should be allowed to accede to the standard contractual clauses as data exporters or importers throughout the life cycle of the contract of which those clauses form a part.
  • These Clauses set out appropriate safeguards, including enforceable data subject rights and effective legal remedies, pursuant to Article 46(1), and Article 46 (2)(c) of Regulation (EU) 2016/679 and, with respect to data transfers from controllers to processors and/or processors to processors, standard contractual clauses pursuant to Article 28(7) of Regulation (EU) 2016/679, provided they are not modified, except to add or update information in the Annexes. This does not prevent the Parties from including the standard contractual clauses laid down in this Clauses in a wider contract, and to add other clauses or additional safeguards provided that they do not contradict, directly or indirectly, the standard contractual clauses or prejudice the fundamental rights or freedoms of data subjects. These Clauses are without prejudice to obligations to which the data exporter is subject by virtue of the Regulation (EU) 2016/679

On the same day, the EC released its SCC proposals, the EDPB issued guidance documents, which was surely not coincidental. In “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,” the EDPB explained the genesis and rationale for the document:

  • The GDPR or the [CJEU] do not define or specify the “additional safeguards”, “additional measures” or “supplementary measures” to the safeguards of the transfer tools listed under Article 46.2 of the GDPR that controllers and processors may adopt to ensure compliance with the level of protection required under EU law in a particular third country.
  • The EDPB has decided, on its own initiative, to examine this question and to provide controllers and processors, acting as exporters, with recommendations on the process they may follow to identify and adopt supplementary measures. These recommendations aim at providing a methodology for the exporters to determine whether and which additional measures would need to be put in place for their transfers. It is the primary responsibility of exporters to ensure that the data transferred is afforded in the third country of a level of protection essentially equivalent to that guaranteed within the EU. With these recommendations, the EDPB seeks to encourage consistent application of the GDPR and the Court’s ruling, pursuant to the EDPB’s mandate

Broadly speaking, whether SCCs and supplemental measures will pass muster under the GDPR will be determined on a case-by-case basis. The EDPB did not offer much in the way of bright line rules. Indeed, it will be up to SAs to determine if transfers to nations like the U.S. are possible under the GDPR, meaning these recommendations may shed more light on this central question without deciding it. One wonders, as a practical matter, if the SAs will have the capacity, resources, and will to police SCCs to ensure the GDPR and Charter are being met.

Nonetheless, the EDPB stressed the principle of accountability under which controllers which export personal data must ensure that whatever mechanism and supplemental measures govern a data transfer, the data must receive the same protection it would in the EU. The EDPB made the point that EU protections travel with the data and should EU personal data make its way to a country where it is not possible for appropriate protection to occur, then the transfer violates the GDPR. Moreover, these recommendations pertain to both public and private transfers of EU data to private sector entities outside the EU.

These recommendations work like a decision tree with exporters needing to ask themselves a series of questions to determine whether they must use supplemental measures. This may prove a resource intensive process, for exporters will need to map all transfers (i.e. know exactly) where the data are going. The exporter must understand the laws and practices of the third nation in order to put in place appropriate measures if this is possible in order to meet the EU’s data protection standards.

Reading between the lines leads one to conclude that data exporters may not send personal data to the U.S. for its federal surveillance regime is not “necessary and proportionate,” at least from the EU’s view. The U.S. lacks judicial redress in the case a U.S. national, let alone a foreign national, objects to the sweeping surveillance. The U.S. also has neither a national data protection law nor a dedicated data protection authority. These hints seem to also convey the EDPB’s view on the sorts of legal reforms needed in the U.S. before an adequacy decision would pass muster with the CJEU.

The EDPB said it was still evaluating how Schrems II affects the use of BCR and ad hoc contractual clauses, two of the other alternate means of transferring EU personal data in the absence of an adequacy agreement.

Nevertheless, in an annex, the EDPB provided examples of supplementary measures that may be used depending on the circumstances, of course, such as “flawlessly implemented” encryption and pseudonymizing data. However, the EDPB discusses these in the context of different scenarios and calls for more conditions than just the two aforementioned. Moreover, the EDPB rules out two scenarios categorically as being inadequate: “Transfer to cloud services providers or other processors which require access to data in the clear” and “Remote access to data for business purposes.”

The EDPB also issued an update to guidance published after the first lawsuit brought by Maximilian Schrems resulted in the striking down of the Safe Harbor transfer agreement. The forerunner to the EDPB, the Working Party 29, had drafted and released the European Essential Guarantees, and so, in light of Schrems II, the EDPB updated and published “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” “to provide elements to examine, whether surveillance measures allowing access to personal data by public authorities in a third country, being national security agencies or law enforcement authorities, can be regarded as a justifiable interference or not” with fundamental EU rights and protections. As the EDPB explains, these recommendations are intended to help data controllers and exporters determine whether other nations have protections and processes in place equivalent to those of the EU visa vis their surveillance programs. The EDPB stressed that these are the essential guarantees and other features and processes may be needed for a determination of lawfulness under EU law.

The EDPB formulated the four European Essential Guarantees:

A. Processing should be based on clear, precise and accessible rules

B. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated

C. An independent oversight mechanism should exist

D. Effective remedies need to be available to the individual

Where the new joint opinions of the EDPB and EDPS fit into this process is that the EC asked for a joint opinion on its drafts as noted at the beginning of one of their opinions:

On 12 November 2020, the European Commission requested a joint opinion of the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) on the basis of Article 42(1), (2) of Regulation (EU) 2018/1725 (EU DPR) on these two sets of draft standard contractual clauses and the respective implementing acts.

Consequently, the EDPB and EDPS issued the following:

In Joint Opinion 1/2021, the two bodies explained:

The EDPB and the EDPS are of the opinion that clauses which merely restate the provisions of Article 28(3) and (4) GDPR and Article 29 (3) and (4) EUDPR are inadequate to constitute standard contractual clauses. The Board and EDPS have therefore decided to analyse the document in its entirety, including the appendices. In the opinion of the Board and the EDPS, a contract under Article 28 GDPR or Article 29 EUDPR should further stipulate and clarify how the provisions will be fulfilled. It is in this light that the Draft SCCs submitted to the Board and EDPS for opinion are analysed.

The EDPB and EDPS go on to ask the EC to better clarify the difference between the legislation on transfers between controllers and processors, which is meant to happen only inside the EU, and the transfers to third countries. They asked for clarity on the scope of the language. The EDPB and EDPS also asked that the EC expand the the intra-EU SCC decision to include those nations that have been found adequate (e.g. Israel, Japan, New Zealand, and others.)

The EDPB and EDPS did find much to like, however:

  • Adopted standard contractual clauses constitute a set of guarantees to be used as is, as they are intended to protect data subjects and mitigate specific risks associated with the fundamental principles of data protection.
  • The EDPB and the EDPS welcome in general the adoption of standard contractual clauses as a strong accountability tool that facilitates compliance by controllers and processors to their obligations under the GDPR and the EUDPR.
  • The EDPB already issued opinions on standard contractual clauses prepared by the Danish Supervisory Authority2 and the Slovenian Supervisory Authority 3.
  • To ensure a coherent approach to personal data protection throughout the Union, the EDPB and the EDPS strongly welcome the envisaged adoption of SCCs having an EU-wide effect by the Commission.
  • The same set of SCCs will indeed apply irrespective of whether this relationship involves private entities, public authorities of the Member States or EU institutions or bodies. These EU-wide SCCs will ensure further harmonisation and legal certainty.
  • The EDPB and the EDPS also welcome the fact that the same set of SCCs should apply in respect of the relationship between controllers and processors subject to GDPR and EUDPR respectively.

In Joint Opinion 2/2021, the EDPB and EDPS stated:

The Draft SCCs combine general clauses with a modular approach to cater for various transfer scenarios. In addition to the general clauses, controllers and processors should select the module applicable to their situation among the four following modules:

  • Module One: transfer controller to controller;
  • Module Two: transfer controller to processor;
  • Module Three: transfer processor to processor;
  • Module Four: transfer processor to controller.

Again, the EDPB and EDPS wanted greater clarity on the language in this decision, especially regarding SCCs governing EU institutions subject not to the GDPR but to Regulation (EU) 2018/1725 (aka the EUDPR). In general, the EDPB and EDPS had this comment on the actual draft SCCs:

The EDPB and the EDPS welcome the introduction of specific modules for each transfer scenarios. However, the EDPB and the EDPS note that it is not clear whether one set of the SCCs can include several modules in practice to address different situations, or whether this should amount to the signing of several sets of the SCCs. In order to achieve maximum readability and easiness in the practical application of the SCCs, the EDPB and the EDPS suggest that the European Commission provides additional guidance (e.g. in the form of flowcharts, publication of Frequently Asked Questions (FAQs), etc.). In particular, it should be made clear that the combination of different modules in a single set of SCCs cannot lead to the blurring of roles and responsibilities among the parties.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dimitris Vetsikas from Pixabay

France Reaches Agreement On Pay For Media; Australia Clashes With Google and Facebook On Same Issue

Google and France reach agreement on a scheme to compensate media while the company threatens to pull its search engine from Australia.

Not long after Google reached agreement with French media on how to compensate them under France’s law to implement a European Union (EU) Directive, the company threatened to pull its search engine from Australia for legislation introduced last month to make payment to that country’s media mandatory.

Last week, Google and the Alliance de la Presse d’Information Générale (APIG) announced agreement “about neighboring rights under French law…a major step forward: it is the culmination of months of negotiations within the framework set by the French Competition Authority.” APIG and Google explained:

  • This agreement establishes a framework within which Google will negotiate individual licensing agreements with IPG certified publishers within APIG’s membership, while reflecting the principles of the law.  These agreements will cover publishers’ neighboring rights, and allow for participation in News Showcase, a new licencing program recently launched by Google to provide readers access to enriched content.
  • The remuneration that is included in these licensing agreements is based on criteria such as the publisher’s contribution to political and general information (IPG certified publishers), the daily volume of publications, and its monthly internet traffic.

In April 2020, France’s Competition Authority found that Google had indeed violated French and EU law and must compensate French news media for use of their content. Google appealed, lost, and then agreed to negotiate.

In the weeks before the French appeals court ruled against Google, Google and Alphabet CEO Sundar Pichai announced the company will pay some media outlets up to $1 billion over the next three years “to create and curate high-quality content for a different kind of online news experience” for its new product, Google News Showcase. Pichai claimed:

This approach is distinct from our other news products because it leans on the editorial choices individual publishers make about which stories to show readers and how to present them. It will start rolling out today to readers in Brazil and Germany, and will expand to other countries in the coming months where local frameworks support these partnerships.

In response, the European Publishers Council (EPC) noted

The French Competition Authority decision from April considered that Google’s practices were likely to constitute an abuse of a dominant position and brought serious and immediate damage to the press sector. It calls on Google, within three months, to conduct negotiations in good faith with publishers and press agencies on the remuneration for their protected content. Google’s appeal in July seeks to get some legal clarity on parts of the decision.

Moreover, the EU Directive on Copyright in the Digital Single Market is being implemented in EU member states and would allow them to require compensation from platforms like Facebook and Google. The EPC claimed:

Many are quite cynical about Google’s perceived strategy. By launching their own product, they can dictate terms and conditions, undermine legislation designed to create conditions for a fair negotiation, while claiming they are helping to fund news production.

The Directive on Copyright in the Digital Single Market provides:

This Directive lays down rules which aim to harmonise further Union law applicable to copyright and related rights in the framework of the internal market, taking into account, in particular, digital and cross-border uses of protected content. It also lays down rules on exceptions and limitations to copyright and related rights, on the facilitation of licences, as well as rules which aim to ensure a well-functioning marketplace for the exploitation of works and other subject matter.

Matters in Australia stand on different ground. In testimony before the Australian Senate Economics Legislation Committee, the Managing Director of Google Australia and New Zealand Melanie Silva said “[i]f this version of the code were to become law, it would give us no real choice but to stop making Google Search available in Australia.” She added “[i]t’s not a threat…[i]t’s a reality.” Silva was testifying on a recently released bill that would require Google, Facebook, and others to pay Australian news media for use of their content. Both tech giants have been fighting this initiative since it was launched in early 2020. Silva added the new law “would set an untenable precedent” and pose “unmanageable financial and operational risk.” Silva later posted a video on Twitter, making the case against the legislation. Additionally, Facebook threatened to limit what users in Australia could post and see if the law goes into effect.

In December 2020, Canberra unveiled the “Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020” that “establishes a mandatory code of conduct to help support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses” according to the Explanatory Memorandum. The legislation comes after the center-right government, the Liberal–National Coalition, tried to negotiate a voluntary agreement with Google and Facebook, but talks fell apart. In late May, Australia’s Treasurer Josh Frydenberg explained in an op-ed that because Facebook and Google have not come to an agreement with the Australian Competition & Consumer Commission (ACCC) in “facilitat[ing] the development of a voluntary code of conduct governing the relationships between digital platforms and media businesses, the goal of which was to protect consumers, improve transparency and address the power imbalance between the parties.”

In late July 2020, the ACCC released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” In publishing the draft, the ACCC explained

The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.

This is not the ACCC’s first interaction with the companies. In December 2020, the ACCC sued Facebook and two subsidiaries “for false, misleading or deceptive conduct when promoting Facebook’s Onavo Protect mobile app to Australian consumers.” The Australian regulator is claiming Facebook and its subsidiaries misrepresented VPN services that were offered, in large part, to collect personal data. The ACCC is arguing that Facebook Inc., Facebook Israel Ltd and Onavo,Inc. “made false, misleading or deceptive representations that Onavo Protect would keep users’ personal activity data private, protected and secret, and that such data would not be used for any purpose other than to provide the Onavo Protect services.” The ACCC argued that “a key purpose and use of Onavo Protect, which utilised Facebook’s servers, was for Facebook and Onavoto collect significant personal activity data, including about users’ internet and app activity, for Facebook and Onavo to use for their commercial benefit, including to support market analytics and related activities such as identifying future acquisitions.”

In October 2019 the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.

In 2019, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Rodrigo Kugnharski on Unsplash

Google Buys Fitbit Even Though U.S. and Australia May Still Oppose

The Google/Fitbit deal could ultimately get blocked.

Even though the European Union (EU) has signed off on Google’s acquisition of Fitbit with some conditions, the United States (U.S.) and Australia are still assessing the deal. Moreover, given that both nations are in the midst of acting against Google and other tech companies, one, if not both, may find the deal violates antitrust or competition laws and seek to force Google to reverse the merger.

In blog posting, Google Senior Vice President, Devices & Services Rick Osterloh stated “Google has completed its acquisition of Fitbit and I want to personally welcome this talented team to Google.” Osterloh asserted “[y]our privacy and security are paramount to achieving this and we are committed to protecting your health information and putting you in control of your data.” Osterloh claimed:

This deal has always been about devices, not data, and we’ve been clear since the beginning that we will protect Fitbit users’ privacy. We worked with global regulators on an approach which safeguards consumers’ privacy expectations, including a series of binding commitments that confirm Fitbit users’ health and wellness data won’t be used for Google ads and this data will be separated from other Google ads data. We’ll also maintain access to Android APIs that enable devices like fitness trackers and smart watches to interoperate with Android smartphones, and we’ll continue to allow Fitbit users to choose to connect to third-party services so you’ll still be able to sync your favorite health and fitness apps to your Fitbit account. These commitments will be implemented globally so that all consumers can benefit from them. We’ll also continue to work with regulators around the world so that they can be assured that we are living up to these commitments. 

Last month, following the completion of its “in-depth” investigation, the European Commission (EC) cleared Google’s acquisition of Fitbit with certain conditions, removing a significant hurdle for the American multinational in buying the wearable fitness tracker company. In its press release, the EC explained that after its investigation, “the Commission had concerns that the transaction, as initially notified, would have harmed competition in several markets.” To address and allay concerns, Google bound itself for ten years to a set of commitments that can be unilaterally extended by the EC and will be enforced, in part, by the appointment of a trustee to oversee compliance. However, a number of these commitments are binding only in the European Economic Area (EEA) (i.e. the EU plus a handful of non-EU European nations).

The EC was particularly concerned about:

  • Advertising: By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to that of Fitbit. By increasing the already vast amount of data that Google could use for the personalisation of ads, it would be more difficult for rivals to match Google’s services in the markets for online search advertising, online display advertising, and the entire “ad tech” ecosystem. The transaction would therefore raise barriers to entry and expansion for Google’s competitors for these services to the detriment of advertisers, who would ultimately face higher prices and have less choice.
  • Access to Web Application Programming Interface (‘API’) in the market for digital healthcare: A number of players in this market currently access health and fitness data provided by Fitbit through a Web API, in order to provide services to Fitbit users and obtain their data in return. The Commission found that following the transaction, Google might restrict competitors’ access to the Fitbit Web API. Such a strategy would come especially at the detriment of start-ups in the nascent European digital healthcare space.
  • Wrist-worn wearable devices: The Commission is concerned that following the transaction, Google could put competing manufacturers of wrist-worn wearable devices at a disadvantage by degrading their interoperability with Android smartphones.

As noted, Google made a number of commitments to address competition concerns:

  • Ads Commitment:
    • Google will not use for Google Ads the health and wellness data collected from wrist-worn wearable devices and other Fitbit devices of users in the EEA, including search advertising, display advertising, and advertising intermediation products. This refers also to data collected via sensors (including GPS) as well as manually inserted data.
    • Google will maintain a technical separation of the relevant Fitbit’s user data. The data will be stored in a “data silo” which will be separate from any other Google data that is used for advertising.
    • Google will ensure that European Economic Area (‘EEA’) users will have an effective choice to grant or deny the use of health and wellness data stored in their Google Account or Fitbit Account by other Google services (such as Google Search, Google Maps, Google Assistant, and YouTube).
  • Web API Access Commitment:
    • Google will maintain access to users’ health and fitness data to software applications through the Fitbit Web API, without charging for access and subject to user consent.
  • Android APIs Commitment:
    • Google will continue to license for free to Android original equipment manufacturers (OEMs) those public APIs covering all current core functionalities that wrist-worn devices need to interoperate with an Android smartphone. Such core functionalities include but are not limited to, connecting via Bluetooth to an Android smartphone, accessing the smartphone’s camera or its GPS. To ensure that this commitment is future-proof, any improvements of those functionalities and relevant updates are also covered.
    • It is not possible for Google to circumvent the Android API commitment by duplicating the core interoperability APIs outside the Android Open Source Project (AOSP). This is because, according to the commitments, Google has to keep the functionalities afforded by the core interoperability APIs, including any improvements related to the functionalities, in open-source code in the future. Any improvements to the functionalities of these core interoperability APIs (including if ever they were made available to Fitbit via a private API) also need to be developed in AOSP and offered in open-source code to Fitbit’s competitors.
    • To ensure that wearable device OEMs have also access to future functionalities, Google will grant these OEMs access to all Android APIs that it will make available to Android smartphone app developers including those APIs that are part of Google Mobile Services (GMS), a collection of proprietary Google apps that is not a part of the Android Open Source Project.
    • Google also will not circumvent the Android API commitment by degrading users experience with third party wrist-worn devices through the display of warnings, error messages or permission requests in a discriminatory way or by imposing on wrist-worn devices OEMs discriminatory conditions on the access of their companion app to the Google Play Store.

The EC allowed the deal to move ahead despite concerns about harms to users in the EU. Amnesty International (AI) sent EC Executive Vice-President Margrethe Vestager a letter, arguing “[t]he merger risks further extending the dominance of Google and its surveillance-based business model, the nature and scale of which already represent a systemic threat to human rights.” AI asserted “[t]he deal is particularly troubling given the sensitive nature of the health data that Fitbit holds that would be acquired by Google.” AI argued “[t]he Commission must ensure that the merger does not proceed unless the two business enterprises can demonstrate that they have taken adequate account of the human rights risks and implemented strong and meaningful safeguards that prevent and mitigate these risks in the future.”

In late December, the Australian Competition & Consumer Commission (ACCC) “announced that it will not accept a long-term behavioural undertaking offered by Google that sought to address competition concerns about its proposed acquisition of wearables supplier and manufacturer Fitbit.” In light of the ongoing fights between the ACCC and Google, this was hardly a surprising outcome. The agency added it “will therefore continue its investigation into Google’s proposed acquisition of Fitbit and has set a new decision date of 25 March 2021.” The agency said Google had offered a deal similar to the one accepted by the EC, but ACCC Chair Rod Sims remarked “[w]hile we are aware that the EC recently accepted a similar undertaking from Google, we are not satisfied that a long term behavioural undertaking of this type in such a complex and dynamic industry could be effectively monitored and enforced in Australia.”

The ACCC claimed:

  • The proposed acquisition also further consolidates Google’s leading position in relation to the collection of user data, which supports its significant market power in online advertising and is likely to have applications in health markets.
  • Google sought to address the ACCC’s competition concerns by offering a court enforceable undertaking that it would behave in certain ways towards rival wearable manufacturers, not use health data for advertising and, in some circumstances, allow competing businesses access to health and fitness data.
  • The proposed acquisition has received conditional clearance in Europe, but several other competition authorities, including the U.S. Department of Justice, are yet to make a decision.  Both companies are based in the U.S. and Fitbit’s market share is higher in the U.S. than in most other countries. The ACCC will continue to work closely with overseas agencies on these important competition issues. 

In its June 2020 Statement of issues on the proposed merger, the ACCC turned up a reasons why Google’s offer to not use Fitbit data for Google Ads (an offer the EC accepted) will not stop the use and possible abuse of such data:

The health and fitness data collected by Fitbit will provide Google with access to consumer data that is likely to be an important element of services in several markets.

Google will not use these data in Google Ads, but what about Google Maps? Could it find ways to profitably use people’s health data in perhaps selling access to population level health trends to companies aside and apart from Google Ads? I would think the answer is yes even if my example is uninformed or unrealistic.

The ACCC added:

In relation to data-dependent health services, the ACCC is concerned that the acquisition may eliminate potential competition between Fitbit (either under current ownership or under alternative ownership) and Google. Google has a strong focus on new and developing markets and will likely become a strong competitor in the supply of data-dependent health services with or without the proposed acquisition. The health and fitness data collected by Fitbit puts Fitbit in a strong position to enter and compete in data-dependent health markets. The proposed acquisition eliminates this potential competition between Google and Fitbit.

The ACCC’s rejection of the terms accepted by the EU must be seen in light of other regulatory actions. In 2019, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.

In October 2020, the United States (U.S.) Department of Justice (DOJ) and a number of states finally filed the antitrust suit against Google that has been rumored to be coming since late summer. This anti-trust action centers on Google’s practices of making Google the default search engine on Android devices and paying browsers and other technology entities to make Google the default search engine. Of course, this type of conduct, even if true, does not necessarily bear on the DOJ’s deliberations on whether the U.S. should act against the Google/Fitbit deal. And yet, given the renewed focus on antitrust in Washington, the DOJ under new President Joe Biden might indeed take a look at the deal on the grounds that a massive company is getting much bigger.

In its press release on the October antitrust action, the DOJ claimed

Today, the Department of Justice — along with eleven state Attorneys General — filed a civil antitrust lawsuit in the U.S. District Court for the District of Columbia to stop Google from unlawfully maintaining monopolies through anticompetitive and exclusionary practices in the search and search advertising markets and to remedy the competitive harms. The participating state Attorneys General offices represent Arkansas, Florida, Georgia, Indiana, Kentucky, Louisiana, Mississippi, Missouri, Montana, South Carolina, and Texas.

The DOJ added

As one of the wealthiest companies on the planet with a market value of $1 trillion, Google is the monopoly gatekeeper to the internet for billions of users and countless advertisers worldwide. For years, Google has accounted for almost 90 percent of all search queries in the United States and has used anticompetitive tactics to maintain and extend its monopolies in search and search advertising.  

The DOJ claimed:

As alleged in the Complaint, Google has entered into a series of exclusionary agreements that collectively lock up the primary avenues through which users access search engines, and thus the internet, by requiring that Google be set as the preset default general search engine on billions of mobile devices and computers worldwide and, in many cases, prohibiting preinstallation of a competitor. In particular, the Complaint alleges that Google has unlawfully maintained monopolies in search and search advertising by:

  • Entering into exclusivity agreements that forbid preinstallation of any competing search service.
  • Entering into tying and other arrangements that force preinstallation of its search applications in prime locations on mobile devices and make them undeletable, regardless of consumer preference.
  • Entering into long-term agreements with Apple that require Google to be the default – and de facto exclusive – general search engine on Apple’s popular Safari browser and other Apple search tools.
  • Generally using monopoly profits to buy preferential treatment for its search engine on devices, web browsers, and other search access points, creating a continuous and self-reinforcing cycle of monopolization.

These and other anticompetitive practices harm competition and consumers, reducing the ability of innovative new companies to develop, compete, and discipline Google’s behavior. 

In December, two other suits were filed against Google, arguing that the company’s dominance in the search engine and online advertising markets. One suit is led by Colorado’s attorney general and the other by Texas’ attorney general. The two suits have overlapping but different foci, and it is possible these new suits get folded into the suit against Google filed by the DOJ. There are also media reports that some of the states that brought these suits may be preparing yet another antitrust action against Google over allegedly anti-monopolistic behavior in how it operates its Google Play app store. (see here for more detail.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Social Estate on Unsplash

ePrivacy Exception Proposed

Late last month, a broad exception to the EU’s privacy regulations became effective.

My apologies. The first version of this post erroneously asserted the derogation to the ePrivacy Directive had been enacted. It has not, and this post has been re-titled and updated to reflect this fact.

As the European Union (EU) continues to work on enacting a modernized ePrivacy Directive (Directive 2002/58/EC) to complement the General Data Protection Regulation (GDPR), it proposed an exemption to manage a change in another EU law to sweep “number-independent interpersonal communications services” into the current regulatory structure of electronics communication. The policy justification for allowing a categorical exemption to the ePrivacy Directive is for combatting child sexual abuse online. This derogation of EU law is limited to at most five years and quite possibly less time if the EU can enact a successor to the ePrivacy Directive, an ePrivacy Regulation. However, it is unclear when this derogation will be agreed upon and enacted.

In September 2020, the European Commission (EC) issued “a Proposal for a Regulation on a temporary derogation from certain provisions of the ePrivacy Directive 2002/58/EC as regards the use of technologies by number-independent interpersonal communicationsservice providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.” The final regulation took effect on 21 December 2020. However, the EC has also issued a draft of compromise ePrivacy Regulation, the results of extensive communications. The GDPR was enacted with an update of the ePrivacy Directive in mind.

In early December, an EU Parliament committee approved the proposed derogation but the full Parliament has not yet acted upon the measure. The Parliament needs to reach agreement with the Presidency of the Council and the European Commission. In its press release, the Civil Liberties, Justice and Home Affairs explained:

The proposed regulation will provide for limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

Article 1 sets out the scope and aim of the temporary regulation:

This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.

The EC explained the legal and policy background for the exemption to the ePrivacy Directive:

  • On 21 December 2020, with the entry into application of the European Electronic Communications Code (EECC), the definition of electronic communications services will be replaced by a new definition, which includes number-independent interpersonal communications services. From that date on, these services will, therefore, be covered by the ePrivacy Directive, which relies on the definition of the EECC. This change concerns communications services like webmail messaging services and internet telephony.
  • Certain providers of number-independent interpersonal communications services are already using specific technologies to detect child sexual abuse on their services and report it to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, and/or to remove child sexual abuse material. These organisations refer to national hotlines for reporting child sexual abuse material, as well as organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the EU and in third countries.
  • Child sexual abuse is a particularly serious crime that has wide-ranging and serious life-long consequences for victims. In hurting children, these crimes also cause significant and long- term social harm. The fight against child sexual abuse is a priority for the EU. On 24 July 2020, the European Commission adopted an EU strategy for a more effective fight against child sexual abuse, which aims to provide an effective response, at EU level, to the crime of child sexual abuse. The Commission announced that it will propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and oblige them to report that material to public authorities by the second quarter of 2021. The announced legislation will be intended to replace this Regulation, by putting in place mandatory measures to detect and report child sexual abuse, in order to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse, while ensuring respect of the fundamental rights of the users, including in particular the right to freedom of expression and opinion, protection of personal data and privacy, and providing for mechanisms to ensure accountability and transparency.

The EC baldly asserts the problem of child online sexual abuse justifies a loophole to the broad prohibition on violating the privacy of EU persons. The EC did note that the fight against this sort of crime is a political priority for the EC, one that ostensibly puts the EU close to the views of the Five Eyes nations that have been pressuring technology companies to end the practice of making apps and hardware encrypted by default.

The EC explained:

The present proposal therefore presents a narrow and targeted legislative interim solution with the sole objective of creating a temporary and strictly limited derogation from the applicability of Articles 5(1) and 6 of the ePrivacy Directive, which protect the confidentiality of communications and traffic data. This proposal respects the fundamental rights, including the rights to privacy and protection of personal data, while enabling providers of number-independent interpersonal communications services to continue using specific technologies and continue their current activities to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services, pending the adoption of the announced long- term legislation. Voluntary efforts to detect solicitation of children for sexual purposes (“grooming”) also must be limited to the use of existing, state-of-the-art technology that corresponds to the safeguards set out. This Regulation should cease to apply in December 2025.

The EC added “[i]n case the announced long-term legislation is adopted and enters into force prior to this date, that legislation should repeal the present Regulation.”

In November, the European Data Protections Supervisor (EDPS) Wojciech Wiewiorówski published his opinion on the temporary, limited derogation from the EU’s regulation on electronics communication and privacy. Wiewiorówski cautioned that a short-term exception, however well-intended, would lead to future loopholes that would ultimately undermine the purpose of the legislation. Moreover, Wiewiorówski found that the derogation was not sufficiently specific guidance and safeguards and is not proportional. Wiewiorówski argued:

  • In particular, he notes that the measures envisaged by the Proposal would constitute an interference with the fundamental rights to respect for private life and data protection of all users of very popular electronic communications services, such as instant messaging platforms and applications. Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
  • The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field. The EDPS therefore considers it essential that the Proposal is not adopted, even in the form a temporary derogation, until all the necessary safeguards set out in this Opinion are integrated.
  • In particular, in the interest of legal certainty, the EDPS considers that it is necessary to clarify whether the Proposal itself is intended to provide a legal basis for the processing within the meaning of the GDPR, or not. If not, the EDPS recommends clarifying explicitly in the Proposal which legal basis under the GDPR would be applicable in this particular case.
  • In this regard, the EDPS stresses that guidance by data protection authorities cannot substitute compliance with the requirement of legality. It is insufficient to provide that the temporary derogation is “without prejudice” to the GDPR and to mandate prior consultation of data protection authorities. The co-legislature must take its responsibility and ensure that the proposed derogation complies with the requirements of Article 15(1), as interpreted by the CJEU.
  • In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.
  • Finally, the EDPS is of the view that the five-year period as proposed does not appear proportional given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards within the text of the legislation. He considers that the validity of any transitional measure should not exceed 2 years.

The Five Eyes nations (Australia, Canada, New Zealand, the United Kingdom, and the United States) issued a joint statement in which their ministers called for quick action.

In this statement, we highlight how from 21 December 2020, the ePrivacy Directive, applied without derogation, will make it easier for children to be sexually exploited and abused without detection – and how the ePrivacy Directive could make it impossible both for providers of internet communications services, and for law enforcement, to investigate and prevent such exploitation and abuse. It is accordingly essential that the European Union adopt urgently the derogation to the ePrivacy Directive as proposed by the European Commission in order for the essential work carried out by service providers to shield endangered children in Europe and around the world to continue.

Without decisive action, from 21 December 2020 internet-based messaging services and e-mail services captured by the European Electronic Communications Code’s (EECC) new, broader definition of ‘electronic communications services’ are covered by the ePrivacy Directive. The providers of electronic communications services must comply with the obligation to respect the confidentiality of communications and the conditions for processing communications data in accordance with the ePrivacy Directive. In the absence of any relevant national measures made under Article 15 of that Directive, this will have the effect of making it illegal for service providers operating within the EU to use their current tools to protect children, with the impact on victims felt worldwide.

As mentioned, this derogation comes at a time when the EC and the EU nations are trying to finalize and enact an ePrivacy Regulation. In the original 2017 proposal, the EC stated:

The ePrivacy Directive ensures the protection of fundamental rights and freedoms, in particular the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the Union.

The ePrivacy Regulation is intended to work in concert with the GDPR, and the draft 2020 regulation contains the following passages explaining the intended interplay of the two regulatory schemes:

  • Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons. Processing of electronic communications data by providers of electronic communications services and networks should only be permitted in accordance with this Regulation. This Regulation does not impose any obligations on the end-user End-users who are legal persons may have rights conferred by Regulation (EU) 2016/679 to the extent specifically required by this Regulation
  • While the principles and main provisions of Directive 2002/58/EC of the European Parliament and of the Council remain generally sound, that Directive has not fully kept pace with the evolution of technological and market reality, resulting in an inconsistent or insufficient effective protection of privacy and confidentiality in relation to electronic communications. Those developments include the entrance on the market of electronic communications services that from a consumer perspective are substitutable to traditional services, but do not have to comply with the same set of rules. Another development concerns new techniques that allow for tracking of online behaviour of end-users, which are not covered by Directive 2002/58/EC. Directive 2002/58/EC should therefore be repealed and replaced by this Regulation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Périgois on Unsplash