ePrivacy Exception Proposed

Late last month, a broad exception to the EU’s privacy regulations became effective.

My apologies. The first version of this post erroneously asserted the derogation to the ePrivacy Directive had been enacted. It has not, and this post has been re-titled and updated to reflect this fact.

As the European Union (EU) continues to work on enacting a modernized ePrivacy Directive (Directive 2002/58/EC) to complement the General Data Protection Regulation (GDPR), it proposed an exemption to manage a change in another EU law to sweep “number-independent interpersonal communications services” into the current regulatory structure of electronics communication. The policy justification for allowing a categorical exemption to the ePrivacy Directive is for combatting child sexual abuse online. This derogation of EU law is limited to at most five years and quite possibly less time if the EU can enact a successor to the ePrivacy Directive, an ePrivacy Regulation. However, it is unclear when this derogation will be agreed upon and enacted.

In September 2020, the European Commission (EC) issued “a Proposal for a Regulation on a temporary derogation from certain provisions of the ePrivacy Directive 2002/58/EC as regards the use of technologies by number-independent interpersonal communicationsservice providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.” The final regulation took effect on 21 December 2020. However, the EC has also issued a draft of compromise ePrivacy Regulation, the results of extensive communications. The GDPR was enacted with an update of the ePrivacy Directive in mind.

In early December, an EU Parliament committee approved the proposed derogation but the full Parliament has not yet acted upon the measure. The Parliament needs to reach agreement with the Presidency of the Council and the European Commission. In its press release, the Civil Liberties, Justice and Home Affairs explained:

The proposed regulation will provide for limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

Article 1 sets out the scope and aim of the temporary regulation:

This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.

The EC explained the legal and policy background for the exemption to the ePrivacy Directive:

  • On 21 December 2020, with the entry into application of the European Electronic Communications Code (EECC), the definition of electronic communications services will be replaced by a new definition, which includes number-independent interpersonal communications services. From that date on, these services will, therefore, be covered by the ePrivacy Directive, which relies on the definition of the EECC. This change concerns communications services like webmail messaging services and internet telephony.
  • Certain providers of number-independent interpersonal communications services are already using specific technologies to detect child sexual abuse on their services and report it to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, and/or to remove child sexual abuse material. These organisations refer to national hotlines for reporting child sexual abuse material, as well as organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the EU and in third countries.
  • Child sexual abuse is a particularly serious crime that has wide-ranging and serious life-long consequences for victims. In hurting children, these crimes also cause significant and long- term social harm. The fight against child sexual abuse is a priority for the EU. On 24 July 2020, the European Commission adopted an EU strategy for a more effective fight against child sexual abuse, which aims to provide an effective response, at EU level, to the crime of child sexual abuse. The Commission announced that it will propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and oblige them to report that material to public authorities by the second quarter of 2021. The announced legislation will be intended to replace this Regulation, by putting in place mandatory measures to detect and report child sexual abuse, in order to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse, while ensuring respect of the fundamental rights of the users, including in particular the right to freedom of expression and opinion, protection of personal data and privacy, and providing for mechanisms to ensure accountability and transparency.

The EC baldly asserts the problem of child online sexual abuse justifies a loophole to the broad prohibition on violating the privacy of EU persons. The EC did note that the fight against this sort of crime is a political priority for the EC, one that ostensibly puts the EU close to the views of the Five Eyes nations that have been pressuring technology companies to end the practice of making apps and hardware encrypted by default.

The EC explained:

The present proposal therefore presents a narrow and targeted legislative interim solution with the sole objective of creating a temporary and strictly limited derogation from the applicability of Articles 5(1) and 6 of the ePrivacy Directive, which protect the confidentiality of communications and traffic data. This proposal respects the fundamental rights, including the rights to privacy and protection of personal data, while enabling providers of number-independent interpersonal communications services to continue using specific technologies and continue their current activities to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services, pending the adoption of the announced long- term legislation. Voluntary efforts to detect solicitation of children for sexual purposes (“grooming”) also must be limited to the use of existing, state-of-the-art technology that corresponds to the safeguards set out. This Regulation should cease to apply in December 2025.

The EC added “[i]n case the announced long-term legislation is adopted and enters into force prior to this date, that legislation should repeal the present Regulation.”

In November, the European Data Protections Supervisor (EDPS) Wojciech Wiewiorówski published his opinion on the temporary, limited derogation from the EU’s regulation on electronics communication and privacy. Wiewiorówski cautioned that a short-term exception, however well-intended, would lead to future loopholes that would ultimately undermine the purpose of the legislation. Moreover, Wiewiorówski found that the derogation was not sufficiently specific guidance and safeguards and is not proportional. Wiewiorówski argued:

  • In particular, he notes that the measures envisaged by the Proposal would constitute an interference with the fundamental rights to respect for private life and data protection of all users of very popular electronic communications services, such as instant messaging platforms and applications. Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
  • The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field. The EDPS therefore considers it essential that the Proposal is not adopted, even in the form a temporary derogation, until all the necessary safeguards set out in this Opinion are integrated.
  • In particular, in the interest of legal certainty, the EDPS considers that it is necessary to clarify whether the Proposal itself is intended to provide a legal basis for the processing within the meaning of the GDPR, or not. If not, the EDPS recommends clarifying explicitly in the Proposal which legal basis under the GDPR would be applicable in this particular case.
  • In this regard, the EDPS stresses that guidance by data protection authorities cannot substitute compliance with the requirement of legality. It is insufficient to provide that the temporary derogation is “without prejudice” to the GDPR and to mandate prior consultation of data protection authorities. The co-legislature must take its responsibility and ensure that the proposed derogation complies with the requirements of Article 15(1), as interpreted by the CJEU.
  • In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.
  • Finally, the EDPS is of the view that the five-year period as proposed does not appear proportional given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards within the text of the legislation. He considers that the validity of any transitional measure should not exceed 2 years.

The Five Eyes nations (Australia, Canada, New Zealand, the United Kingdom, and the United States) issued a joint statement in which their ministers called for quick action.

In this statement, we highlight how from 21 December 2020, the ePrivacy Directive, applied without derogation, will make it easier for children to be sexually exploited and abused without detection – and how the ePrivacy Directive could make it impossible both for providers of internet communications services, and for law enforcement, to investigate and prevent such exploitation and abuse. It is accordingly essential that the European Union adopt urgently the derogation to the ePrivacy Directive as proposed by the European Commission in order for the essential work carried out by service providers to shield endangered children in Europe and around the world to continue.

Without decisive action, from 21 December 2020 internet-based messaging services and e-mail services captured by the European Electronic Communications Code’s (EECC) new, broader definition of ‘electronic communications services’ are covered by the ePrivacy Directive. The providers of electronic communications services must comply with the obligation to respect the confidentiality of communications and the conditions for processing communications data in accordance with the ePrivacy Directive. In the absence of any relevant national measures made under Article 15 of that Directive, this will have the effect of making it illegal for service providers operating within the EU to use their current tools to protect children, with the impact on victims felt worldwide.

As mentioned, this derogation comes at a time when the EC and the EU nations are trying to finalize and enact an ePrivacy Regulation. In the original 2017 proposal, the EC stated:

The ePrivacy Directive ensures the protection of fundamental rights and freedoms, in particular the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the Union.

The ePrivacy Regulation is intended to work in concert with the GDPR, and the draft 2020 regulation contains the following passages explaining the intended interplay of the two regulatory schemes:

  • Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons. Processing of electronic communications data by providers of electronic communications services and networks should only be permitted in accordance with this Regulation. This Regulation does not impose any obligations on the end-user End-users who are legal persons may have rights conferred by Regulation (EU) 2016/679 to the extent specifically required by this Regulation
  • While the principles and main provisions of Directive 2002/58/EC of the European Parliament and of the Council remain generally sound, that Directive has not fully kept pace with the evolution of technological and market reality, resulting in an inconsistent or insufficient effective protection of privacy and confidentiality in relation to electronic communications. Those developments include the entrance on the market of electronic communications services that from a consumer perspective are substitutable to traditional services, but do not have to comply with the same set of rules. Another development concerns new techniques that allow for tracking of online behaviour of end-users, which are not covered by Directive 2002/58/EC. Directive 2002/58/EC should therefore be repealed and replaced by this Regulation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Périgois on Unsplash

New Cybersecurity Law and Strategy Unveiled

The EU is revising and replacing a 2016 regime to govern cybersecurity across the bloc.

The European Union (EU) is floating a proposal to reform its 2016 law on cybersecurity throughout the European Union to address gaps the current regime is not addressing. This proposal was released in concert with a new cybersecurity strategy and a statutory proposal to address physical (i.e. non-cyber) infrastructure. These proposals are the latest in a line of policy changes put forth by the EU’s new leadership to make this decade the EU’s Digital Decade. It may, however, take years for these proposals to become laws. For example, the successor to the ePrivacy Directive has been held up in negotiations for the last few years.

New European Commission (EC) President Ursula von der Leyen spelled out her vision for the EU for the years of 2019 through 2024, including “A Europe fit for the digital age.” In its February 2020 “Communication: Shaping Europe’s digital future,” the EC spelled out more how von der Leyen’s vision would be effectuated:

A European cybersecurity strategy, including the establishment of a joint Cybersecurity Unit, a Review of the Security of Network and Information Systems (NIS) Directive and giving a push to the single market for cybersecurity.

To this end, in mid-December 2020, the EC and the High Representative of the Union for Foreign Affairs and Security Policy unveiled a new EU Cybersecurity Strategy and “proposals to address both cyber and physical resilience of critical entities and networks: a Directive on measures for high common level of cybersecurity across the Union (revised NIS Directive or ‘NIS 2′), and a new Directive on the resilience of critical entities.”

Let us turn to the NIS 2 first. This proposal would replace the 2016 “Directive on security of network and information systems (NIS Directive)” ((EU) 2016/1148) currently in effect throughout the EU. NIS 2 would impose new obligations and responsibilities on EU member states and essential and important entities. The nations of the EU would need to draft and implement cybersecurity frameworks/strategies, which includes setting up vulnerability disclosure programs, voluntary cybersecurity information sharing programs, a policy to address information and communications technology (ICT) supply chain risk, and cybersecurity standards for publicly bought and used ICT. EU nations would also need to name “competent” national authorities to enforce NIS 2, for the EC identified lax or non-existent enforcement of existing cybersecurity laws as a rationale for the new proposal. Consequently, such authorities must be empowered to issue binding directives, if necessary, warnings, or instructions to cease certain conduct. These authorities must also work with data protection authorities in the event of data breaches. NIS 2 also provides for administrative fines and penalties to be established in the laws of EU nations.

Additionally, all EU nations should have computer security incident response teams (CSIRTs). NIS 2 would apply to a number of public and private entities in certain sectors, which are deemed “essential:” energy; transport; banking; financial market infrastructures; health, drinking water; waste water; digital infrastructure; public administration and space. Some public and private entities would be “important” entities and subject to the NIS 2 in these sectors: postal and courier services; waste management; manufacture, production and distribution of chemicals; food production, processing and distribution; manufacturing and digital providers. Micro and small entities would largely not be swept up into NIS 2 even if they are part of one of the aforementioned sectors. However, “providers of electronic communications networks or of publicly available electronic communications services, trust service providers, Top-level domain name (TLD) name registries and public administration, and certain other entities” would be governed by NIS 2 regardless of their size.

The EU would also establish a Cooperation Group that would be tasked with helping EU nations work more harmoniously under the NIS 2. However, this new body, unlike, say the General Data Protection Regulation’s created European Data Protection Board (EDPB), would not have power to compel its members to comply with NIS 2.

Notably, NIS 2 would require that: “Member States shall ensure that essential and important entities shall take appropriate and proportionate technical and organisational measures to manage the risks posed to the security of network and information systems which those entities use in the provision of their services.” The law lists a number of elements that must go into these measures. Moreover, “essential and important entities notify, without undue delay, the competent authorities or the CSIRT…of any incident having a significant impact on the provision of their services.” The NIS 2 lays out broad criteria as to what constitutes a “significant impact:”

  • the incident has caused or has the potential to cause substantial operational disruption or financial losses for the entity concerned;
  • the incident has affected or has the potential to affect other natural or legal persons by causing considerable material or non-material losses.

In order to address ICT supply chain risk, EU countries may elect to “require essential and important entities to certify certain ICT products, ICT services and ICT processes under specific European cybersecurity certification schemes adopted ” under the legislation that created the European Union Agency for Cybersecurity (ENISA).

As noted earlier, EU nations must establish systems for essential and important entities to share information but need not compel them to do so. Article 26 provides that nation “shall ensure that essential and important entities may exchange relevant cybersecurity information among themselves including information relating to cyber threats, vulnerabilities, indicators of compromise, tactics, techniques and procedures, cybersecurity alerts and configuration tools.” EU countries must also have a system for any non-essential, non-important entities or those from sectors not covered by NIS 2 can also voluntarily submit information.

The EC argued that the NIS Directive is now outdated and is in desperate need of revision to reflect current realities:

Notwithstanding its notable achievements, the NIS Directive, which paved the way for a significant change in mind-set, in relation to the institutional and regulatory approach to cybersecurity in many Member States, has also proven its limitations. The digital transformation of society (intensified by the COVID-19 crisis) has expanded the threat landscape and is bringing about new challenges which require adapted and innovative responses. The number of cyber-attacks continues to rise, with increasingly sophisticated attacks coming from a wide range of sources inside and outside the EU.

The EC highlighted some of the limitations in how the NIS Directive has been implemented by EU member states and its failure to drive the adoption of better cyber practices by EU businesses:

The evaluation on the functioning of the NIS Directive, conducted for the purposes of the Impact Assessment, identified the following issues: (1) the low level of cyber resilience of businesses operating in the EU; (2) the inconsistent resilience across Member States and sectors; and (3) the low level of joint situational awareness and lack of joint crisis response. For example, certain major hospitals in a Member State do not fall within the scope of the NIS Directive and hence are not required to implement the resulting security measures, while in another Member State almost every single healthcare provider in the country is covered by the NIS security requirements.

The EC explained how the NIS 2 relates to a proposal released the same day to address physical infrastructure in the EU:

The proposal is therefore closely aligned with the proposal for a Directive on the resilience of critical entities, which aims at enhancing the resilience of critical entities against physical threats in a large number of sectors. The proposal aims to ensure that competent authorities under both legal acts take complementary measures and exchange information as necessary regarding cyber and non-cyber resilience, and that particularly critical operators in the sectors considered to be ‘essential’ per the proposal at hand are also subject to more general resilience-enhancing obligations with an emphasis on non-cyber risks.

The EC’s impact assessment on how well the NIS Directive is working shows limitations in scope and application, some of which may be attributed to changes in the EU and the world:

  • The scope of the NIS Directive is too limited in terms of the sectors covered, mainly due to: (i) increased digitisation in recent years and a higher degree of interconnectedness, (ii) the scope of the NIS Directive no longer reflecting all digitised sectors providing key services to the economy and society as a whole.
  • The NIS Directive is not sufficiently clear when it comes to the scope for operators of essential services and its provisions do not provide sufficient clarity regarding national competence over digital service providers. This has led to a situation in which certain types of entities have not been identified in all Member States and are therefore not required to put in place security measures and report incidents.
  • The NIS Directive allowed wide discretion to the Member States when laying down security and incident reporting requirements for operators of essential services (hereinafter called ‘OES(s)’). The evaluation shows that in some instances Member States have implemented these requirements in significantly different ways, creating additional burden for companies operating in more than one Member State.
  • The supervision and enforcement regime of the NIS Directive is ineffective. For example, Member States have been very reluctant to apply penalties to entities failing to put in place security requirements or report incidents. This can have negative consequences for the cyber resilience of individual entities.
  • The financial and human resources set aside by Member States for fulfilling their tasks (such as OES identification or supervision), and consequently the different levels of maturity in dealing with cybersecurity risks, vary greatly. This further exacerbates the differences in cyber resilience between Member States.
  • Member States do not share information systematically with one another, with negative consequences in particular for the effectiveness of the cybersecurity measures and for the level of joint situational awareness at EU level. This is also the case for information sharing among private entities, and for the engagement between the EU level cooperation structures and private entities.

The EC’s proposal contains a summary of what the new law would do:

  • The Directive, in particular: (a) lays down obligations for the Member States to adopt a national cybersecurity strategy, designate competent national authorities, single points of contact and CSIRTs; (b) provides that Member States shall lay down cybersecurity risk management and reporting obligations for entities referred to as essential entities in Annex I and important entities in Annex II; (c) provides that Member States shall lay down obligations on cybersecurity information sharing.
  • It applies to certain public or private essential entities operating in the sectors listed in Annex I (energy; transport; banking; financial market infrastructures; health, drinking water; waste water; digital infrastructure; public administration and space) and certain important entities operating in the sectors listed in Annex II (postal and courier services; waste management; manufacture, production and distribution of chemicals; food production, processing and distribution; manufacturing and digital providers). Micro and small entities within the meaning of Commission Recommendation 2003/361/EC of 6 May 2003 are excluded from the scope of the Directive, except for providers of electronic communications networks or of publicly available electronic communications services, trust service providers, Top-level domain name (TLD) name registries and public administration, and certain other entities, such as the sole provider of a service in a Member State.

The EC also released “The EU’s Cybersecurity Strategy for the Digital Decade” alongside the NIS 2 “to ensure a global and open Internet with strong guardrails to address the risks to the security and fundamental rights and freedoms of people in Europe.” The EC spelled out its dramatic plan to remake how the bloc regulates, invests in, and structures policies around cybersecurity. The EC claimed “[a]s a key component of Shaping Europe’s Digital Future, the Recovery Plan for Europe  and the EU Security Union Strategy, the Strategy will bolster Europe’s collective resilience against cyber threats and help to ensure that all citizens and businesses can fully benefit from trustworthy and reliable services and digital tools.” If the EU follows through, this strategy may have significant effects in the EU and around the world.

The EC further explained:

  • Following the progress achieved under the previous strategies, it contains concrete proposals for deploying three principal instruments –regulatory, investment and policy instruments – to address three areas of EU action – (1) resilience, technological sovereignty and leadership, (2) building operational capacity to prevent, deter and respond, and (3) advancing a global and open cyberspace. The EU is committed to supporting this strategy through an unprecedented level of investment in the EU’s digital transition over the next seven years – potentially quadrupling previous levels – as part of new technological and industrial policies and the recovery agenda
  • Cybersecurity must be integrated into all these digital investments, particularly key technologies like Artificial Intelligence (AI), encryption and quantum computing, using incentives, obligations and benchmarks. This can stimulate the growth of the European cybersecurity industry and provide the certainty needed to ease the phasing out of legacy systems. The European Defence Fund (EDF) will support European cyber defence solutions, as part of the European defence technological and industrial base. Cybersecurity is included in external financial instruments to support our partners, notably the Neighbourhood, Development and International Cooperation Instrument. Preventing the misuse of technologies, protecting critical infrastructure and ensuring the integrity of supply chains also enables the EU’s adherence to the UN norms, rules and principles of responsible state behavior.

Per the EC’s press release, the ”Directive on the resilience of critical entities” “expands both the scope and depth of the 2008 European Critical Infrastructure directive.” The EC added:

Ten sectors are now covered: energy, transport, banking, financial market infrastructures, health, drinking water, waste water, digital infrastructure, public administration and space. Under the proposed directive, Member States would each adopt a national strategy for ensuring the resilience of critical entities and carry out regular risk assessments. These assessments would also help identify a smaller subset of critical entities that would be subject to obligations intended to enhance their resilience in the face of non-cyber risks, including entity-level risk assessments, taking technical and organisational measures, and incident notification. The Commission, in turn, would provide complementary support to Member States and critical entities, for instance by developing a Union-level overview of cross-border and cross-sectoral risks, best practice, methodologies, cross-border training activities and exercises to test the resilience of critical entities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Prawny from Pixabay

New Google Antitrust Suits Filed

Two new suits filed against Google by state attorneys general. If the content detailed isn’t illegal behavior, get ready for even more shocking conduct from technology companies to stymie competitors and extract the maximum of any and all rents.

Last month, two new suits were filed against Google, arguing that the company’s dominance in the search engine and online advertising markets. One suit is led by Colorado’s attorney general and the other by Texas’ attorney general. The two suits have overlapping but different foci, and it is possible these new suits get folded into the suit against Google filed by the United States (U.S.) Department of Justice (DOJ). There are also media reports that some of the states that brought these suits may be preparing yet another antitrust action against Google over allegedly anti-monopolistic behavior in how it operates its Google Play app store.

Colorado Attorney General Phil Phil Weiser and 38 other state attorneys general[1] filed their antitrust suit in the District Court of the District of Columbia “under Section 2 of the Sherman Act, 15 U.S.C. § 2, to restrain Google from unlawfully restraining trade and maintaining monopolies in markets that include general search services, general search text advertising, and general search advertising in the United States, and to remedy the effects of this conduct.” They are asking the court for a range of relief, including but not limited to permanent injunctions to stop ongoing and future anti-competitive conduct and a ;possible breakup of the company.

Weiser and his counterparts framed their argument this way:

Google, one of the largest companies in the world, has methodically undertaken actions to entrench and reinforce its general search services and search-related advertising monopolies by stifling competition. As the gateway to the internet, Google has systematically degraded the ability of other companies to access consumers. In doing so, just as Microsoft improperly maintained its monopoly through conduct directed at Netscape, Google has improperly maintained and extended its search-related monopolies through exclusionary conduct that has harmed consumers, advertisers, and the competitive process itself. Google, moreover, cannot establish business justifications or procompetitive benefits sufficient to justify its exclusionary conduct in any relevant market.

They summed up their legal argument of three forms of anticompetitive conduct of Google:

  • First, Google uses its massive financial resources to limit the number of consumers who use a Google competitor. For example, according to public estimates Google pays Apple between $8 and $12 billion per year to ensure that Google is enthroned as the default search engine on Apple devices, and it limits general search competition on Android devices with a web of restrictive contracts. Google pursues similar strategies with other devices, such as voice assistants and internet-connected cars.
  • Second, Google’s Search Ads 360 (“SA360”) service, a search advertising marketing tool used by many of the world’s most sophisticated advertisers, has long pledged to offer advertisers a “neutral” means for purchasing and comparing the performance of not only Google’s search advertising, but also that of its closest competitors. But, in reality, Google operates SA360—the single largest such tool used by advertisers—to severely limit the tool’s interoperability with a competitor, thereby disadvantaging SA360 advertisers.
  • Third, Google throttles consumers from bypassing its general search engine and going directly to their chosen destination, especially when those destinations threaten Google’s monopoly power. Google acknowledges its [REDACTED] because of the proliferation of services offered by specialized vertical providers. Specialized vertical providers, like an online travel agency who offer consumers the ability to complete a transaction then and there, do not compete in Google’s search-related markets. Nevertheless, they pose a threat to Google’s monopoly power in those markets because their success would both strengthen general search rivals with whom they partner and lower the artificially high barriers to expansion and entry that protect Google’s monopolies.

In summary, Weiser and his colleagues argued:

  • Google has willfully maintained, abused, and extended its monopoly power in general search services through (a) anticompetitive and exclusionary distribution agreements that lock up the present default positions for search access points on browsers, mobile devices, computers, and other devices as well as emerging device technology; require preinstallation and prominent placement of Google’s apps; and tie Google’s search access points to Google Play and Google APIs; (b) operation of SA360 to limit the tool’s interoperability with a competitor, disadvantaging SA360 advertisers; (c) discriminatory treatment towards specialized vertical providers in certain commercial segments that hinders consumers’ ability to find responsive information; and (d) other restrictions that drive queries to Google at the expense of search rivals.
  • Google has willfully maintained, abused, and extended its monopoly power in general search advertising through (a) anticompetitive and exclusionary distribution agreements that lock up the present default positions for search access points on browsers, mobile devices, computers, and other devices as well as emerging device technology; require preinstallation and prominent placement of Google’s apps; and tie Google’s search access points to Google Play and Google APIs; (b) operation of SA360 to limit the tool’s interoperability with a competitor, disadvantaging SA360 advertisers; (c) discriminatory treatment towards specialized vertical providers in certain commercial segments that hinders consumers’ ability to find responsive information; and (d) other restrictions that drive queries to Google at the expense of search rivals.
  • Google has willfully maintained, abused, and extended its monopoly power in general search text advertising through (a) anticompetitive and exclusionary distribution agreements that lock up the present default positions for search access points on browsers, mobile devices, computers, and other devices as well as emerging device technology; require preinstallation and prominent placement of Google’s apps; and tie Google’s search access points to Google Play and Google APIs; (b) operation of SA360 to limit the tool’s interoperability with a competitor, disadvantaging SA360 advertisers; (c) discriminatory treatment towards specialized vertical providers in certain commercial segments that hinders consumers’ ability to find responsive information; and (d) other restrictions that drive queries to Google at the expense of search rivals.

Texas Attorney General Ken Paxton and nine other attorneys general[2] filed their antitrust action in the Eastern District of Texas and dropped a bomb: they allege Google and Facebook conspired to monopolize the online advertising market after publishers have devised a system to blunt Google’s dominance. However, Paxton and his colleagues argue that Google’s illegal actions have essentially taxed Americans through higher prices and lower quality products and services because companies are forced to pay a premium to Google to advertise online.

Paxton and the attorneys general summarized their suit and the relief they think appropriate in light of Google’s conduct:

As a result of Google’s anticompetitive conduct, including its unlawful agreement with Facebook, Google has violated and continues to violate Sections 1 and 2 of the Sherman Act, 15 U.S.C. §§ 1, 2. Plaintiff States bring this action to remove the veil of Google’s secret practices and end Google’s abuse of its monopoly power in online advertising markets. Plaintiff States seek to restore free and fair competition to these markets and to secure structural, behavioral, and monetary relief to prevent Google from ever again engaging in deceptive trade practices and abusing its monopoly power to foreclose competition and harm consumers.

They summed up the harm they think Google has wrought:

Plaintiff States have sustained antitrust injury as a direct and proximate cause of Google’s unlawful conduct, in at least the following ways: (1) substantially foreclosing competition in the market for publisher ad servers, and using market power in the publisher ad server market to harm competition in the exchange market; (2) substantially foreclosing competition in the exchange market by denying rivals’ access to publisher inventory and to advertiser demand; (3) substantially foreclosing competition in the market for demand-side buying tools by creating information asymmetry and unfair auctions by virtue of Google’s market dominance in the publisher ad serving tools and exchange markets; (4) increasing barriers to entry and competition in publisher ad server, exchange, and demand-side buying tools markets; (5) harming innovation, which would otherwise benefit publishers, advertisers and competitors; (6) harming publishers’ ability to effectively monetize their content, reducing publishers’ revenues, and thereby reducing output and harming consumers; (7) reducing advertiser demand and participation in the market by maintaining opacity on margins and selling process, harming rival exchanges and buying tools; (8) increasing advertisers’ costs to advertise and reducing the effectiveness of their advertising, and thereby harming businesses’ return on the investment in delivering their products and services, reducing output, and harming consumers; (9) protecting Google’s products from competitive pressures, thereby allowing it to continue to extract high margins while shielded from significant pressure to innovate.

With regard to another possible antitrust action against Google, the suit Epic Games brought against the tech giant for taking 30% of in-app purchases as a condition of being allowed in the Play Store may shed light on what such a suit may look like.  In August Epic Games filed a suit against Google on substantially the same grounds as it is bringing against Apple. Google acted after Apple did to remove Fortnite from its Play Store once Epic Games started offering users a discounted price to buy directly from them as opposed to through Google. Epic asserted:

  • Epic brings claims under Sections 1 and 2 of the Sherman Act and under California law to end Google’s unlawful monopolization and anti-competitive restraints in two separate markets: (1) the market for the distribution of mobile apps to Android users and (2) the market for processing payments for digital content within Android mobile apps. Epic seeks to end Google’s unfair, monopolistic and anti-competitive actions in each of these markets, which harm device makers, app developers, app distributors, payment processors, and consumers.
  • Epic does not seek monetary compensation from this Court for the injuries it has suffered. Epic likewise does not seek a side deal or favorable treatment from Google for itself. Instead, Epic seeks injunctive relief that would deliver Google’s broken promise: an open, competitive Android ecosystem for all users and industry participants. Such injunctive relief is sorely needed.
  • Google has eliminated competition in the distribution of Android apps using myriad contractual and technical barriers. Google’s actions force app developers and consumers into Google’s own monopolized “app store”—the Google Play Store. Google has thus installed itself as an unavoidable middleman for app developers who wish to reach Android users and vice versa. Google uses this monopoly power to impose a tax that siphons monopoly profits for itself every time an app developer transacts with a consumer for the sale of an app or in-app digital content. And Google further siphons off all user data exchanged in such transactions, to benefit its own app designs and advertising business.
  • If not for Google’s anti-competitive behavior, the Android ecosystem could live up to Google’s promise of open competition, providing Android users and developers with competing app stores that offer more innovation, significantly lower prices and a choice of payment processors. Such an open system is not hard to imagine. Two decades ago, through the actions of courts and regulators, Microsoft was forced to open up the Windows for PC ecosystem. As a result, PC users have multiple options for downloading software unto their computers, either directly from developers’ websites or from several competing stores. No single entity controls the ecosystem or imposes a tax on all transactions. And Google, as the developer of software such as the Chrome browser, is a direct beneficiary of this competitive landscape. Android users and developers likewise deserve free and fair competition.

In late October, the DOJ and a number of states filed a long awaited antitrust suit against Google that has been rumored to be coming since late summer 2020. This anti-trust action centers on Google’s practices of making Google the default search engine on Android devices and paying browsers and other technology entities to make Google the default search engine. The DOJ and eleven state attorneys general are following in the footsteps of the European Union’s (EU) €4.34 billion fine of Google in 2018 for imposing “illegal restrictions on Android device manufacturers and mobile network operators to cement its dominant position in general internet search.” The European Commission (EC or Commission) claimed the offending behavior included:

  • has required manufacturers to pre-install the Google Search app and browser app (Chrome), as a condition for licensing Google’s app store (the Play Store);
  • made payments to certain large manufacturers and mobile network operators on condition that they exclusively pre-installed the Google Search app on their devices; and
  • has prevented manufacturers wishing to pre-install Google apps from selling even a single smart mobile device running on alternative versions of Android that were not approved by Google (so-called “Android forks”).

The EC said its “decision concludes that Google is dominant in the markets for general internet search services, licensable smart mobile operating systems and app stores for the Android mobile operating system.”

And, of course, this is only the latest anti-trust case Google has faced in the EU with the €2.42 billion fine in June 2017 “for abusing its dominance as a search engine by giving an illegal advantage to Google’s own comparison shopping service.”

Google’s antitrust and anticompetitive issues are not confined to the United States and the EU. In 2019, the Australian Competition and Consumer Commission (ACCC) announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Hebi B. from Pixabay


[1] The following states are parties to the suit: Colorado, Nebraska, Arizona, Iowa, New York, North Carolina, Tennessee, Utah, Alaska, Connecticut, Delaware, Hawaii, Idaho, Illinois, Kansas, Maine, Maryland, Minnesota, Nevada, New Hampshire, New Jersey, New Mexico, North Dakota, Ohio, Oklahoma, Oregon, Rhode Island, South Dakota, Vermont, Washington, West Virginia, and Wyoming; the Commonwealths of Massachusetts, Pennsylvania, Puerto Rico, and Virginia; the Territory of Guam; and the District of Columbia.

[2] These states sued Google: Texas, Arkansas  Idaho, Indiana, Mississippi,  Missouri,  North Dakota,  South Dakota, Utah, and the Commonwealth of Kentucky.

Further Reading, Other Developments, and Coming Events (5 January 2021)

Further Reading

  • China Used Stolen Data To Expose CIA Operatives In Africa And Europe;” “Beijing Ransacked Data as U.S. Sources Went Dark in China;” “Tech Giants Are Giving China A Vital Edge In Espionage” By Zach Dorfman — Foreign Policy. This terrifying trio of articles lays bare the 180 degree change in espionage advantage the People’s Republic of China (PRC) seems to hold over the United States (U.S.). Hacking, big data, processing, algorithms, and other technological issues play prominent roles in the PRC’s seeming advantage. It remains to be seen how the U.S. responds to the new status quo.
  • Singapore police can access COVID-19 contact tracing data for criminal investigations” By Eileen Yu — ZDNet. During questioning in Singapore’s Parliament, it was revealed the police can use existing authority to access the data on a person’s smartphone collected by the nation’s TraceTogether app. Technically, this would entail a person being asked by the police to upload their data, which is stored on devices and encrypted. Nonetheless, this is the very scenario privacy advocates have been saying is all but inevitable with COVID-19 tracing apps on phones.
  • As Understanding of Russian Hacking Grows, So Does Alarm” By David Sanger, Nicole Perlroth, and Julian Barnes — The New York Times. Like a detonated bomb, the Russian hack of United States (U.S.) public and private systems keeps getting worse in terms of damage and fallout. The scope continues to widen as it may come to pass that thousands of U.S. entities have been compromised in ways that leave them vulnerable to future attacks. Incidentally, the massive hack has tarnished somewhat the triumph of the U.S. intelligence agencies in fending off interference with the 2020 election.
  • Google workers launch unconventional union with help of Communications Workers of America” By Nitasha Tiku — The Washington Post. A new union formed in Google stopped short of seeking certification by the National Labor Relations Board (NLRB), which will block it from collective bargaining. Nonetheless, the new union will collect dues and have a board of directors. This may lead to additional unionizing efforts in union-averse Silicon Valley and throughout the tech world.
  • ‘Break up the groupthink’: Democrats press Biden to diversify his tech picks” By Cristiano Lima — Politico. Key Democratic groups in the House are pushing the Biden team to appoint people of color for key technology positions at agencies such as the Federal Trade Commission (FTC), Federal Communications Commission (FCC), the Office of Science and Technology Policy (OSTP).

Other Developments

  • The Congress overrode President Donald Trump’s veto of the FY 2021 National Defense Authorization Act (NDAA), thus enacting the annual defense and national security policy bill, which includes a number of technology provisions that will have effects in the public and private sectors. (See here and here for analysis of these provisions in the “William M. “Mac” Thornberry National Defense Authorization Act for Fiscal Year 2021” (H.R.6395).
  • A federal court dismissed a lawsuit brought by a civil liberties and privacy advocacy group to stop implementation of President Donald Trump’s executive order aimed at social media companies and their liability protection under 47 USC 230 (aka Section 230). In June, the Center for Democracy and Technology (CDT), filed suit in federal court to block enforcement of the “Executive Order (EO) on Preventing Online Censorship.” However, the United States District Court of the District of Columbia ruled that CDT is not injured by the executive order (EO) and any such lawsuit is premature. The court dismissed the lawsuit for lack of jurisdiction.
    • In its complaint, CDT argued the EO “violates the First Amendment in two fundamental respects:
      • First, the Order is plainly retaliatory: it attacks a private company, Twitter, for exercising its First Amendment right to comment on the President’s statements.
      • Second, and more fundamentally, the Order seeks to curtail and chill the constitutionally protected speech of all online platforms and individuals— by demonstrating the willingness to use government authority to retaliate against those who criticize the government.”
  • The Federal Trade Commission (FTC) reached a settlement with a company that sells emergency travel and medical services for failing “to take reasonable steps to secure sensitive consumer information such as health records,” including having a unsecured cloud database a security researcher stumbled upon with the sensitive data of more than 130,000 people. Moreover, the company claimed a certification of compliance with the Health Insurance Portability and Accountability Act (HIPAA), which turned out to be untrue. In the complaint, the FTC alleged that these and other practices “constitute unfair and/or deceptive acts or practices, in or affecting commerce in violation of Section 5(a) of the Federal Trade Commission Act.” The FTC and the company reached agreement on a consent order that will require the company’s compliance for at least 20 years.
    • In the complaint, the FTC stated that SkyMed “advertises, offers for sale, and sells nationwide a wide array of emergency travel membership plans that cover up to eighteen different emergency travel and medical evacuation services for members who sustain serious illnesses or injuries during travel in certain geographic areas.”
    • The FTC asserted a security researcher discovered SkyMed’s “database, which could be located and accessed by anyone on the internet, contained approximately 130,000 membership records with consumers’ personal information stored in plain text, including information populated in certain fields for names, dates of birth, gender, home addresses, email addresses, phone numbers, membership information and account numbers, and health information.”
    • The FTC noted the company told affected customers that it had investigated and “[t]here was no medical or payment-related information visible and no indication that the information has been misused.” This turns out to be completely false, and the company’s “investigation did not determine that consumers’ health information was neither stored on the cloud database, nor improperly accessed by an unauthorized third party.”
    • The FTC summarized the terms of the consent order and SkyMed’s obligations:
      • Under the proposed settlement, SkyMed is prohibited from misrepresenting how it secures personal data, the circumstances of and response to a data breach, and whether the company has been endorsed by or participates in any government-sponsored privacy or security program. The company also will be required to send a notice to affected consumers detailing the data that was exposed by the data breach.
      • As part of the mandated information security program, the company must identify and document potential internal and external risks and design, implement, and maintain safeguards to protect personal information it collects from those risks. In addition, SkyMed must obtain biennial assessments of its information security program by a third party, which the FTC has authority to approve, to examine the effectiveness of SkyMed’s information security program, identify any gaps or weaknesses, and monitor efforts to address these problems. The settlement also requires a senior SkyMed executive to certify annually that the company is complying with the requirements of the settlement.
  • The European Commission (EC) has communicated its vision for a new cybersecurity strategy to the European Parliament and European Council “to ensure a global and open Internet with strong guardrails to address the risks to the security and fundamental rights and freedoms of people in Europe.” The EC spelled out its dramatic plan to remake how the bloc regulates, invests in, and structures policies around cybersecurity. The EC claimed “[a]s a key component of Shaping Europe’s Digital Future, the Recovery Plan for Europe  and the EU Security Union Strategy, the Strategy will bolster Europe’s collective resilience against cyber threats and help to ensure that all citizens and businesses can fully benefit from trustworthy and reliable services and digital tools.” If the European Union (EU) follows through, this strategy may have significant effects in the EU and around the world. The EC further explained:
    • Following the progress achieved under the previous strategies, it contains concrete proposals for deploying three principal instruments –regulatory, investment and policy instruments – to address three areas of EU action – (1) resilience, technological sovereignty and leadership, (2) building operational capacity to prevent, deter and respond, and (3) advancing a global and open cyberspace. The EU is committed to supporting this strategy through an unprecedented level of investment in the EU’s digital transition over the next seven years – potentially quadrupling previous levels – as part of new technological and industrial policies and the recovery agenda
    • Cybersecurity must be integrated into all these digital investments, particularly key technologies like Artificial Intelligence (AI), encryption and quantum computing, using incentives, obligations and benchmarks. This can stimulate the growth of the European cybersecurity industry and provide the certainty needed to ease the phasing out of legacy systems. The European Defence Fund (EDF) will support European cyber defence solutions, as part of the European defence technological and industrial base. Cybersecurity is included in external financial instruments to support our partners, notably the Neighbourhood, Development and International Cooperation Instrument. Preventing the misuse of technologies, protecting critical infrastructure and ensuring the integrity of supply chains also enables the EU’s adherence to the UN norms, rules and principles of responsible state behavior.
    • With respect to actions that might be taken, the EC stated that “[t]he EU should ensure:
      • Adoption of revised NIS Directive;
      • Regulatory measures for an Internet of Secure Things
      • Through the CCCN investment in cybersecurity (notably through the Digital Europe Programme, Horizon Europe and recovery facility) to reach up to €4.5 billion in public and private investments over 2021-2027;
      • An EU network of AI-enabled Security Operation Centres and an ultra-secure communication infrastructure harnessing quantum technologies;
      • Widespread adoption of cybersecurity technologies through dedicated support to SMEs under the Digital Innovation Hubs;
      • Development of an EU DNS resolver service as a safe and open alternative for EU citizens, businesses and public administration to access the Internet; and
      • Completion of the implementation of the 5G Toolbox by the second quarter of 2021
      • Complete the European cybersecurity crisis management framework and determine the process, milestones and timeline for establishing the Joint Cyber Unit;
      •  Continue implementation of cybercrime agenda under the Security Union Strategy;
      • Encourage and facilitate the establishment of a Member States’ cyber intelligence working group residing within the EU INTCEN;
      • Advance the EU’s cyber deterrence posture to prevent, discourage, deter and respond to malicious cyber activities;
      • Review the Cyber Defence Policy Framework;
      • Facilitate the development of an EU “Military Vision and Strategy on Cyberspace as a Domain of Operations” for CSDP military missions and operations;
      • Support synergies between civil, defence and space industries; and
      • Reinforce cybersecurity of critical space infrastructures under the Space Programme.
      • Define a set of objectives in international standardisation processes, and promote these at international level;
      • Advance international security and stability in cyberspace, notably through the proposal by the EU and its Member States for a Programme of Action to Advance Responsible State Behaviour in Cyberspace (PoA) in the United Nations;
      • Offer practical guidance on the application of human rights and fundamental freedoms in cyberspace;
      • Better protect children against child sexual abuse and exploitation, as well as a Strategy on the Rights of the Child;
      • Strengthen and promote the Budapest Convention on Cybercrime, including through the work on the Second Additional Protocol to the Budapest Convention;
      • Expand EU cyber dialogue with third countries, regional and international organisations, including through an informal EU Cyber Diplomacy Network;
      • Reinforce the exchanges with the multi-stakeholder community, notably by regular and structured exchanges with the private sector, academia and civil society; and
      • Propose an EU External Cyber Capacity Building Agenda and an EU Cyber Capacity Building Board.
  • The U.S.-China  Economic  and  Security  Review  Commission released its annual report on the People’s Republic of China (PRC) per its “mandate “to monitor, investigate, and report to Congress on the national security implications of the bilateral trade and economic relationship between the United States and the People’s Republic of China.” The Commission argued:
    • Left unchecked, the PRC will continue building a new global order anathema to the interests and values that have underpinned unprecedented economic growth and stability among nations in the post-Cold War era. The past 20 years are littered with the Chinese  Communist  Party’s (CCP) broken promises. In China’s intended new order, there is little reason to believe CCP promises of “win-win” solutions, mutual respect, and peaceful coexistence. A clear understanding of the CCP’s adversarial national security and economic ambitions is essential as U.S. and allied leaders develop the policies and programs that will define the conditions of global freedom and shape our future.
    • The Commission made ten “Key Recommendations:”
      • Congress adopt the principle of reciprocity as foundational in all legislation bearing on U.S.-China relations.
      • Congress expand the authority of the Federal Trade Commission (FTC) to monitor and take foreign government subsidies into account in premerger notification processes.
      • Congress direct the U.S. Department of State to produce an annual report detailing China’s actions in the United Nations and its subordinate agencies that subvert the principles and purposes of the United Nations
      • Congress hold hearings to consider the creation of an interagency executive Committee on Technical Standards that would be responsible for coordinating U.S. government policy and priorities on international standards.
      • Congress consider establishing a “Manhattan Project”-like effort to ensure that the American public has access to safe and secure supplies of critical lifesaving and life-sustaining drugs and medical equipment, and to ensure that these supplies are available from domestic sources or, where necessary, trusted allies.
      • Congress enact legislation establishing a China Economic Data Coordination Center (CEDCC) at the Bureau of Economic Analysis at the U.S. Department of Commerce.
      • Congress direct the Administration, when sanctioning an entity in the People’s Republic of China for actions contrary to the economic and national security interests of the United States or for violations of human rights, to also sanction the parent entity.
      • Congress consider enacting legislation to make the Director of the American Institute in Taiwan a presidential nomination subject to the advice and consent of the United States Senate.
      • Congress amend the Immigration and Nationality Act to clarify that association with a foreign government’s technology transfer programs may be considered grounds to deny a nonimmigrant visa if the foreign government in question is deemed a strategic competitor of the United States, or if the applicant has engaged in violations of U.S. laws relating to espionage, sabotage, or export controls.
      • Congress direct the Administration to identify and remove barriers to receiving United States visas for Hong Kong residents attempting to exit Hong Kong for fear of political persecution.
  • The Electronic Privacy Information Center, the Center for Digital Democracy, the Campaign for a Commercial-Free Childhood, the Parent Coalition for Student Privacy, and Consumer Federation of America asked the Federal Trade Commission (FTC) “to recommend specific changes to the proposed Consent Order to safeguard the privacy interests of Zoom users” in their comments submitted regarding the FTC’s settlement with Zoom. In November, the FTC split along party lines to approve a settlement with Zoom to resolve allegations that the video messaging platform violated the FTC Act’s ban on unfair and deceptive practices in commerce. Zoom agreed to a consent order mandating a new information security program, third party assessment, prompt reporting of covered incidents and other requirements over a period of 20 years. The two Democratic Commissioners voted against the settlement and dissented because they argued it did not punish the abundant wrongdoing and will not dissuade future offenders. Commissioners Rohit Chopra and Rebecca Kelly Slaughter dissented for a variety of reasons that may be summed up: the FTC let Zoom off with a slap on the wrist. Kelly Slaughter focused on the majority’s choice to ignore the privacy implications of Zoom’s misdeeds, especially by not including any requirements that Zoom improve its faulty privacy practices.
    • The groups “recommend that the FTC modify the proposed Consent Order and require Zoom to(1) implement a comprehensive privacy program; (2) obtain regular independent privacy assessments and make those assessments available to the public; (3) provide meaningful redress for victims of Zoom’s unfair and deceptive trade practices; and (4) ensure the adequate protection and limits on the collection of children’s data.”

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Free-Photos from Pixabay

Further Reading, Other Development, and Coming Events (4 January 2021)

Further Reading

  • Microsoft Says Russian Hackers Viewed Some of Its Source Code” By Nicole Perlroth — The New York Times. The Sluzhba vneshney razvedki Rossiyskoy Federatsii’s (SVR) hack keeps growing and growing with Microsoft admitting its source code was viewed through an employee account. It may be that authorized Microsoft resellers were one of the vectors by which the SVR accessed SolarWinds, FireEye, and ultimately a number of United States (U.S.) government agencies. Expect more revelations to come about the scope and breadth of entities and systems the SVR compromised.
  • In 2020, we reached peak Internet. Here’s what worked — and what flopped.” By Geoffrey Fowler — The Washington Post. The newspaper’s tech columnist reviews the technology used during the pandemic and what is likely to stay with us when life returns to some semblance of normal.
  • Facebook Says It’s Standing Up Against Apple For Small Businesses. Some Of Its Employees Don’t Believe It.” By Craig Silverman and Ryan Mac — BuzzFeed News. Again, two of the best-sourced journalists when it comes to Facebook have exposed employee dissent within the social media and advertising giant, and this time over the company’s advertising blitz positioning it as the champion of small businesses that allegedly stand to be hurt when Apple rolls out iOS 14 that will allow users to block the type of tracking across apps and the internet Facebook thrives on. The company’s PR campaign stands in contrast to the anecdotal stories about errors that harmed and impeded small companies in using Facebook to advertise and sell products and services to cusstomers.
  • SolarWinds hack spotlights a thorny legal problem: Who to blame for espionage?” By Tim Starks — cyberscoop. This piece previews possible and likely inevitable litigation to follow from the SolarWinds hack, including possible securities action on the basis of fishy dumps of stock by executive, breach of contract, and negligence for failing to patch and address vulnerabilities in a timely fashion. Federal and state regulators will probably get on the field, too. But this will probably take years to play out as Home Depot settled claims arising from its 2014 breach with state attorneys general in November 2020.
  • The Tech Policies the Trump Administration Leaves Behind” By Aaron Boyd — Nextgov. A look back at the good, the bad, and the ugly of the Trump Administration’s technology policies, some of which will live on in the Biden Administration.

Other Developments

  • In response to the SolarWinds hack, the Federal Bureau of Investigation (FBI), the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA), and the Office of the Director of National Intelligence (ODNI) issued a joint statement indicating that the process established in Pursuant to Presidential Policy Directive (PPD) 41, an Obama Administration policy has been activated and a Cyber Unified Coordination Group (UCG) has been formed “to coordinate a whole-of-government response to this significant cyber incident.” The agencies explained “[t]he UCG is intended to unify the individual efforts of these agencies as they focus on their separate responsibilities.”
    • In PPD-41 it is explained that a UCG “shall serve as the primary method for coordinating between and among Federal agencies in response to a significant cyber incident as well as for integrating private sector partners into incident response efforts, as appropriate.” Moreover, “[t]he Cyber UCG is intended to result in unity of effort and not to alter agency authorities or leadership, oversight, or command responsibilities.”
  • Following the completion of its “in-depth” investigation, the European Commission (EC) cleared Google’s acquisition of Fitbit with certain conditions, removing a significant hurdle for the American multinational in buying the wearable fitness tracker company. In its press release, the EC explained that after its investigation, “the Commission had concerns that the transaction, as initially notified, would have harmed competition in several markets.” To address and allay concerns, Google bound itself for ten years to a set of commitments that can be unilaterally extended by the EC and will be enforced, in part, by the appointment of a trustee to oversee compliance.
    • The EC was particularly concerned about:
      • Advertising: By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to that of Fitbit. By increasing the already vast amount of data that Google could use for the personalisation of ads, it would be more difficult for rivals to match Google’s services in the markets for online search advertising, online display advertising, and the entire “ad tech” ecosystem. The transaction would therefore raise barriers to entry and expansion for Google’s competitors for these services to the detriment of advertisers, who would ultimately face higher prices and have less choice.
      • Access to Web Application Programming Interface (‘API’) in the market for digital healthcare: A number of players in this market currently access health and fitness data provided by Fitbit through a Web API, in order to provide services to Fitbit users and obtain their data in return. The Commission found that following the transaction, Google might restrict competitors’ access to the Fitbit Web API. Such a strategy would come especially at the detriment of start-ups in the nascent European digital healthcare space.
      • Wrist-worn wearable devices: The Commission is concerned that following the transaction, Google could put competing manufacturers of wrist-worn wearable devices at a disadvantage by degrading their interoperability with Android smartphones.
    • As noted, Google made a number of commitments to address competition concerns:
      • Ads Commitment:
        • Google will not use for Google Ads the health and wellness data collected from wrist-worn wearable devices and other Fitbit devices of users in the EEA, including search advertising, display advertising, and advertising intermediation products. This refers also to data collected via sensors (including GPS) as well as manually inserted data.
        • Google will maintain a technical separation of the relevant Fitbit’s user data. The data will be stored in a “data silo” which will be separate from any other Google data that is used for advertising.
        • Google will ensure that European Economic Area (‘EEA’) users will have an effective choice to grant or deny the use of health and wellness data stored in their Google Account or Fitbit Account by other Google services (such as Google Search, Google Maps, Google Assistant, and YouTube).
      • Web API Access Commitment:
        • Google will maintain access to users’ health and fitness data to software applications through the Fitbit Web API, without charging for access and subject to user consent.
      • Android APIs Commitment:
        • Google will continue to license for free to Android original equipment manufacturers (OEMs) those public APIs covering all current core functionalities that wrist-worn devices need to interoperate with an Android smartphone. Such core functionalities include but are not limited to, connecting via Bluetooth to an Android smartphone, accessing the smartphone’s camera or its GPS. To ensure that this commitment is future-proof, any improvements of those functionalities and relevant updates are also covered.
        • It is not possible for Google to circumvent the Android API commitment by duplicating the core interoperability APIs outside the Android Open Source Project (AOSP). This is because, according to the commitments, Google has to keep the functionalities afforded by the core interoperability APIs, including any improvements related to the functionalities, in open-source code in the future. Any improvements to the functionalities of these core interoperability APIs (including if ever they were made available to Fitbit via a private API) also need to be developed in AOSP and offered in open-source code to Fitbit’s competitors.
        • To ensure that wearable device OEMs have also access to future functionalities, Google will grant these OEMs access to all Android APIs that it will make available to Android smartphone app developers including those APIs that are part of Google Mobile Services (GMS), a collection of proprietary Google apps that is not a part of the Android Open Source Project.
        • Google also will not circumvent the Android API commitment by degrading users experience with third party wrist-worn devices through the display of warnings, error messages or permission requests in a discriminatory way or by imposing on wrist-worn devices OEMs discriminatory conditions on the access of their companion app to the Google Play Store.
  • The United States (U.S.) Department of Health and Human Services’ (HHS) Office of Civil Rights (OCR) has proposed a major rewrite of the regulations governing medical privacy in the U.S. As the U.S. lacks a unified privacy regime, the proposed changes would affect on those entities in the medical sector subject to the regime, which is admittedly many such entities. Nevertheless, it is almost certain the Biden Administration will pause this rulemaking and quite possibly withdraw it should it prove crosswise with the new White House’s policy goals.
    • HHS issued a notice of proposed rulemaking “to modify the Standards for the Privacy of Individually Identifiable Health Information (Privacy Rule) under the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act).”
      • HHS continued:
        • The Privacy Rule is one of several rules, collectively known as the HIPAA Rules, that protect the privacy and security of individuals’ medical records and other protected health information (PHI), i.e., individually identifiable health information maintained or transmitted by or on behalf of HIPAA covered entities (i.e., health care providers who conduct covered health care transactions electronically, health plans, and health care clearinghouses).
        • The proposals in this NPRM support the Department’s Regulatory Sprint to Coordinated Care (Regulatory Sprint), described in detail below. Specifically, the proposals in this NPRM would amend provisions of the Privacy Rule that could present barriers to coordinated care and case management –or impose other regulatory burdens without sufficiently compensating for, or offsetting, such burdens through privacy protections. These regulatory barriers may impede the transformation of the health care system from a system that pays for procedures and services to a system of value-based health care that pays for quality care.
    • In a press release, OCR asserted:
      • The proposed changes to the HIPAA Privacy Rule include strengthening individuals’ rights to access their own health information, including electronic information; improving information sharing for care coordination and case management for individuals; facilitating greater family and caregiver involvement in the care of individuals experiencing emergencies or health crises; enhancing flexibilities for disclosures in emergency or threatening circumstances, such as the Opioid and COVID-19 public health emergencies; and reducing administrative burdens on HIPAA covered health care providers and health plans, while continuing to protect individuals’ health information privacy interests.
  • The Federal Trade Commission (FTC) has used its powers to compel selected regulated entities to provide requested information in asking that “nine social media and video streaming companies…provide data on how they collect, use, and present personal information, their advertising and user engagement practices, and how their practices affect children and teens.” The TFTC is using its Section 6(b) authority to compel the information from Amazon.com, Inc., ByteDance Ltd., which operates the short video service TikTok, Discord Inc., Facebook, Inc., Reddit, Inc., Snap Inc., Twitter, Inc., WhatsApp Inc., and YouTube LLC. Failure to respond can result in the FTC fining a non-compliant entity.
    • The FTC claimed in its press release it “is seeking information specifically related to:
      • how social media and video streaming services collect, use, track, estimate, or derive personal and demographic information;
      • how they determine which ads and other content are shown to consumers;
      • whether they apply algorithms or data analytics to personal information;
      • how they measure, promote, and research user engagement; and
      • how their practices affect children and teens.
    • The FTC explained in its sample order:
      • The Commission is seeking information concerning the privacy policies, procedures, and practices of Social Media and Video Streaming Service providers, Including the method and manner in which they collect, use, store, and disclose Personal Information about consumers and their devices. The Special Report will assist the Commission in conducting a study of such policies, practices, and procedures.
  • The United States (U.S.) Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) supplemented its Emergency Directive 21-01 to federal civilian agencies in response to the Sluzhba vneshney razvedki Rossiyskoy Federatsii’s (SVR) hack via SolarWinds. In an 18 December update, CISA explained:
    • This section provides additional guidance on the implementation of CISA Emergency Directive (ED) 21-01, to include an update on affected versions, guidance for agencies using third-party service providers, and additional clarity on required actions.
    •  In a 30 December update, CISA stated:
      • Specifically, all federal agencies operating versions of the SolarWinds Orion platform other than those identified as “affected versions” below are required to use at least SolarWinds Orion Platform version 2020.2.1HF2. The National Security Agency (NSA) has examined this version and verified that it eliminates the previously identified malicious code. Given the number and nature of disclosed and undisclosed vulnerabilities in SolarWinds Orion, all instances that remain connected to federal networks must be updated to 2020.2.1 HF2 by COB December 31, 2020. CISA will follow up with additional supplemental guidance, to include further clarifications and hardening requirements.
  • Australia’s Attorney-General’s Department published an unclassified version of the four volumes of the “Report of the Comprehensive Review of the Legal Framework of the National Intelligence Community,” an “examination of the legislative framework underpinning the National Intelligence Community (NIC)…the first and largest since the Hope Royal Commissions considered the Australian Intelligence Community (AIC) in the 1970s and 1980s.” Ultimately, the authors of the report concluded:
    • We do not consider the introduction of a common legislative framework, in the form of a single Act governing all or some NIC agencies, to be a practical, pragmatic or proportionate reform. It would be unlikely that the intended benefits of streamlining and simplifying NIC legislation could be achieved due to the diversity of NIC agency functions—from intelligence to law enforcement, regulatory and policy—and the need to maintain differences in powers, immunities and authorising frameworks. The Review estimates that reform of this scale would cost over $200million and take up to 10years to complete. This would be an impractical and disproportionate undertaking for no substantial gain. In our view, the significant costs and risks of moving to a single, consolidated Act clearly outweigh the limited potential benefits.
    • While not recommending a common legislative framework for the entire NIC, some areas of NIC legislation would benefit from simplification and modernisation. We recommend the repeal of the TIA Act, Surveillance Devices Act 2004(SD Act) and parts of the Australian Security Intelligence Organisation Act 1979 (ASIO Act), and their replacement with a single new Act governing the use of electronic surveillance powers—telecommunications interception, covert access to stored communications, computers and telecommunications data, and the use of optical, listening and tracking devices—under Commonwealth law.
  • The National Institute of Standards and Technology (NIST) released additional materials to supplement a major rewrite of a foundational security guidance document. NIST explained “[n]ew supplemental materials for NIST Special Publication (SP) 800-53 Revision 5, Security and Privacy Controls for Information Systems and Organizations, are available for download to support the December 10, 2020 errata release of SP 800-53 and SP 800-53B, Control Baselines for Information Systems and Organizations.” These supplemental materials include:
    • A comparison of the NIST SP 800-53 Revision 5 controls and control enhancements to Revision 4. The spreadsheet describes the changes to each control and control enhancement, provides a brief summary of the changes, and includes an assessment of the significance of the changes.  Note that this comparison was authored by The MITRE Corporation for the Director of National Intelligence (DNI) and is being shared with permission by DNI.
    • Mapping of the Appendix J Privacy Controls (Revision 4) to Revision 5. The spreadsheet supports organizations using the privacy controls in Appendix J of SP 800-53 Revision 4 that are transitioning to the integrated control catalog in Revision 5.
    • Mappings between NIST SP 800-53 and other frameworks and standards. The mappings provide organizations a general indication of SP 800-53 control coverage with respect to other frameworks and standards. When leveraging the mappings, it is important to consider the intended scope of each publication and how each publication is used; organizations should not assume equivalency based solely on the mapping tables because mappings are not always one-to-one and there is a degree of subjectivity in the mapping analysis.
  • Via a final rule, the Department of Defense (DOD) codified “the National Industrial Security Program Operating Manual (NISPOM) in regulation…[that] establishes requirements for the protection of classified information disclosed to or developed by contractors, licensees, grantees, or certificate holders (hereinafter referred to as contractors) to prevent unauthorized disclosure.” The DOD stated “[i]n addition to adding the NISPOM to the Code of Federal Regulations (CFR), this rule incorporates the requirements of Security Executive Agent Directive (SEAD) 3, “Reporting Requirements for Personnel with Access to Classified Information or Who Hold a Sensitive Position.” The DOD stated “SEAD 3 requires reporting by all contractor cleared personnel who have been granted eligibility for access to classified information.”
    • The DOD added “[t]his NISPOM rule provides for a single nation-wide implementation plan which will, with this rule, include SEAD 3 reporting by all contractor cleared personnel to report specific activities that may adversely impact their continued national security eligibility, such as reporting of foreign travel and foreign contacts.”
    • The DOD explained “NISP Cognizant Security Agencies (CSAs) shall conduct an analysis of such reported activities to determine whether they pose a potential threat to national security and take appropriate action.”
    • The DOD added that “the rule also implements the provisions of Section 842 of Public Law 115-232, which removes the requirement for a covered National Technology and Industrial Base (NTIB) entity operating under a special security agreement pursuant to the NISP to obtain a national interest determination as a condition for access to proscribed information.”
  • An advisory committee housed at the United States (U.S.) Department of Homeland Security (DHS) is calling for the White House to quickly “operationalize intelligence in a classified space with senior executives and cyber experts from most critical entities in the energy, financial services, and communications sectors working directly with intelligence analysts and other government staff.” In their report, the President’s National Infrastructure Advisory Council (NIAC) proposed the creation of a Critical Infrastructure Command Center (CICC) to “provid[e] real-time collaboration between government and industry…[and] take direct action and provide tactical solutions to mitigate, remediate,  and deter threats.” NIAC urged the President to “direct relevant federal agencies to support the private sector in executing the concept, including identifying the required government staff…[and] work with Congress to ensure the appropriate authorities are established to allow the CICC to fully realize its operational functionality.” NIAC recommended “near-term actions to implement the CICC concept:
    • 1.The President should direct the relevant federal agencies to support the private sector in rapidly standing up the CICC concept with the energy, financial services, and communications sectors:
      • a. Within 90 days the private sector will identify the executives who will lead execution of the CICC concept and establish governing criteria (including membership, staffing and rotation, and other logistics).
      • b. Within 120 days the CICC sector executives will identify and assign the necessary CICC staff from the private sector.
      • c. Within 90 days an appropriate venue to house the operational component will be identified and the necessary agreements put in place.
    • 2. The President should direct the Intelligence Community and other relevant government agencies to identify and co-locate the required government staff counterparts to enable the direct coordination required by the CICC. This staff should be pulled from the IC, SSAs, and law enforcement.
    • 3. The President, working with Congress, should establish the appropriate authorities and mission for federal agencies to directly share intelligence with critical infrastructure companies, along with any other authorities required for the CICC concept to be fully successful (identified in Appendix A).
    • 4. Once the CICC concept is fully operational (within 180 days), the responsible executives should deliver a report to the NSC and the NIAC demonstrating how the distinct capabilities of the CICC have been achieved and the impact of the capabilities to date. The report should identify remaining gaps in resources, direction, or authorities.

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by opsa from Pixabay

EC Finally Unveils Digital Services Act and Digital Markets Act

The EU releases its proposals to remake digital markets.

The European Commission (EC) has released its draft proposals to remake how the European Union (EU) regulates digital markets and digital services, the latest in the bloc’s attempts to rein in what it sees as harms and abuses to people and competition in Europe and the world. At the earliest, these proposals would take effect in 2022 and are sure to be vigorously opposed by large United States (U.S.) multinationals like Google and Facebook and will also likely faced more restrained pushback from the U.S. government.

The Digital Markets Act would allow the EU to designate certain core platform services as gatekeepers subject to certain quanitative metrics or on a case-by-case basis. Once a company is deemed a gatekeeper, it would be subject to much greater regulation by the EU and violations of the new act could result in fines of 10% of worldwide revenue.

In its press release, the EC asserted:

European values are at the heart of both proposals. The new rules will better protect consumers and their fundamental rights online, and will lead to fairer and more open digital markets for everyone. A modern rulebook across the single market will foster innovation, growth and competitiveness and will provide users with new, better and reliable online services. It will also support the scaling up of smaller platforms, small and medium-sized enterprises, and start-ups, providing them with easy access to customers across the whole single market while lowering compliance costs. Furthermore, the new rules will prohibit unfair conditions imposed by online platforms that have become or are expected to become gatekeepers to the single market. The two proposals are at the core of the Commission’s ambition to make this Europe’s Digital Decade.

In the Digital Markets Act, the EC explained the problem with large platforms dominating certain digital markets. The EC discussed the harm to people and medium and small businesses as some large companies control certain markets and use their size and dominance to extract unfair prices for inferior services and products. The EC listed the core platform services that might be regulated:

  • online intermediation services (incl. for example marketplaces, app stores and online intermediation services in other sectors like mobility, transport or energy)
  • online search engines,
  • social networking
  • video sharing platform services,
  • number-independent interpersonal electronic communication services,
  • operating systems,
  • cloud services and
  • advertising services, including advertising networks, advertising exchanges and any other advertising intermediation services, where these advertising services are being related to one or more of the other core platform services mentioned above.

Clearly, a number of major American firms could easily be considered “core platform services” including Amazon, Apple, Google, Facebook, Instagram, YouTube, WhatsApp, Microsoft, and others. Whether they would be deemed gatekeepers would hinge on whether they meet the quantitative metrics the EU will put in place, and this will be a rebuttable presumption such that if a firm meets the standards, it may present evidence to the contrary and argue it is not a gatekeeper.

The EC detailed the quantitative metrics in Article 3. A company may qualify if it meets all three of the following criteria subject to further metrics:

A provider of core platform services shall be designated as gatekeeper if:

(a) it has a significant impact on the internal market;

(b) it operates a core platform service which serves as an important gateway for business users to reach end users; and

(c) it enjoys an entrenched and durable position in its operations or it is foreseeable that it will enjoy such a position in the near future.

The other metrics include €6.5 billion in revenue over the last three years or a €65 billion market capitalization and the provision of core platform services in at least three member states to show a “significant impact on internal market.” For the second category listed above, a company would need to provide a core platform service to 45 million or more people in the EU and 10,000 or more businesses in the EU. And, for the last category, passing the 45 million user and 10,000 business threshold for three consecutive years would suffice. The act reads:

A provider of core platform services shall be presumed to satisfy:

(a) the requirement in paragraph 1 point (a) where the undertaking to which it belongs achieves an annual EEA turnover equal to or above EUR 6.5 billion in the last three financial years, or where the average market capitalisation or the equivalent fair market value of the undertaking to which it belongs amounted to at least EUR 65 billion in the last financial year, and it provides a core platform service in at least three Member States;

(b) the requirement in paragraph 1 point (b) where it provides a core platform service that has more than 45 million monthly active end users established or located in the Union and more than 10,000 yearly active business users established in the Union in the last financial year; for the purpose of the first subparagraph, monthly active end users shall refer to the average number of monthly active end users throughout the largest part of the last financial year;

(c) the requirement in paragraph 1 point (c) where the thresholds in point (b) were met in each of the last three financial years.

The EU would also be able to label a provider of core platform services a gatekeeper on a case-by-case basis:

Provision should also be made for the assessment of the gatekeeper role of providers of core platform services which do not satisfy all of the quantitative thresholds, in light of the overall objective requirements that they have a significant impact on the internal market, act as an important gateway for business users to reach end users and benefit from a durable and entrenched position in their operations or it is foreseeable that it will do so in the near future.

It bears note that a company would be found to be a gatekeeper if it is merely foreseeable that it will satisfy these criteria soon. This flexibility could allow the EU to track companies and flag them as gatekeepers before they, in fact, achieve the sort of market dominance this regulation is intended to stop.

Among the relevant excerpts from the “Reasons for and objectives of the proposal” section of the act are:

  • Large platforms have emerged benefitting from characteristics of the sector such as strong network effects, often embedded in their own platform ecosystems, and these platforms represent key structuring elements of today’s digital economy, intermediating the majority of transactions between end users and business users. Many of these undertakings are also comprehensively tracking and profiling end users. A few large platforms increasingly act as gateways or gatekeepers between business users and end users and enjoy an entrenched and durable position, often as a result of the creation of conglomerate ecosystems around their core platform services, which reinforces existing entry barriers.
  • As such, these gatekeepers have a major impact on, have substantial control over the access to, and are entrenched in digital markets, leading to significant dependencies of many business users on these gatekeepers, which leads, in certain cases, to unfair behaviour vis-à-vis these business users. It also leads to negative effects on the contestability of the core platform services concerned. Regulatory initiatives by Member States cannot fully address these effects; without action at EU level, they could lead to a fragmentation of the Internal Market.
  • Unfair practices and lack of contestability lead to inefficient outcomes in the digital sector in terms of higher prices, lower quality, as well as less choice and innovation to the detriment of European consumers. Addressing these problems is of utmost importance in view of the size of the digital economy (estimated at between 4.5% to 15.5% of global GDP in 2019 with a growing trend) and the important role of online platforms in digital markets with its societal and economic implications.
  • Weak contestability and unfair practices in the digital sector are more frequent and pronounced in certain digital services than others. This is the case in particular for widespread and commonly used digital services and infrastructures that mostly directly intermediate between business users and end users.
  • The enforcement experience under EU competition rules, numerous expert reports and studies and the results of the OPC show that there are a number of digital services that have the following features: (i) highly concentrated multi-sided platform services, where usually one or very few large digital platforms set the commercial conditions with considerable autonomy; (ii) a few large digital platforms act as gateways for business users to reach their customers and vice-versa; and (iii) gatekeeper power of these large digital platforms is often misused by means of unfair behaviour vis-à-vis economically dependent business users and customers.
  • The proposal is therefore further limited to a number of ‘core platform services’ where the identified problems are most evident and prominent and where the presence of a limited number of large online platforms that serve as gateways for business users and end users has led or is likely to lead to weak contestability of these services and of the markets in which these intervene. These core platform services include: (i) online intermediation services (incl. for example marketplaces, app stores and online intermediation services in other sectors like mobility, transport or energy) (ii) online search engines, (iii) social networking (iv)video sharing platform services, (v) number-independent interpersonal electronic communication services, (vi) operating systems, (vii) cloud services and (viii) advertising services, including advertising networks, advertising exchanges and any other advertising intermediation services, where these advertising services are being related to one or more of the other core platform services mentioned above.
  • The fact that a digital service qualifies as a core platform service does not mean that issues of contestability and unfair practices arise in relation to every provider of these core platform services. Rather, these concerns appear to be particularly strong when the core platform service is operated by a gatekeeper. Providers of core platform providers can be deemed to be gatekeepers if they: (i) have a significant impact on the internal market, (ii) operate one or more important gateways to customers and (iii) enjoy or are expected to enjoy an entrenched and durable position in their operations.
  • Such gatekeeper status can be determined either with reference to clearly circumscribed and appropriate quantitative metrics, which can serve as rebuttable presumptions to determine the status of specific providers as a gatekeeper, or based on a case-by-case qualitative assessment by means of a market investigation.

The Digital Services Act would add new regulation on top of Directive 2000/31/EC (aka the e-Commerce Directive) by “[b]uilding on the key principles set out in the e-Commerce Directive, which remain valid today.” This new scheme “seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.”

The Digital Services Act is focused mostly on the information and misinformation present all over the online world and the harms it wreaks on EU citizens. However, the EC is also seeking to balance fundamental EU rights in more tightly regulating online platforms. Like the Digital Markets Act, this regulation would focus on the largest online content, product and services providers, which, as a practical matter, would likely be Facebook, Amazon, Google, Spotify, and a handful of other companies. Once a company has 10% of more of the EU’s population using its offerings, then the requirements of the Digital Services Act would be triggered.

Additionally, the Digital Services Act unites two online issues not usually considered together in the United States (U.S.): harmful online content and harmful online products. Even though it seems logical to consider these online offerings in tandem, there is clear bifurcation in the U.S. in how these two issues are regulated to the extent they are at the federal and state levels.

The Digital Services Act “will introduce a series of new, harmonised EU-wide obligations for digital services, carefully graduated on the basis of those services’ size and impact, such as:

  • Rules for the removal of illegal goods, services or content online;
  • Safeguards for users whose content has been erroneously deleted by platforms;
  • New obligations for very large platforms to take risk-based action to prevent abuse of their systems;
  • Wide-ranging transparency measures, including on online advertising and on the algorithms used to recommend content to users;
  • New powers to scrutinize how platforms work, including by facilitating access by researchers to key platform data;
  • New rules on traceability of business users in online market places, to help track down sellers of illegal goods or services;
  • An innovative cooperation process among public authorities to ensure effective enforcement across the single market.”

The EC explained

new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.

The EC spelled out what the Digital Services Act would do:

This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes:

(a) a framework for the conditional exemption from liability of providers of intermediary services;

(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services;

(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.

The EC explained the purpose of the act:

  • this proposal seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.
  • The proposal defines clear responsibilities and accountability for providers of intermediary services, and in particular online platforms, such as social media and marketplaces. By setting out clear due-diligence obligations for certain intermediary services, including notice-and-action procedures for illegal content and the possibility to challenge the platforms’ content moderation decisions, the proposal seeks to improve users’ safety online across the entire Union and improve the protection of their fundamental rights. Furthermore, an obligation for certain online platforms to receive, store and partially verify and publish information on traders using their services will ensure a safer and more transparent online environment for consumers.
  • Recognising the particular impact of very large online platforms on our economy and society, the proposal sets a higher standard of transparency and accountability on how the providers of such platforms moderate content, on advertising and on algorithmic processes. It sets obligations to assess the risks their systems pose to develop appropriate risk management tools to protect the integrity of their services against the use of manipulative techniques.

The EC summarized how the act will work:

  • The operational threshold for service providers in scope of these obligations includes those online platforms with a significant reach in the Union, currently estimated to be amounting to more than 45 million recipients of the service. This threshold is proportionate to the risks brought by the reach of the platforms in the Union; where the Union’s population changes by a certain percentage, the Commission will adjust the number of recipients considered for the threshold, so that it consistently corresponds to 10 % of the Union’s population. Additionally, the Digital Services Act will set out a co-regulatory backstop, including building on existing voluntary initiatives.
  • This proposal should constitute the appropriate basis for the development of robust technologies to prevent the reappearance of illegal information, accompanied with the highest safeguards to avoid that lawful content is taken down erroneously; such tools could be developed on the basis of voluntary agreements between all parties concerned and should be encouraged by Member States; it is in the interest of all parties involved in the provision of intermediary services to adopt and implement such procedures; the provisions of this Regulation relating to liability should not preclude the development and effective operation, by the different interested parties, of technical systems of protection and identification and of automated recognition made possible by digital technology within the limits laid down by Regulation 2016/679.
  • Union citizens and others are exposed to ever-increasing risks and harms online – from the spread of illegal content and activities, to limitations to express themselves and other societal harms. The envisaged policy measures in this legislative proposal will substantially improve this situation by providing a modern, future-proof governance framework, effectively safeguarding the rights and legitimate interests of all parties involved, most of all Union citizens. The proposal introduces important safeguards to allow citizens to freely express themselves, while enhancing user agency in the online environment, as well as the exercise of other fundamental rights such as the right to an effective remedy, non-discrimination, rights of the child as well as the protection of personal data and privacy online.
  • The proposed Regulation will mitigate risks of erroneous or unjustified blocking speech, address the chilling effects on speech, stimulate the freedom to receive information and hold opinions, as well as reinforce users’ redress possibilities. Specific groups or persons may be vulnerable or disadvantaged in their use of online services because of their gender, race or ethnic origin, religion or belief, disability, age or sexual orientation. They can be disproportionately affected by restrictions and removal measures following from (unconscious or conscious) biases potentially embedded in the notification systems by users and third parties, as well as replicated in automated content moderation tools used by platforms. The proposal will mitigate discriminatory risks, particularly for those groups or persons and will contribute to the protection of the rights of the child and the right to human dignity online. The proposal will only require removal of illegal content and will impose mandatory safeguards when users’ information is removed, including the provision of explanatory information to the user, complaint mechanisms supported by the service providers as well as external out-of-court dispute resolution mechanism. Furthermore, it will ensure EU citizens are also protected when using services provided by providers not established in the Union but active on the internal market, since those providers are covered too.
  • With regard to service providers’ freedom to conduct a business, the costs incurred on businesses are offset by reducing fragmentation across the internal market. The proposal introduces safeguards to alleviate the burden on service providers, including measures against repeated unjustified notices and prior vetting of trusted flaggers by public authorities. Furthermore, certain obligations are targeted to very large online platforms, where the most serious risks often occur and which have the capacity absorb the additional burden.
  • The proposed legislation will preserve the prohibition of general monitoring obligations of the e-Commerce Directive, which in itself is crucial to the required fair balance of fundamental rights in the online world. The new Regulation prohibits general monitoring obligations, as they could disproportionately limit users’ freedom of expression and freedom to receive information, and could burden service providers excessively and thus unduly interfere with their freedom to conduct a business. The prohibition also limits incentives for online surveillance and has positive implications for the protection of personal data and privacy.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Sambeet D from Pixabay

New EU Consumer Agenda

The European Commission frames a number of current and new programs as a paradigm shift in consumer rights in the EU.

The European Commission (EC) has published its New Consumer Agenda that envisions nothing less than a remaking of the European Union’s (EU) approach to a number of key realms, including the digital world. If enacted, these sweeping reforms could drive change in other nations the same the General Data Protection Regulation (GDPR) has informed revisions of data protection regimes around the globe. Some of the proposed changes have been rumored for some time, some are already in progress, and some are new. Nonetheless, the EC has repackaged a number of digital initiatives under the umbrella of the New Consumer Agenda in its document detailing the provisions. Incidentally, the document serves as a good crib sheet to get up to speed on a number of EU’s digital programs and policy goals. Having said that, much of the New Consumer Agenda may prove aspirational, for there are a number of moving pieces and stakeholders in making policy in the EU, and this can play out over a number of years (e.g., the plans to revise the e-Privacy Directive). So, inclusion in this policy superstructure does not guarantee enactment, or if changes are made, they may be years in coming.

The EC stated

The New Consumer Agenda (the Agenda) presents a vision for EU consumer policy from 2020 to 2025, building on the 2012 Consumer Agenda (which expires in 2020) and the 2018 New Deal for Consumers. It also aims to address consumers’ immediate needs in the face of the ongoing COVID-19 pandemic and to increase their resilience. The pandemic has raised significant challenges affecting the daily lives of consumers, in particular in relation to the availability and accessibility of products and services, as well as travel within, and to and from the EU.

The EC identified the five prongs of the Agenda and given the emphasis the EC’s new leadership has placed on digital matters, one of them is entitled “digital transformation.” The Agenda is meant to work in unison with previously announced and still to be implemented major policy initiatives like the European Green Deal, the Circular Economy Action Plan, and the Communication on shaping the EU’s digital future.

The EC suggests that current EU law and regulation may address what some consider among the worst abuses of the digital age:

Commercial practices that disregard consumers’ right to make an informed choice, abuse their behavioural biases, or distort their decision-making processes, must be tackled. These practices include the use of ‘dark’ patterns, certain personalisation practices often based on profiling, hidden advertising, fraud, false or misleading information and manipulated consumer reviews. Additional guidance is needed on the applicability of consumer law instruments such as the Unfair Commercial Practices Directive and Consumer Rights Directive to these practices. Ultimately, consumers should benefit from a comparable level of protection and fairness online as they enjoy offline.

The EC seems to be suggesting that should those directives be found wanting, they could be revised and expanded to adequately protect EU citizens and residents, at least in the view of the EC.

The EC made reference to two long anticipated pieces of legislation expected new week in draft form:

  • First, the Commission’s upcoming proposal for a new Digital Services Act (DSA), will aim to define new and enhanced responsibilities and reinforce the accountability of online intermediaries and platforms. The DSA will ensure that consumers are protected effectively against illegal products, content and activities on online platforms as they are offline.
  • Second, to address the problems arising in digital markets prone to market failures, such as the gatekeeper power of certain digital platforms, the Commission is planning to present also a Digital Markets Act. It would combine the ex ante regulation of digital platforms having the characteristics of gatekeepers with a dynamic market investigation framework to examine digital markets prone to market failures. Consumers will be the final beneficiaries of fairer and more contestable digital markets, including lower prices, better and new services and greater choice.

Regarding artificial intelligence (AI), the EC previewed its next steps on putting in place a regulatory structure to regulate the new technology in the consumer space, including extra protection and an appropriate civil liability scheme:

  • a proposal to guarantee a high level of protection of consumer interest and the protection of fundamental rights, in turn building the trust necessary for the societal uptake of AI;
  • as regards civil liability, measures to ensure that victims of damage caused by AI applications have the same level of protection in practice as victims of damage caused by other products or services.

The EC is also floating the idea of revising other consumer protection directives such as “the Machinery Directive, the adoption of delegated acts under the Radio Equipment Directive, and the revision of the General Product Safety Directive.” The EC aspires to refresh the General Product Safety Directive, “which provides the legal framework for the safety of non-food consumer products” to account for “AI-powered products and connected devices,” the latter of which may be a reference to Internet of Things (IOT). The EC also remarked on the consumer safety issues posed by a regulatory system that cannot police items sold online that originate from outside the EU, which often raises product safety issues. The EC vowed that “[t]he forthcoming proposal for a revision of the General Product Safety Directive, foreseen for 2021, should provide a solid response to these increasing challenges.”

The EC also referred to an existing lawmaking that would allow EU citizens and residents to use “a universally accepted public electronic identity.” This new system would allow people to “to manage the access and use of their data in a fully controlled and secure manner” “based on the consumers’ choice, their consent and the guarantee that their privacy is fully respected in line with the General Data Protection Regulation (GDPR).”

The EC is addressing another facet of data protection, privacy, and consumers’ rights: data portability. The Commission stated that the “European Strategy for Data aims to facilitate the effective individuals’ right to data portability under the GDPR…[that] has clear potential to put individuals at the centre of the data economy by enabling them to switch between service providers, combine services, use other innovative services and choose the services that offer most data protection.” The EC claimed this strategy “will also drive the creation of a genuine single market for data and the creation of common European data spaces.”

The Commission made note of its Geo-blocking Regulation to address the practice of discriminating “between EU consumers to segment markets along national borders.”

The EC also described laws and regulations to revamp the digital facets of the financial services sector that “will improve consumer protection” in new areas of FinTech and other new realms.

The EC is touting how a previously announced proposal to foster the use of recycled materials and products fits into the Consumer Agenda. The EC stated “the new Circular Economy Action Plan sets out a number of specific initiatives to fight early obsolescence and promote durability, recyclability, reparability, and accessibility of products, and to support action by business.” The EC all but said new regulations are coming to address electronic waste and products designed not to be repairable. The EC noted “[a]dditional regulatory and non- regulatory measures will be needed to address specific groups of goods and services, such as ICT, electronics or textile, and packaging…[f]or instance:

  • The Circular Electronics Initiative aims to ensure that electronic devices are designed for durability, maintenance, repair, disassembly, dismantling, reuse and recycling, and that consumers have a ‘right to repair’ them including software updates.
  • The initiative on a common charger for mobile phones and other portable devices, aims to increase consumer convenience and reduce material use and e-waste associated with production and disposal of this particular item used daily by the vast majority of consumers.”

The EC discussed other aspects of consumer protection with implications for technology policy. The EC stated

The new Consumer Protection Cooperation (CPC) Regulation which entered into force in January 2020, provides a stronger basis for joint EU action. It strengthens enforcement authorities’ online capacity, cooperation mechanisms and intelligence gathering system to address large-scale infringements of EU consumer law, ensure consistent level of consumer protection and offer a ‘one-stop-shop’ for businesses. The Commission will not hesitate to make use of its powers under the Regulation to trigger coordinated enforcement actions on EU-wide issues where necessary.

The EC is proposing to address algorithmic biases and how technology companies are exploiting certain human behavioral inclinations to drive engagement:

  • The risk of discrimination is at times exacerbated by algorithms used by certain goods and services providers, and which may be formulated with certain biases often resulting from pre-existing cultural or social expectations. Whereas this may lead to discrimination among consumers generally, it often affects certain groups more than others, and in particular people from minority ethnic or racial backgrounds. The upcoming proposal for a horizontal legislative framework on Artificial Intelligence will aim to specifically address how to limit risks of bias and discrimination from being built into algorithmic systems.
  • Lastly, evidence from behavioural economics shows that the behaviours of consumers are often affected by cognitive biases, especially online, which can be exploited by traders for commercial purposes. Such new forms of risks can affect virtually all consumers. Transparency obligations are certainly important in tackling information asymmetries (as also mentioned above in the context of the digital transformation), but further assessment is required to determine the need for additional measures to address this dynamic form of vulnerability.

The EC acknowledged the international nature of commerce and vowed to take certain steps to ensure the products, goods, and services entering the EU are safe. Notably, the EC is aiming to develop “an action plan with China for strengthened product safety cooperation for products sold online.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by eberhard grossgasteiger from Pexels

EU Announces One Antitrust Action Against A Big Tech Firm and Previews Another

The EU commences with one antitrust action against Amazon while investigating other possible violations.

The European Commission (EC) released a summary of its findings in one antitrust investigation against Amazon, finding enough evidence to proceed while also starting the process to investigate another alleged violation by the United States (U.S.) multinational. The EC started its investigation of Amazon in July 2019, and this action follows an announced investigation of Apple earlier this year. Also, the European Union (EU) has fined Google €8.2 billion cumulatively for three separate antitrust violations over the last five or six years. Moreover, the EC is readying a “Digital Markets Act” to update the EU’s competition laws.

Article 102 of the Treaty on the Functioning of the European Union (TFEU) bars a company from abusing its dominant market position. The EC is asserting that Amazon has a dominant market position regarding its use of sales data from selling the items of third parties that the company sometimes uses to undercut the third parties. According to the EC, this is abuse in violation of Article 102, and it has issued a Statement of Objections. However, the process by which an antitrust action in the EU is brought is not finished at this stage. Amazon will have the opportunity to respond and any final decision, particularly fines, must be approved by the Advisory Committee which consists of the EU’s competition authorities.

In its press statement, the EC explained:

  • The European Commission has informed Amazon of its preliminary view that it has breached EU antitrust rules by distorting competition in online retail markets. The Commission takes issue with Amazon systematically relying on non-public business data of independent sellers who sell on its marketplace, to the benefit of Amazon’s own retail business, which directly competes with those third party sellers.
  • The Commission also opened a second formal antitrust investigation into the possible preferential treatment of Amazon’s own retail offers and those of marketplace sellers that use Amazon’s logistics and delivery services.

In its Statement of Objections, the EC further detailed its case that Amazon’s access to and use of private business data of third-party sellers for Amazon’s benefit distorts competition contrary to EU law:

  • Amazon has a dual role as a platform: (i) it provides a marketplace where independent sellers can sell products directly to consumers; and (ii) it sells products as a retailer on the same marketplace, in competition with those sellers.
  • As a marketplace service provider, Amazon has access to non-public business data of third party sellers such as the number of ordered and shipped units of products, the sellers’ revenues on the marketplace, the number of visits to sellers’ offers, data relating to shipping, to sellers’ past performance, and other consumer claims on products, including the activated guarantees.
  • The Commission’s preliminary findings show that very large quantities of non-public seller data are available to employees of Amazon’s retail business and flow directly into the automated systems of that business, which aggregate these data and use them to calibrate Amazon’s retail offers and strategic business decisions to the detriment of the other marketplace sellers. For example, it allows Amazon to focus its offers in the best-selling products across product categories and to adjust its offers in view of non-public data of competing sellers.
  • The Commission’s preliminary view, outlined in its Statement of Objections, is that the use of non-public marketplace seller data allows Amazon to avoid the normal risks of retail competition and to leverage its dominance in the market for the provision of marketplace services in France and Germany- the biggest markets for Amazon in the EU. If confirmed, this would infringe Article 102 of the TFEU that prohibits the abuse of a dominant market position.

The EC also launched another inquiry into the platform’s practices that allegedly favor the company’s items as compared to third-party sellers and also those items offered by third-parties that use Amazon’s logistics and delivery services. The EC explained it “opened a second antitrust investigation into Amazon’s business practices that might artificially favour its own retail offers and offers of marketplace sellers that use Amazon’s logistics and delivery services (the so-called “fulfilment by Amazon or FBA sellers”).” The EC continued:

  • In particular, the Commission will investigate whether the criteria that Amazon sets to select the winner of the “Buy Box” and to enable sellers to offer products to Prime users, under Amazon’s Prime loyalty programme, lead to preferential treatment of Amazon’s retail business or of the sellers that use Amazon’s logistics and delivery services.
  • The “Buy Box” is displayed prominently on Amazon’s websites and allows customers to add items from a specific retailer directly into their shopping carts. Winning the “Buy Box” (i.e. being chosen as the offer that features in this box) is crucial to marketplace sellers as the Buy Box prominently shows the offer of one single seller for a chosen product on Amazon’s marketplaces, and generates the vast majority of all sales. The other aspect of the investigation focusses on the possibility for marketplace sellers to effectively reach Prime users. Reaching these consumers is important to sellers because the number of Prime users is continuously growing and because they tend to generate more sales on Amazon’s marketplaces than non-Prime users.
  • If proven, the practice under investigation may breach Article 102 of the TFEU that prohibits the abuse of a dominant market position.

The EC’s antitrust action may be followed by an action by the United States (U.S.) government. It has been reported in the media that the Federal Trade Commission (FTC) is also investigating Amazon’s conduct visa vis third-party sellers on its platform and could also bring suit. However, there may be a lack of bandwidth and resources at the agency if it proceeds with an antitrust action against Facebook as is rumored to be filed by year’s end.

Moreover, the U.S. House of Representatives’ Judiciary Committee’s Antitrust, Commercial and Administrative Law Subcommittee’s “Investigation into Competition in Online Markets” detailed the same conduct the EU is alleging violates antitrust law:

One of the widely reported ways in which Amazon treats third-party sellers unfairly centers on Amazon’s asymmetric access to and use of third-party seller data. During the investigation, the Subcommittee heard repeated concerns that Amazon leverages its access to third-party sellers’ data to identify and replicate popular and profitable products from among the hundreds of millions of listings on its marketplace. Armed with this information, it appears that Amazon would: (1) copy the product to create a competing private-label product; or (2) identify and source the product directly from the manufacturer to free ride off the seller’s efforts, and then cut that seller out of the equation.

Amazon claims that it has no incentive to abuse sellers’ trust because third-party sales make up nearly 60% of its sales, and that Amazon’s first-party sales are relatively small. Amazon has similarly pointed out that third-party listings far outnumber Amazon’s first-party listings. In a recent shareholder letter, CEO Jeff Bezos wrote, “Third-party sellers are kicking our first-party butt. Badly.” In response to a question from the Subcommittee, however, Amazon admitted that by percentage of sales—a more telling measure—Amazon’s first-party sales are significant and growing in a number of categories. For example, in books, Amazon owns 74% of sales, whereas third-party sellers only account for 26% of sales. At the category level, it does not appear that third-party sellers are kicking Amazon’s first-party butt. Amazon may, in fact, be positioned to overtake its thirdparty sellers in several categories as its first-party business continues to grow.

As noted, earlier this year, the EC announced two antitrust investigations of Apple regarding allegations of unfair and anticompetitive practices with its App Store and Apple Pay.

In a press release, the EC announced it “has opened a formal antitrust investigation to assess whether Apple’s conduct in connection with Apple Pay violates EU competition rules…[that] concerns Apple’s terms, conditions and other measures for integrating Apple Pay in merchant apps and websites on iPhones and iPads, Apple’s limitation of access to the Near Field Communication (NFC) functionality (“tap and go”) on iPhones for payments in stores, and alleged refusals of access to Apple Pay.” The EC noted that “[f]ollowing a preliminary investigation, the Commission has concerns that Apple’s terms, conditions, and other measures related to the integration of Apple Pay for the purchase of goods and services on merchant apps and websites on iOS/iPadOS devices may distort competition and reduce choice and innovation.” The EC contended “Apple Pay is the only mobile payment solution that may access the NFC “tap and go” technology embedded on iOS mobile devices for payments in stores.” The EC revealed “[t]he investigation will also focus on alleged restrictions of access to Apple Pay for specific products of rivals on iOS and iPadOS smart mobile devices” and “will investigate the possible impact of Apple’s practices on competition in providing mobile payments solutions.”

In a press release issued the same day, the EC explained it had also “opened formal antitrust investigations to assess whether Apple’s rules for app developers on the distribution of apps via the App Store violate EU competition rules.” The EC said “[t]he investigations concern in particular the mandatory use of Apple’s own proprietary in-app purchase system and restrictions on the ability of developers to inform iPhone and iPad users of alternative cheaper purchasing possibilities outside of apps.” The EC added “[t]he investigations concern the application of these rules to all apps, which compete with Apple’s own apps and services in the European Economic Area (EEA)…[and] [t]he investigations follow-up on separate complaints by Spotify and by an e-book/audiobook distributor on the impact of the App Store rules on competition in music streaming and e-books/audiobooks.”

Finally, recently, EU Executive Vice-President Margrethe Vestager gave a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the EC more tools to combat the same competition and market dominance issues posed by companies like Apple, Amazon, Facebook, and Google. Vestager stated:

  • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
  • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
  • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
  • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
  • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
  • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
  • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
  • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
  • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
  • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by stein egil liland from Pexels

EDPB Publishes Schrems II Recommendations; EU Parliament Issues Draft SCC Revisions

The EU takes steps to respond to the CJEU’s striking down of the EU-US Privacy Shield by augmenting SCCs and other transfer mechanisms.

The European Data Protection Board (EDPB) published recommendations for entities exporting and importing the personal data of European Union (EU) residents in light of the court decision striking down the adequacy decision that allowed transfers to the United States (U.S.). The EDPB noted that alternate mechanisms like standard contractual clauses (SCC) may still be used for transfers to nations without adequate protections of EU rights provided that supplemental measures are used. It should be noted that the EDPB said that supplemental measures will be needed for the use of any transfers to nations that do not guarantee the same level of rights as the EU, which would include Binding Corporate Rules (BCR). While, the EDPB’s recommendations will undoubtedly prove persuasive with the Supervisory Authorities (SA), each SA will ultimately assess whether the mechanisms and supplementary measures used by entities comport with General Data Protection Regulation (GDPR) and the EU’s Charter of Fundamental Rights.

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the Court of Justice for the European Union (CJEU) explained:

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the U.S. lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law. However, the CJEU left open the question of whether SCCs may permit the continued exporting of EU personal data into the U.S. for companies like Facebook, Google, and many, many others. Consequently, there has been no small amount of interpreting and questioning of whether this may be a way for the trans-Atlantic data flow worth billions, perhaps even trillions, of dollars to continue. And yet, the CJEU seemed clear that additional measures would likely be necessary. Indeed, the CJEU asserted “[c]ontrollers and processors should be encouraged to provide additional safeguards via contractual commitments that supplement standard protection clauses” and “[i]n so far as those standard data protection clauses cannot, having regard to their very nature, provide guarantees beyond a contractual obligation to ensure compliance with the level of protection required under EU law, they may require, depending on the prevailing position in a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with that level of protection.”

In “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,” the EDPB explained the genesis and rationale for the document:

  • The GDPR or the [CJEU] do not define or specify the “additional safeguards”, “additional measures” or “supplementary measures” to the safeguards of the transfer tools listed under Article 46.2 of the GDPR that controllers and processors may adopt to ensure compliance with the level of protection required under EU law in a particular third country.
  • The EDPB has decided, on its own initiative, to examine this question and to provide controllers and processors, acting as exporters, with recommendations on the process they may follow to identify and adopt supplementary measures. These recommendations aim at providing a methodology for the exporters to determine whether and which additional measures would need to be put in place for their transfers. It is the primary responsibility of exporters to ensure that the data transferred is afforded in the third country of a level of protection essentially equivalent to that guaranteed within the EU. With these recommendations, the EDPB seeks to encourage consistent application of the GDPR and the Court’s ruling, pursuant to the EDPB’s mandate

Broadly speaking, whether SCCs and supplemental measures will pass muster under the GDPR will be determined on a case-by-case basis. The EDPB did not offer much in the way of bright line rules. Indeed, it will be up to SAs to determine if transfers to nations like the U.S. are possible under the GDPR, meaning these recommendations may shed more light on this central question without deciding it. One wonders, as a practical matter, if the SAs will have the capacity, resources, and will to police SCCs to ensure the GDPR and Charter are being met.

Nonetheless, the EDPB stressed the principle of accountability under which controllers which export personal data must ensure that whatever mechanism and supplemental measures govern a data transfer, the data must receive the same protection it would in the EU. The EDPB made the point that EU protections travel with the data and should EU personal data make its way to a country where it is not possible for appropriate protection to occur, then the transfer violates the GDPR. Moreover, these recommendations pertain to both public and private transfers of EU data to private sector entities outside the EU.

These recommendations work like a decision tree with exporters needing to ask themselves a series of questions to determine whether they must use supplemental measures. This may prove a resource intensive process, for exporters will need to map all transfers (i.e. know exactly) where the data are going. The exporter must understand the laws and practices of the third nation in order to put in place appropriate measures if this is possible in order to meet the EU’s data protection standards.

Reading between the lines leads one to conclude that data exporters may not send personal data to the U.S. for its federal surveillance regime is not “necessary and proportionate,” at least from the EU’s view. The U.S. lacks judicial redress in the case a U.S. national, let alone a foreign national, objects to the sweeping surveillance. The U.S. also has neither a national data protection law nor a dedicated data protection authority. These hints seem to also convey the EDPB’s view on the sorts of legal reforms needed in the U.S. before an adequacy decision would pass muster with the CJEU.

The EDPB said it was still evaluating how Schrems II affects the use of BCR and ad hoc contractual clauses, two of the other alternate means of transferring EU personal data in the absence of an adequacy agreement.

Nevertheless, in an annex, the EDPB provided examples of supplementary measures that may be used depending on the circumstances, of course, such as “flawlessly implemented” encryption and pseudonymizing data. However, the EDPB discusses these in the context of different scenarios and calls for more conditions than just the two aforementioned. Moreover, the EDPB rules out two scenarios categorically as being inadequate: “Transfer to cloud services providers or other processors which require access to data in the clear” and “Remote access to data for business purposes.”

The EDPB also issued an update to guidance published after the first lawsuit brought by Maximilian Schrems resulted in the striking down of the Safe Harbor transfer agreement. The forerunner to the EDPB, the Working Party 29, had drafted and released the European Essential Guarantees, and so, in light of Schrems II, the EDPB updated and published “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” “to provide elements to examine, whether surveillance measures allowing access to personal data by public authorities in a third country, being national security agencies or law enforcement authorities, can be regarded as a justifiable interference or not” with fundamental EU rights and protections. As the EDPB explains, these recommendations are intended to help data controllers and exporters determine whether other nations have protections and processes in place equivalent to those of the EU visa vis their surveillance programs. The EDPB stressed that these are the essential guarantees and other features and processes may be needed for a determination of lawfulness under EU law.

The EDPB formulated the four European Essential Guarantees:

A. Processing should be based on clear, precise and accessible rules

B. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated

C. An independent oversight mechanism should exist

D. Effective remedies need to be available to the individual

The European Commission (EC) has also released for comment a draft revision of SCC for transfers of personal data to countries outside the EU. The EC is accepting comments and input until 10 December. It may be no accident that the EDPB and EC more or less acted in unison to address the practical and statutory changes necessary to effectuate the CJEU’s striking down of the EU-US Privacy Shield. Whatever the case, the EC released draft legislative language and, in an Annex, actual contract language for use by controllers and processors in the form of modules that are designed to be used in a variety of common circumstances (e.g., transfers by controllers to other controllers or a controller to a processor.) However, as the EDPB did, the EC stressed that SCCs form a floor and controllers, processors, and other parties are free to add additional language so long as it does not contradict or denigrate the rights protected by SCCs.

In the implementing decision, the EC asserted

the standard contractual clauses needed to be updated in light of new requirements in Regulation (EU) 2016/679. Moreover, since the adoption of these decisions, important developments have taken place in the digital economy, with the widespread use of new and more complex processing operations often involving multiple data importers and exporters, long and complex processing chains as well as evolving business relationships. This calls for a modernisation of the standard contractual clauses to better reflect those realities, by covering additional processing and transfer situations and to use a more flexible approach, for example with respect to the number of parties able to join the contract.

The EC continued:

The standard contractual clauses set out in the Annex to this Decision may be used by a controller or a processor in order to provide appropriate safeguards within the meaning of Article 46(1) of Regulation (EU) 2016/679 for the transfer of personal data to a processor or a controller established in a third country. This also includes the transfer of personal data by a controller or processor not established in the Union, to the extent that the processing is subject to Regulation (EU) 2016/679 pursuant to Article 3(2) thereof, because it relates to the offering of goods or services to data subjects in the Union or the monitoring of their behaviour as far as their behaviour takes place within the Union.

The EC explained the design and intent of the SCC language in the Annex:

  • The standard contractual clauses set out in the Annex to this Decision combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains. In addition to the general clauses, controllers and processors should select the module applicable to their situation, which makes it possible to tailor their obligations under the standard contractual clauses to their corresponding role and responsibilities in relation to the data processing at issue. It should be possible for more than two parties to adhere to the standard contractual clauses. Moreover, additional controllers and processors should be allowed to accede to the standard contractual clauses as data exporters or importers throughout the life cycle of the contract of which those clauses form a part.
  • These Clauses set out appropriate safeguards, including enforceable data subject rights and effective legal remedies, pursuant to Article 46(1), and Article 46 (2)(c) of Regulation (EU) 2016/679 and, with respect to data transfers from controllers to processors and/or processors to processors, standard contractual clauses pursuant to Article 28(7) of Regulation (EU) 2016/679, provided they are not modified, except to add or update information in the Annexes. This does not prevent the Parties from including the standard contractual clauses laid down in this Clauses in a wider contract, and to add other clauses or additional safeguards provided that they do not contradict, directly or indirectly, the standard contractual clauses or prejudice the fundamental rights or freedoms of data subjects. These Clauses are without prejudice to obligations to which the data exporter is subject by virtue of the Regulation (EU) 2016/679

In October, the Trump Administration released a crib sheet they are hoping U.S. multinationals will have success in using to argue to SAs that SCC and BCR and U.S. law satisfy the European Court of Justice’s ruling that struck down the EU-U.S. Privacy Shield. And, the Trump Administration is basically arguing, sure, we spy, but most EU citizens data is not surveilled and EU governments themselves often share in the proceeds of the surveillance we conduct. Moreover, there are plenty of safeguards and means of redress in the U.S. system because, you know, we say so. It is unlikely this analysis will be very persuasive in the EU, especially since these broad arguments do not go to the criticisms the EU has had under Privacy Shield about U.S. surveillance and privacy rights nor to the basis for the CJEU’s ruling.

Earlier this month, the European Data Protection Supervisor (EDPS) published a strategy detailing how EU agencies and bodies should comply with the CJEU ruling that struck down the EU-US Privacy Shield and threw into question the compliance of SCC with EU law and the GDPR. The EDPS has already started working with EU Institutions’, bodies, offices and agencies (EUIs) on the process of determining if their transfers of the personal data of people in the EU to the U.S. meets the CJEU’s judgement.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Anthony Beck from Pexels

Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay