ePrivacy Exception Proposed

Late last month, a broad exception to the EU’s privacy regulations became effective.

My apologies. The first version of this post erroneously asserted the derogation to the ePrivacy Directive had been enacted. It has not, and this post has been re-titled and updated to reflect this fact.

As the European Union (EU) continues to work on enacting a modernized ePrivacy Directive (Directive 2002/58/EC) to complement the General Data Protection Regulation (GDPR), it proposed an exemption to manage a change in another EU law to sweep “number-independent interpersonal communications services” into the current regulatory structure of electronics communication. The policy justification for allowing a categorical exemption to the ePrivacy Directive is for combatting child sexual abuse online. This derogation of EU law is limited to at most five years and quite possibly less time if the EU can enact a successor to the ePrivacy Directive, an ePrivacy Regulation. However, it is unclear when this derogation will be agreed upon and enacted.

In September 2020, the European Commission (EC) issued “a Proposal for a Regulation on a temporary derogation from certain provisions of the ePrivacy Directive 2002/58/EC as regards the use of technologies by number-independent interpersonal communicationsservice providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.” The final regulation took effect on 21 December 2020. However, the EC has also issued a draft of compromise ePrivacy Regulation, the results of extensive communications. The GDPR was enacted with an update of the ePrivacy Directive in mind.

In early December, an EU Parliament committee approved the proposed derogation but the full Parliament has not yet acted upon the measure. The Parliament needs to reach agreement with the Presidency of the Council and the European Commission. In its press release, the Civil Liberties, Justice and Home Affairs explained:

The proposed regulation will provide for limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

Article 1 sets out the scope and aim of the temporary regulation:

This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.

The EC explained the legal and policy background for the exemption to the ePrivacy Directive:

  • On 21 December 2020, with the entry into application of the European Electronic Communications Code (EECC), the definition of electronic communications services will be replaced by a new definition, which includes number-independent interpersonal communications services. From that date on, these services will, therefore, be covered by the ePrivacy Directive, which relies on the definition of the EECC. This change concerns communications services like webmail messaging services and internet telephony.
  • Certain providers of number-independent interpersonal communications services are already using specific technologies to detect child sexual abuse on their services and report it to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, and/or to remove child sexual abuse material. These organisations refer to national hotlines for reporting child sexual abuse material, as well as organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the EU and in third countries.
  • Child sexual abuse is a particularly serious crime that has wide-ranging and serious life-long consequences for victims. In hurting children, these crimes also cause significant and long- term social harm. The fight against child sexual abuse is a priority for the EU. On 24 July 2020, the European Commission adopted an EU strategy for a more effective fight against child sexual abuse, which aims to provide an effective response, at EU level, to the crime of child sexual abuse. The Commission announced that it will propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and oblige them to report that material to public authorities by the second quarter of 2021. The announced legislation will be intended to replace this Regulation, by putting in place mandatory measures to detect and report child sexual abuse, in order to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse, while ensuring respect of the fundamental rights of the users, including in particular the right to freedom of expression and opinion, protection of personal data and privacy, and providing for mechanisms to ensure accountability and transparency.

The EC baldly asserts the problem of child online sexual abuse justifies a loophole to the broad prohibition on violating the privacy of EU persons. The EC did note that the fight against this sort of crime is a political priority for the EC, one that ostensibly puts the EU close to the views of the Five Eyes nations that have been pressuring technology companies to end the practice of making apps and hardware encrypted by default.

The EC explained:

The present proposal therefore presents a narrow and targeted legislative interim solution with the sole objective of creating a temporary and strictly limited derogation from the applicability of Articles 5(1) and 6 of the ePrivacy Directive, which protect the confidentiality of communications and traffic data. This proposal respects the fundamental rights, including the rights to privacy and protection of personal data, while enabling providers of number-independent interpersonal communications services to continue using specific technologies and continue their current activities to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services, pending the adoption of the announced long- term legislation. Voluntary efforts to detect solicitation of children for sexual purposes (“grooming”) also must be limited to the use of existing, state-of-the-art technology that corresponds to the safeguards set out. This Regulation should cease to apply in December 2025.

The EC added “[i]n case the announced long-term legislation is adopted and enters into force prior to this date, that legislation should repeal the present Regulation.”

In November, the European Data Protections Supervisor (EDPS) Wojciech Wiewiorówski published his opinion on the temporary, limited derogation from the EU’s regulation on electronics communication and privacy. Wiewiorówski cautioned that a short-term exception, however well-intended, would lead to future loopholes that would ultimately undermine the purpose of the legislation. Moreover, Wiewiorówski found that the derogation was not sufficiently specific guidance and safeguards and is not proportional. Wiewiorówski argued:

  • In particular, he notes that the measures envisaged by the Proposal would constitute an interference with the fundamental rights to respect for private life and data protection of all users of very popular electronic communications services, such as instant messaging platforms and applications. Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
  • The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field. The EDPS therefore considers it essential that the Proposal is not adopted, even in the form a temporary derogation, until all the necessary safeguards set out in this Opinion are integrated.
  • In particular, in the interest of legal certainty, the EDPS considers that it is necessary to clarify whether the Proposal itself is intended to provide a legal basis for the processing within the meaning of the GDPR, or not. If not, the EDPS recommends clarifying explicitly in the Proposal which legal basis under the GDPR would be applicable in this particular case.
  • In this regard, the EDPS stresses that guidance by data protection authorities cannot substitute compliance with the requirement of legality. It is insufficient to provide that the temporary derogation is “without prejudice” to the GDPR and to mandate prior consultation of data protection authorities. The co-legislature must take its responsibility and ensure that the proposed derogation complies with the requirements of Article 15(1), as interpreted by the CJEU.
  • In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.
  • Finally, the EDPS is of the view that the five-year period as proposed does not appear proportional given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards within the text of the legislation. He considers that the validity of any transitional measure should not exceed 2 years.

The Five Eyes nations (Australia, Canada, New Zealand, the United Kingdom, and the United States) issued a joint statement in which their ministers called for quick action.

In this statement, we highlight how from 21 December 2020, the ePrivacy Directive, applied without derogation, will make it easier for children to be sexually exploited and abused without detection – and how the ePrivacy Directive could make it impossible both for providers of internet communications services, and for law enforcement, to investigate and prevent such exploitation and abuse. It is accordingly essential that the European Union adopt urgently the derogation to the ePrivacy Directive as proposed by the European Commission in order for the essential work carried out by service providers to shield endangered children in Europe and around the world to continue.

Without decisive action, from 21 December 2020 internet-based messaging services and e-mail services captured by the European Electronic Communications Code’s (EECC) new, broader definition of ‘electronic communications services’ are covered by the ePrivacy Directive. The providers of electronic communications services must comply with the obligation to respect the confidentiality of communications and the conditions for processing communications data in accordance with the ePrivacy Directive. In the absence of any relevant national measures made under Article 15 of that Directive, this will have the effect of making it illegal for service providers operating within the EU to use their current tools to protect children, with the impact on victims felt worldwide.

As mentioned, this derogation comes at a time when the EC and the EU nations are trying to finalize and enact an ePrivacy Regulation. In the original 2017 proposal, the EC stated:

The ePrivacy Directive ensures the protection of fundamental rights and freedoms, in particular the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the Union.

The ePrivacy Regulation is intended to work in concert with the GDPR, and the draft 2020 regulation contains the following passages explaining the intended interplay of the two regulatory schemes:

  • Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons. Processing of electronic communications data by providers of electronic communications services and networks should only be permitted in accordance with this Regulation. This Regulation does not impose any obligations on the end-user End-users who are legal persons may have rights conferred by Regulation (EU) 2016/679 to the extent specifically required by this Regulation
  • While the principles and main provisions of Directive 2002/58/EC of the European Parliament and of the Council remain generally sound, that Directive has not fully kept pace with the evolution of technological and market reality, resulting in an inconsistent or insufficient effective protection of privacy and confidentiality in relation to electronic communications. Those developments include the entrance on the market of electronic communications services that from a consumer perspective are substitutable to traditional services, but do not have to comply with the same set of rules. Another development concerns new techniques that allow for tracking of online behaviour of end-users, which are not covered by Directive 2002/58/EC. Directive 2002/58/EC should therefore be repealed and replaced by this Regulation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Périgois on Unsplash

Sponsors Take A New Run At Privacy Law in Washington State

Perhaps the third time is the charm? Legislators seek to pass a privacy law in Washington state for the third year in a row.

A group of Senators in Washington state’s Senate have introduced a slightly altered version of a privacy bill they floated last summer. A committee of jurisdiction will hold a hearing on 14 January 2021 on SB 5062. Of course, this would mark the third year in a row legislators have tried to enact the Washington privacy act. The new bill (SB 5062) tracks closely with the two bills produced by the Washington Senate and House last year lawmakers could not ultimately reconcile. However, there are no provisions on facial recognition technology, which was largely responsible for sinking a privacy bill in Washington State two years ago. The sponsors have also taken the unusual step of appending language covering the collection and processing of personal data to combat infectious diseases like COVID-19.

I analyzed the discussion draft that Washington State Senator Reuven Carlyle (D-Seattle) released over the summer, and so I will not recite everything about the new bill. It should suffice to highlight the differences between the discussion draft and the introduced legislation. Big picture, the bill still uses the concepts of data controllers and processors most famously enshrined in the European Union’s (EU) General Data Protection Regulation (GDPR). Like other privacy bills, generally, people in Washington State would not need to consent before an entity could collect and process its information. People would be able to opt out of some activities, but most could data collection and processing could still occur as it presently does.

The date on which the bill would take effect was pushed aback from 120 days in the discussion draft to 31 July 2022 in the introduced bill. While SB 5062 would cover non-profits, institutions of higher education, airlines, and others unlike the discussion draft, the effective date for the bill to cover would be 31 July 2026. The right of a person to access personal data a controller is processing is narrowed slightly in that it would no longer be the personal data the controller has but rather categories of personal data. The time controllers would have to respond to a certain class of request would be decreased from 45 to 15 days. This class includes requests to opt out of targeted advertising, the sale of personal data, and any profiling in furtherance of decisions with legal effects. Section 106’s requirement that processors have reasonable security measures has been massaged, rephrased and possibly weakened a bit.

One of the activities controllers and processors could undertake without meeting the requirements of the act was removed. Notably, they will no longer be able to “conduct internal research solely to improve or repair products, services, or technology.” There is also a clarification that using any of the exemptions in Section 110 does not make an entity a controller for purposes of the bill. There is a new requirement that the State Office of Privacy and Data Protection must examine current technology that allows for mass or global opt out or opt in and then report to the legislature. Finally, two of the Congressional stakeholders on privacy and data security hail from Washington state, and consideration and possible passage of a state law may limit their latitude on a federal bill they could support. Senator Maria Cantwell (D-WA) and Representative Cathy McMorris Rodgers (R-WA), who are the ranking members of the Senate Commerce, Science, and Transportation Committee and House Energy and Commerce Committee respectively, are expected to be involved in drafting their committee’s privacy bills, and a Washington state statute may affect their positions in much the same the “California Consumer Privacy Act” (CCPA) (AB 375) has informed a number of California Members’ position on privacy legislation, especially with respect to bills being seen as weaker than the CCPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Kranich17 from Pixabay

UK and EU Defer Decision On Data Flows

Whether there will be an adequacy decision allowing the free flow of personal data under the GDPR from the EU to the recently departed UK has been punted. And, its recent status as a member of the EU notwithstanding, the UK might not get an adequacy decision.

In reaching agreement on many aspects of the United Kingdom’s (UK) exit from the European Union (EU), negotiators did not reach agreement on whether the EU would permit the personal data of EU persons to continue flowing to the UK under the easiest means possible. Instead, the EU and UK agreed to let the status quo continue until an adequacy decision is made or six months lapse. The value of data flowing between the UK and EU was valued at more than £100 billion in 2017 according to British estimates, with the majority of this trade being from the UK to the EU.

Under the General Data Protection Regulation (GDPR), the personal data of EU people can be transferred to other nations for most purposes once the European Commission (EC) has found the other nation has adequate protection equal to those granted in the EU. Of course, this has been an ongoing issue with data flows to the United States (U.S.) as two agreements (Safe Harbor and Privacy Shield) and their EC adequacy decisions were ruled illegal, in large part, because, according to the EU’s highest court, U.S. law does not provide EU persons with the same rights they have in the EU. Most recently, this occurred in 2020 when the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the EU-United States Privacy Shield (aka Schrems II). It bears note that transfers of personal data may occur through other means under the GDPR that may prove more resource intensive: standard data protection clauses (SCC), binding corporate rules (BCR), and others.

Nevertheless, an adequacy decision is seen as the most desirable means of transfer and the question of whether the UK’s laws are sufficient has lingered over the Brexit discussions, with some claiming that the nation’s membership in the Five Eyes surveillance alliance with the U.S. and others possibly disqualifying the UK. Given the range of thorny issues the UK and EU punted (e.g. how to handle the border between Northern Ireland and Ireland), it is not surprising that the GDPR and data flows was also punted.

The UK-EU Trade and Cooperation Agreement (TCA) explained the terms of the data flow agreement and, as noted, in the short term, the status quo will continue with data flows to the UK being treated as if it were still part of the EU. This state will persist until the EC reaches an adequacy decision or for four months with another two months of the status quo being possible in the absence of an adequacy decision so long as neither the UK nor EU object. Moreover, these provisions are operative only so long as the UK has its GDPR compliant data protection law (i.e. UK Data Protection Act 2018) in place and does exercise specified “designated powers.” The UK has also deemed EU and European Economic Area (EEA) and European Free Trade Association (EFTA) nations to be adequate for purposes of data transfers from the UK on a transitional basis.

Specifically, the TCA provides

For the duration of the specified period, transmission of personal data from the Union to the United Kingdom shall not be considered as transfer to a third country under Union law, provided that the data protection legislation of the United Kingdom on 31 December 2020, as it is saved and incorporated into United Kingdom law by the European Union (Withdrawal) Act 2018 and as modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (“the applicable data protection regime”), applies and provided that the United Kingdom does not exercise the designated powers without the agreement of the Union within the Partnership Council.

The UK also agreed to notify the EU if it “enters into a new instrument which can be relied on to transfer personal data to a third country under Article 46(2)(a) of the UK GDPR or section 75(1)(a) of the UK Data Protection Act 2018 during the specified period.” However, if the EU were to object, it appears from the terms of the TCA, all the EU could do is force the UK “to discuss the relevant object.” And yet, should the UK sign a treaty allowing personal data to flow to a nation the EU deems inadequate, this could obviously adversely affect the UK’s prospects of getting an adequacy decision.

Not surprisingly, the agreement also pertains to the continued flow of personal data as part of criminal investigations and law enforcement matters but not national security matters. Moreover, these matters fall outside the scope of the GDPR and would not be affected in many ways by an adequacy decision or a lack of one. In a British government summary, it is stated that the TCA

provide[s] for law enforcement and judicial cooperation between the UK, the EU and its Member States in relation to the prevention, investigation, detection and prosecution of criminal offences and the prevention of and fight against money laundering and financing of terrorism.

The text of the TCA makes clear national security matters visa vis data flows and information sharing are not covered:

This Part only applies to law enforcement and judicial cooperation in criminal matters taking place exclusively between the United Kingdom, on the one side, and the Union and the Member States, on the other side. It does not apply to situations arising between the Member States, or between Member States and Union institutions, bodies, offices and agencies, nor does it apply to the activities of authorities with responsibilities for safeguarding national security when acting in that field.

The TCA also affirms:

  • The cooperation provided for in this Part is based on the Parties’ long-standing commitment to ensuring a high level of protection of personal data.
  • To reflect that high level of protection, the Parties shall ensure that personal data processed under this Part is subject to effective safeguards in the Parties’ respective data protection regimes…

The United Kingdom’s data protection authority (DPA), the Information Commissioner’s Office (ICO), issued an explanation of how British law enforcement entities should act in light of the TCA. The ICO explained to British entities on law enforcement-related data transfers to the UK:

  • We are now a ‘third country’ for EU data protection purposes. If you receive personal data from a law enforcement partner in the EU, this means the sender will need to comply with the transfer provisions under their national data protection law (which are likely to be similar to those in Part 3 of the DPA 2018).
  • This means the EU sender needs to make sure other appropriate safeguards are in place – probably through a contract or other binding legal instrument, or by making their own assessment of appropriate safeguards. The sender can take into account the protection provided by the DPA 2018 itself when making this assessment.
  • If you receive personal data from other types of organisations in the EU or EEA who are subject to the GDPR, the sender will need to comply with the transfer provisions of the UK GDPR. You may want to consider putting standard contractual clauses (SCCs) in place to ensure adequate safeguards in these cases. We have produced an interactive tool to help you use the SCCs.

The ICO explained for transfers from the UK to the EU (but not the EEA):

  • There is a transitional adequacy decision in place to cover transfers to EU member states and Gibraltar. This will not extend to EEA countries outside the EU, where you should continue to consider other safeguards.
  • This means you can continue to send personal data from the UK to your law enforcement partners in the EU, as long as you can show the transfer is necessary for law enforcement purposes. You can also transfer personal data to non-law enforcement bodies in the EU if you can meet some additional conditions, but you will need to notify the ICO.

Turning back to an adequacy decision and commercial transfers of personal data from the EU to the UK, in what may well be a preview of a world in which there is no adequacy decision between the UK and EU, the European Data Protection Board (EDPB) issued an “information note” in mid-December that spells out how the GDPR would be applied:

  • In the absence of an adequacy decision applicable to the UK as per Article 45 GDPR, such transfers will require appropriate safeguards(e.g., standard data protection clauses, binding corporate rules, codes of conduct…), as well as enforceable data subject rights and effective legal remedies for data subjects, in accordance with Article 46 GDPR.
  • Subject to specific conditions, it may still be possible to transfer personal data to the UK based on a derogation listed in Article 49 GDPR. However, Article 49 GDPR has an exceptional nature and the derogations it contains must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive.
  • Moreover, where personal data are transferred to the UK on the basis of Article 46 GDPR safeguards, supplementary measures might be necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, in accordance with the Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.

Regarding commercial data transfers, the ICO issued a statement urging British entities to start setting up “alternative transfer mechanisms” to ensure data continues to flow from the EU to UK:

  • The Government has announced that the Treaty agreed with the EU will allow personal data to flow freely from the EU (and EEA) to the UK, until adequacy decisions have been adopted, for no more than six months.
  • This will enable businesses and public bodies across all sectors to continue to freely receive data from the EU (and EEA), including law enforcement agencies.
  • As a sensible precaution, before and during this period, the ICO recommends that businesses work with EU and EEA organisations who transfer personal data to them, to put in place alternative transfer mechanisms, to safeguard against any interruption to the free flow of EU to UK personal data.

However, even with these more restrictive means of transferring personal data to the UK exist, there will likely be legal challenges. It bears note that in light of Schrems II, EU DPAs are likely to apply a much higher level of scrutiny to SCCs, and challenges to the legality of using SCCs to transfer personal data to the U.S. have already been commenced. It also seems certain the legality of using SCCs to transfer data to the UK would be challenged, as well.

However, returning to the preliminary issue of whether the EC will give the UK an adequacy decision, there may a number of obstacles to a finding that the UK’s data protection and surveillance laws are indeed adequate under EU law[1]. Firstly, the UK’s surveillance practices in light of a recent set of CJEU rulings may prove difficult for the EC to stomach. In 2020, the CJEU handed down a pair of rulings (here and here) on the extent to which European Union (EU) nations may engage in bulk, indiscriminate collection of two types of data related to electronic communications. The CJEU found that while EU member nations may conduct these activities to combat crime or national security threats during periods limited by necessity and subject to oversight, nations may not generally require the providers of electronic communications to store and provide indiscriminate location data and traffic data in response to an actual national security danger or a prospective one. The CJEU combined three cases into two rulings that came from the UK, France, and Belgium to elucidate the reach of the Privacy and Electronic Communications Directive in relation to foundational EU laws.

The UK is, of course, one of the U.S.’s staunchest allies and partners when it comes to government surveillance of electronic communications. On this point, the CJEU summarized the beginning of the case out of the UK:

  • At the beginning of 2015, the existence of practices for the acquisition and use of bulk communications data by the various security and intelligence agencies of the United Kingdom, namely GCHQ, MI5 and MI6, was made public, including in a report by the Intelligence and Security Committee of Parliament (United Kingdom). On 5 June 2015, Privacy International, a non-governmental organisation, brought an action before the Investigatory Powers Tribunal (United Kingdom) against the Secretary of State for Foreign and Commonwealth Affairs, the Secretary of State for the Home Department and those security and intelligence agencies, challenging the lawfulness of those practices.

Secondly, the government of Prime Minister Boris Johnson may aspire to change data laws in ways the EU does not. In media accounts, unnamed EC officials were critical of the UK’s 2020 “National Data Strategy,” particularly references to “legal barriers (real and perceived)” to accessing data that “must be addressed.”

Thirdly, it may become a matter of politics. The EU has incentives to make the UK’s exit from the EU difficult to dissuade other nations from following the same path. Moreover, having previously been the second largest economy in the EU as measured by GDP, the UK may prove a formidable economy competitor, lending more weight to the view that the EU may not want to help the UK’s  businesses compete with the EU’s.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by succo from Pixabay


[1] European Union Parliament, “The EU-UK relationship beyond Brexit: options for Police Cooperation and Judicial Cooperation in Criminal Matters,” Page 8: Although the UK legal framework is currently broadly in line with the EU legal framework and the UK is a signatory to the European Convention on Human Rights (ECHR), there are substantial questions over whether the Data Protection Act fully incorporates the data protection elements required by the Charter of Fundamental Rights, concerning the use of the national security exemption from the GDPR used by the UK, the retention of data and bulk powers granted to its security services, and over its onward transfer of this data to third country security partners such as the ‘Five Eyes’ partners (Britain, the USA, Australia, New Zealand and Canada).

EU Regulators Settle Dispute Over Proper Punishment of Twitter For Breach

The EDPB uses its GDPR powers to manage a dispute between DPAs.

The European Data Protection Board (EDPB) concluded its first use of powers granted under the General Data Protection Regulation (GDPR) (Regulation (EU) 2016/679 of the European Parliament and of the Council) to resolve a dispute among EU regulators on how to apply the GDPR in punishing a violator. In this case, the EDPB had to referee how Twitter should be punished for a data breach arising from a bug affecting users of an Android OS. Ireland’s Data Protection Commission (DPC) and unnamed concerned supervisory agencies (CSA) disagreed about how Twitter should be fined for the GDPR breach, and so an unused article of the GDPR was triggered that put the EDPB in charge of resolving the dispute. The EDPB considered the objections raised by other EU agencies and found that the DPC needed to recalculate its fine that was set to be a maximum of $300,000 of a possible $69.2 million. Thereafter, the DPC revised and decided that “an administrative fine of €450,000 on Twitter” is “an effective, proportionate and dissuasive measure.”

The DPC issued a revised decision that incorporates the EDPB’s decision on the case that arose from a glitch that changed a person’s protected tweets to unprotected. Twitter users may protect their tweets, meaning only certain people, usually just followers, can see this content. However, a bug with the Android OS resulted in a person’s desire to protect their tweets being thwarted the DPC explained:

The bug that resulted in this data breach meant that, if a user operating an Android device changed the  email  address  associated  with  that  Twitter  account,  their  tweets  became  unprotected  and consequently were accessible to the wider public without the user’s knowledge.

The DPC said this breach occurred between September 2017 and January 2019, affecting 88,726 EU and European Economic Area (EEA) users, and on 8 January 2019, Twitter alerted the DPC, triggering an investigation. Twitter revealed:

On 26 December 2018, we received a bug report through our bug bounty program that if a Twitter user with a protected account, using Twitter for Android, changed their email address the bug would result in their account being unprotected.

Article 33(1) of the GDPR requires breaches to be reported to a DPA within 72 hours in most cases:

In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.

However, Twitter conceded by way of reason as to why it had not reported the breach within the 72 hour window:

The severity of the issue – and that it was reportable – was not appreciated until 3 January 2018 at which point Twitter’s incident response process was put into action.

Additionally, Article 33(5) would become relevant during the DPC investigation:

The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.

Consequently, Twitter had a responsibility as the controller to document all the relevant facts about the data breach and then to report the breach within 72 hours of becoming aware of the breach subject to a range of exceptions.

Shortly thereafter, the DPC named itself the lead supervisory agency (LSA), investigated and reached its proposed decision in late April and submitted it to the European Commission (EC). And, this is where the need for the EDPB to step in began.

Irish Data Protection Commissioner Helen Dixon explained the scope of the subsequent investigation:

  1. Whether Twitter International Company (TIC) complied with its obligations, in accordance with Article 33(1) GDPR, to notify the Commission of the Breach without undue delay and, where feasible, not later than 72 hours after having become aware of it; and
  2. Whether TIC complied with its obligation under Article 33(5) to document the Breach.

Dixon found that TIC did not comply with Article 33(1) and found unpersuasive the main claim of TIC that because Twitter, International, its processor under EU law, did not alert TIC in a timely fashion, it need not meet the 72 hour window. Moreover, Dixon found TIC did not meet its Article 33(5) obligations such that its compliance with Article 33 could be determined. However, the size of the fine became the issue necessitating the EDPB step in because the Austrian Supervisory Authority (Österreichische Datenschutzbehörde), the German Supervisory Authority (Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit) and the Italian Supervisory Authority (Garante per la protezione dei dati personali) made “relevant and reasoned” objections.

Per the GDPR, the EDPB intervened. Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

In its mid-November statement, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 2% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

According to the DPC, the EDPB ultimately decided that

…the [DPC] is required to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC, and to amend its Draft Decision by increasing the level of the fine in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.

Dixon went back and reasoned through the breach and compliance. She stressed that the GDPR infringements were largely aside and apart from the substance of the breach, which is why the administrative fine was low. Nonetheless, Dixon reexamined the evidence in light of the EDPB’s decision and concluded in relevant part:

  • I therefore consider that the nature of the obligations arising under Article 33(1) and Article 33(5) are such that, compliance is central to the overall functioning of the supervision and enforcement regime performed by supervisory authorities in relation to both the specific issue of personal data breaches but also the identification and assessment of wider issues of non-compliance by controllers. As such, non-compliance with these obligations has serious consequences in that it risks undermining the effective exercise by supervisory authorities of their functions under the GDPR. With regard to the nature of the specific infringements in these circumstances, it is clear, having regard to the foregoing, that in the circumstances of this case, the delayed notification under Article 33(1) inevitably delayed the Commission’s assessment of the Breach. With regard to Article 33(5), the deficiencies in the “documenting” of the Breach by TIC impacted on the Commission’s overall efficient assessment of the Breach, necessitating the raising of multiple queries concerning the facts and sequencing surrounding the notification of the Breach.
  • Accordingly, having regard to the potential for damage to data subjects caused by the delayed notification to the Commission (which I have set out above in the context of Article 83(2)(a)), the corollary of this is that any category of personal data could have been affected by the delayed notification. Whilst, as stated above, there was no direct evidence of damage, at the same time, it cannot be definitively said that there was no damage to data subjects or no affected categories of personal data.

Dixon also recalculated the fine that she noted was bound on the upper limit at €10 million or 2% of annual worldwide revenue after once again turning aside TIC’s argument that it independent of Twitter for purposes of determining a fine. Dixon determined the appropriate administrative fine would be about $500,000 and Twitter’s worldwide revenue was $3.46 billion in 2019 (meaning a maximum penalty of $69.2 million.) Dixon explained:

Having regard to all of the foregoing, and, in particular, having had due regard to all of the factors which I am required to consider under Articles 83(2)(a) to (k), as applicable, and in the interests of effectiveness, proportionality and deterrence, and in light of the re-assessment of the elements I have implemented and documented above in accordance with the EDPB Decision, I have decided to impose an administrative fine of $500,000, which equates (in my estimation for this purpose) to €450,000. In deciding to impose a fine in this amount, I have had regard to the previous range of the fine, set out in the Draft Decision (of $150,000 – $300,000), and to the binding direction in the EDPB Decision, at paragraph 207 thereof, that the level of the fine should be increased “..in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.”

In its Article 65 decision, the EDPB judged the various objections to the DPC’s proposed decision against Article 4(24) of the GDPR:

‘relevant and reasoned objection’ means an objection to a draft decision as to whether there is an infringement of this Regulation, or whether envisaged action in relation to the controller or processor complies with this Regulation, which clearly demonstrates the significance of the risks posed by the draft decision as regards the fundamental rights and freedoms of data subjects and, where applicable, the free flow of personal data within the Union;

The EDPB ultimately decided “the fine proposed in the Draft Decision is too low and therefore does not fulfil its purpose as a corrective measure, in particular it does not meet the requirements of Article 83(1) GDPR of being effective, dissuasive and proportionate.” The EDPB directed the DPC “to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC so as to ensure it is appropriate to the facts of the case.” However, the EDPB turned aside a number of other objections raised by EU DPAs as failing to meet the standard of review in Article 4(24):

  • the competence of the LSA;
  • the qualification of the roles of TIC and Twitter, Inc., respectively;
  • the infringements of the GDPR identified by the LSA;
  • the existence of possible additional (or alternative) infringements of the GDPR;
  • the lack of a reprimand;

However, the EDPB stressed:

Regarding the objections deemed not to meet the requirements stipulated by Art 4(24) GDPR, the EDPB does not take any position on the merit of any substantial issues raised by these objections. The EDPB reiterates that its current decision is without any prejudice to any assessments the EDPB may be called upon to make in other cases, including with the same parties, taking into account the contents of the relevant draft decision and the objections raised by the CSAs.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by papagnoc from Pixabay

EC Finally Unveils Digital Services Act and Digital Markets Act

The EU releases its proposals to remake digital markets.

The European Commission (EC) has released its draft proposals to remake how the European Union (EU) regulates digital markets and digital services, the latest in the bloc’s attempts to rein in what it sees as harms and abuses to people and competition in Europe and the world. At the earliest, these proposals would take effect in 2022 and are sure to be vigorously opposed by large United States (U.S.) multinationals like Google and Facebook and will also likely faced more restrained pushback from the U.S. government.

The Digital Markets Act would allow the EU to designate certain core platform services as gatekeepers subject to certain quanitative metrics or on a case-by-case basis. Once a company is deemed a gatekeeper, it would be subject to much greater regulation by the EU and violations of the new act could result in fines of 10% of worldwide revenue.

In its press release, the EC asserted:

European values are at the heart of both proposals. The new rules will better protect consumers and their fundamental rights online, and will lead to fairer and more open digital markets for everyone. A modern rulebook across the single market will foster innovation, growth and competitiveness and will provide users with new, better and reliable online services. It will also support the scaling up of smaller platforms, small and medium-sized enterprises, and start-ups, providing them with easy access to customers across the whole single market while lowering compliance costs. Furthermore, the new rules will prohibit unfair conditions imposed by online platforms that have become or are expected to become gatekeepers to the single market. The two proposals are at the core of the Commission’s ambition to make this Europe’s Digital Decade.

In the Digital Markets Act, the EC explained the problem with large platforms dominating certain digital markets. The EC discussed the harm to people and medium and small businesses as some large companies control certain markets and use their size and dominance to extract unfair prices for inferior services and products. The EC listed the core platform services that might be regulated:

  • online intermediation services (incl. for example marketplaces, app stores and online intermediation services in other sectors like mobility, transport or energy)
  • online search engines,
  • social networking
  • video sharing platform services,
  • number-independent interpersonal electronic communication services,
  • operating systems,
  • cloud services and
  • advertising services, including advertising networks, advertising exchanges and any other advertising intermediation services, where these advertising services are being related to one or more of the other core platform services mentioned above.

Clearly, a number of major American firms could easily be considered “core platform services” including Amazon, Apple, Google, Facebook, Instagram, YouTube, WhatsApp, Microsoft, and others. Whether they would be deemed gatekeepers would hinge on whether they meet the quantitative metrics the EU will put in place, and this will be a rebuttable presumption such that if a firm meets the standards, it may present evidence to the contrary and argue it is not a gatekeeper.

The EC detailed the quantitative metrics in Article 3. A company may qualify if it meets all three of the following criteria subject to further metrics:

A provider of core platform services shall be designated as gatekeeper if:

(a) it has a significant impact on the internal market;

(b) it operates a core platform service which serves as an important gateway for business users to reach end users; and

(c) it enjoys an entrenched and durable position in its operations or it is foreseeable that it will enjoy such a position in the near future.

The other metrics include €6.5 billion in revenue over the last three years or a €65 billion market capitalization and the provision of core platform services in at least three member states to show a “significant impact on internal market.” For the second category listed above, a company would need to provide a core platform service to 45 million or more people in the EU and 10,000 or more businesses in the EU. And, for the last category, passing the 45 million user and 10,000 business threshold for three consecutive years would suffice. The act reads:

A provider of core platform services shall be presumed to satisfy:

(a) the requirement in paragraph 1 point (a) where the undertaking to which it belongs achieves an annual EEA turnover equal to or above EUR 6.5 billion in the last three financial years, or where the average market capitalisation or the equivalent fair market value of the undertaking to which it belongs amounted to at least EUR 65 billion in the last financial year, and it provides a core platform service in at least three Member States;

(b) the requirement in paragraph 1 point (b) where it provides a core platform service that has more than 45 million monthly active end users established or located in the Union and more than 10,000 yearly active business users established in the Union in the last financial year; for the purpose of the first subparagraph, monthly active end users shall refer to the average number of monthly active end users throughout the largest part of the last financial year;

(c) the requirement in paragraph 1 point (c) where the thresholds in point (b) were met in each of the last three financial years.

The EU would also be able to label a provider of core platform services a gatekeeper on a case-by-case basis:

Provision should also be made for the assessment of the gatekeeper role of providers of core platform services which do not satisfy all of the quantitative thresholds, in light of the overall objective requirements that they have a significant impact on the internal market, act as an important gateway for business users to reach end users and benefit from a durable and entrenched position in their operations or it is foreseeable that it will do so in the near future.

It bears note that a company would be found to be a gatekeeper if it is merely foreseeable that it will satisfy these criteria soon. This flexibility could allow the EU to track companies and flag them as gatekeepers before they, in fact, achieve the sort of market dominance this regulation is intended to stop.

Among the relevant excerpts from the “Reasons for and objectives of the proposal” section of the act are:

  • Large platforms have emerged benefitting from characteristics of the sector such as strong network effects, often embedded in their own platform ecosystems, and these platforms represent key structuring elements of today’s digital economy, intermediating the majority of transactions between end users and business users. Many of these undertakings are also comprehensively tracking and profiling end users. A few large platforms increasingly act as gateways or gatekeepers between business users and end users and enjoy an entrenched and durable position, often as a result of the creation of conglomerate ecosystems around their core platform services, which reinforces existing entry barriers.
  • As such, these gatekeepers have a major impact on, have substantial control over the access to, and are entrenched in digital markets, leading to significant dependencies of many business users on these gatekeepers, which leads, in certain cases, to unfair behaviour vis-à-vis these business users. It also leads to negative effects on the contestability of the core platform services concerned. Regulatory initiatives by Member States cannot fully address these effects; without action at EU level, they could lead to a fragmentation of the Internal Market.
  • Unfair practices and lack of contestability lead to inefficient outcomes in the digital sector in terms of higher prices, lower quality, as well as less choice and innovation to the detriment of European consumers. Addressing these problems is of utmost importance in view of the size of the digital economy (estimated at between 4.5% to 15.5% of global GDP in 2019 with a growing trend) and the important role of online platforms in digital markets with its societal and economic implications.
  • Weak contestability and unfair practices in the digital sector are more frequent and pronounced in certain digital services than others. This is the case in particular for widespread and commonly used digital services and infrastructures that mostly directly intermediate between business users and end users.
  • The enforcement experience under EU competition rules, numerous expert reports and studies and the results of the OPC show that there are a number of digital services that have the following features: (i) highly concentrated multi-sided platform services, where usually one or very few large digital platforms set the commercial conditions with considerable autonomy; (ii) a few large digital platforms act as gateways for business users to reach their customers and vice-versa; and (iii) gatekeeper power of these large digital platforms is often misused by means of unfair behaviour vis-à-vis economically dependent business users and customers.
  • The proposal is therefore further limited to a number of ‘core platform services’ where the identified problems are most evident and prominent and where the presence of a limited number of large online platforms that serve as gateways for business users and end users has led or is likely to lead to weak contestability of these services and of the markets in which these intervene. These core platform services include: (i) online intermediation services (incl. for example marketplaces, app stores and online intermediation services in other sectors like mobility, transport or energy) (ii) online search engines, (iii) social networking (iv)video sharing platform services, (v) number-independent interpersonal electronic communication services, (vi) operating systems, (vii) cloud services and (viii) advertising services, including advertising networks, advertising exchanges and any other advertising intermediation services, where these advertising services are being related to one or more of the other core platform services mentioned above.
  • The fact that a digital service qualifies as a core platform service does not mean that issues of contestability and unfair practices arise in relation to every provider of these core platform services. Rather, these concerns appear to be particularly strong when the core platform service is operated by a gatekeeper. Providers of core platform providers can be deemed to be gatekeepers if they: (i) have a significant impact on the internal market, (ii) operate one or more important gateways to customers and (iii) enjoy or are expected to enjoy an entrenched and durable position in their operations.
  • Such gatekeeper status can be determined either with reference to clearly circumscribed and appropriate quantitative metrics, which can serve as rebuttable presumptions to determine the status of specific providers as a gatekeeper, or based on a case-by-case qualitative assessment by means of a market investigation.

The Digital Services Act would add new regulation on top of Directive 2000/31/EC (aka the e-Commerce Directive) by “[b]uilding on the key principles set out in the e-Commerce Directive, which remain valid today.” This new scheme “seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.”

The Digital Services Act is focused mostly on the information and misinformation present all over the online world and the harms it wreaks on EU citizens. However, the EC is also seeking to balance fundamental EU rights in more tightly regulating online platforms. Like the Digital Markets Act, this regulation would focus on the largest online content, product and services providers, which, as a practical matter, would likely be Facebook, Amazon, Google, Spotify, and a handful of other companies. Once a company has 10% of more of the EU’s population using its offerings, then the requirements of the Digital Services Act would be triggered.

Additionally, the Digital Services Act unites two online issues not usually considered together in the United States (U.S.): harmful online content and harmful online products. Even though it seems logical to consider these online offerings in tandem, there is clear bifurcation in the U.S. in how these two issues are regulated to the extent they are at the federal and state levels.

The Digital Services Act “will introduce a series of new, harmonised EU-wide obligations for digital services, carefully graduated on the basis of those services’ size and impact, such as:

  • Rules for the removal of illegal goods, services or content online;
  • Safeguards for users whose content has been erroneously deleted by platforms;
  • New obligations for very large platforms to take risk-based action to prevent abuse of their systems;
  • Wide-ranging transparency measures, including on online advertising and on the algorithms used to recommend content to users;
  • New powers to scrutinize how platforms work, including by facilitating access by researchers to key platform data;
  • New rules on traceability of business users in online market places, to help track down sellers of illegal goods or services;
  • An innovative cooperation process among public authorities to ensure effective enforcement across the single market.”

The EC explained

new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.

The EC spelled out what the Digital Services Act would do:

This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes:

(a) a framework for the conditional exemption from liability of providers of intermediary services;

(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services;

(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.

The EC explained the purpose of the act:

  • this proposal seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.
  • The proposal defines clear responsibilities and accountability for providers of intermediary services, and in particular online platforms, such as social media and marketplaces. By setting out clear due-diligence obligations for certain intermediary services, including notice-and-action procedures for illegal content and the possibility to challenge the platforms’ content moderation decisions, the proposal seeks to improve users’ safety online across the entire Union and improve the protection of their fundamental rights. Furthermore, an obligation for certain online platforms to receive, store and partially verify and publish information on traders using their services will ensure a safer and more transparent online environment for consumers.
  • Recognising the particular impact of very large online platforms on our economy and society, the proposal sets a higher standard of transparency and accountability on how the providers of such platforms moderate content, on advertising and on algorithmic processes. It sets obligations to assess the risks their systems pose to develop appropriate risk management tools to protect the integrity of their services against the use of manipulative techniques.

The EC summarized how the act will work:

  • The operational threshold for service providers in scope of these obligations includes those online platforms with a significant reach in the Union, currently estimated to be amounting to more than 45 million recipients of the service. This threshold is proportionate to the risks brought by the reach of the platforms in the Union; where the Union’s population changes by a certain percentage, the Commission will adjust the number of recipients considered for the threshold, so that it consistently corresponds to 10 % of the Union’s population. Additionally, the Digital Services Act will set out a co-regulatory backstop, including building on existing voluntary initiatives.
  • This proposal should constitute the appropriate basis for the development of robust technologies to prevent the reappearance of illegal information, accompanied with the highest safeguards to avoid that lawful content is taken down erroneously; such tools could be developed on the basis of voluntary agreements between all parties concerned and should be encouraged by Member States; it is in the interest of all parties involved in the provision of intermediary services to adopt and implement such procedures; the provisions of this Regulation relating to liability should not preclude the development and effective operation, by the different interested parties, of technical systems of protection and identification and of automated recognition made possible by digital technology within the limits laid down by Regulation 2016/679.
  • Union citizens and others are exposed to ever-increasing risks and harms online – from the spread of illegal content and activities, to limitations to express themselves and other societal harms. The envisaged policy measures in this legislative proposal will substantially improve this situation by providing a modern, future-proof governance framework, effectively safeguarding the rights and legitimate interests of all parties involved, most of all Union citizens. The proposal introduces important safeguards to allow citizens to freely express themselves, while enhancing user agency in the online environment, as well as the exercise of other fundamental rights such as the right to an effective remedy, non-discrimination, rights of the child as well as the protection of personal data and privacy online.
  • The proposed Regulation will mitigate risks of erroneous or unjustified blocking speech, address the chilling effects on speech, stimulate the freedom to receive information and hold opinions, as well as reinforce users’ redress possibilities. Specific groups or persons may be vulnerable or disadvantaged in their use of online services because of their gender, race or ethnic origin, religion or belief, disability, age or sexual orientation. They can be disproportionately affected by restrictions and removal measures following from (unconscious or conscious) biases potentially embedded in the notification systems by users and third parties, as well as replicated in automated content moderation tools used by platforms. The proposal will mitigate discriminatory risks, particularly for those groups or persons and will contribute to the protection of the rights of the child and the right to human dignity online. The proposal will only require removal of illegal content and will impose mandatory safeguards when users’ information is removed, including the provision of explanatory information to the user, complaint mechanisms supported by the service providers as well as external out-of-court dispute resolution mechanism. Furthermore, it will ensure EU citizens are also protected when using services provided by providers not established in the Union but active on the internal market, since those providers are covered too.
  • With regard to service providers’ freedom to conduct a business, the costs incurred on businesses are offset by reducing fragmentation across the internal market. The proposal introduces safeguards to alleviate the burden on service providers, including measures against repeated unjustified notices and prior vetting of trusted flaggers by public authorities. Furthermore, certain obligations are targeted to very large online platforms, where the most serious risks often occur and which have the capacity absorb the additional burden.
  • The proposed legislation will preserve the prohibition of general monitoring obligations of the e-Commerce Directive, which in itself is crucial to the required fair balance of fundamental rights in the online world. The new Regulation prohibits general monitoring obligations, as they could disproportionately limit users’ freedom of expression and freedom to receive information, and could burden service providers excessively and thus unduly interfere with their freedom to conduct a business. The prohibition also limits incentives for online surveillance and has positive implications for the protection of personal data and privacy.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Sambeet D from Pixabay

Privacy Shield Hearing

The focus was on how the U.S. and EU can reach agreement on an arrangement that will not be struck down by the EU’s highest court.

Last week, the Senate Commerce, Science, and Transportation Committee held a hearing on the now invalidated European Union (EU)-United States (U.S.) Privacy Shield, a mechanism that allowed companies to transfer the personal data of EU residents to the U.S. The EU’s highest court struck down the adequacy decision that underpinned the system on the basis of U.S. surveillance activities and lack of redress that violated EU law. This is the second time in the decade the EU’s top court has invalidated a transfer arrangement, the first being the Safe Harbor system. Given the estimated billions, or even trillions, of dollars in value realized from data flows between the EU and U.S. there is keen interest on both sides of the Atlantic in finding a legal path forward. However, absent significant curtailment of U.S. surveillance and/or a significant expansion of the means by which EU nationals could have violations of their rights rectified, it would appear a third agreement may not withstand the inevitable legal challenges. Moreover, there are questions as to the legality of other transfer tools in light of the Court of Justice for the European Union’s decision in the case known as Schrems II, and the legality of some Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) may be soon be found in violation, too.

Consequently, a legislative fix, or some portion thereof, could be attached to federal privacy legislation. Hence, the striking down of Privacy Shield may provide additional impetus to Congress and the next Administration to reach a deal on privacy. Moreover, the lapsed reauthorization of some Foreign Intelligence Surveillance Act authorities may be another legislative opportunity for the U.S. to craft an approach amendable to the EU in order to either obtain an adequacy decision or a successor agreement to the Privacy Shield.

Chair Roger Wicker (R-MS) approached the issue from the perspective of international trade and the economic benefit accruing to businesses on both sides of the Atlantic. His opening remarks pertained less to the privacy and surveillance aspects of the CJEU’s ruling. Wicker appears to be making the case that the EU seems to misunderstand that redress rights in the U.S. are more than adequate, and the U.S.’ surveillance regime is similar to those of some EU nations. One wonders if the CJEU is inclined to agree with this position. Nonetheless, Wicker expressed hope that the EU and U.S. can reach “a durable and lasting data transfer framework…that provides meaningful data protections to consumers, sustains the free flow of information across the Atlantic, and encourages continued economic and strategic partnership with our European allies – a tall order but an essential order.” He worried about the effect of the CJEU’s ruling on SCCs. Wicker made the case that the EU and U.S. share democratic values and hinted that the ongoing talks in the committee to reach a federal data privacy law might include augmented redress rights that might satisfy the CJEU.

Ranking Member Maria Cantwell (D-WA) spoke very broadly about a range of issues related to data transfers and privacy. She stressed the importance of data flows in the context of larger trade relations. Cantwell also stressed the shared values between the U.S. and the EU and her hope that the two entities work “together on these very important national concerns, trade and technology, so that we can continue to improve economic opportunities and avoid moves towards protectionism.” She also called for federal privacy legislation but hinted that states should still be able to regulate privacy, suggesting her commitment to having a federal law be a floor for state laws. Cantwell also asserted that bulk surveillance, the likes of which the National security Agency has engaged in, may simply not be legal under EU law.

Deputy Assistant Secretary of Commerce for Services James Sullivan blurred the issues presented by Schrems II much like Cantwell did. The CJEU’s decision that focused on U.S. surveillance practices and the lack of meaningful recourse in the U.S. if an EU resident’s rights were violated was merged into a call for like-minded nations to unite against authoritarian nations. Sullivan distinguished between U.S. surveillance and the surveillance conducted by the People’s Republic of China (without naming the nation) and other regimes as if this should satisfy the EU as to the legality and propriety of U.S. treatment of EU personal data. Sullivan stated:

  • The Schrems II decision has created enormous uncertainties for U.S. companies and the transatlantic economy at a particularly precarious time. Immediately upon issuance of the ruling, the 5,400 Privacy Shield participants and their business partners in the EU could no longer rely on the Framework as a lawful basis for transferring personal data from Europe to the United States. Because neither the Court nor European data protection authorities provided for any enforcement grace period, Privacy Shield companies were left with three choices: (1) risk facing potentially huge fines (of up to 4 percent of total global turnover in the preceding year) for violating GDPR, (2) withdraw from the European market, or (3) switch right away to another more expensive data transfer mechanism.
  • Unfortunately, because of the Court’s ruling in the Privacy Shield context that U.S. laws relating to government access to data do not confer adequate protections for EU personal data, the use of other mechanisms like SCCs and BCRs to transfer EU personal data to the United States is now in question as well.
  • The objective of any potential agreement between the United States and the European Commission to address Schrems II is to restore the continuity of transatlantic data flows and the Framework’s privacy protections by negotiating targeted enhancements to Privacy Shield that address the Court’s concerns in Schrems II. Any such enhancements must respect the U.S. Government’s security responsibilities to our citizens and allies.
  • To be clear, we expect that any enhancements to the Privacy Shield Framework would also cover transfers under all other EU-approved data transfer mechanisms like SCCs and BCRs as well.
  • The Schrems II decision has underscored the need for a broader discussion among likeminded democracies on the issue of government access to data. Especially as a result of the extensive U.S. surveillance reforms since 2015, the United States affords privacy protections relating to national security data access that are equivalent to or greater than those provided by many other democracies in Europe and elsewhere.
  • To minimize future disruptions to data transfers, we have engaged with the European Union and other democratic nations in a multilateral discussion to develop principles based on common practices for addressing how best to reconcile law enforcement and national security needs for data with protection of individual rights.
  • It is our view that democracies should come together to articulate shared principles regarding government access to personal data—to help make clear the distinction between democratic societies that respect civil liberties and the rule of law and authoritarian governments that engage in the unbridled collection of personal data to surveil, manipulate, and control their citizens and other individuals without regard to personal privacy and human rights. Such principles would allow us to work with like-minded partners in preserving and promoting a free and open Internet enabled by the seamless flow of data.

Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips stressed he was speaking in a personal capacity and not for the FTC. He extolled the virtues of the “free and open” internet model in the U.S. with the double implication that it is superior both to nations like the PRC and Russia but also the EU model. Phillips seemed to be advocating for talking the EU into accepting that the U.S.’s privacy regime and civil liberties are stronger than any other nation. Her also made the case, like other witnesses, that the U.S. data privacy and protection regulation is more similar to the EU than the PRC, Russia, and others. Phillips also sought to blur the issues and recast Privacy Shield in the context of the global struggle between democracies and authoritarian regimes. Phillips asserted:

  • First, we need to find a path forward after Schrems II, to permit transfers between the U.S. and EU. I want to recognize the efforts of U.S. and EU negotiators to find a replacement for Privacy Shield. While no doubt challenging, I have confidence in the good faith and commitment of public servants like Jim Sullivan, with whom I have the honor of appearing today, and our partners across the Atlantic. I have every hope and expectation that protecting cross-border data flows will be a priority for the incoming Administration, and I ask for your help in ensuring it is.
  • Second, we must actively engage with nations evaluating their approach to digital governance, something we at the FTC have done, to share and promote the benefits of a free and open Internet. There is an active conversation ongoing internationally, and at every opportunity—whether in public forums or via private assistance—we must ensure our voice and view is heard.
  • Third, we should be vocal in our defense of American values and policies. While we as Americans always look to improve our laws—and I commend the members of this committee on their important work on privacy legislation and other critical matters—we do not need to apologize to the world. When it comes to civil liberties or the enforcement of privacy laws, we are second to none. Indeed, in my view, the overall U.S. privacy framework—especially with the additional protections built into Privacy Shield—should certainly qualify as adequate under EU standards.
  • Fourth, as European leaders call to strengthen ties with the U.S., we should prioritize making our regimes compatible for the free flow of data. This extends to the data governance regimes of like-minded countries outside of Europe as well. Different nations will have different rules, but relatively minor differences need not impede mutually-beneficial commerce. We need not and should not purport to aim for a single, identical system of data governance. And we should remind our allies, and remind ourselves, that far more unites liberal democracies than divides us.
  • Fifth and finally, if we must draw lines, those lines should be drawn between allies with shared values—the U.S., Europe, Japan, Australia, and others—and those, like China and Russia, that offer a starkly different vision. I am certainly encouraged when I hear recognition of this distinction from Europe. European Data Protection Supervisor Wojciech Wiewiórowski recently noted that the U.S. is much closer to Europe than is China and that he has a preference for data being processed by countries that share values with Europe. Some here in the U.S. are even proposing agreements to solidify the relationships among technologically advanced democracies, an idea worth exploring in more detail

Washington University Professor of Law Neil Richards stressed that the Schrems II decision spells out how the U.S. would achieve adequacy: reforming surveillance and providing meaningful redress for alleged privacy violations. Consequently, FISA would need to be rewritten and narrowed and a means for EU residents to seek relief beyond the current Ombudsman system is needed, possibly a statutory right to sue. Moreover, he asserted strong data protection and privacy laws are needed and some of the bills introduced in this Congress could fit the bill. Richards asserted:

In sum, the Schrems litigation is a creature of distrust, and while it has created problems for American law and commerce, it has also created a great opportunity. That opportunity lies before this Committee –the chance to regain American leadership in global privacy and data protection by passing a comprehensive law that provides appropriate safeguards, enforceable rights, and effective legal remedies for consumers. I believe that the way forward can not only safeguard the ability to share personal data across the Atlantic, but it can do so in a way that builds trust between the United States and our European trading partners and between American companies and their American and European customers. I believe that there is a way forward, but it requires us to recognize that strong, clear, trust-building rules are not hostile to business interest, that we need to push past the failed system of “notice and choice,” that we need to preserve effective consumer remedies and state-level regulatory innovation, and seriously consider a duty of loyalty. In that direction, I believe, lies not just consumer protection, but international cooperation and economic prosperity.

Georgia Tech University Professor Peter Swire explained that the current circumstances make the next Congress the best possibility in memory to enact privacy legislation because of the need for a Privacy Shield replacement, passage of the new California Privacy Rights Act (Proposition 24), and the Biden Administration’s likely support for such legislation. Swire made the following points:

  1. The European Data Protection Board in November issued draft guidance with an extremely strict interpretation of how to implement the Schrems II case.
  2. The decision in Schrems II is based on EU constitutional law. There are varying current interpretations in Europe of what is required by Schrems II, but constitutional requirements may restrict the range of options available to EU and U.S. policymakers.
  3. Strict EU rules about data transfers, such as the draft EDPB guidance, would appear to result in strict data localization, creating numerous major issues for EU- and U.S.-based businesses, as well as affecting many online activities of EU individuals.
  4. Along with concerns about lack of individual redress, the CJEU found that the EU Commission had not established that U.S. surveillance was “proportionate” in its scope and operation. Appendix 2 to this testimony seeks to contribute to an informed judgment on proportionality, by cataloguing developments in U.S. surveillance safeguards since the Commission’s issuance of its Privacy Shield decision in 2016.
  5. Negotiating an EU/U.S. adequacy agreement is important in the short term.
  6. A short-run agreement would assist in creating a better overall long-run agreement or agreements.
  7. As the U.S. considers its own possible legal reforms in the aftermath of Schrems II, it is prudent and a normal part of negotiations to seek to understand where the other party – the EU – may have flexibility to reform its own laws.
  8. Issues related to Schrems II have largely been bipartisan in the U.S., with substantial continuity across the Obama and Trump administrations, and expected as well for a Biden administration.
  9. Passing comprehensive privacy legislation would help considerably in EU/U.S. negotiations.
  10. This Congress may have a unique opportunity to enact comprehensive commercial privacy legislation for the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dooffy Design from Pixabay

Canada Releases Privacy Bill

Canada’s newly released privacy bill shares commonalities with U.S. bills but features a stronger enforcement regime that could result in fines of up to 5% of annual worldwide revenue for the worst violations.

The government in Ottawa has introduced in Parliament the “Digital Charter Implementation Act, 2020” (Bill C-11) that would dramatically reform the nation’s privacy laws and significantly expand the power of the Office of Privacy Commissioner (OPC). The bill consists of two main parts, the Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act, and would partially repeal Canada’s federal privacy law: Personal Information Protection and Electronic Documents Act. Notably, the bill would allow the OPC to levy fines up to 5% of worldwide revenue or $25 million CAD (roughly $20 million USD), whichever is higher. Canadians would also get a private right of action under certain conditions.

Broadly, this bill shares many characteristics with a number of bills introduced in the United States Congress by Democratic Members. Consent would be needed in most cases where a Canadian’s personal information is collected, processed, used, shared, or disclosed although there are some notable exceptions. Canada’s federal privacy regulator would be able to seek and obtain stiff fines for non-compliance.

In the bill, its purpose is explained:

The purpose of this Act is to establish — in an era in which data is constantly flowing across borders and geographical boundaries and significant economic activity relies on the analysis, circulation and exchange of personal information — rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

The Department of Industry (aka Innovation, Science and Economic Development Canada) released this summary of the bill:

The Government of Canada has tabled the Digital Charter Implementation Act, 2020 to strengthen privacy protections for Canadians as they engage in commercial activities. The Act will create the Consumer Privacy Protection Act (CPPA), which will modernize Canada’s existing private sector privacy law, and will also create the new Personal information and Data Protection Tribunal Act, which will create the Personal Information and Data Tribunal, an entity that can impose administrative monetary penalties for privacy violations. Finally, the Act will repeal Part 2 of the existing Personal Information Protection and Electronic Documents Act (PIPEDA) and turn it into stand-alone legislation, the Electronic Documents Act. With each of these steps, the government is building a Canada where citizens have confidence that their data is safe and privacy is respected, while unlocking innovation that promotes a strong economy.

The Department added:

  • Changes enabled by CPPA will enhance individuals’ control over their personal information, such as by requesting its deletion, creating new data mobility rights that promote consumer choice and innovation, and by creating new transparency requirements over uses of personal information in areas such as artificial intelligence systems.
  • CPPA will also promote responsible innovation by reducing regulatory burden. A new exception to consent will address standard business practices; a new regime to clarify how organizations are to handle de-identified personal information, and another new exception to consent to allow organizations to disclose personal information for socially beneficial purposes, such as public health research, for example.
  • The new legislative changes will strengthen privacy enforcement and oversight in a manner similar to certain provinces and some of Canada’s foreign trading partners. It does so by: granting the Office of the Privacy Commissioner of Canada (OPC) order-making powers, which can compel organizations to comply with the law; force them to stop certain improper activities or uses of personal information; and order organizations to preserve information relevant to an OPC investigation. The new law will also enable administrative monetary penalties for serious contraventions of the law, subject to a maximum penalty of 3% of global revenues.
  • The introduction of the Personal Information and Data Tribunal Act will establish a new Data Tribunal, which will be responsible for determining whether to assign administrative monetary penalties that are recommended by the OPC following its investigations, determining the amount of any penalties and will also hear appeals of OPC orders and decisions. The Tribunal will provide for access to justice and contribute to the further development of privacy expertise by providing expeditious reviews of the OPC’s orders.
  • The Electronic Documents Act will take the electronic documents provisions of PIPEDA and enact them in standalone legislation. This change will simplify federal privacy laws and will better align the federal electronic documents regime to support service delivery initiatives by the Treasury Board Secretariat.

In a summary, the Department explained:

Under the CPPA, the Privacy Commissioner would have broad order-making powers, including the ability to force an organization to comply with its requirements under the CPPA and the ability to order a company to stop collecting data or using personal information. In addition, the Privacy Commissioner would also be able to recommend that the Personal Information and Data Protection Tribunal impose a fine. The legislation would provide for administrative monetary penalties of up to 3% of global revenue or $10 million [CAD] for non-compliant organizations. It also contains an expanded range of offences for certain serious contraventions of the law, subject to a maximum fine of 5% of global revenue or $25 million [CAD].

The CPPA broadly defines what constitutes “personal information” and what is therefore covered and protected by the bill. It would be “information about an identifiable individual,” a much wider scope than almost all the legislation in the United States, for example. Consequently, even information derived through processing that was not directly or indirectly collected from a person would seem to be covered by the bill. And, speaking of processing, the CPPA limits how personal information may collected and used, specifically “only for purposes that a reasonable person would consider appropriate in the circumstances.”

Moreover, entity can only collect personal information needed for purposes disclosed before collection or at the time of collection and only with the consent of the person. However, the CPPA would allow for “implied consent” if “the organization establishes that it is appropriate…taking into account the reasonable expectations of the individual and the sensitivity of the personal information that is to be collected, used or disclosed.” And, if the entity decides to collect and use personal information for any new purpose, it must obtain the consent of people in Canada before doing so. What’s more, organizations cannot condition the provision of products or services on people providing consent for collection of personal information beyond what is necessary. And, of course, consent gained under false, deceptive, or misleading pretenses is not valid and people may withdraw consent at any time.

In terms of the types of disclosures an organization must make in terms of purposes, the CPPA would require more than most proposed U.S. federal privacy laws. For example, an entity must tell people the specific personal information to be collected, processed, used, or disclosed, the reasonable consequences of any of the aforementioned, and the names of third parties or types of third partied with whom personal information would be shared.

The CPPA is very much like U.S. privacy bills in that there are numerous exceptions as to when consent is not needed for collecting, processing, and using personal information. Principally, this would be when a reasonable person would expect or understand this could happen or so long as the collection and processing activities are not to influence a person’s decisions or behavior. Activities that would fall in the former category are things such as collection, use, and processing needed to deliver a product or service, protecting the organization’s systems and information security, or the due diligence necessary to protect the organization from commercial risk. Moreover, if collection, use, and processing are in the public interest and consent cannot be readily obtained, then the organization may proceed. The same is true if there is an emergency situation that imperils the life or health of a person so long as disclosure to the person is made in writing expeditiously afterwards. However, neither consent nor knowledge are required for transfers of personal information to service providers, in employment settings, to prevent fraud, and for a number of other enumerated purposes.

There are wide exceptions to the consent requirement relating to collection and use of personal information in the event of investigations of breaches of agreements or contravention of federal or provincial law. Likewise, consent may not be needed if an organization is disclosing personal information to government institutions. Similarly, the collection and use of public information is authorized subject to regulations.

However, the CPPA makes clear that certain exceptions to the consent and knowledge requirements are simply not operative when the personal information in question is an “electronic address” or is stored on a computer system. In these cases, consent or knowledge would be needed before such collection of personal information is legal.

Organizations must dispose of personal information when it is no longer needed for the purpose it was originally collected except for personal information collected and used for decision making. In this latter case, information must be retained in case the person about whom the decision was made wants access. Organizations must dispose of personal information about a person upon his or her request unless doing so would result in the disposal of other people’s information or there is a Canadian law barring such disposal. If the organization refuses the request to dispose, it must inform the person in writing. If the organization grants the request, it must direct service providers to do the same and confirm destruction.

Organizations would have a duty to ensure personal information is accurate, and the applicability of this duty would turn on whether the information is being used to make decisions, is being shared with third parties, and if the information is being used on an ongoing basis.

The CPPA would impose security requirements for organizations collecting, using, and holding personal information. These data would need protection “through physical, organizational and technological security safeguards” appropriate to the sensitivity of the information. Specifically, these security safeguards “must protect personal information against, among other things, loss, theft and unauthorized access, disclosure, copying, use and modification.” Breaches must be reported as soon as feasible to the OPC and to affected people if there is a reasonable belief of “real risk of significant harm to an individual.” Significant harm is defined as “bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.” Real risk of significant harm is determined on the basis of

  • the sensitivity of the personal information involved in the breach;
  • the probability that the personal information has been, is being or will be misused; and
  • any other prescribed factor.

Organizations will also have a duty to explain their policies and practices under this act in plain language, including:

  • a description of the type of personal information under the organization’s control;
  • a general account of how the organization makes use of personal information, including how the organization applies the exceptions to the requirement to obtain consent under this Act;
  • a general account of the organization’s use of any automated decision system to make predictions, recommendations or decisions about individuals that could have significant impacts on them;
  • whether or not the organization carries out any international or interprovincial transfer or disclosure of personal information that may have reasonably foreseeable privacy implications;
  • how an individual may make a request for disposal under section 55 or access under section 63; and
  • the business contact information of the individual to whom complaints or requests for information may be made.

 Canadian nationals and residents would be able to access their personal information. Notably, “[o]n request by an individual, an organization must inform them of whether it has any personal information about them, how it uses the information and whether it has disclosed the information.” Access must also be granted to the requesting person. If the organization has disclosed a person’s information, when she makes a request to access, she must be told the names or types of third parties to whom her information was disclosed. Moreover, organizations using automated decision-making processes would have further responsibilities: “[i]f the organization has used an automated decision system to make a prediction, recommendation or decision about the individual, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision and of how the personal information that was used to make the prediction, recommendation or decision was obtained.” Additionally, if a person has been granted access to his personal information and it “is not accurate, up-to-date or complete,” then the organization must amend it and send the corrected information to third parties that has access to the information.

There are provisions required data portability (deemed data mobility by the CPPA). All organizations subject to the data mobility framework must transfer personal information upon request. People must be able to lodge complaints with organizations over compliance with the CPPA regarding their personal information. Organizations may not re-identify de-identified personal information.

Organizations would be able to draft and submit codes of conduct to the OPC for approval so long as they “provide[] for substantially the same or greater protection of personal information as some or all of the protection provided under this Act.” Likewise, an entity may apply to the OPC “for approval of a certification program that includes

(a) a code of practice that provides for substantially the same or greater protection of personal information as some or all of the protection provided under this Act;

(b) guidelines for interpreting and implementing the code of practice;

(c) a mechanism by which an entity that operates the program may certify that an organization is in compliance with the code of practice;

(d) a mechanism for the independent verification of an organization’s compliance with the code of practice;

(e) disciplinary measures for non-compliance with the code of practice by an organization, including the revocation of an organization’s certification; and

(f) anything else that is provided in the regulations.

However, complying with approved codes of conduct or a certification program does not mean an entity is complying with the CPPA.

The OPC would be granted a range of new powers to enforce the CPPA either through compliance orders (which resemble administrative actions taken by the United States Federal Trade Commission) that can be appealed to a new Personal Information and Data Protection Tribunal (Tribunal) and ultimately enforced in federal court if necessary. People in Canada would also get the right to sue in the event the OPC or the new Tribunal find an entity has contravened the CPPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by James Wheeler from Pixabay

Further Reading, Other Development, and Coming Events (7 December)

Further Reading

  • Facebook steps up campaign to ban false information about coronavirus vaccines” By Elizabeth Dwoskin — The Washington Post. In its latest step to find and remove lies, misinformation, and disinformation, the social media giant is now committing to removing and blocking untrue material about COVID-19 vaccines, especially from the anti-vaccine community. Will the next step be to take on anti-vaccination proponents generally?
  • Comcast’s 1.2 TB data cap seems like a ton of data—until you factor in remote work” By Rob Pegoraro — Fast Company. Despite many people and children working and learning from home, Comcast is reimposing a 1.2 terabyte limit on data for homes. Sounds like quite a lot until you factor in video meetings, streaming, etc. So far, other providers have not set a cap.
  • Google’s star AI ethics researcher, one of a few Black women in the field, says she was fired for a critical email” By Drew Harwell and Nitasha Tiku — The Washington Post. Timnit Gebru, a top flight artificial intelligence (AI) computer scientist, was fired for questioning Google’s review of a paper she wanted to present at an AI conference that is likely critical of the company’s AI projects. Google claims she resigned, but Gebru says she was fired. She has long been an advocate for women and minorities in tech and AI and her ouster will likely only increase scrutiny of and questions about Google’s commitment to diversity and an ethical approach to the development and deployment of AI. It will also probably lead to more employee disenchantment about the company that follows in the wake of protests about Google’s involvement with the United States Department of Defense’s Project Maven and hiring of former United States Department of Homeland Security chief of staff Miles Taylor who was involved with the policies that resulted in caging children and separating families on the southern border of the United States.
  • Humans Can Help Clean Up Facebook and Twitter” By Greg Bensinger — The New York Times. In this opinion piece, the argument is made that social media platforms should redeploy their human monitors to the accounts that violate terms of service most frequently (e.g., President Donald Trump) and more aggressively label and remove untrue or inflammatory content, they would have a greater impact on lies, misinformation, and disinformation.
  • Showdown looms over digital services tax” By Ashley Gold — Axios. Because the Organization for Economic Cooperation and Development (OECD) has not reached a deal on digital services taxes, a number of the United States (U.S.) allies could move forward with taxes on U.S. multinationals like Amazon, Google, and Apple. The Trump Administration has variously taken an adversarial position threatening to retaliate against countries like France who have enacted a tax that has not been collected during the OECD negotiations. The U.S. also withdrew from talks. It is probable the Biden Administration will be more willing to work in a multi-lateral fashion and may strike a deal on an issue that it not going away as the United Kingdom, Italy, and Canada also have plans for a digital tax.
  • Trump’s threat to veto defense bill over social-media protections is heading to a showdown with Congress” By Karoun Demirjian and Tony Romm — The Washington Post. I suppose I should mention of the President’s demands that the FY 2021 National Defense Authorization Act (NDAA) contain a repeal of 47 U.S.C. 230 (Section 230 of the Communications Act) that came at the eleventh hour and fifty-ninth minute of negotiations on a final version of the bill. Via Twitter, Donald Trump threatened to veto the bill which has been passed annually for decades. Republicans were not having it, however, even if they agreed on Trump’s desire to remove liability protection for technology companies. And yet, if Trump continues to insist on a repeal, Republicans may find themselves in a bind and the bill could conceivably get pulled until President-elect Joe Biden is sworn in. On the other hand, Trump’s veto threats about renaming military bases currently bearing the names of Confederate figures have not been renewed even though the final version of the bill contains language instituting a process to do just that.

Other Developments

  • The Senate Judiciary Committee held over its most recent bill to narrow 47 U.S.C. 230 (Section 230 of the Communications Act) that provides liability protection for technology companies for third-party material posted on their platforms and any decisions to edit, alter, or remove such content. The committee opted to hold the “Online Content Policy Modernization Act” (S.4632), which may mean the bill’s chances of making it to the Senate floor are low. What’s more, even if the Senate passes Section 230 legislation, it is not clear there will be sufficient agreement with Democrats in the House to get a final bill to the President before the end of this Congress. On 1 October, the committee also decided to hold over bill to try to reconcile the fifteen amendments submitted for consideration. The Committee could soon meet again to formally markup and report out this legislation.
    • At the earlier hearing, Chair Lindsey Graham (R-SC) submitted an amendment revising the bill’s reforms to Section 230 that incorporate some of the below amendments but includes new language. For example, the bill includes a definition of “good faith,” a term not currently defined in Section 230. This term would be construed as a platform taking down or restricting content only according to its publicly available terms of service, not as a pretext, and equally to all similarly situated content. Moreover, good faith would require alerting the user and giving him or her an opportunity to respond subject to certain exceptions. The amendment also makes clear that certain existing means of suing are still available to users (e.g. suing claiming a breach of contract.)
    • Senator Mike Lee (R-UT) offered a host of amendments:
      • EHF20913 would remove “user[s]” from the reduced liability shield that online platforms would receive under the bill. Consequently, users would still not be legally liable for the content posted by another user.
      • EHF20914 would revise the language the language regarding the type of content platforms could take down with legal protection to make clear it would not just be “unlawful” content but rather content “in violation of a duly enacted law of the United States,” possibly meaning federal laws and not state laws. Or, more likely, the intent would be to foreclose the possibility a platform would say it is acting in concert with a foreign law and still assert immunity.
      • EHF20920 would add language making clear that taking down material that violates terms of service or use according to an objectively reasonable belief would be shielded from liability.
      • OLL20928 would expand legal protection to platforms for removing or restricting spam,
      • OLL20929 would bar the Federal Communications Commission (FCC) from a rulemaking on Section 230.
      • OLL20930 adds language making clear if part of the revised Section 230 is found unconstitutional, the rest of the law would still be applicable.
      • OLL20938 revises the definition of an “information content provider,” the term of art in Section 230 that identifies a platform, to expand when platforms may be responsible for the creation or development of information and consequently liable for a lawsuit.
    • Senator Josh Hawley (R-MO) offered an amendment that would create a new right of action for people to sue large platforms for taking down his or her content if not done in “good faith.” The amendment limits this right only to “edge providers” who are platforms with more than 30 million users in the U.S. , 300 million users worldwide, and with revenues of more than $1.5 billion. This would likely exclude all platforms except for Twitter, Facebook, Instagram, TikTok, Snapchat, and a select group of a few others.
    • Senator John Kennedy (R-LA) offered an amendment that removes all Section 230 legal immunity from platforms that collect personal data and then uses an “automated function” to deliver targeted or tailored content to a user unless a user “knowingly and intentionally elect[s]” to receive such content.
  • The Massachusetts Institute of Technology’s (MIT) Work of the Future Task Force issued its final report and drew the following conclusions:
    • Technological change is simultaneously replacing existing work and creating new work. It is not eliminating work altogether.
    • Momentous impacts of technological change are unfolding gradually.
    • Rising labor productivity has not translated into broad increases in incomes because labor market institutions and policies have fallen into disrepair.
    • Improving the quality of jobs requires innovation in labor market institutions.
    • Fostering opportunity and economic mobility necessitates cultivating and refreshing worker skills.
    • Investing in innovation will drive new job creation, speed growth, and meet rising competitive challenges.
    • The Task Force stated:
      • In the two-and-a-half years since the Task Force set to work, autonomous vehicles, robotics, and AI have advanced remarkably. But the world has not been turned on its head by automation, nor has the labor market. Despite massive private investment, technology deadlines have been pushed back, part of a normal evolution as breathless promises turn into pilot trials, business plans, and early deployments — the diligent, if prosaic, work of making real technologies work in real settings to meet the demands of hard-nosed customers and managers.
      • Yet, if our research did not confirm the dystopian vision of robots ushering workers off of factor y floors or artificial intelligence rendering superfluous human expertise and judgment, it did uncover something equally pernicious: Amidst a technological ecosystem delivering rising productivity, and an economy generating plenty of jobs (at least until the COVID-19 crisis), we found a labor market in which the fruits are so unequally distributed, so skewed towards the top, that the majority of workers have tasted only a tiny morsel of a vast har vest.
      • As this report documents, the labor market impacts of technologies like AI and robotics are taking years to unfold. But we have no time to spare in preparing for them. If those technologies deploy into the labor institutions of today, which were designed for the last century, we will see similar effects to recent decades: downward pressure on wages, skills, and benefits, and an increasingly bifurcated labor market. This report, and the MIT Work of the Future Task Force, suggest a better alternative: building a future for work that har vests the dividends of rapidly advancing automation and ever-more powerful computers to deliver opportunity and economic security for workers. To channel the rising productivity stemming from technological innovations into broadly shared gains, we must foster institutional innovations that complement technological change.
  • The European Data Protection Supervisor (EDPS) Wojciech Wiewiorówski published his “preliminary opinion on the European Commission’s (EC) Communication on “A European strategy for data” and the creation of a common space in the area of health, namely the European Health Data Space (EHDS).” The EDPS lauded the goal of the EHDS, “the prevention, detection and cure of diseases, as well as for evidence-based decisions in order to enhance effectiveness, accessibility and sustainability of the healthcare systems.” However, Wiewiorówski articulated his concerns that the EC needs to think through the applicability of the General Data Protection Regulation (GDPR), among other European Union (EU) laws before it can legally move forward. The EDPS stated:
    • The EDPS calls for the establishment of a thought-through legal basis for the processing operations under the EHDS in line with Article 6(1) GDPR and also recalls that such processing must comply with Article 9 GDPR for the processing of special categories of data.
    • Moreover, the EDPS highlights that due to the sensitivity of the data to be processed within the EHDS, the boundaries of what constitutes a lawful processing and a compatible further processing of the data must be crystal-clear for all the stakeholders involved. Therefore, the transparency and the public availability of the information relating to the processing on the EHDS will be key to enhance public trust in the EHDS.
    • The EDPS also calls on the Commission to clarify the roles and responsibilities of the parties involved and to clearly identify the precise categories of data to be made available to the EHDS. Additionally, he calls on the Member States to establish mechanisms to assess the validity and quality of the sources of the data.
    • The EDPS underlines the importance of vesting the EHDS with a comprehensive security infrastructure, including both organisational and state-of-the-art technical security measures to protect the data fed into the EHDS. In this context, he recalls that Data Protection Impact Assessments may be a very useful tool to determine the risks of the processing operations and the mitigation measures that should be adopted.
    • The EDPS recommends paying special attention to the ethical use of data within the EHDS framework, for which he suggests taking into account existing ethics committees and their role in the context of national legislation.
    • The EDPS is convinced that the success of the EHDS will depend on the establishment of a strong data governance mechanism that provides for sufficient assurances of a lawful, responsible, ethical management anchored in EU values, including respect for fundamental rights. The governance mechanism should regulate, at least, the entities that will be allowed to make data available to the EHDS, the EHDS users, the Member States’ national contact points/ permit authorities, and the role of DPAs within this context.
    • The EDPS is interested in policy initiatives to achieve ‘digital sovereignty’ and has a preference for data being processed by entities sharing European values, including privacy and data protection. Moreover, the EDPS calls on the Commission to ensure that the stakeholders taking part in the EHDS, and in particular, the controllers, do not transfer personal data unless data subjects whose personal data are transferred to a third country are afforded a level of protection essentially equivalent to that guaranteed within the European Union.
    • The EDPS calls on Member States to guarantee the effective implementation of the right to data portability specifically in the EHDS, together with the development of the necessary technical requirements. In this regard, he considers that a gap analysis might be required regarding the need to integrate the GDPR safeguards with other regulatory safeguards, provided e.g. by competition law or ethical guidelines.
  • The Office of Management and Budget (OMB) extended a guidance memorandum directing agencies to consolidate data centers after Congress pushed back the sunset date for the program. OMB extended OMB Memorandum M-19-19, Update to Data Center Optimization Initiative (DCOI) through 30 September 2022, which applies “to the 24 Federal agencies covered by the Chief Financial Officers (CFO) Act of 1990, which includes the Department of Defense.” The DCOI was codified in the “Federal Information Technology Acquisition Reform” (FITARA) (P.L. 113-291) and extended in 2018 until October 1, 2020. And this sunset was pushed back another two years in the FY 2020 National Defense Authorization Act (NDAA) (P.L. 116-92).
    • In March 2020, the Government Accountability Office (GAO) issued another of its periodic assessments of the DCOI, started in 2012 by the Obama Administration to shrink the federal government’s footprint of data centers, increase efficiency and security, save money, and reduce energy usage.
    • The GAO found that 23 of the 24 agencies participating in the DCOI met or planned to meet their FY 2019 goals to close 286 of the 2,727 data centers considered part of the DCOI. This latter figure deserves some discussion, for the Trump Administration changed the definition of what is a data center to exclude smaller ones (so-called non-tiered data centers). GAO asserted that “recent OMB DCOI policy changes will reduce the number of data centers covered by the policy and both OMB and agencies may lose important visibility over the security risks posed by these facilities.” Nonetheless, these agencies are projecting savings of $241.5 million when all the 286 data centers planned for closure in FY 2019 actually close. It bears note that the GAO admitted in a footnote it “did not independently validate agencies’ reported cost savings figures,” so these numbers may not be reliable.
    • In terms of how to improve the DCOI, the GAO stated that “[i]n addition to reiterating our prior open recommendations to the agencies in our review regarding their need to meet DCOI’s closure and savings goals and optimization metrics, we are making a total of eight new recommendations—four to OMB and four to three of the 24 agencies. Specifically:
      • The Director of the Office of Management and Budget should (1) require that agencies explicitly document annual data center closure goals in their DCOI strategic plans and (2) track those goals on the IT Dashboard. (Recommendation 1)
      • The Director of the Office of Management and Budget should require agencies to report in their quarterly inventory submissions those facilities previously reported as data centers, even if those facilities are not subject to the closure and optimization requirements of DCOI. (Recommendation 2)
      • The Director of the Office of Management and Budget should document OMB’s decisions on whether to approve individual data centers when designated by agencies as either a mission critical facility or as a facility not subject to DCOI. (Recommendation 3)
      • The Director of the Office of Management and Budget should take action to address the key performance measurement characteristics missing from the DCOI optimization metrics, as identified in this report. (Recommendation 4)
  • Australia’s Inspector-General of Intelligence and Security (IGIS) released its first report on how well the nation’s security services did in observing the law with respect to COVID  app  data. The IGIS “is satisfied that the relevant agencies have policies and procedures in place and are taking reasonable steps to avoid intentional collection of COVID app data.” The IGIS revealed that “[i]ncidental collection in the course of the lawful collection of other data has occurred (and is permitted by the Privacy Act); however, there is no evidence that any agency within IGIS jurisdiction has decrypted, accessed or used any COVID app data.” The IGIS is also “satisfied  that  the intelligence agencies within IGIS jurisdiction which have the capability to incidentally collect a least some types of COVID app data:
    • Are aware of their responsibilities under Part VIIIA of the Privacy Act and are taking active steps to minimise the risk that they may collect COVID app data.
    • Have appropriate  policies  and  procedures  in  place  to  respond  to  any  incidental  collection of COVID app data that they become aware of. 
    • Are taking steps to ensure any COVID app data is not accessed, used or disclosed.
    • Are taking steps to ensure any COVID app data is deleted as soon as practicable.
    • Have not decrypted any COVID app data.
    • Are applying the usual security measures in place in intelligence agencies such that a ‘spill’ of any data, including COVID app data, is unlikely.
  • New Zealand’s Government Communications Security Bureau’s National Cyber Security Centre (NCSC) has released its annual Cyber Threat Report that found that “nationally significant organisations continue to be frequently targeted by malicious cyber actors of all types…[and] state-sponsored and non-state actors targeted public and private sector organisations to steal information, generate revenue, or disrupt networks and services.” The NCSC added:
    • Malicious cyber actors have shown their willingness to target New Zealand organisations in all sectors using a range of increasingly advanced tools and techniques. Newly disclosed vulnerabilities in products and services, alongside the adoption of new services and working arrangements, are rapidly exploited by state-sponsored actors and cyber criminals alike. A common theme this year, which emerged prior to the COVID-19 pandemic, was the exploitation of known vulnerabilities in internet-facing applications, including corporate security products, remote desktop services and virtual private network applications.
  • The former Director of the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) wrote an opinion piece disputing President Donald Trump’s claims that the 2020 Presidential Election was fraudulent. Christopher Krebs asserted:
    • While I no longer regularly speak to election officials, my understanding is that in the 2020 results no significant discrepancies attributed to manipulation have been discovered in the post-election canvassing, audit and recount processes.
    • This point cannot be emphasized enough: The secretaries of state in Georgia, Michigan, Arizona, Nevada and Pennsylvania, as well officials in Wisconsin, all worked overtime to ensure there was a paper trail that could be audited or recounted by hand, independent of any allegedly hacked software or hardware.
    • That’s why Americans’ confidence in the security of the 2020 election is entirely justified. Paper ballots and post-election checks ensured the accuracy of the count. Consider Georgia: The state conducted a full hand recount of the presidential election, a first of its kind, and the outcome of the manual count was consistent with the computer-based count. Clearly, the Georgia count was not manipulated, resoundingly debunking claims by the president and his allies about the involvement of CIA supercomputers, malicious software programs or corporate rigging aided by long-gone foreign dictators.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Daniel Schludi on Unsplash

New EU Consumer Agenda

The European Commission frames a number of current and new programs as a paradigm shift in consumer rights in the EU.

The European Commission (EC) has published its New Consumer Agenda that envisions nothing less than a remaking of the European Union’s (EU) approach to a number of key realms, including the digital world. If enacted, these sweeping reforms could drive change in other nations the same the General Data Protection Regulation (GDPR) has informed revisions of data protection regimes around the globe. Some of the proposed changes have been rumored for some time, some are already in progress, and some are new. Nonetheless, the EC has repackaged a number of digital initiatives under the umbrella of the New Consumer Agenda in its document detailing the provisions. Incidentally, the document serves as a good crib sheet to get up to speed on a number of EU’s digital programs and policy goals. Having said that, much of the New Consumer Agenda may prove aspirational, for there are a number of moving pieces and stakeholders in making policy in the EU, and this can play out over a number of years (e.g., the plans to revise the e-Privacy Directive). So, inclusion in this policy superstructure does not guarantee enactment, or if changes are made, they may be years in coming.

The EC stated

The New Consumer Agenda (the Agenda) presents a vision for EU consumer policy from 2020 to 2025, building on the 2012 Consumer Agenda (which expires in 2020) and the 2018 New Deal for Consumers. It also aims to address consumers’ immediate needs in the face of the ongoing COVID-19 pandemic and to increase their resilience. The pandemic has raised significant challenges affecting the daily lives of consumers, in particular in relation to the availability and accessibility of products and services, as well as travel within, and to and from the EU.

The EC identified the five prongs of the Agenda and given the emphasis the EC’s new leadership has placed on digital matters, one of them is entitled “digital transformation.” The Agenda is meant to work in unison with previously announced and still to be implemented major policy initiatives like the European Green Deal, the Circular Economy Action Plan, and the Communication on shaping the EU’s digital future.

The EC suggests that current EU law and regulation may address what some consider among the worst abuses of the digital age:

Commercial practices that disregard consumers’ right to make an informed choice, abuse their behavioural biases, or distort their decision-making processes, must be tackled. These practices include the use of ‘dark’ patterns, certain personalisation practices often based on profiling, hidden advertising, fraud, false or misleading information and manipulated consumer reviews. Additional guidance is needed on the applicability of consumer law instruments such as the Unfair Commercial Practices Directive and Consumer Rights Directive to these practices. Ultimately, consumers should benefit from a comparable level of protection and fairness online as they enjoy offline.

The EC seems to be suggesting that should those directives be found wanting, they could be revised and expanded to adequately protect EU citizens and residents, at least in the view of the EC.

The EC made reference to two long anticipated pieces of legislation expected new week in draft form:

  • First, the Commission’s upcoming proposal for a new Digital Services Act (DSA), will aim to define new and enhanced responsibilities and reinforce the accountability of online intermediaries and platforms. The DSA will ensure that consumers are protected effectively against illegal products, content and activities on online platforms as they are offline.
  • Second, to address the problems arising in digital markets prone to market failures, such as the gatekeeper power of certain digital platforms, the Commission is planning to present also a Digital Markets Act. It would combine the ex ante regulation of digital platforms having the characteristics of gatekeepers with a dynamic market investigation framework to examine digital markets prone to market failures. Consumers will be the final beneficiaries of fairer and more contestable digital markets, including lower prices, better and new services and greater choice.

Regarding artificial intelligence (AI), the EC previewed its next steps on putting in place a regulatory structure to regulate the new technology in the consumer space, including extra protection and an appropriate civil liability scheme:

  • a proposal to guarantee a high level of protection of consumer interest and the protection of fundamental rights, in turn building the trust necessary for the societal uptake of AI;
  • as regards civil liability, measures to ensure that victims of damage caused by AI applications have the same level of protection in practice as victims of damage caused by other products or services.

The EC is also floating the idea of revising other consumer protection directives such as “the Machinery Directive, the adoption of delegated acts under the Radio Equipment Directive, and the revision of the General Product Safety Directive.” The EC aspires to refresh the General Product Safety Directive, “which provides the legal framework for the safety of non-food consumer products” to account for “AI-powered products and connected devices,” the latter of which may be a reference to Internet of Things (IOT). The EC also remarked on the consumer safety issues posed by a regulatory system that cannot police items sold online that originate from outside the EU, which often raises product safety issues. The EC vowed that “[t]he forthcoming proposal for a revision of the General Product Safety Directive, foreseen for 2021, should provide a solid response to these increasing challenges.”

The EC also referred to an existing lawmaking that would allow EU citizens and residents to use “a universally accepted public electronic identity.” This new system would allow people to “to manage the access and use of their data in a fully controlled and secure manner” “based on the consumers’ choice, their consent and the guarantee that their privacy is fully respected in line with the General Data Protection Regulation (GDPR).”

The EC is addressing another facet of data protection, privacy, and consumers’ rights: data portability. The Commission stated that the “European Strategy for Data aims to facilitate the effective individuals’ right to data portability under the GDPR…[that] has clear potential to put individuals at the centre of the data economy by enabling them to switch between service providers, combine services, use other innovative services and choose the services that offer most data protection.” The EC claimed this strategy “will also drive the creation of a genuine single market for data and the creation of common European data spaces.”

The Commission made note of its Geo-blocking Regulation to address the practice of discriminating “between EU consumers to segment markets along national borders.”

The EC also described laws and regulations to revamp the digital facets of the financial services sector that “will improve consumer protection” in new areas of FinTech and other new realms.

The EC is touting how a previously announced proposal to foster the use of recycled materials and products fits into the Consumer Agenda. The EC stated “the new Circular Economy Action Plan sets out a number of specific initiatives to fight early obsolescence and promote durability, recyclability, reparability, and accessibility of products, and to support action by business.” The EC all but said new regulations are coming to address electronic waste and products designed not to be repairable. The EC noted “[a]dditional regulatory and non- regulatory measures will be needed to address specific groups of goods and services, such as ICT, electronics or textile, and packaging…[f]or instance:

  • The Circular Electronics Initiative aims to ensure that electronic devices are designed for durability, maintenance, repair, disassembly, dismantling, reuse and recycling, and that consumers have a ‘right to repair’ them including software updates.
  • The initiative on a common charger for mobile phones and other portable devices, aims to increase consumer convenience and reduce material use and e-waste associated with production and disposal of this particular item used daily by the vast majority of consumers.”

The EC discussed other aspects of consumer protection with implications for technology policy. The EC stated

The new Consumer Protection Cooperation (CPC) Regulation which entered into force in January 2020, provides a stronger basis for joint EU action. It strengthens enforcement authorities’ online capacity, cooperation mechanisms and intelligence gathering system to address large-scale infringements of EU consumer law, ensure consistent level of consumer protection and offer a ‘one-stop-shop’ for businesses. The Commission will not hesitate to make use of its powers under the Regulation to trigger coordinated enforcement actions on EU-wide issues where necessary.

The EC is proposing to address algorithmic biases and how technology companies are exploiting certain human behavioral inclinations to drive engagement:

  • The risk of discrimination is at times exacerbated by algorithms used by certain goods and services providers, and which may be formulated with certain biases often resulting from pre-existing cultural or social expectations. Whereas this may lead to discrimination among consumers generally, it often affects certain groups more than others, and in particular people from minority ethnic or racial backgrounds. The upcoming proposal for a horizontal legislative framework on Artificial Intelligence will aim to specifically address how to limit risks of bias and discrimination from being built into algorithmic systems.
  • Lastly, evidence from behavioural economics shows that the behaviours of consumers are often affected by cognitive biases, especially online, which can be exploited by traders for commercial purposes. Such new forms of risks can affect virtually all consumers. Transparency obligations are certainly important in tackling information asymmetries (as also mentioned above in the context of the digital transformation), but further assessment is required to determine the need for additional measures to address this dynamic form of vulnerability.

The EC acknowledged the international nature of commerce and vowed to take certain steps to ensure the products, goods, and services entering the EU are safe. Notably, the EC is aiming to develop “an action plan with China for strengthened product safety cooperation for products sold online.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by eberhard grossgasteiger from Pexels

Further Reading, Other Developments, and Coming Events (17 November)

Further Reading

  • How the U.S. Military Buys Location Data from Ordinary Apps” By Joseph Cox — Vice’s Motherboard. This article confirms the entirely foreseeable: the Department of Defense and its contractors are obtaining and using personal information from smartphones all over the world. Given this practice is common in United States’ (U.S.) law enforcement agencies, it is little surprise the U.S. military is doing the same. Perhaps the fact the U.S. is doing this has been one of the animating force behind the Trump Administration’s moves against applications from the People’s Republic of China (PRC)?
  • Regulators! Stand Back: Under a Biden administration, Big Tech is set for a field day” By Lizzie O’Shea — The Baffler. This piece argues that a Biden Administration may be little more than a return to the Obama Administration’s favorable view of and largely laissez-faire regulatory approach. At least one expert worries the next administration may do enough on addressing big tech to appear to be doing something but not nearly enough to change the current market and societal dynamics.
  • Cheating-detection companies made millions during the pandemic. Now students are fighting back.” By Drew Harwell — The Washington Post. There are scores of problems with online testing platforms, including weak or easily compromised data security and privacy safeguards. Many students report getting flagged for stretching, looking off-screen, and even needing to go to the restroom. However, the companies in the market are in growth-mode and seem unresponsive to such criticisms.
  • Zuckerberg defends not suspending ex-Trump aide Bannon from Facebook: recording” By Katie Paul — Reuters. On an internal company call, Facebook CEO Mark Zuckerberg defended the platform’s decision not to deactivate former White House advisor Steve Bannon’s account after he “metaphorically” advocated for the beheadings of Federal Bureau of Investigation Director Christopher Wray and National Institute of Allergy and Infectious Diseases (NIAID) Director Anthony Fauci. Zuckerberg also reassured employees that a Biden Administration would not necessarily be entirely adversarial to Facebook.
  • How Trump uses Twitter to distract the media – new research” By Ullrich Ecker, Michael Jetter, and Stephan Lewandowsky — The Conversation. Research backs up the assertion that President Donald Trump has tweeted bizarre non-sequiturs to distract from what he perceived to be negative stories, and it worked because the media reported on the tweets almost every time. Trump is not the only politician or leader using this strategy.
  • Bumble Vulnerabilities Put Facebook Likes, Locations And Pictures Of 95 Million Daters At Risk” By Thomas Brewster — Forbes. Users of the dating app, Bumble, were at risk due to weak security white hacker researchers easily circumvented. Worse still, it took the company months to address and fix these vulnerabilities after being informed.

Other Developments

  • A number of United States (U.S.) election security stakeholders issued a statement, carefully and tactfully refuting the claims of President Donald Trump and other Republicans who have claimed that President-elect Joe Biden won the election only because of massive fraud. These officials declared “[t]he November 3rd election was the most secure in American history” and “[t]here is no evidence that any voting system deleted or lost votes, changed votes, or was in any way compromised.”
    • The officials seemed to flatly contradict Trump and others:
      • While we know there are many unfounded claims and opportunities for misinformation about the process of our elections, we can assure you we have the utmost confidence in the security and integrity of our elections, and you should too.
    • The members of Election Infrastructure Government Coordinating Council (GCC) Executive Committee – Cybersecurity and Infrastructure Security Agency (CISA) Assistant Director Bob Kolasky, U.S. Election Assistance Commission Chair Benjamin Hovland, National Association of Secretaries of State (NASS) President Maggie Toulouse Oliver, National Association of State Election Directors (NASED) President Lori Augino, and Escambia County (Florida) Supervisor of Elections David Stafford – and the members of the Election Infrastructure Sector Coordinating Council (SCC) – Chair Brian Hancock (Unisyn Voting Solutions), Vice Chair Sam Derheimer (Hart InterCivic), Chris Wlaschin (Election Systems & Software), Ericka Haas (Electronic Registration Information Center), and Maria Bianchi (Democracy Works) issued the statement.
  • President Donald Trump signed an executive order that would bar from the United States’ (U.S.) security markets those companies from the People’s Republic of China (PRC) connected to the PRC’s “military-industrial complex.” This order would take effect on 11 January 2021 and seeks, as a matter of national security, to cut off access to U.S. capital for these PRC companies because “the PRC exploits United States investors to finance the development and modernization of its military.” Consequently, Trump declared a national emergency with respect to the PRC’s behavior, which triggers a host of powers at the Administration’s request to deny funds and access to the object of such an order. It remains to be seen whether the Biden Administration will rescind or keep in place this executive order when it takes office ten days after it takes effect. Nevertheless, Trump asserted:
    • that the PRC is increasingly exploiting United States capital to resource and to enable the development and modernization of its military, intelligence, and other security apparatuses, which continues to allow the PRC to directly threaten the United States homeland and United States forces overseas, including by developing and deploying weapons of mass destruction, advanced conventional weapons, and malicious cyber-enabled actions against the United States and its people.
  • Microsoft revealed it has “detected cyberattacks from three nation-state actors targeting seven prominent companies directly involved in researching vaccines and treatments for Covid-19.” Microsoft attributed these attacks to Russian and North Korean hackers and tied the announcement to its participation to the company’s advocacy at the Paris Peace Forum where the United States (U.S.) multinational reiterated its calls for “the world’s leaders to affirm that international law protects health care facilities and to take action to enforce the law.” Microsoft sought to position its cyber efforts among larger diplomatic efforts to define the norms of cyberspace and to bring cyber action into the body of international law. The company asserted:
    • In recent months, we’ve detected cyberattacks from three nation-state actors targeting seven prominent companies directly involved in researching vaccines and treatments for Covid-19. The targets include leading pharmaceutical companies and vaccine researchers in Canada, France, India, South Korea and the United States. The attacks came from Strontium, an actor originating from Russia, and two actors originating from North Korea that we call Zinc and Cerium.
    • Among the targets, the majority are vaccine makers that have Covid-19 vaccines in various stages of clinical trials. One is a clinical research organization involved in trials, and one has developed a Covid-19 test. Multiple organizations targeted have contracts with or investments from government agencies from various democratic countries for Covid-19 related work.
    • Strontium continues to use password spray and brute force login attempts to steal login credentials. These are attacks that aim to break into people’s accounts using thousands or millions of rapid attempts. Zinc has primarily used spear-phishing lures for credential theft, sending messages with fabricated job descriptions pretending to be recruiters. Cerium engaged in spear-phishing email lures using Covid-19 themes while masquerading as World Health Organization representatives. The majority of these attacks were blocked by security protections built into our products. We’ve notified all organizations targeted, and where attacks have been successful, we’ve offered help.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) announced a £1.25 million fine of Ticketmaster UK for failing “to put appropriate security measures in place to prevent a cyber-attack on a chat-bot installed on its online payment page” in violation of the General Data Protection Regulation (GDPR). The ICO explained:
    • The breach began in February 2018 when Monzo Bank customers reported fraudulent transactions. The Commonwealth Bank of Australia, Barclaycard, Mastercard and American Express all reported suggestions of fraud to Ticketmaster. But the company failed to identify the problem.
    • In total, it took Ticketmaster nine weeks from being alerted to possible fraud to monitoring the network traffic through its online payment page.
    • The ICO’s investigation found that Ticketmaster’s decision to include the chat-bot, hosted by a third party, on its online payment page allowed an attacker access to customers’ financial details.
    • Although the breach began in February 2018, the penalty only relates to the breach from 25 May 2018, when new rules under the GDPR came into effect. The chat-bot was completely removed from Ticketmaster UK Limited’s website on 23 June 2018.
    • The ICO added:
      • The data breach, which included names, payment card numbers, expiry dates and CVV numbers, potentially affected 9.4million of Ticketmaster’s customers across Europe including 1.5million in the UK.
      • Investigators found that, as a result of the breach, 60,000 payment cards belonging to Barclays Bank customers had been subjected to known fraud. Another 6,000 cards were replaced by Monzo Bank after it suspected fraudulent use.
      • The ICO found that Ticketmaster failed to:
        • Assess the risks of using a chat-bot on its payment page
        • Identify and implement appropriate security measures to negate the risks
        • Identify the source of suggested fraudulent activity in a timely manner
  • The Office of the Comptroller of the Currency, the Board of Governors of the Federal Reserve System, and the Federal Deposit Insurance Corporation issued an interagency paper titled “Sound Practices to Strengthen Operational Resilience.” The agencies stated the paper “generally describes standards for operational resilience set forth in the agencies’ existing rules and guidance for domestic banking organizations that have average total consolidated assets greater than or equal to (1) $250 billion or (2) $100 billion and have $75 billion or more in average cross-jurisdictional activity, average weighted short-term wholesale funding, average nonbank assets, or average off-balance-sheet exposure.” The agencies explained the paper also:
    • promotes a principles-based approach for effective governance, robust scenario analysis, secure and resilient information systems, and thorough surveillance and reporting.
    • includes an appendix focused on sound practices for managing cyber risk.
    • In the appendix, the agencies stressed they could not “endorse the use of any particular tool,” they did state:
      • To manage cyber risk and assess cybersecurity preparedness of its critical operations, core business lines and other operations, services, and functions firms may choose to use standardized tools that are aligned with common industry standards and best practices. Some of the tools that firms can choose from include the Federal Financial Institutions Examination Council (FFIEC) Cybersecurity Assessment Tool, the National Institute of Standards and Technology Cybersecurity Framework (NIST), the Center for Internet Security Critical Security Controls, and the Financial Services Sector Coordinating Council Cybersecurity Profile.
  • A class action was filed in the United Kingdom (UK) against Facebook over the Cambridge Analytica scandal. Facebook You Owe Us announced its legal action “for the illegal use of one million users’ data in the England and Wales.” The campaign claimed:
    • Group legal actions like Facebook You Owe Us will pave the way for consumers in the UK to gain redress and compensation for the persistent mass misuse of personal data by the world’s largest companies.  
    • Facebook has exhibited a pattern of unethical behaviour including allegations of election interference and failing to remove fake news. The Information Commissioners Office noted when issuing a £500,000 fine against Facebook for the Cambridge Analytica data breach that “protection of personal information and personal privacy is of fundamental importance, not only for the rights of individuals, but also as we now know, for the preservation of a strong democracy.” Facebook You Owe Us aims to fight back by holding the company to account for failing to protect Facebook users’ personal data and showing that Facebook is not above the law.  
    • The launch of Facebook You Owe Us follows Google You Owe Us’ victory in the Court of Appeal. The Google You Owe Us case has been appealed by Google and is now scheduled to be heard before the Supreme Court in April 2021. If successful, the case will demonstrate that personal data is of value to individuals and that companies cannot simply take it and profit from it illegally. Both cases are led by James Oldnall at Milberg London LLP, with Richard Lloyd, the former executive director of Which?. 

Coming Events

  • The Senate Homeland Security and Governmental Affairs Committee’s Regulatory Affairs and Federal Management Subcommittee will hold a hearing on how to modernize telework in light of what was learned during the COVID-19 pandemic on 18 November.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.
  • On 27 November, The European Data Protection Board “is organising a remote stakeholder workshop on the topic of Legitimate Interest.” The EDPB explained “[p]laces will be allocated on a first come, first served basis, depending on availability.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.