Further Reading, Other Developments, and Coming Events (18 February 2021)

Further Reading

  • Google, Microsoft, Qualcomm Protest Nvidia’s Acquisition of Arm Ltd.” By  David McLaughlin, Ian King, and Dina Bass — Bloomberg. Major United States (U.S.) tech multinationals are telling the U.S. government that Nvidia’s proposed purchase of Arm will hurt competition in the semi-conductor market, an interesting position for an industry renowned for being acquisition hungry. The British firm, Arm, is a key player in the semi-conductor business that deals with all companies, and the fear articulated by firms like Qualcomm, Microsoft, and Google is that Nvidia will cut supply and increase prices once it controls Arm. According to one report, Arm has made something like 95% of the chip architecture for the world’s smartphones and 95% of the chips made in the People’s Republic of China (PRC). The deal has to clear U.S., British, EU, and PRC regulators. In the U.S., the Federal Trade Commission (FTC) has reportedly made very large document requests, which indicates their interest in digging into the deal and suggests the possibility they may come out against the acquisition. The FTC may also be waiting to read the mood in Washington as there is renewed, bipartisan concern about antitrust and competition and about the semi-conductor industry. Finally, acting FTC Chair Rebecca Kelly Slaughter has come out against a lax approach to so-called vertical mergers such as the proposed Nvidia-Arm deal, which may well be the ultimate position of a Democratic FTC.
  • Are Private Messaging Apps the Next Misinformation Hot Spot?” By Brian X. Chen and Kevin Roose — The New York Times. The conclusion these two tech writers reach is that, on balance, private messaging apps like Signal and Telegram, are better for society than not. Moreover, they reason it is better to have extremists migrate from platforms like Facebook to ones where it is much harder to spread their views and proselytize.
  • Amazon Has Transformed the Geography of Wealth and Power” By Vauhini Vara — The Atlantic. A harrowing view of the rise of Amazon cast against the decline of the middle class and the middle of the United States (U.S.) Correlation is not causation, of course, but the company has sped the decline of a number of industries and arguably a number of cities.
  • Zuckerberg responds to Apple’s privacy policies: “We need to inflict pain” By Samuel Axon — Ars Technica. Relations between the companies have worsened as their CEO have taken personal shots at each other in public and private culminating in Apple’s change to its iOS requiring users to agree to being tracked by apps across the internet, which is Facebook’s bread and butter. Expect things to get worse as both Tim Cook and Mark Zuckerberg think augmented reality or mixed reality are the next major frontiers in tech, suggesting the competition may intensify.
  • Inside the Making of Facebook’s Supreme Court” By Kate Klonik — The New Yorker. A very immersive piece on the genesis and design of the Facebook Oversight Board, originally conceived of as a supreme court for content moderation. However, not all content moderation decisions can be referred to the Board; in fact, only when Facebook decides to take down content does a person have a right to appeal. Otherwise, one must depend on the company’s beneficence. So, for example, if Facebook decided to leave up content that is racist toward Muslims, a Facebook user could not appeal the decision. Additionally, Board decisions are not precedential, which, in plain English means, if the Board decides a take down of, say, Nazi propaganda comports with Facebook’s rules, the company would not be obligated to take down similar Nazi content thereafter. This latter wrinkle will ultimately serve to limit the power of the Board. The piece quotes critics, including many involved with the design and establishment of the Board, who see the final form as being little more than a fig leaf for public relations.

Other Developments

  • The Department of Health and Human Services (HHS) was taken to task by a federal appeals court in a blunt opinion decrying the agency’s failure to articulate even the most basic rationale for a multi-million dollar fine of a major Houston hospital for its data security and data privacy violations. HHS’ Office of Civil Rights had levied $4.348 million find on  the University of Texas M.D. Anderson Cancer Center (M.D. Anderson) for violations of the regulations promulgated pursuant to the “Health Insurance Portability and Accountability Act of 1996” (P.L. 104–191) and “Health Information Technology for Economic and Clinical Health Act” (HITECH Act) (P.L. 111-5) governing the security and privacy of certain classes of health information. M.D. Anderson appealed the decision, losing at each stage, until it reached the United States Court of Appeals for the Fifth Circuit (Fifth Circuit.) In its ruling, the Fifth Circuit held that OCR’s “decision  was  arbitrary,  capricious,  and contrary to law.” The Fifth Circuit vacated the penalty and sent the matter back to HHS for further consideration.
    • In its opinion, the Fifth Circuit explained the facts:
      • First, back in 2012, an M.D. Anderson faculty member’s laptop was stolen. The laptop was not encrypted or password-protected but contained “electronic protected health information (ePHI) for 29,021 individuals.” Second, also in 2012, an M.D. Anderson trainee lost an unencrypted USB thumb drive during her evening commute. That thumb drive contained ePHI for over 2,000 individuals. Finally, in 2013, a visiting researcher at M.D. Anderson misplaced another unencrypted USB thumb drive, this time containing ePHI for nearly 3,600 individuals.
      • M.D. Anderson disclosed these incidents to HHS. Then HHS determined that M.D. Anderson had violated two federal regulations. HHS promulgated both of those regulations under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act of 2009 (the “HITECH Act”). The first regulation requires entities covered by HIPAA and the HITECH Act to “[i]mplement a mechanism to encrypt” ePHI or adopt some other “reasonable and appropriate” method to limit access to patient data. 45 C.F.R. §§ 164.312(a)(2)(iv), 164.306(d) (the “Encryption Rule”). The second regulation prohibits the unpermitted disclosure of protected health information. Id. § 164.502(a) (the “Disclosure Rule”).
      • HHS also determined that M.D. Anderson had “reasonable cause” to know that it had violated the rules. 42 U.S.C. § 1320d-5(a)(1)(B) (setting out the “reasonable cause” culpability standard). So, in a purported exercise of its power under 42 U.S.C. § 1320d-5 (HIPAA’s enforcement provision), HHS assessed daily penalties of $1,348,000 for the Encryption Rule violations, $1,500,000 for the 2012 Disclosure Rule violations, and $1,500,000 for the 2013 Disclosure Rule violations. In total, HHS imposed a civil monetary penalty (“CMP” or “penalty”) of $4,348,000.
      • M.D. Anderson unsuccessfully worked its way through two levels of administrative appeals. Then it petitioned our court for review. See 42 U.S.C. § 1320a-7a(e)  (authorizing  judicial  review).  After  M.D.  Anderson  filed  its  petition, the Government conceded that it could not defend its penalty and asked us to reduce it by a factor of 10 to $450,000. 
  • The Australian Senate Standing Committee for the Scrutiny of Bills has weighed in on both the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 and the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, two major legislative proposals put forth in December 2020. This committee plays a special role in legislating in the Senate, for it must “scrutinise each bill introduced into the Parliament as to whether the bills, by express words or otherwise:
    • (i)  trespass unduly on personal rights and liberties;
    • (ii)  make rights, liberties or obligations unduly dependent upon insufficiently defined administrative powers;
    • (iii)  make rights, liberties or obligations unduly dependent upon non- reviewable decisions;
    • (iv)  inappropriately delegate legislative powers; or
    • (v)  insufficiently subject the exercise of legislative power to parliamentary scrutiny.
    • Regarding the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 (see here for analysis), the committee explained:
      • The bill seeks to amend the Surveillance Devices Act 2004 (SD Act), the Crimes Act 1914 (Crimes Act) and associated legislation to introduce three new types of warrants available to the Australian Federal Police (AFP) and the Australian Criminal Intelligence Commission (ACIC) for investigating and disrupting online crime. These are:
        • data disruption warrants, which enable the AFP and the ACIC to modify, add, copy or delete data for the purposes of frustrating the commission of serious offences online;
        • network activity warrants, which permit access to devices and networks used by suspected criminal networks, and
        • account takeover warrants, which provide the AFP and the ACIC with the ability to take control of a person’s online account for the purposes of gathering evidence to further a criminal investigation.
    • The committee flagged concerns about the bill in these categories:
      • Authorisation of coercive powers
        • Issuing authority
        • Time period for warrants
        • Mandatory considerations
        • Broad scope of offences
      • Use of coercive powers without a warrant
        • Emergency authorisations
      • Innocent third parties
        • Access to third party computers, communications in transit and account-based data
        • Compelling third parties to provide information
        • Broad definition of ‘criminal network of individuals’
      • Use of information obtained through warrant processes
        • Prohibitions on use
        • Storage and destruction of records
      • Presumption of innocence—certificate constitutes prima facie evidence
      • Reversal of evidential burden of proof
      • Broad delegation of administrative powers
        • Appropriate authorising officers of the ACIC
    • The committee asked for the following feedback from the government on the bill:
      • The committee requests the minister’s detailed advice as to:
        • why it is considered necessary and appropriate to enable law enforcement officers to disrupt or access data or takeover an online account without a warrant in certain emergency situations (noting the coercive and intrusive nature of these powers and the ability to seek a warrant via the telephone, fax or email);
        • the appropriateness of retaining information obtained under an emergency authorisation that is subsequently not approved by a judge or AAT member;
        • and the appropriateness of enabling law enforcement agencies to act to conceal any thing done under a warrant after the warrant has ceased to be in force, and whether the bill could be amended to provide a process for obtaining a separate concealment of access warrant if the original warrant has ceased to be in force.
      • The committee requests the minister’s detailed advice as to:
        • the effect of Schedules 1-3 on the privacy rights of third parties and a detailed justification for the intrusion on those rights, in particular:
        • why proposed sections 27KE and 27KP do not specifically require the judge or nominated AAT member to consider the privacy implications
        • for third parties of authorising access to a third party computer or
        • communication in transit;
        • why the requirement that an issuing authority be satisfied that an assistance order is justifiable and proportionate, having regard to the offences to which it would relate, only applies to an assistance order with respect to data disruption warrants, and not to all warrants; and
        • whether the breadth of the definitions of ‘electronically linked group of individuals’ and ‘criminal network of individuals’ can be narrowed to reduce the potential for intrusion on the privacy rights of innocent third parties.
    • The committee requests the minister’s detailed advice as to:
      • whether all of the exceptions to the restrictions on the use, recording or disclosure of protected information obtained under the warrants are appropriate and whether any exceptions are drafted in broader terms than is strictly necessary; and
      • why the bill does not require review of the continued need for the retention of records or reports comprising protected information on a more regular basis than a period of five years.
    • As the explanatory materials do not adequately address these issues, the committee requests the minister’s detailed advice as to:
      • why it is considered necessary and appropriate to provide for evidentiary certificates to be issued in connection a data disruption warrant or emergency authorisation, a network access warrant, or an account takeover warrant;
      • the circumstances in which it is intended that evidentiary certificates would be issued, including the nature of any relevant proceedings; and
      • the impact that issuing evidentiary certificates may have on individuals’ rights and liberties, including on the ability of individuals to challenge the lawfulness of actions taken by law enforcement agencies.
    • As the explanatory materials do not address this issue, the committee requests the minister’s advice as to why it is proposed to use offence-specific defences (which reverse the evidential burden of proof) in this instance. The committee’s consideration of the appropriateness of a provision which reverses the burden of proof is assisted if it explicitly addresses relevant principles as set out in the Guide to Framing Commonwealth Offences.
    • The committee requests the minister’s advice as to why it is considered necessary to allow for executive level members of staff of the ACIC to be ‘appropriate authorising officers’, in particular with reference to the committee’s scrutiny concerns in relation to the use of coercive powers without judicial authorisation under an emergency authorisation.
    • Regarding the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, the committee asserted the bill “seeks to establish a mandatory code of conduct to support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The committee requested less input on this bill:
      • requests the Treasurer’s advice as to why it is considered necessary and appropriate to leave the determination of which digital platforms must participate in the News Media and Digital Platforms Mandatory Bargaining Code to delegated legislation.
      • If it is considered appropriate to leave this matter to delegated legislation, the committee requests the Treasurer’s advice as to whether the bill can be amended to require the positive approval of each House of the Parliament before determinations made under proposed section 52E come into effect.
  • The European Data Protection Board (EDPB) issued a statement “on new draft provisions of the second additional protocol to the Council of Europe Convention on Cybercrime (Budapest Convention),” the second time it has weighed in on the rewrite of “the first international treaty on crimes committed via the Internet and other computer networks, dealing particularly with infringements of copyright, computer-related fraud, child pornography and violations of network security.” The EDPB took issue with the process of meeting and drafting new provisions:
    • Following up on the publication of new draft provisions of the second additional protocol to the Budapest Convention , the EDPB therefore, once again, wishes to provide an expert and constructive contribution with a view to ensure that data protection considerations are duly taken into account in the overall drafting process of the additional protocol, considering that the meetings dedicated to the preparation of the additional protocol are being held in closed sessions and that the direct involvement of data protection authorities in the drafting process has not been foreseen in the T-CY Terms of Reference
    • The EDPB offered itself again as a resource and key stakeholder that needs to be involved with the effort:
      • In November 2019, the EDPB also published its latest contribution to the consultation on a draft second additional protocol, indicating that it remained available for further contributions and called for an early and more proactive involvement of data protection authorities in the preparation of these specific provisions, in order to ensure an optimal understanding and consideration of data protections safeguards (emphasis in the original).
    • The EDPB further asserted:
      • The EDPB remains fully aware that situations where judicial and law enforcement authorities are faced with a “cross-border situation” with regards to access to personal data as part of their investigations can be a challenging reality and recognises the legitimate objective of enhancing international cooperation on cybercrime and access to information. In parallel, the EDPB reiterates that the protection of personal data and legal certainty must be guaranteed, thus contributing to the objective of establishing sustainable arrangements for the sharing of personal data with third countries for law enforcement purposes, which are fully compatible with the EU Treaties and the Charter of Fundamental Rights of the EU. The EDPB furthermore considers it essential to frame the preparation of the additional protocol within the framework of the Council of Europe core values and principles, and in particular human rights and the rule of law.
  • The European Commission (EC) published a statement on how artificial intelligence (AI) “can transform Europe’s health sector.” The EC sketched out legislation it hopes to introduce soon on regulating AI in the European union (EU). The EC asserted:
    • A high-standard health system, rich health data and a strong research and innovation ecosystem are Europe’s key assets that can help transform its health sector and make the EU a global leader in health-related artificial intelligence applications. 
    • The use of artificial intelligence (AI) applications in healthcare is increasing rapidly.
    • Before the COVID-19 pandemic, challenges linked to our ageing populations and shortages of healthcare professionals were already driving up the adoption of AI technologies in healthcare. 
    • The pandemic has all but accelerated this trend. Real-time contact tracing apps are just one example of the many AI applications used to monitor the spread of the virus and to reinforce the public health response to it.
    • AI and robotics are also key for the development and manufacturing of new vaccines against COVID-19.
    • The European Commission is currently preparing a comprehensive package of measures to address issues posed by the introduction of AI, including a European legal framework for AI to address fundamental rights and safety risks specific to the AI systems, as well as rules on liability related to new technologies.
  • The House Energy and Commerce Committee Chair Frank Pallone, Jr. (D-NJ) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) wrote to Apple CEO Tim Cook “urging review and improvement of Apple’s new App Privacy labels in light of recent reports suggesting they are often misleading or inaccurate.” Pallone and Schakowsky are working from a Washington Post article, in which the paper’s tech columnist learned that Apple’s purported ratings system to inform consumers about the privacy practices of apps is largely illusory and possibly illegally deceptive. Pallone and Schakowsky asserted:
    • According to recent reports, App Privacy labels can be highly misleading or blatantly false. Using software that logs data transmitted to trackers, a reporter discovered that approximately one third of evaluated apps that said they did not collect data had inaccurate labels. For example, a travel app labeled as collecting no data was sending identifiers and other data to a massive search engine and social media company, an app-analytics company, and even a Russian Internet company. A ‘slime simulator’ rated for ages 4 and older had a ‘Data Not Collected’ label, even though the app shares identifying information with major tech companies and shared data about the phone’s battery level, storage, general location, and volume level with a video game software development company.
    • Simplifying and enhancing privacy disclosures is a laudable goal, but consumer trust in privacy labeling approaches may be undermined if Apple’s App Privacy labels disseminate false and misleading information. Without meaningful, accurate information, Apple’s tool of illumination and transparency may become a source of consumer confusion and harm. False and misleading privacy labels can dupe privacy-conscious consumers into downloading data-intensive apps, ultimately eroding the credibility and integrity of the labels. A privacy label without credibility and integrity also may dull the competitive forces encouraging app developers to improve their data practices.
    • A privacy label is no protection if it is false. We urge Apple to improve the validity of its App Privacy labels to ensure consumers are provided meaningful information about their apps’ data practices and that consumers are not harmed by these potentially deceptive practices.
    • Pallone and Schakowsky stated “[t]o better understand Apple’s practices with respect to the privacy labels, we request that you provide written response to the following questions by February 23, 2021:
      • 1. Apple has stated that it conducts routine and ongoing audits of the information provided by developers and works with developers to correct any inaccuracies.
        • a. Please detail the process by which Apple audits the privacy information provided by app developers. Please explain how frequently audits are conducted, the criteria by which Apple selects which apps to audit, and the methods for verifying the accuracy of the privacy information provided by apps.
        • b. How many apps have been audited since the implementation of the App Privacy label? Of those, how many were found to have provided inaccurate or misleading information? 
      • 2. Does Apple ensure that App Privacy labels are corrected upon the discovery of inaccuracies or misleading information? If not, why not? For each app that has been found to have provided inaccurate or misleading information, how quickly was that label corrected?
      • 3. Please detail Apple’s enforcement policies when an app fails to provide accurate privacy information for the App Privacy label.
      • 4. Does Apple require more in-depth privacy disclosures and conduct more stringent oversight of apps targeted to children under the age of 13? If not, why not? If so, please describe the additional disclosures required and the oversight actions employed for these apps.
      • 5. Providing clear and easily comprehendible privacy information at the point of sale is certainly valuable, but privacy policies are not static. Does Apple notify users when one of their app’s privacy labels has materially changed? If not, why not. If so, how are users notified of such changes.
  • The United Kingdom’s Department for Digital, Culture, Media & Sport (DCMS) “published its draft rules of the road for governing the future use of digital identities…[and] [i]t is part of plans to make it quicker and easier for people to verify themselves using modern technology and create a process as trusted as using passports or bank statements” according to its press release. The DCMS wants feedback by 11 March 2021 on the draft trust framework. The DCMS stated:
    • Digital identity products allow people to prove who they are, where they live or how old they are. They are set to revolutionise transactions such as buying a house, when people are often required to prove their identity multiple times to a bank, conveyancer or estate agent, and buying age-restricted goods online or in person.
    • The new ‘trust framework’ lays out the draft rules of the road organisations should follow. It includes the principles, policies, procedures and standards governing the use of digital identity to allow for the sharing of information to check people’s identities or personal details, such as a user’s address or age, in a trusted and consistent way. This will enable interoperability and increase public confidence.
    • The framework, once finalised, is expected to be brought into law. It has specific standards and requirements for organisations which provide or use digital identity services including:
      • Having a data management policy which explains how they create, obtain, disclose, protect, and delete data;
      • Following industry standards and best practice for information security and encryption;
      • Telling the user if any changes, for example an update to their address, have been made to their digital identity;
      • Where appropriate, having a detailed account recovery process and notifying users if organisations suspect someone has fraudulently accessed their account or used their digital identity;
      • Following guidance on how to choose secure authenticators for their service.
  • The European Commission (EC) “opened infringement procedures against 24 Member States for failing to enact new EU telecom rules.”
    • The EC asserted:
      • The European Electronic Communications Code modernises the European regulatory framework for electronic communications, to enhance consumers’ choices and rights, for example by ensuring clearer contracts, quality of services, and competitive markets. The Code also ensures higher standards of communication services, including more efficient and accessible emergency communications. Furthermore, it allows operators to benefit from rules incentivising investments in very-high capacity networks, as well as from enhanced regulatory predictability, leading to more innovative digital services and infrastructures.
      • The European Electronic Communications Code that brings the regulatory framework governing the European telecom sector up to date with the new challenges came into force in December 2018, and Member States have had two years to implement its rules. It is a central piece of legislation to achieve Europe’s Gigabit society and ensure full participation of all EU citizens in the digital economy and society.

Coming Events

  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • The U.S.-China Economic and Security Review Commission will hold a hearing titled “Deterring PRC Aggression Toward Taiwan” on 18 February.
  • On 24 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Fanning the Flames: Disinformation and Extremism in the Media.”
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Estúdio Bloom on Unsplash

Further Reading, Other Developments, and Coming Events (8 February 2021)

Further Reading

  • ‘A kiss of death’: Top GOP tech critics are personae non gratae after election challenge” By Cristiano Lima — Politico. I take these articles with a block of salt, not least of which because many inside the Beltway articles lack perspective and a sense of history. For sure, in the short term the Josh Hawleys and Ted Cruzes of the world are radioactive to Democrats, but months down the road things will look different, especially if Democrats need votes or allies in the Senate. For example, former Senator David Vitter’s (R-LA) interesting activities with prostitutes made him radioactive for some time and then all was forgotten because he held a valuable currency: a vote.
  • I Talked to the Cassandra of the Internet Age” By Charlie Warzel — The New York Times. A sobering read on the implications of the attention economy. We would all be helped by slowing down and choosing what to focus on.
  • A Vast Web of Vengeance” By Kashmir Hill — The New York Times. A true horror story illustrating the power platforms give anyone to slander others. The more these sorts of stories move to the fore of the consciousness of policymakers, the greater the chances of reform to 47 USC 230 (Section 230), which many companies used to deny requests that they take down defamatory, untrue material.
  • Amazon says government demands for user data spiked by 800% in 2020” By Zack Whitaker — TechCrunch. In an interesting development, Germany far outpaced the United States (U.S.) in information requests between 1 July and 31 December 2020 for Amazon except for Amazon Web Services (AWS). Regarding AWS, the U.S. accounted for 75% of requests. It bears note there were over 27,000 non-AWS requests and only 523 AWS requests.
  • Russian hack brings changes, uncertainty to US court system” By MaryClaire Dale — Associated Press. Because the Administrative Office of United States (U.S.) Courts may have been part of the massive SolarWinds hack, lawyers involved with cases that have national security aspects may no longer file materials electronically. It appears these cases will go old school with paper filings only, stored on a computers in federal courts that have no connection to the internet. However, it is apparently believed at present that the Foreign Intelligence Surveillance Court system was not compromised by the Russians.

Other Developments

  • Senator Ted Cruz (R-TX) placed a hold on Secretary of Commerce designate Gina Raimondo’s nomination, explaining on Twitter: “I’ll lift the hold when the Biden admin commits to keep the massive Chinese Communist Party spy operation Huawei on the Entity List.” Cruz was one of three Republicans to vote against reporting out Raimondo’s nomination from the Senate Commerce, Science, and Transportation Committee. Even though the Ranking Member, Senator Roger Wicker (R-MS), voted to advance her nomination to the Senate floor, he, too, articulated concerns about Raimondo and the Biden Administration’s refusal to commit to keeping Huawei on the Department of Commerce’s Entity List, a designation that cuts off needed technology and products from the company from the People’s Republic of China (PRC). Wicker said “I do remain concerned about the Governor’s reluctance to state unequivocally that she intends to keep Huawei on the department’s entity list…[and] [k]eeping Huawei on this list is important for the security of our networks and I urge the Governor and the administration to make its position clear.” Of course, the continuing Republican focus on the PRC is seeking to box in the Biden Administration and to try to force them to maintain the Trump Administration’s policies. The new administration has refused to make hard commitments on the PRC thus far and will likely seek different tactics than the Trump Administration even though there will likely be agreement on the threat posed by the PRC and its companies.
  • Virginia’s “Consumer Data Protection Act” (SB 1392/HB 2307) advanced from the Virginia Senate to the House of Delegates by a 36-0-1 vote on 5 February. The package was sent to the Communications, Technology and Innovation Subcommittee in the House on 7 February. Last week, it appeared as if the legislature would not have time to finish work on the United States’ second privacy law, but Governor Ralph Northam (D) convened a special session right before the legislature was set to adjourn. Now, there will be more time to address this bill and other priorities.
  • Senators Brian Schatz (D-HI), Deb Fischer (R-NE), Richard Blumenthal (D-CT), Rick Scott (R-FL) and Jacky Rosen (D-NV) introduced “The Safe Connections Act” “to help survivors of domestic violence and other crimes cut ties with their abusers and separate from shared wireless service plans, which can be exploited to monitor, stalk, or control victims” per their press release. The Senators asserted “the Safe Connections Act would help them stay safe and connected by:
    • Allowing survivors to separate a mobile phone line from any shared plan involving an abuser without penalties or other requirements. This includes lines of any dependents in their care;
    • Requiring the Federal Communications Commission (FCC) to initiate a rulemaking proceeding to seek comment on how to help survivors who separate from a shared plan enroll in the Lifeline Program for up to six-months as they become financially stable; and
    • Requiring the FCC to establish rules that would ensure any calls or texts to hotlines do not appear on call logs.
  • The European Commission’s Directorate-General for Justice and Consumers issued the “Report on the implementation of specific provisions of Regulation (EU) 2016/679,” the General Data Protection Regulation (GDPR), in which it was determined that implementation of these provisions at the member state level is uneven. The implication of this assessment released some 2.5 years after the GDPR took effect is that it may be some time more before each European Union state has made the statutory and policy changes necessary to the data protection regime full effect. And so, the Directorate-General made “[t]he following general observations can be made in relation to the implementation of the GDPR clauses under assessment:
    • As regards Article 8(1) GDPR (i.e., Conditions applicable to child’s consent in relation to information society services), the majority of the Member States have set an age limit lower than 16 years of age for the validity of the consent of a minor in relation to information society services. Nine Member States set the age limit at 16 years age, while eight Member States opted for that of 13 years, six for that of 14 years and three for 15 years.
    • With respect to Article 9(4) GDPR (i.e., Processing of special categories of personal data), most Member States provide for conditions/limitations with regard to the processing of genetic data, biometric data or data concerning health. Such limitations/conditions typically consist in listing the categories of persons who have access to such data, ensuring that they are subject to confidentiality obligations, or making processing subject to prior authorisation from the competent national authority. No national provision restricting or prohibiting the free movement of personal data within the European Union has been identified.
    • As regards Article 23(1) GDPR, and irrespective of the areas of public interest assessed under Article 23(1)(c) and (e) GDPR (i.e. public security, public administration, public health, taxation and migration), some Member States provide for restrictions in the area of (i) social security; or (ii) supervision of financial market participants, functioning of the guarantee systems and resolution and macroeconomic analyses. Concerning Article 23(1)(c) GDPR, the majority of Member States allow for restrictions of various provisions referred to in Article 23(1) GDPR. Normally there is a general reference to public security, while more specific areas of processing include the processing of personal data for the investigation and prosecution of crimes, and the use of video cameras for surveillance. Most commonly, the restrictions apply only where certain conditions are met. In some Member States the proportionality and necessity test is not contemplated at all, while in most Member States it is established in law, rather than left to the data controller. The overwhelming majority of Member States do not sufficiently implement the conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 23(1)(e) GDPR in relation to public administration, half of the Member States provide for restrictions for such purpose. Normally there is a general reference to general public interest or public administration, while more specific areas of processing include discussions of the Council of Ministers and investigation of judicial or ‘administrative’ police authorities in connection with the commission of a crime or administrative infringement. Most commonly, the restrictions apply only where certain conditions are met. In some Member States the proportionality and necessity test is not contemplated at all, whereas in some other Member States the test is established in law or left to the data controller. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 23(1)(e) GDPR in relation to public health, a minority of the Member States provide for restrictions for such purpose. Normally there is a general reference to public health or general public interest, while more specific areas of processing include the security of food chain and medical files. In most Member States, the applicable restrictions apply only where certain conditions are met. The proportionality and necessity test is generally established in the law. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • With respect to Article 23(1)(e) GDPR in relation to taxation, a sizeable number of Member States provide restrictions for such purposes. There tends to be a general reference to taxation or general public interest, while more specific areas of processing include recovery of taxes, as well as automated tax data transfer procedures. Normally, the applicable restrictions apply only where certain conditions are met. The proportionality and necessity test is generally left to the data controller. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 23(1)(e) GDPR in relation to migration, a minority of the Member States provide for restrictions for such purpose. Normally there is a general reference to migration or general public interest. The applicable restrictions tend to apply only where certain conditions are met. The proportionality and necessity test is generally left to the data controller. No Member State implements all conditions and safeguards under Article 23(2) GDPR.
    • As regards Article 85(1) GDPR (which requires Member States to reconcile by law the right to the protection of personal data with the right to freedom of expression and information), the majority of the Member States provide for provisions aiming to reconcile the right to the protection of personal data with the right to freedom of expression and information. These provisions are usually in the national data protection act implementing the GDPR, however, in some instances there are also specific provisions in media laws to this effect.
    • With respect to Article 85(2) GDPR (Reconciliation of the right to the protection of personal data with the right to freedom of expression and information), most Member States provide exemptions/derogations from the rules set out in Chapters II, III, IV, V, VI, VII and IX GDPR. More often than not, no specific balancing or reconciliation test is identified in the national legislation. A detailed account of the exemptions/derogations can be found in Annex 2 – Implementation of Article 85(2) GDPR.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) announced it is resuming the “investigation into real time bidding (RTB) and the adtech industry” in response to the COVID-19 pandemic. Simon McDougall, ICO Deputy Commissioner – Regulatory Innovation and Technology stated in a blog posting:
    • Enabling transparency and protecting vulnerable citizens are priorities for the ICO. The complex system of RTB can use people’s sensitive personal data to serve adverts and requires people’s explicit consent, which is not happening right now.
    • Sharing people’s data with potentially hundreds of companies, without properly assessing and addressing the risk of these counterparties, also raises questions around the security and retention of this data.
    • Our work will continue with a series of audits focusing on data management platforms* and we will be issuing assessment notices to specific companies in the coming months. The outcome of these audits will give us a clearer picture of the state of the industry.
    • The investigation is vast and complex and, because of the sensitivity of the work, there will be times where it won’t be possible to provide regular updates. However, we are committed to publishing our final findings, once the investigation is concluded.
    • We are also continuing to work with the Competition and Markets Authority (CMA) in considering Google’s Privacy Sandbox proposals to phase out support for third party cookies on Chrome.
  • Washington State Representative Shelley Kloba (D) and cosponsors introduced a bill, HB 1303, to establish a data brokers registry in Washington state that would also levy a 1.8% tax on gross revenue from selling personal data. In her press release, Kloba stated:
    • We are spending more and more of our lives on our phones and devices. From this has arisen a new business model where brokers collect, analyze, and resell personal data collected from applications on our phones and other devices. Currently, this type of business is totally unregulated and untaxed, and these businesses are reselling information with no compensation to the people of Washington. My legislation would shine a light on this very active segment of our economy while also establishing a small tax on the companies that profit from selling our personal data. Brokers that make money from collecting our personal information should contribute their fair share in tax revenue, and there should be more transparency on the number of businesses engaged in this industry.
    • HB 1303 would:
      • Impose a 1.8% Business & Occupation (B&O) tax on gross income arising from the sale of personal data.
      • Require companies that engage in this type of economic activity to register annually with the Department of Revenue (DOR).
      • Require DOR to provide the Legislature with an annual report on this information.
    • Recently, Kloba and cosponsors introduced the “People’s Privacy Act” (HB 1433), a bill to establish a privacy and data protection regime in Washington state. (see here for analysis.)
  • The Federal Trade Commission (FTC) used recently granted authority to police the use of algorithms and automated processes to buy tickets for entertainment and sporting events. The “Better Online Ticket Sales (BOTS) Act” (P.L. 114-274) “was enacted in 2016 and gives the FTC authority to take law enforcement action against individuals and companies that use bots or other means to circumvent limits on online ticket purchases” per the agency’s press release. The FTC stating it is taking “legal action against three ticket brokers based in New York who allegedly used automated software to illegally buy up tens of thousands of tickets for popular concerts and sporting events, then subsequently made millions of dollars reselling the tickets to fans at higher prices.” The FTC added:
    • The three ticket brokers will be subject to a judgment of more than $31 million in civil penalties for violating the Better Online Ticket Sales (BOTS) Act, under a proposed settlement reached with the FTC. Due to their inability to pay, the judgment will be partially suspended, requiring them to pay $3.7 million.
    • The FTC explained “[u]nder the terms of the proposed orders, judgments will be entered against the defendants for civil penalties as follows:
  • The National Institute of Standards and Technology (NIST) pushed back the deadline for comments until 26 February 2021 for four guidance documents on the Internet of Things:
    • Draft NIST SP 800-213, IoT Device Cybersecurity Guidance for the Federal Government: Establishing IoT Device Cybersecurity Requirements, has background and recommendations to help federal agencies consider how an IoT device they plan to acquire can integrate into a federal information system. IoT devices and their support for security controls are presented in the context of organizational and system risk management. SP 800-213 provides guidance on considering system security from the device perspective. This allows for the identification of IoT device cybersecurity requirements—the abilities and actions a federal agency will expect from an IoT device and its manufacturer and/or third parties, respectively.
    • Draft NISTIR 8259B, IoT Non-Technical Supporting Capability Core Baseline, complements the NISTIR 8259A device cybersecurity core baseline by detailing additional, non-technical supporting activities typically needed from manufacturers and/or associated third parties. This non-technical baseline collects and makes explicit supporting capabilities like documentation, training, customer feedback, etc.
    • Draft NISTIR 8259C, Creating a Profile Using the IoT Core Baseline and Non-Technical Baseline, describes a process, usable by any organization, that starts with the core baselines provided in NISTIRs 8259A and 8259B and explains how to integrate those baselines with organization- or application-specific requirements (e.g., industry standards, regulatory guidance) to develop a IoT cybersecurity profile suitable for specific IoT device customers or applications. The process in NISTIR 8259C guides organizations needing to define a more detailed set of capabilities responding to the concerns of a specific sector, based on some authoritative source such as a standard or other guidance, and could be used by organizations seeking to procure IoT technology or by manufacturers looking to match their products to customer requirements.
    • Draft NISTIR 8259D, Profile Using the IoT Core Baseline and Non-Technical Baseline for the Federal Government, provides a worked example result of applying the NISTIR 8259C process, focused on the federal government customer space, where the requirements of the FISMA process and the SP 800-53 security and privacy controls catalog are the essential guidance. NISTIR 8259D provides a device-centric, cybersecurity-oriented profile of the NISTIR 8259A and 8259B core baselines, calibrated against the FISMA low baseline described in NIST SP 800-53B as an example of the criteria for minimal securability for federal use cases.
  • The New York State Department of Financial Services (NYDFS) announced “[r]egulated entities and licensed persons must file the Certification of Compliance for the calendar year 2020 by April 15, 2021” These certificates are due under the NYDFS’ cybersecurity regulations with which most financial services companies in the state must comply. These regulations took effect in May 2017.

Coming Events

  • On 10 February, the House Homeland Committee will hold a hearing titled “Homeland Cybersecurity: Assessing Cyber Threats and Building Resilience” with these witnesses:
    • Mr. Chris Krebs, Former Director, Cybersecurity and Infrastructure Security Agency, U.S. Department of Homeland Security
    • Ms. Sue Gordon, Former Principal Deputy Director of National Intelligence, Office of the Director of National Intelligence
    • Mr. Michael Daniel, President & CEO, Cyber Threat Alliance
    • Mr. Dmitri Alperovitch, Executive Chairman, Silverado Policy Accelerator
  • The House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Justice Restored: Ending Forced Arbitration and Protecting Fundamental Rights.”
  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Martin Ceralde on Unsplash

Further Reading, Other Developments, and Coming Events (4 February 2021)

Further Reading

  • Global Privacy Control wants to succeed where Do Not Track failed” By Russell Brandom — The Verge. A new effort to block tracking people across the internet and selling people’s information has launched, the Global Privacy Control. This initiative is looking to leverage a provision currently effective in the “California Consumer Privacy Act” (CCPA) (AB 375) that is also in the recently enacted “California Privacy Rights Act” (CPRA) (aka Proposition 24) that requires covered entities to honor when people opt out in a global fashion. This browser add on will transmit the message to websites and other entities that the user does not want to have her data sold, which will have to be honored under California law. The piece cites a Tweet from outgoing California Attorney General Xavier Becerra (D) endorsing the notion generally. Of course, much remains to unfold on this front, but it may prove an easy, effective way for people to guard their privacy.
  • A Former Comcast Employee Explains Why Low-Income WiFi Packages Aren’t Helping Students” By Caroline O’Donovan — BuzzFeed News. Comcast’s Internet Essentials seems insufficient for low-income families with multiple children needing to use videoconferencing for school. A group of students in Baltimore tried working with the company to increase the speed of this low cost package, but the company did nothing more than offer to help the students doing the advocacy. There are other stakeholders in the government and other sectors who think Comcast’s efforts are not enough in the midst of a pandemic.
  • Facebook Ad Services Let Anyone Target US Military Personnel” By Lily Hay Newman — WIRED. Researchers have turned up evidence that united states military personnel could be easily targeted with misinformation as part of attempts to radicalize them or run psychological operations on them. Facebook, naturally, denies there is any such capability with its targeted advertising system, and this new type of threat seems outside the scope of what most experts consider as the main threats from social media.
  • Nextdoor Is Quietly Replacing the Small-Town Paper” By Will Oremus — OneZero. There is another social media platform on which misinformation may be flourishing although perhaps at the cost of local media losing revenue. Nextdoor allows neighbors (but only those with snail mail addresses screening out the homeless) to share information, data, rumors, biases, paranoia, etc. And while the platform fences off each community (e.g., members of the Savannah, Georgia cohort cannot get access to the Jacksonville, Florida group), there is no seemingly effective mechanism to fight lies and misinformation. So it sounds much like the neighborhood WhatsApp group I’m on where one gentlemen is forever spamming everyone with anti-vaccine claims and news about how well Sweden was handling COVID-19 by doing nothing, at least until the government in Stockholm disavowed that approach. I find the WhatsApp group a breeding ground for racial and class biases, and a number of Nextdoor users are reporting the same. Moreover the platform is competing with local media for some of the same advertisers, exacerbating the trend of reduced revenue for media since Facebook and Google came to dominate the advertising market.
  • Google switches ad tracking tech ahead of Apple privacy update” By Rae Hodge — c/net. Google is taking a quieter path than Facebook in pushing back against Apple’s forthcoming change to its iOS that will prompt iPhone users to agree to letting apps track them (i.e., App Tracking Transparency (ATT) policy). Google is switching from the use of IDFA to another Apple tool, SKAdNetwork, which is considered not as good as IDFA.
  • Facebook strikes back against Apple privacy change, prompts users to accept tracking to get ‘better ads experience” By Salvador Rodriguez — CNBC. Speaking of Apple’s pending change, Facebook seems to be moving preemptively to start offering iPhone and iPad users a choice on letting the social media giant use their information to show them personalized ads. The Facebook popup will appear before Apple’s popup. We should probably expect an Apple countermove soon.

Other Developments

  • The Biden White House issued a “Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking” that will change how the United States (U.S.) government uses and deploys data and evidence. There are a range of actions for agencies inside the White House and the Administration to neutralize and remove procedures put in place during the Trump Administration that disregarded science.
    • In relevant part, the memorandum says:
      • Scientific findings should never be distorted or influenced by political considerations.  When scientific or technological information is considered in policy decisions, it should be subjected to well-established scientific processes, including peer review where feasible and appropriate, with appropriate protections for privacy.  Improper political interference in the work of Federal scientists or other scientists who support the work of the Federal Government and in the communication of scientific facts undermines the welfare of the Nation, contributes to systemic inequities and injustices, and violates the trust that the public places in government to best serve its collective interests.
  • Facebook Oversight Board issued its first decisions, overturning Facebook in four of the five cases. Facebook has committed itself to being bound by these decisions. The panel also made “nine policy recommendations to the company” in the decisions. The Oversight Board explained:
    • Facebook now has seven days to restore content in line with the Board’s decisions. The company will also examine whether identical content with parallel context associated with the Board’s decisions should remain on its platform. In addition, Facebook must publicly respond to any policy recommendations the Board has made in its decisions within 30 days.
    • The Oversight Board made the following decisions:
      • Overturned Facebook’s decision on case 2020-002-FB-UA to remove a post under its Community Standard on Hate Speech. The post commented on the supposed lack of reaction to the treatment of Uyghur Muslims in China, compared to the violent reaction to cartoons in France. Click here for more information.
      • Upheld Facebook’s decision on case 2020-003-FB-UA to remove a post under its Community Standard on Hate Speech. The post used the Russian word “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed have no history compared to Armenians. Click here for more information.
      • Overturned Facebook’s original decision on case 2020-004-IG-UA to remove a post under its Community Standard on Adult Nudity and Sexual Activity. The post included photos of breast cancer symptoms which, in some cases, showed uncovered female nipples. Click here for more information.
      • Overturned Facebook’s decision on case 2020-005-FB-UA to remove a post under its Community Standard on Dangerous Individuals and Organizations. The post included an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. Click here for more information.
      • Overturned Facebook’s decision on case 2020-006-FB-FBR to remove a post under its Community Standard on Violence and Incitement. The post criticized the lack of a health strategy in France and included claims that a cure for COVID-19 exists. Click here for more information.
  • House Armed Services Committee announced the creation of a new cyber-focused subcommittee that will split off from the existing the Intelligence and Emerging Threats and Capabilities Subcommittee. The former chair of that subcommittee, Representative James Langevin (D-RI), will chair the Cyber, Innovative Technologies, and Information Systems (CITI) Subcommittee with jurisdiction over the following:
    • Cyber Security, Operations, and Forces
    • Information Technology, Systems, and Operations
    • Science and Technology Programs and Policy
    • Defense-Wide Research and Development (except Missile Defense and Space)
    • Artificial Intelligence Policy and Programs
    • Electromagnetic Spectrum Policy
    • Electronic Warfare Policy
    • Computer Software Acquisition Policy
    • Now the House Armed Services Committee will match the Senate Armed Services Committee, which as a Cybersecurity Committee established when the late Senator John McCain (R-AZ) chaired the full committee.
  • The European Union Agency for Cybersecurity (ENISA) published a report “on pseudonymisation for personal data protection, “Data Pseudonymisation: Advanced Techniques and Use Cases,” providing a technical analysis of cybersecurity measures in personal data protection and privacy.” ENISA stated:
    • As there is no one-size-fits-all pseudonymisation technique, a high level of competence is needed to reduce threats and maintain efficiency in processing pseudonymised data across different scenarios. The ENISA report aims to support data controllers and processors in implementing pseudonymisation by providing possible techniques and use cases that could fit different scenarios.
    • The report underlines the need to take steps that include the following:
      • Each case of personal data processing needs to be analysed to determine the most suitable technical option in relation to pseudonymisation;
      • An in-depth look into the context of personal data processing before data pseudonymisation is applied;
      • Continuous analysis of state-of-the-art in the field of data pseudonymisation, as new research and business models break new ground;
      • Developing advanced pseudonymisation scenarios for more complex cases, for example when the risks of personal data processing are deemed to be high;
      • Further discussion on the broader adoption of data pseudonymisation at EU and Member States levels alike.
  • The United States (U.S.) Chamber of Commerce’s Center for Capital Markets Competitiveness (CCMC) released a new report, “Digital Assets: A Framework for Regulation to Maintain the United States’ Status as an Innovation Leader,” “providing recommendations to help guide policymakers in developing a more closely coordinated response to the regulation of digital assets.” In its press release, the CCMC explained the “report has a focus on financial services regulatory systems due to their significant impact on digital assets and related blockchain innovation, and outlines several recommendations for promoting innovation in the digital assets space, including:
    • Implement technology-neutral regulation
    • Implement principles-based regulation
    • Avoid regulation by enforcement
    • Ensure good faith compliance
    • Establish regulatory flexibility
    • Create digital asset categorization
    • Establish a White House Task Force focused on digital assets
  • The Australian Securities and Investments Commission (ASIC) revealed that “an unidentified threat actor accessed an ASIC server containing attachments to Australian credit licence applications submitted to ASIC between 1 July 2020 and 28 December 2020.” ASIC added:
    • The cyber incident occurred due to a vulnerability in a file transfer appliance (FTA) provided by California-based Accellion and used by ASIC to receive attachments to Australian credit licence applications.
    • ASIC has determined that the credit licence application forms held within the server were not accessed. Analysis by ASIC’s independent forensic investigators shows no evidence that attachments were opened or downloaded.
    • However, the filenames of attachments for credit licence applications that were submitted to ASIC between 1 July 2020 and 28 December 2020 may have been viewed by the threat actor. For example, the credit licence applicant’s name or the name of an individual responsible manager, if these were used in the filename of the attachment (e.g. police check, CV) may have been viewed by the threat actor.
  • In a blog posting, the United Kingdom’s (UK) Information Commissioner’s Office (ICO) regarding “the recently agreed UK and EU Trade and Cooperation Agreement (TCA).” Information Commissioner Elizabeth Denham explained her view on data protection in the UK during the period when data transfers to the UK will be treated as if the European Union (EU) has an adequacy decision about UK law:
    • High standards and co-operation 
      • I must begin by welcoming the commitment by both the EU and UK to ensuring a high level of personal data protection, and to working together to promote high international standards.
      • As envisaged by the TCA, I look forward to developing a new regulatory relationship with European data protection authorities, sharing ideas and data protection expertise and co-operating on enforcement actions where appropriate. As evidenced by our work globally, regulatory cooperation remains key to ensuring we can protect the public’s personal data wherever it resides. The ICO will also continue to develop its international strategy.
    • Data flows: short term bridging provisions and adequacy
      • The TCA contains an important safety net, allowing transfers of data from the EU to UK to continue without restriction for four months whilst the EU considers the UK’s application for adequacy. This is the usual mechanism used by the EU to allow for continued data flow with third countries. This is very welcome news and was the best possible outcome for UK organisations given the risks and impacts of no adequacy. This bridge contained within the TCA will provide a legally robust mechanism that can give UK organisations confidence to continue digital trade in the coming months.
      • The EU has committed (in a Declaration alongside the TCA) to consider promptly the UK’s adequacy application. The Government is taking the lead on that process, with the ICO providing independent regulatory advice when appropriate. We’ll publish more details in due course as the outcome of the adequacy process becomes clear.
      • Whilst we wait for an adequacy decision, for the bridge to continue any new UK adequacy regulations, standard contractual clauses or ICO approvals of international transfer mechanisms, must be put before the TCA’s oversight mechanisms.
    • Data flows: keeping us safe
      • Our police and other law enforcement authorities, in the UK and EU, rely on sharing information with each other to prevent, investigate and prosecute crimes, and ultimately to keep us all safe.
      • Part three of the TCA sets out detailed provisions allowing data sharing for law enforcement. It includes arrangements for the transfer of DNA data, fingerprints, vehicle registrations and Passenger Name Record (PNR) data. It also allows for the UK to access data from EUROPOL and EUROJUST. Part three also contains important commitments to key elements of data protection and for the ICO to be consulted about data protection assessments related to PNR data.
      • I welcome the provisions in the TCA which bake-in the importance of high standards of data protection and international data flows for UK citizens and for the UK economy – they keep us safe, they support our economy, they keep us connected. In our ever-innovating, inter-connected world, my role is to make sure that data flows continue, and continue to protect UK citizens, so they can continue to enjoy digital services underpinned by a seamless flow of data.

Coming Events

  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by michelmondadori from Pixabay

Further Reading, Other Developments, and Coming Events (1 February 2021)

Further Reading

  • Facebook and Apple Are Beefing Over the Future of the Internet” By Gilad Edelman — WIRED. The battle over coming changes to Apple’s iOS continues to escalate. Apple CEO Tim Cook said the changes that will change the app set up for iPhone users to an opt-in system for tracking people across the internet would help protect both privacy and democracy. This latter claim is a shot at Facebook and its role in the rise of extremist groups in the United States and elsewhere. Facebook CEP Mark Zuckerberg claimed this change was of a piece with Apple’s long term interests in driving the app market from a free to paid model that would benefit the Cupertino giant through its 30% fees on all in-app purchases. Zuckerberg also reiterated Facebook’s arguments that such a change by Apple will harm small businesses that will have a harder time advertising. Facebook is also making noise about suing Apple in the same way Epic Games has for its allegedly anti-competitive app store practices. Experts expect Apple’s change will take as much as 10% off of Facebook’s bottom line until it and other advertising players adjust their tactics. This will not be the last shots fired between the two tech giants.
  • Democratic Congress Prepares to Take On Big Tech” By Cecilia Kang — The New York Times. Senator Amy Klobuchar (D-MN) is vowing to introduce antitrust legislation this spring that could rein in big technology companies in the future. Klobuchar’s proposal will receive serious consideration because she now chairs the Senate Judiciary Committee’s subcommittee with jurisdiction over antitrust and competition policy. Klobuchar also plans to release a book this spring with her views on antitrust. Any proposal to reform antitrust law faces a steep uphill battle to 60 votes in the Senate.
  • Pressure builds on Biden, Democrats to revive net neutrality rules” By Tony Romm — The Washington Post. Until the Federal Communications Commission (FCC) has a third Democratic vote, pressure from the left will be on whom the Biden Administration will choose to nominate. Once a Democratic majority is in place, the pressure will be substantial to re-promulgate the Obama Administration net neutrality order.
  • Why Google’s Internet-Beaming Balloons Ran Out of Air” By Aaron Mak — Slate. Among the reasons Alphabet pulled the plug on Loon, its attempt to provide internet service in areas without it, include: the costs, lack of revenue since the areas without service tend to be poorer, the price barriers to people getting 4G devices, and resistance or indifference from governments and regulators.
  • A big hurdle for older Americans trying to get vaccinated: Using the internet” By Rebecca Heilweil — recode. Not surprisingly, the digital divide and basic digital literacy are barriers to the elderly, especially poorer and minorities segment of that demographic, securing online appointments for COVID-19 vaccination.

Other Developments

  • A group of House and Senate Democrats have reintroduced the “Public Health Emergency Privacy Act,” a bill that follows legislation of the same title introduced last spring to address gaps in United States (U.S.) privacy law turned up by the promise of widespread use of COVID-19 tracking apps. And while adoption and usage of these apps have largely underperformed expectations, the gaps and issues have not. And, so Representatives Suzan DelBene (D-WA), Anna Eshoo (D-CA), and Jan Schakowsky (D-IL) and Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) have introduced the “Public Health Emergency Privacy Act” (S.81) but did not make available bill text, so it is not possible at this point to determine how closely it matches last year’s bill, the “Public Health Emergency Privacy Act” (S.3749/H.R.6866) (see here for my analysis of last year’s bill.) However, in a sign that the bills may be identical or very close in their wording, the summary provided in May 2020 and the one provided last week are exactly the same:
    • Ensure that data collected for public health is strictly limited for use in public health;
    • Explicitly prohibit the use of health data for discriminatory, unrelated, or intrusive purposes, including commercial advertising, e-commerce, or efforts to gate access to employment, finance, insurance, housing, or education opportunities;
    • Prevent the potential misuse of health data by government agencies with no role in public health;
    • Require meaningful data security and data integrity protections – including data minimization and accuracy – and mandate deletion by tech firms after the public health emergency;
    • Protect voting rights by prohibiting conditioning the right to vote based on a medical condition or use of contact tracing apps;
    • Require regular reports on the impact of digital collection tools on civil rights;
    • Give the public control over their participation in these efforts by mandating meaningful transparency and requiring opt-in consent; and
    • Provide for robust private and public enforcement, with rulemaking from an expert agency while recognizing the continuing role of states in legislation and enforcement.
  • The United States Department of Justice (DOJ) filed charges against a United States (U.S.) national for “conspiring with others in advance of the 2016 U.S. Presidential Election to use various social media platforms to disseminate misinformation designed to deprive individuals of their constitutional right to vote.” In its complaint, the DOJ foes out of its way not to mention which candidate in the presidential election the accused was working to elect, contemporaneous reporting on the individual made clear he supported Donald Trump and sought to depress the vote for former Secretary of State Hillary Clinton. In its press release, the DOJ asserted:
    • The complaint alleges that in 2016, Mackey established an audience on Twitter with approximately 58,000 followers. A February 2016 analysis by the MIT Media Lab ranked Mackey as the 107th most important influencer of the then-upcoming Election, ranking his account above outlets and individuals such as NBC News (#114), Stephen Colbert (#119) and Newt Gingrich (#141).
    • As alleged in the complaint, between September 2016 and November 2016, in the lead up to the Nov. 8, 2016, U.S. Presidential Election, Mackey conspired with others to use social media platforms, including Twitter, to disseminate fraudulent messages designed to encourage supporters of one of the presidential candidates (the “Candidate”) to “vote” via text message or social media, a legally invalid method of voting.
    • For example, on Nov. 1, 2016, Mackey allegedly tweeted an image that featured an African American woman standing in front of an “African Americans for [the Candidate]” sign.  The image included the following text: “Avoid the Line. Vote from Home. Text ‘[Candidate’s first name]’ to 59925[.] Vote for [the Candidate] and be a part of history.”  The fine print at the bottom of the image stated: “Must be 18 or older to vote. One vote per person. Must be a legal citizen of the United States. Voting by text not available in Guam, Puerto Rico, Alaska or Hawaii. Paid for by [Candidate] for President 2016.”
    • The tweet included the typed hashtags “#Go [Candidate]” and another slogan frequently used by the Candidate. On or about and before Election Day 2016, at least 4,900 unique telephone numbers texted “[Candidate’s first name]” or some derivative to the 59925 text number, which was used in multiple deceptive campaign images tweeted by the defendant and his co-conspirators.
  • Six European and two North American nations worked in coordinated fashion to take down a botnet. Europol announced that “[l]aw enforcement and judicial authorities worldwide have this week disrupted one of most significant botnets of the past decade: EMOTET…[and] [i]nvestigators have now taken control of its infrastructure in an international coordinated action” per their press release. Europol added:
    • EMOTET has been one of the most professional and long lasting cybercrime services out there. First discovered as a banking Trojan in 2014, the malware evolved into the go-to solution for cybercriminals over the years. The EMOTET infrastructure essentially acted as a primary door opener for computer systems on a global scale. Once this unauthorised access was established, these were sold to other top-level criminal groups to deploy further illicit activities such data theft and extortion through ransomware.
  • On 26 January, Senator Ed Markey (D-MA) “asked Facebook why it continues to recommend political groups to users despite committing to stopping the practice” at an October 2020 hearing. Markey pressed CEO Mark Zuckerberg to “explain the apparent discrepancy between its promises to stop recommending political groups and what it has delivered.” Markey added:
    • Unfortunately, it appears that Facebook has failed to keep commitments on this topic that you made to me, other members of Congress, and your users. You and other senior Facebook officials have committed, and reiterated your commitment, to stop your platform’s practice of recommending political groups. First, on October 28, 2020, you appeared before the U.S. Senate Committee on Commerce, Science, and Transportation and stated that Facebook had stopped recommending groups with political content and social issues. When I raised concerns about Facebook’s system of recommending groups, you stated, “Senator, we have taken the step of stopping recommendations in groups for all political content or social issue groups as a precaution for this.”
    • It does not appear, however, that Facebook has kept these commitments. According to The Markup, Facebook “continued to recommend political groups to its users throughout December[of 2020]” — well after you responded to my question at the Commerce Committee hearing.
    • On 27 January, Zuckerberg announced on an earnings call that the platform would stop recommending political and civic groups to users.
  •  The United States (U.S.) Department of Transportation’s National Highway Traffic Safety Administration “announced the expansion of the Automated Vehicle Transparency and Engagement for Safe Testing (AV TEST) Initiative from a pilot to a full program” according to a press release. NHTSA announced the “new web pilot of the Department initiative to improve the safety and testing transparency of automated driving systems” in June 2020 that “aligns with the Department’s leadership on automated driving system vehicles, including AV 4.0:  Ensuring American Leadership in Automated Vehicle Technologies.”
  • The United Kingdom’s (UK) House of Lords amended the government’s Trade Bill that would allow for an agreement with the United States (U.S.) in a way that would block the U.S.’s position that essentially exports 47 USC 230 (Section 230) to the UK. The Lords agreed to this language:
    • (1)The United Kingdom may only become a signatory to an international trade agreement if the conditions in subsection (2) are satisfied.
    • (2) International trade agreements must be consistent with—
      • (a) other international treaties to which the United Kingdom is a party, and the domestic law of England and Wales (including any changes to the law after the trade agreement is signed), regarding the protection of children and other vulnerable user groups using the internet;
      • (b) the provisions on data protection for children, as set out in the age appropriate design code under section 123 of the Data Protection Act 2018 (age-appropriate design code) and other provisions of that Act which impact children; and(c)online protections provided for children in the United Kingdom that the Secretary of State considers necessary.
    • However, the House of Commons disagreed with this change, arguing “it is not an effective means of ensuring the protection of children online.”
    • In a House of Lords briefing document, it is explained:
      • The bill introduces measures to support the UK in implementing an independent trade policy, having left the European Union. It would:
        • enable the UK to implement obligations arising from acceding to the international Agreement on Government Procurement in its own right;
        • enable the UK to implement in domestic law obligations arising under international trade agreements the UK signs with countries that had an existing international trade agreement with the EU;
        • formally establish a new Trade Remedies Authority;
        • enable HM Revenue and Customs (HMRC) to collect information on the number of exporters in the UK; and
        • enable data sharing between HMRC and other private and public sector bodies to fulfil public functions relating to trade.
  • According to their press release, “a coalition of education advocates petitioned the Federal Communications Commission (FCC) to close the remote learning gap for the estimated 15 to 16 million students who lack home internet access” through the E-rate program. This petition follows an Executive Order (EO) signed by President Joe Biden on the first day of his Administration, calling on the FCC to expand broadband connectivity for children across the United States to help them with schooling and studies.
    • In their petition, the groups argued
      • In one of his first Executive Orders, President Biden stated: “The Federal Communications Commission is encouraged, consistent with applicable law, to increase connectivity options for students lacking reliable home broadband, so that they can continue to learn if their schools are operating remotely.”
      • Consistent with [Biden’s EO], the Commission can dramatically improve circumstances for these underserved students, and for schools all over the country that are struggling to educate all of their students, by taking the temporary, limited measures requested in this Petition.
      • As shown below, these actions are well within the Commission’s authority, and in fact all of the actions requested in this Petition could be taken by the Wireline Competition Bureau on delegated authority.
      • As noted above, the Petitioners ask that the Commission issue a declaratory ruling to clarify that, for the duration of the pandemic, the off-campus use of E-rate-supported services to enable remote learning constitutes an “educational purpose” and is therefore allowed under program rules.
      • The declaratory ruling will allow schools and libraries to extend E -rate-funded broadband networks and services outside of a school or library location during Funding Years 2020 and 2021, without losing E-rate funds they are otherwise eligible to receive. Importantly, this requested action would not require the collection of any additional Universal Service funds.
      • Given the severity of our current national emergency, the Petitioners ask that the Bureau release hundreds of millions of dollars—currently not designated for use but held in the E-rate program—to support remote learning. There is little justification for keeping E-rate funds in reserve when the country is facing such an enormous educational crisis.
      • The Commission should use the program’s existing discount methodologies, which take into account socioeconomic status and rural location, in calculating the amount of funding that applicants may receive.  Applicants will have the incentive to make cost-effective purchases because they will have to pay a share of the total cost of services.  
      • To facilitate the distribution of additional funding, Petitioners ask that the Commission direct the Universal Service Administrative Company (USAC) to establish a “remote learning application window” as soon as practicable for the specific purpose of allowing applicants to submit initial or revised requests for E-rate funding for off-campus services used for educational purposes during Funding Years 2020 and 2021.  
      • The Petitioners ask the Commission to waive all rules necessary to effectuate these actions for remote learning funding applications, including the competitive bidding, eligible services, and application rules, pursuant to section 1.3 of the Commission’s rules.
      • The Petitioners respectfully request expedited review of this petition, so that schools and libraries may take action to deploy solutions as soon as possible.
  • “A group of more than 70 organizations have sent a letter to Congress and the Biden/Harris administration warning against responding to the violence in the U.S. Capitol by renewing injudicious attacks on Section 230 of the Communications Decency Act” per their press release. They further urged “lawmakers to consider impacts on marginalized communities before making changes to Section 230, and call on lawmakers to take meaningful action to hold Big Tech companies accountable, including enforcement of existing anti-trust and civil rights law, and passing Federal data privacy legislation.” The signatories characterized themselves as “racial justice, LGBTQ+, Muslim, prison justice, sex worker, free expression, immigration, HIV advocacy, child protection, gender justice, digital rights, consumer, and global human rights organizations.” In terms of the substance of their argument, they asserted:
    • Gutting Section 230 would make it more difficult for web platforms to combat the type of dangerous rhetoric that led to the attack on the Capitol. And certain carve outs to the law could threaten human rights and silence movements for social and racial justice that are needed now more than ever. 
    • Section 230 is a foundational law for free expression and human rights when it comes to digital speech. It makes it possible for websites and online forums to host the opinions, photos, videos, memes, and creativity of ordinary people, rather than just content that is backed by corporations. 
    • The danger posed by uncareful changes to Section 230 is not theoretical. The last major change to the law, the passage of SESTA/FOSTA in 2018, put lives in danger. The impacts of this law were immediate and destructive, limiting the accounts of sex workers and making it more difficult to find and help those who were being trafficked online. This was widely seen as a disaster that made vulnerable communities less safe and led to widespread removal of speech online.
    • We share lawmakers’ concerns with the growing power of Big Tech companies and their unwillingness to address the harm their products are causing. Google and Facebook are just some of the many companies that compromise the privacy and safety of the public by harvesting our data for their own corporate gain, and allowing advertisers, racists and conspiracy theorists to use that data to target us. These surveillance-based business models are pervasive and an attack on human rights. But claims that Section 230 immunizes tech companies that break the law, or disincentivizes them from removing illegal or policy-violating content, are false. In fact, Amazon has invoked Section 230 to defend itself against a lawsuit over its decision to drop Parler from Amazon Web Services due to unchecked threats of violence on Parler’s platform. Additionally, because Section 230 protects platforms’ decisions to remove objectionable content, the law played a role in enabling the removal of Donald Trump from platforms, who could act without fear of excessive litigation.

Coming Events

  • On 3 February, the Senate Commerce, Science, and Transportation Committee will consider the nomination of Rhode Island Governor Gina Raimondo to be the Secretary of Commerce.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Nikolai Chernichenko on Unsplash

Further Reading, Other Development, and Coming Events (20 and 21 January 2021)

Further Reading

  • Amazon’s Ring Neighbors app exposed users’ precise locations and home addresses” By Zack Whittaker — Tech Crunch. Again Amazon’s home security platform suffers problems by way of users data being exposed or less than protected.
  • Harassment of Chinese dissidents was warning signal on disinformation” By Shawna Chen and Bethany Allen-Ebrahimian — Axios. In an example of how malicious online activities can spill into the real world as a number of Chinese dissidents were set upon by protestors.
  • How Social Media’s Obsession with Scale Supercharged Disinformation” By Joan Donovan — Harvard Business Review. Companies like Facebook and Twitter emphasized scale over safety in trying to grow as quickly as possible. This lead to a proliferation of fake accounts and proved welcome ground for the seeds of misinformation.
  • The Moderation War Is Coming to Spotify, Substack, and Clubhouse” By Alex Kantrowitz — OneZero. The same issues with objectionable and abusive content plaguing Twitter, Facebook, YouTube and others will almost certainly become an issue for the newer platforms, and in fact already are.
  • Mexican president mounts campaign against social media bans” By Mark Stevenson — The Associated Press. The leftist President of Mexico President Andrés Manuel López Obrador is vowing to lead international efforts to stop social media companies from censoring what he considers free speech. Whether this materializes into something substantial is not clear.
  • As Trump Clashes With Big Tech, China’s Censored Internet Takes His Side” By Li Yuan — The New York Times. The government in Beijing is framing the ban of former President Donald Trump after the attempted insurrection by social media platforms as proof there is no untrammeled freedom of speech. This position helps bolster the oppressive policing of online content the People’s Republic of China (PRC) wages against its citizens. And quite separately many Chinese people (or what appear to be actual people) are questioning what is often deemed the censoring of Trump in the United States (U.S.), a nation ostensibly committed to free speech. There is also widespread misunderstanding about the First Amendment rights of social media platforms not to host content with which they disagree and the power of platforms to make such determinations without fear that the U.S. government will punish them as is often the case in the PRC.
  • Trump admin slams China’s Huawei, halting shipments from Intel, others – sources” By Karen Freifeld and Alexandra Alper — Reuters. On its way out of the proverbial door, the Trump Administration delivered parting shots to Huawei and the People’s Republic of China by revoking one license and denying others to sell the PRC tech giant semiconductors. Whether the Biden Administration will reverse or stand by these actions remains to be seen. The companies, including Intel, could appeal. Additionally, there are an estimated $400 million worth of applications for similar licenses pending at the Department of Commerce that are now the domain of the new regime in Washington. It is too early to discern how the Biden Administration will maintain or modify Trump Administration policy towards the PRC.
  • Behind a Secret Deal Between Google and Facebook” By Daisuke Wakabayashi and Tiffany Hsu — The New York Times. The newspaper got its hands on an unredacted copy of the antitrust suit Texas Attorney General Ken Paxton and other attorneys general filed against Google, and it has details on the deal Facebook and Google allegedly struck to divide the online advertising world. Not only did Facebook ditch an effort launched by publishers to defeat Google’s overwhelming advantages in online advertising bidding, it joined Google’s rival effort with a guarantee that it would win a specified number of bids and more time to bid on ads. Google and Facebook naturally deny any wrongdoing.
  • Biden and Trump Voters Were Exposed to Radically Different Coverage of the Capitol Riot on Facebook” By Colin Lecher and Jon Keegan — The Markup. Using a tool on browsers the organization pays Facebook users to have, the Markup can track the type of material they see in their feed. Facebook’s algorithm fed people material about the 6 January 2021 attempted insurrection based on their political views. Many have pointed out that this very dynamic creates filter bubbles that poison democracy and public discourse.
  • Banning Trump won’t fix social media: 10 ideas to rebuild our broken internet – by experts” By Julia Carrie Wong — The Guardian. There are some fascinating proposals in this piece that could help address the problems of social media.
  • Misinformation dropped dramatically the week after Twitter banned Trump and some allies” By Elizabeth Dwoskin and Craig Timberg — The Washington Post. Research showed that lies, misinformation, and disinformation about election fraud dropped by three-quarters after former President Donald Trump was banned from Twitter and other platforms. Other research showed that a small group of conservatives were responsible for up to 20% of misinformation on this and other conspiracies.
  • This Was WhatsApp’s Plan All Along” By Shoshana Wodinsky — Gizmodo. This piece does a great job of breaking down into plain English the proposed changes to terms of service on WhatsApp that so enraged users that competitors Signal and Telegram have seen record-breaking downloads. Basically, it is all about reaping advertising dollars for Facebook through businesses and third-party partners using user data from business-related communications. Incidentally, WhatsApp has delayed changes until March because of the pushback.
  • Brussels eclipsed as EU countries roll out their own tech rules” By By Laura Kayali and Mark Scott — Politico EU. The European Union (EU) had a hard-enough task in trying to reach final language on a Digital Services Act and Digital Markets Act without nations like France, Germany, Poland, and others picking and choosing text from draft bills and enacting them into law. Brussels is not happy with this trend.

Other Developments

  • Federal Trade Commission (FTC) Chair Joseph J. Simons announced his resignation from the FTC effective on 29 January 2021 in keeping with tradition and past practice. This resignation clears the way for President Joe Biden to name the chair of the FTC, and along with FTC Commissioner Rohit Chopra’s nomination to head the Consumer Financial Protection Bureau (CFPB), the incoming President will get to nominate two Democratic FTC Commissioners, tipping the political balance of the FTC and likely ushering in a period of more regulation of the technology sector.
    • Simons also announced the resignation of senior staff: General Counsel Alden F. Abbott; Bureau of Competition Director Ian Conner; Bureau of Competition Deputy Directors Gail Levine and Daniel Francis; Bureau of Consumer Protection Director Andrew Smith; Bureau of Economics Director Andrew Sweeting; Office of Public Affairs Director Cathy MacFarlane; and Office of Policy Planning Director Bilal Sayyed.
  • In a speech last week before he sworn in, President Joe Biden announced his $1.9 trillion American Rescue Plan, and according to a summary, Biden will ask Congress to provide $10 billion for a handful of government facing programs to improve technology. Notably, Biden “is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks.” Biden is proposing to dramatically increase funding for a fund that would allow agencies to borrow and then pay back funds to update their technology. Moreover, Biden is looking to push more money to a program to aid officials at agencies who oversee technology development and procurement.
    • Biden stated “[t]o remediate the SolarWinds breach and boost U.S. defenses, including of the COVID-19 vaccine process, President-elect Biden is calling on Congress to:
      • Expand and improve the Technology Modernization Fund. ​A $9 billion investment will help the U.S. launch major new IT and cybersecurity shared services at the Cyber Security and Information Security Agency (CISA) and the General Services Administration and complete modernization projects at federal agencies. ​In addition, the president-elect is calling on Congress to change the fund’s reimbursement structure in order to fund more innovative and impactful projects.
      • Surge cybersecurity technology and engineering expert hiring​. Providing the Information Technology Oversight and Reform fund with $200 million will allow for the rapid hiring of hundreds of experts to support the federal Chief Information Security Officer and U.S. Digital Service.
      • Build shared, secure services to drive transformational projects. ​Investing$300 million in no-year funding for Technology Transformation Services in the General Services Administration will drive secure IT projects forward without the need of reimbursement from agencies.
      • Improving security monitoring and incident response activities. ​An additional $690M for CISA will bolster cybersecurity across federal civilian networks, and support the piloting of new shared security and cloud computing services.
  • The United States (U.S.) Department of Commerce issued an interim final rule pursuant to an executive order (EO) issued by former President Donald Trump to secure the United States (U.S.) information and communications supply chain. This rule will undoubtedly be reviewed by the Biden Administration and may be withdrawn or modified depending on the fate on the EO on which the rule relies.
    • In the interim final rule, Commerce explained:
      • These regulations create the processes and procedures that the Secretary of Commerce will use to identify, assess, and address certain transactions, including classes of transactions, between U.S. persons and foreign persons that involve information and communications technology or services designed, developed, manufactured, or supplied, by persons owned by, controlled by, or subject to the jurisdiction or direction of a foreign adversary; and pose an undue or unacceptable risk. While this interim final rule will become effective on March 22, 2021, the Department of Commerce continues to welcome public input and is thus seeking additional public comment. Once any additional comments have been evaluated, the Department is committed to issuing a final rule.
      • On November 27, 2019, the Department of Commerce (Department) published a proposed rule to implement the terms of the Executive Order. (84 FR 65316). The proposed rule set forth processes for (1) how the Secretary would evaluate and assess transactions involving ICTS to determine whether they pose an undue risk of sabotage to or subversion of the ICTS supply chain, or an unacceptable risk to the national security of the United States or the security and safety of U.S. persons; (2) how the Secretary would notify parties to transactions under review of the Secretary’s decision regarding the ICTS Transaction, including whether the Secretary would prohibit or mitigate the transaction; and (3) how parties to transactions reviewed by the Secretary could comment on the Secretary’s preliminary decisions. The proposed rule also provided that the Secretary could act without complying with the proposed procedures where required by national security. Finally, the Secretary would establish penalties for violations of mitigation agreements, the regulations, or the Executive Order.
      • In addition to seeking general public comment, the Department requested comments from the public on five specific questions: (1) Whether the Secretary should consider categorical exclusions or whether there are classes of persons whose use of ICTS cannot violate the Executive Order; (2) whether there are categories of uses or of risks that are always capable of being reliably and adequately mitigated; (3) how the Secretary should monitor and enforce any mitigation agreements applied to a transaction; (4) how the terms, “transaction,” “dealing in,” and “use of” should be clarified in the rule; and (5) whether the Department should add record-keeping requirements for information related to transactions.
      • The list of “foreign adversaries” consists of the following foreign governments and non-government persons: The People’s Republic of China, including the Hong Kong Special Administrative Region (China); the Republic of Cuba (Cuba); the Islamic Republic of Iran (Iran); the Democratic People’s Republic of Korea (North Korea); the Russian Federation (Russia); and Venezuelan politician Nicolás Maduro (Maduro Regime).
  • The Federal Trade Commission (FTC) adjusted its penalty amounts for inflation, including a boost to the per violation penalty virtually all the privacy bills introduced in the last Congress would allow the agency to wield against first-time violators. The penalty for certain unfair and deceptive acts or practices was increased from $43,280 to $43,792.
  • The United States (U.S.) Department of State stood up its new Bureau of Cyberspace Security and Emerging Technologies (CSET) as it has long planned. At the beginning of the Trump Administration, the Department of State dismantled the Cyber Coordinator Office and gave its cybersecurity portfolio to the Bureau of Economic Affairs, which displeased Congressional stakeholders. In 2019, the department notified Congress of its plan to establish CSET. The department asserted:
    • The need to reorganize and resource America’s cyberspace and emerging technology security diplomacy through the creation of CSET is critical, as the challenges to U.S. national security presented by China, Russia, Iran, North Korea, and other cyber and emerging technology competitors and adversaries have only increased since the Department notified Congress in June 2019 of its intent to create CSET.
    • The CSET bureau will lead U.S. government diplomatic efforts on a wide range of international cyberspace security and emerging technology policy issues that affect U.S. foreign policy and national security, including securing cyberspace and critical technologies, reducing the likelihood of cyber conflict, and prevailing in strategic cyber competition.  The Secretary’s decision to establish CSET will permit the Department to posture itself appropriately and engage as effectively as possible with partners and allies on these pressing national security concerns.
    • The Congressional Members of the Cyberspace Solarium Commission made clear their disapproval of the decision. Senators Angus King (I-ME) and Ben Sasse, (R-NE) and Representatives Mike Gallagher (R-WI) and Jim Langevin (D-RI) said:
      • In our report, we emphasize the need for a greater emphasis on international cyber policy at State. However, unlike the bipartisan Cyber Diplomacy Act, the State Department’s proposed Bureau will reinforce existing silos and […] hinder the development of a holistic strategy to promote cyberspace stability on the international stage. We urge President-elect Biden to pause this reorganization when he takes office in two weeks and work with Congress to enact meaningful reform to protect our country in cyberspace.
  • The Australian Cyber Security Centre (ACSC) the Risk Identification Guidance “developed to assist organisations in identifying risks associated with their use of suppliers, manufacturers, distributors and retailers (i.e. businesses that constitute their cyber supply chain)” and the Risk Management Guidance because “[c]yber supply chain risk management can be achieved by identifying the cyber supply chain, understanding cyber supply chain risk, setting cyber security expectations, auditing for compliance, and monitoring and improving cyber supply chain security practices.”
  • The United Kingdom’s Surveillance Camera Commissioner (SCC), issued “best practice guidance, ‘Facing the Camera’, to all police forces in England and Wales” The SCC explained that “The provisions of this document only apply to the use of facial recognition technology and the inherent processing of images by the police where such use is integral to a surveillance camera system being operated in ‘live time’ or ‘near real time’ operational scenarios.” Last summer, a British appeals court overturned a decision that found that a police force’s use of facial recognition technology in a pilot program that utilized live footage to be legal. The appeals court found the use of this technology by the South Wales Police Force a violation of “the right to respect for private life under Article 8 of the European  Convention  on  Human  Rights,  data  protection  legislation,  and  the  Public  Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.” The SCC stated:
    • The SCC considers surveillance to be an intrusive investigatory power where it is conducted by the police which impacts upon those fundamental rights and freedoms of people, as set out by the European Convention of Human Rights (ECHR) and the Human Rights Act 1998. In the context of surveillance camera systems which make use of facial recognition technology, the extent of state intrusion in such matters is significantly increased by the capabilities of algorithms which are in essence, integral to the surveillance conduct seeking to harvest information, private information, metadata, data, personal data, intelligence and evidence. Each of the aforementioned are bound by laws and rules which ought to be separately and jointly considered and applied in a manner which is demonstrably lawful and ethical and engenders public trust and confidence.
    • Whenever the police seek to use technology in pursuit of a legitimate aim, the key question arises as to whether the degree of intrusion which is caused to the fundamental freedoms of citizens by the police surveillance conduct using surveillance algorithms (biometric or otherwise) is necessary in a democratic society when considered alongside the legality and proportionality of their endeavours and intent. The type of equipment/technology/modality which they choose to use to that end (e.g. LFR, ANPR, thermal imaging, gait analysis, movement sensors etc), the manner in which such technological means are deployed, (such as using static cameras at various locations, used with body worn cameras or other mobile means), and whether such technology is used overtly alongside or networked with other surveillance technologies, are all factors which may significantly influence the depth of intrusion caused by police conduct upon citizen’s rights.
  • The Senate confirmed the nomination of Avril Haines to be the new Director of National Intelligence by an 89-10 vote after Senator Tom Cotton (R-AK) removed his hold on her nomination. However, Josh Hawley (R-MO) placed a hold on the nomination of Alejandro Mayorkas to be the next Secretary of Homeland Security and explained his action this way:
    • On Day 1 of his administration, President-elect Biden has said he plans to unveil an amnesty plan for 11 million immigrants in this nation illegally. This comes at a time when millions of American citizens remain out of work and a new migrant caravan has been attempting to reach the United States. Mr. Mayorkas has not adequately explained how he will enforce federal law and secure the southern border given President-elect Biden’s promise to roll back major enforcement and security measures. Just today, he declined to say he would enforce the laws Congress has already passed to secure the border wall system. Given this, I cannot consent to skip the standard vetting process and fast-track this nomination when so many questions remain unanswered.
  • Former Trump White House Cyber Coordinator Rob Joyce will replace the National Security Agency’s (NSA) Director of Cybersecurity Anne Neuberger who has been named the Biden White House’s Deputy National Security Advisor for Cyber and Emerging Technology. Anne Neuberger’s portfolio at the NSA included “lead[ing] NSA’s cybersecurity mission, including emerging technology areas like quantum-resistant cryptography.” Joyce was purged when former National Security Advisor John Bolton restructured the NSC in 2018, forcing out Joyce and his boss, former Homeland Security Advisor Tom Bossert. Presumably Joyce would have the same responsibilities. At the National Security Council, Neuberger would will work to coordinate cybersecurity and emerging technology policy across agencies and funnel policy options up to the full NSC and ultimately the President. This work would include Joyce.
  • The Supreme Court of the United States (SCOTUS) heard oral arguments on whether the Federal Trade Commission (FTC) Act gives the agency the power to seek monetary damages and restitution alongside permanent injunctions under Section 13(b). In AMG Capital Management, LLC v. FTC, the parties opposing the FTC argue the plain language of the statute does not allow for the seeking of restitution and monetary damages under this specific section of the FTC Act while the agency argues long accepted past practice and Congressional intent do, in fact, allow this relief to be sought when the FTC is seeking to punish violators of Section 5. The FTC is working a separate track to get a fix from Congress which could rewrite the FTC Act to make clear this sort of relief is legal. However, some stakeholders in the debate over privacy legislation may be using the case as leverage.
    • In October 2020, the FTC wrote the House and Senate committees with jurisdiction over the agency, asking for language to resolve the litigation over the power to seek and obtain restitution for victims of those who have violated Section 5 of the FTC Act and disgorgement of ill-gotten gains. The FTC is also asking that Congress clarify that the agency may act against violators even if their conduct has stopped as it has for more than four decades. Two federal appeals courts have ruled in ways that have limited the FTC’s long used powers, and now the Supreme Court of the United States is set to rule on these issues sometime next year. The FTC is claiming, however, that defendants are playing for time in the hopes that the FTC’s authority to seek and receive monetary penalties will ultimately be limited by the United States (U.S.) highest court. Judging by language tucked into a privacy bill introduced by the former chair of one of the committees, Congress may be willing to act soon.
    • The FTC asked the House Energy and Commerce and Senate Commerce, Science, and Transportation Committees “to take quick action to amend Section 13(b) [of the FTC Act i.e. 15 U.S.C. § 53(b)] to make clear that the Commission can bring actions in federal court under Section 13(b) even if conduct is no longer ongoing or impending when the suit is filed and can obtain monetary relief, including restitution and disgorgement, if successful.” The agency asserted “[w]ithout congressional action, the Commission’s ability to use Section 13(b) to provide refunds to consumer victims and to enjoin illegal activity is severely threatened.” All five FTC Commissioners signed the letter.
    • The FTC explained that adverse rulings by two federal appeals courts are constraining the agency from seeking relief for victims and punishment for violators of the FTC Act in federal courts below those two specific courts, but elsewhere defendants are either asking courts for a similar ruling or using delaying tactics in the hopes the Supreme Court upholds the two federal appeals courts:
      • …[C]ourts of appeals in the Third and Seventh Circuits have recently ruled that the agency cannot obtain any monetary relief under Section 13(b). Although review in the Supreme Court is pending, these lower court decisions are already inhibiting our ability to obtain monetary relief under 13(b). Not only do these decisions already prevent us from obtaining redress for consumers in the circuits where they issued, prospective defendants are routinely invoking them in refusing to settle cases with agreed-upon redress payments.
      • Moreover, defendants in our law enforcement actions pending in other circuits are seeking to expand the rulings to those circuits and taking steps to delay litigation in anticipation of a potential Supreme Court ruling that would allow them to escape liability for any monetary relief caused by their unlawful conduct. This is a significant impediment to the agency’s effectiveness, its ability to provide redress to consumer victims, and its ability to prevent entities who violate the law from profiting from their wrongdoing.
  • The United Kingdom’s Information Commissioner’s Office (ICO) issued guidance for British entities that may be affected by the massive SolarWinds hack that has compromised many key systems in the United States. The ICO advised:
    • Organisations should immediately check whether they are using a version of the software that has been compromised. These are versions 2019.4 HF 5, 2020.2 with no hotfix installed, and 2020.2 HF 1.
    • Organisations must also determine if the personal data they hold has been affected by the cyber-attack. If a reportable personal data breach is found, UK data controllers are required to inform the ICO within 72 hours of discovering the breach. Reports can be submitted online or organisations can call the ICO’s personal data breach helpline for advice on 0303 123 1113, option 2.
    • Organisations subject to the NIS Regulation will also need to determine if this incident has led to a “substantial impact on the provision’ of its digital services and report to the ICO.
  • Europol announced the takedown of “the world’s largest illegal marketplace on the dark web” in an operation coordinated by the following nations: “Germany, Australia, Denmark, Moldova, Ukraine, the United Kingdom (the National Crime Agency), and the USA (DEA, FBI, and IRS).” Europol added:
    • The Central Criminal Investigation Department in the German city of Oldenburg arrested an Australian citizen who is the alleged operator of DarkMarket near the German-Danish border over the weekend. The investigation, which was led by the cybercrime unit of the Koblenz Public Prosecutor’s Office, allowed officers to locate and close the marketplace, switch off the servers and seize the criminal infrastructure – more than 20 servers in Moldova and Ukraine supported by the German Federal Criminal Police office (BKA). The stored data will give investigators new leads to further investigate moderators, sellers, and buyers. 
  • The Enforcement Bureau (Bureau) of the Federal Communications Commission (FCC) issued an enforcement advisory intended to remind people that use of amateur and personal radios to commit crimes is itself a criminal offense that could warrant prosecution. The notice was issued because the FCC is claiming it is aware of discussion by some of how these means of communications may be superior to social media, which has been cracking down on extremist material since the attempted insurrection at the United States Capitol on 6 January. The Bureau stated:
    • The Bureau has become aware of discussions on social media platforms suggesting that certain radio services regulated by the Commission may be an alternative to social media platforms for groups to communicate and coordinate future activities.  The Bureau recognizes that these services can be used for a wide range of permitted purposes, including speech that is protected under the First Amendment of the U.S. Constitution.  Amateur and Personal Radio Services, however, may not be used to commit or facilitate crimes. 
    • Specifically, the Bureau reminds amateur licensees that they are prohibited from transmitting “communications intended to facilitate a criminal act” or “messages encoded for the purpose of obscuring their meaning.” Likewise, individuals operating radios in the Personal Radio Services, a category that includes Citizens Band radios, Family Radio Service walkie-talkies, and General Mobile Radio Service, are prohibited from using those radios “in connection with any activity which is against Federal, State or local law.” Individuals using radios in the Amateur or Personal Radio Services in this manner may be subject to severe penalties, including significant fines, seizure of the offending equipment, and, in some cases, criminal prosecution.
  • The European Data Protection Board (EDPB) issued its “Strategy for 2021-2023” in order “[t]o be effective in confronting the main challenges ahead.” The EDPB cautioned:
    • This Strategy does not provide an exhaustive overview of the work of the EDPB in the years to come. Rather it sets out the four main pillars of our strategic objectives, as well as set of key actions to help achieve those objectives. The EDPB will implement this Strategy within its Work Program, and will report on the progress achieved in relation to each Pillar as part of its annual reports.
    • The EDPB listed and explained the four pillars of its strategy:
      • PILLAR 1: ADVANCING HARMONISATION AND FACILITATING COMPLIANCE. The EDPB will continue to strive for a maximum degree of consistency in the application of data protection rules and limit fragmentation among Member States. In addition to providing practical, easily understandable and accessible guidance, the EDPB will develop and promote tools that help to implement data protection into practice, taking into account practical experiences of different stakeholders on the ground.
      • PILLAR 2: SUPPORTING EFFECTIVE ENFORCEMENT AND EFFICIENT COOPERATION BETWEEN NATIONAL SUPERVISORY AUTHORITIES. The EDPB is fully committed to support cooperation between all national supervisory authorities that work together to enforce European data protection law. We will streamline internal processes, combine expertise and promote enhanced coordination. We intend not only to ensure a more efficient functioning of the cooperation and consistency mechanisms, but also to strive for the development of a genuine EU-wide enforcement culture among supervisory authorities.
      • PILLAR 3: A FUNDAMENTAL RIGHTS APPROACH TO NEW TECHNOLOGIES. The protection of personal data helps to ensure that technology, new business models and society develop in accordance with our values, such as human dignity, autonomy and liberty. The EDPB will continuously monitor new and emerging technologies and their potential impact on the fundamental rights and daily lives of individuals. Data protection should work for all people, particularly in the face of processing activities presenting the greatest risks to individuals’ rights and freedoms (e.g. to prevent discrimination). We will help to shape Europe’s digital future in line with our common values and rules. We will continue to work with other regulators and policymakers to promote regulatory coherence and enhanced protection for individuals.
      • PILLAR 4: THE GLOBAL DIMENSION. The EDPB is determined to set and promote high EU and global standards for international data transfers to third countries in the private and the public sector, including in the law enforcement sector. We will reinforce our engagement with the international community to promote EU data protection as a global model and to ensure effective protection of personal data beyond EU borders.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) revealed that all but one of the videoconferencing platforms it and other data protection authorities’ (DPA) July 2020 letter urging them to “adopt principles to guide them in addressing some key privacy risks.” The ICO explained:
    • Microsoft, Cisco, Zoom and Google replied to the open letter. The joint signatories thank these companies for engaging on this important matter and for acknowledging and responding to the concerns raised. In their responses the companies highlighted various privacy and security best practices, measures, and tools that they advise are implemented or built-in to their video teleconferencing services.
    • The information provided by these companies is encouraging. It is a constructive foundation for further discussion on elements of the responses that the joint signatories feel would benefit from more clarity and additional supporting information.
    • The ICO stated:
      • The joint signatories have not received a response to the open letter from Houseparty. They strongly encourage Houseparty to engage with them and respond to the open letter to address the concerns raised.
  • The European Union Agency for Cybersecurity (ENISA) “launched a public consultation, which runs until 7 February 2021, on its draft of the candidate European Union Cybersecurity Certification Scheme on Cloud Services (EUCS)…[that] aims to further improve the Union’s internal market conditions for cloud services by enhancing and streamlining the services’ cybersecurity guarantees.” ENISA stated:
    • There are challenges to the certification of cloud services, such as a diverse set of market players, complex systems and a constantly evolving landscape of cloud services, as well as the existence of different schemes in Member States. The draft EUCS candidate scheme tackles these challenges by calling for cybersecurity best practices across three levels of assurance and by allowing for a transition from current national schemes in the EU. The draft EUCS candidate scheme is a horizontal and technological scheme that intends to provide cybersecurity assurance throughout the cloud supply chain, and form a sound basis for sectoral schemes.
    • More specifically, the draft EUCS candidate scheme:
      • Is a voluntary scheme;
      • The scheme’s certificates will be applicable across the EU Member States;
      • Is applicable for all kinds of cloud services – from infrastructure to applications;
      • Boosts trust in cloud services by defining a reference set of security requirements;
      • Covers three assurance levels: ‘Basic’, ‘Substantial’ and ‘High’;
      • Proposes a new approach inspired by existing national schemes and international standards;
      • Defines a transition path from national schemes in the EU;
      • Grants a three-year certification that can be renewed;
      • Includes transparency requirements such as the location of data processing and storage.

Coming Events

  • The Commerce, Science, and Transportation Committee will hold a hearing on the nomination of Gina Raimondo to be the Secretary of Commerce on 26 January.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Peggy und Marco Lachmann-Anke from Pixabay

Further Reading, Other Developments, and Coming Events (19 January 2021)

Further Reading

  • Hong Kong telecoms provider blocks website for first time, citing security law” — Reuters; “A Hong Kong Website Gets Blocked, Raising Censorship Fears” By Paul Mozur and Aaron Krolik — The New York Times. The Hong Kong Broadband Network (HKBN) blocked access to a website about the 2019 protests against the People’s Republic of China (PRC) (called HKChronicles) under a recently enacted security law critics had warned would lead to exactly this sort of outcome. Allegedly, the Hong Kong police had invoked the National Security Law for the first time, and other telecommunications companies have followed suit.
  • Biden to counter China tech by urging investment in US: adviser” By Yifan Yu — Nikkei Asia. President-elect Joe Biden’s head of the National Economic Council said at a public event that the Biden Administration would focus less on tariffs and other similar instruments to counter the People’s Republic of China (PRC). Instead, the incoming President would try to foster investment in United States companies and technologies to fend off the PRC’s growing strength in a number of crucial fields. Also, a Biden Administration would work more with traditional U.S. allies to contest policies from Beijing.
  • Revealed: walkie-talkie app Zello hosted far-right groups who stormed Capitol” By Micah Loewinger and Hampton Stall — The Guardian. Some of the rioters and insurrectionists whop attacked the United States Capitol on 6 January were using another, lesser known communications app, Zello, to coordinate their actions. The app has since taken down a number of right-wing and extremist groups that have flourished for months if not years on the platform. It remains to be seen how smaller platforms will be scrutinized under a Biden Presidency. Zello has reportedly been aware that these groups have been using their platform and opted not to police their conduct.
  • They Used to Post Selfies. Now They’re Trying to Reverse the Election.” By Stuart A. Thompson and Charlie Warzel — The New York Times. The three people who amassed considerable extremist followings seem each to be part believer and part opportunist. A fascinating series of profiles about the three.
  • Telegram tries, and fails, to remove extremist content” By Mark Scott — Politico. Platforms other than Facebook and Twiiter are struggling to moderate right wing and extremist content that violates their policies and terms of service.

Other Developments

  • The Biden-Harris transition team announced that a statutorily established science advisor will now be a member of the Cabinet and named its nominee for this and other positions. The Office of Science and Technology Policy (OSTP) was created by executive order in the Ford Administration and then codified by Congress. However, the OSTP Director has not been a member of the Cabinet alongside the Senate-confirmed Secretaries and others. President-elect Joe Biden has decided to elevate the OSTP Director to the Cabinet, likely in order to signal the importance of science and technology in his Administration. The current OSTP has exercised unusual influence in the Trump Administration under the helm of OSTP Associate Director Michael Kratsios and shaped policy in a number of realms like artificial intelligence, national security, and others.
    • In the press release, the transition team explained:
      • Dr. Eric Lander will be nominated as Director of the OSTP and serve as the Presidential Science Advisor. The president-elect is elevating the role of science within the White House, including by designating the Presidential Science Advisor as a member of the Cabinet for the first time in history. One of the country’s leading scientists, Dr. Lander was a principal leader of the Human Genome Project and has been a pioneer in the field of genomic medicine. He is the founding director of the Broad Institute of MIT and Harvard, one of the nation’s leading research institutes. During the Obama-Biden administration, he served as external Co-Chair of the President’s Council of Advisors on Science and Technology. Dr. Lander will be the first life scientist to serve as Presidential Science Advisor.
      • Dr. Alondra Nelson will serve as OSTP Deputy Director for Science and Society. A distinguished scholar of science, technology, social inequality, and race, Dr. Nelson is president of the Social Science Research Council, an independent, nonprofit organization linking social science research to practice and policy. She is also a professor at the Institute for Advanced Study, one of the nation’s most distinguished research institutes, located in Princeton, NJ.
      • Dr. Frances H. Arnold and Dr. Maria Zuber will serve as the external Co-Chairs of the President’s Council of Advisors on Science and Technology (PCAST). An expert in protein engineering, Dr. Arnold is the first American woman to win the Nobel Prize in Chemistry. Dr. Zuber, an expert in geophysics and planetary science, is the first woman to lead a NASA spacecraft mission and has chaired the National Science Board. They are the first women to serve as co-chairs of PCAST.
      • Dr. Francis Collins will continue serving in his role as Director of the National Institutes of Health.
      • Kei Koizumi will serve as OSTP Chief of Staff and is one of the nation’s leading experts on the federal science budget.
      • Narda Jones, who will serve as OSTP Legislative Affairs Director, was Senior Technology Policy Advisor and Counsel for the Democratic staff of the U.S. Senate Committee on Commerce, Science and Transportation.
  • The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) issued a report on supply chain security by a public-private sector advisory body, which represents one of the lines of effort of the U.S. government to better secure technology and electronics that emanate from the People’s Republic of China (PRC). CISA’s National Risk Management Center co-chairs the Information and Communications Technology (ICT) Supply Chain Risk Management (SCRM) Task Force along with the Information Technology Sector Coordinating Council and the Communications Sector Coordinating Council. The ICT SCRM published its Year 2 Report that “builds upon” its Interim Report and asserted:
    • Over the past year, the Task Force has expanded upon its first-year progress to advance meaningful partnership around supply chain risk management. Specifically, the Task Force:
      • Developed reference material to support overcoming legal obstacles to information sharing
      • Updated the Threat Evaluation Report, which evaluates threats to suppliers, with additional scenarios and mitigation measures for the corresponding threat scenarios
      • Produced a report and case studies providing in -depth descriptions of control categories and information regarding when and how to use a Qualified List to manage supply chain risks
      • Developed a template for SCRM compliance assessments and internal evaluations of alignment to industry standards
      • Analyzed the current and potential impacts from the COVID-19 pandemic, and developed a system map to visualize ICT supply chain routes and identify chokepoints
      • Surveyed supply chain related programs and initiatives that provide opportunities for potential TaskForce engagement
    • Congress established an entity to address and help police supply chain risk at the end of 2018 in the “Strengthening and Enhancing Cyber-capabilities by Utilizing Risk Exposure Technology Act” (SECURE Act) (P.L. 115-390). The Federal Acquisition Security Council (FASC) has a number of responsibilities, including:
      • developing an information sharing process for agencies to circulate decisions throughout the federal government made to exclude entities determined to be IT supply chain risks
      • establishing a process by which entities determined to be IT supply chain risks may be excluded from procurement government-wide (exclusion orders) or suspect IT must be removed from government systems (removal orders)
      • creating an exception process under which IT from an entity subject to a removal or exclusion order may be used if warranted by national interest or national security
      • issuing recommendations for agencies on excluding entities and IT from the IT supply chain and “consent for a contractor to subcontract” and mitigation steps entities would need to take in order for the Council to rescind a removal or exclusion order
      • In September 2020, the FASC released an interim regulation that took effect upon being published that “implement[s] the requirements of the laws that govern the operation of the FASC, the sharing of supply chain risk information, and the exercise of its authorities to recommend issuance of removal and exclusion orders to address supply chain security risks…”
  • The Australian government has released its bill to remake how platforms like Facebook, Google, and others may use the content of new media, including provision for payment. The “Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020” “establishes a mandatory code of conduct to help support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The agency charged with developing legislation, the Australian Competition and Consumer Commission (ACCC), has tussled with Google in particular over what this law would look like with the technology giant threatening to withdraw from Australia altogether. The ACCC had determined in its July 2019 Digital Platform Inquiry:
    • that there is a bargaining power imbalance between digital platforms and news media businesses so that news media businesses are not able to negotiate for a share of the revenue generated by the digital platforms and to which the news content created by the news media businesses contributes. Government intervention is necessary because of the public benefit provided by the production and dissemination of news, and the importance of a strong independent media in a well-functioning democracy.
    • In an Explanatory Memorandum, it is explained:
      • The Bill establishes a mandatory code of conduct to address bargaining power imbalances between digital platform services and Australian news businesses…by setting out six main elements:
        • bargaining–which require the responsible digital platform corporations and registered news business corporations that have indicated an intention to bargain, to do so in good faith;
        • compulsory arbitration–where parties cannot come to a negotiated agreement about remuneration relating to the making available of covered news content on designated digital platform services, an arbitral panel will select between two final offers made by the bargaining parties;
        • general requirements –which, among other things, require responsible digital platform corporations to provide registered news business corporations with advance notification of planned changes to an algorithm or internal practice that will have a significant effect on covered news content;
        • non-differentiation requirements –responsible digital platform corporations must not differentiate between the news businesses participating in the Code, or between participants and non-participants, because of matters that arise in relation to their participation or non-participation in the Code;
        • contracting out–the Bill recognises that a digital platform corporation may reach a commercial bargain with a news business outside the Code about remuneration or other matters. It provides that parties who notify the ACCC of such agreements would not need to comply with the general requirements, bargaining and compulsory arbitration rules (as set out in the agreement); and
        • standard offers –digital platform corporations may make standard offers to news businesses, which are intended to reduce the time and cost associated with negotiations, particularly for smaller news businesses. If the parties notify the ACCC of an agreed standard offer, those parties do not need to comply with bargaining and compulsory arbitration (as set out in the agreement);
  • The Federal Trade Commission (FTC) has reached a settlement with an mobile advertising company over “allegations that it failed to provide in-game rewards users were promised for completing advertising offers.” The FTC unanimously agreed to the proposed settlement with Tapjoy, Inc. that bars the company “from misleading users about the rewards they can earn and must monitor its third-party advertiser partners to ensure they do what is necessary to enable Tapjoy to deliver promised rewards to consumers.” The FTC drafted a 20 year settlement that will obligate Tapjoy, Inc. to refrain from certain practices that violate the FTC Act; in this case that includes not making false claims about the rewards people can get if they take or do not take some action in an online game. Tapjoy, Inc. will also need to submit compliance reports, keep records, and make materials available to the FTC upon demand. Any failure to meet the terms of the settlement could prompt the FTC to seek redress in federal court, including more than $43,000 per violation.
    • In the complaint, the FTC outlined Tapjoy, Inc.’s illegal conduct:
      • Tapjoy operates an advertising platform within mobile gaming applications (“apps”). On the platform, Tapjoy promotes offers of in-app rewards (e.g., virtual currency) to consumers who complete an action, such as taking a survey or otherwise engaging with third-party advertising. Often, these consumers must divulge personal information or spend money. In many instances, Tapjoy never issues the promised reward to consumers who complete an action as instructed, or only issues the currency after a substantial delay. Consumers who attempt to contact Tapjoy to complain about missing rewards find it difficult to do so, and many consumers who complete an action as instructed and are able to submit a complaint nevertheless do not receive the promised reward.  Tapjoy has received hundreds of thousands of complaints concerning its failure to issue promised rewards to consumers. Tapjoy nevertheless has withheld rewards from consumers who have completed all required actions.
    • In its press release, the FTC highlighted the salient terms of the settlement:
      • As part of the proposed settlement, Tapjoy is prohibited from misrepresenting the rewards it offers consumers and the terms under which they are offered. In addition, the company must clearly and conspicuously display the terms under which consumers can receive such rewards and must specify that the third-party advertisers it works with determine if a reward should be issued. Tapjoy also will be required to monitor its advertisers to ensure they are following through on promised rewards, investigate complaints from consumers who say they did not receive their rewards, and discipline advertisers who deceive consumers.
    • FTC Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued a joint statement, and in their summary section, they asserted:
      • The explosive growth of mobile gaming has led to mounting concerns about harmful practices, including unlawful surveillance, dark patterns, and facilitation of fraud.
      • Tapjoy’s failure to properly police its mobile gaming advertising platform cheated developers and gamers out of promised compensation and rewards.
      • The Commission must closely scrutinize today’s gaming gatekeepers, including app stores and advertising middlemen, to prevent harm to developers and gamers.
    • On the last point, Chopra and Kelly Slaughter argued:
      • We should all be concerned that gatekeepers can harm developers and squelch innovation. The clearest example is rent extraction: Apple and Google charge mobile app developers on their platforms up to 30 percent of sales, and even bar developers from trying to avoid this tax through offering alternative payment systems. While larger gaming companies are pursuing legal action against these practices, developers and small businesses risk severe retaliation for speaking up, including outright suspension from app stores – an effective death sentence.
      • This market structure also has cascading effects on gamers and consumers. Under heavy taxation by Apple and Google, developers have been forced to adopt alternative monetization models that rely on surveillance, manipulation, and other harmful practices.
  • The United Kingdom’s (UK) High Court ruled against the use of general warrants for online surveillance by the Uk’s security agencies (MI5, MI6, and the Government Communication Headquarters (GCHQ)). Privacy International (PI), a British advocacy organization, had brought the suit after Edward Snowden revealed the scope of the United States National Security Agency’s (NSA) surveillance activities, including bulk collection of information, a significant portion of which required hacking. PI sued in a special tribunal formed to resolve claims against British security agencies where the government asserted general warrants would suffice for purposes of mass hacking. PI disagreed and argued this was counter to 250 years of established law in the UK that warrants must be based on reasonable suspicion, specific in what is being sought, and proportionate. The High Court agreed with PI.
    • In its statement after the ruling, PI asserted:
      • Because general warrants are by definition not targeted (and could therefore apply to hundreds, thousands or even millions of people) they violate individuals’ right not to not have their property searched without lawful authority, and are therefore illegal.
      • The adaptation of these 250-year-old principles to modern government hacking and property interference is of great significance. The Court signals that fundamental constitutional principles still need to be applied in the context of surveillance and that the government cannot circumvent traditional protections afforded by the common law.
  • In Indiana, the attorney general is calling on the governor to “to adopt a safe harbor rule I proposed that would incentivize companies to take strong data protection measures, which will reduce the scale and frequency of cyberattacks in Indiana.” Attorney General Curtis Hill urged Governor Eric J. Holcomb to allow a change in the state’s data security regulations to be made effective.
    • The proposed rule provides:
      • Procedures adopted under IC 24-4.9-3-3.5(c) are presumed reasonable if the procedures comply with this section, including one (1) of the following applicable standards:
        • (1) A covered entity implements and maintains a cybersecurity program that complies with the National Institute of Standards and Technology (NIST) cybersecurity framework and follows the most recent version of one (1) of the following standards:
          • (A) NIST Special Publication 800-171.
          • (B) NIST SP 800-53.
          • (C) The Federal Risk and Authorization Management Program (FedRAMP) security assessment framework.
          • (D) International Organization for Standardization/International Electrotechnical Commission 27000 family – information security management systems.
        • (2) A covered entity is regulated by the federal or state government and complies with one (1) of the following standards as it applies to the covered entity:
          • (A) The federal USA Patriot Act (P.L. 107-56).
          • (B) Executive Order 13224.
          • (C) The federal Driver’s Privacy Protection Act (18 U.S.C. 2721 et seq.).
          • (D) The federal Fair Credit Reporting Act (15 U.S.C. 1681 et seq.).
          • (E) The federal Health Insurance Portability and Accountability Act (HIPAA) (P.L. 104-191).
        • (3) A covered entity complies with the current version of the payment card industry data security standard in place at the time of the breach of security of data, as published by the Payment Card Industry Security Standard Council.
      • The regulations further provide that if a data base owner can show “its data security plan was reasonably designed, implemented, and executed to prevent the breach of security of data” then it “will not be subject to a civil action from the office of the attorney general arising from the breach of security of data.”
  • The Tech Transparency Project (TTP) is claiming that Apple “has removed apps in China at the government’s request” the majority of which “involve activities like illegal gambling and porn.” However, TTP is asserting that its analysis “suggests Apple is proactively blocking scores of other apps that are politically sensitive for Beijing.”

Coming Events

  • On 19 January, the Senate Intelligence Committee will hold a hearing on the nomination of Avril Haines to be the Director of National Intelligence.
  • The Senate Homeland Security and Governmental Affairs Committee will hold a hearing on the nomination of Alejandro N. Mayorkas to be Secretary of Homeland Security on 19 January.
  • On 19 January, the Senate Armed Services Committee will hold a hearing on former General Lloyd Austin III to be Secretary of Defense.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

ePrivacy Exception Proposed

Late last month, a broad exception to the EU’s privacy regulations became effective.

My apologies. The first version of this post erroneously asserted the derogation to the ePrivacy Directive had been enacted. It has not, and this post has been re-titled and updated to reflect this fact.

As the European Union (EU) continues to work on enacting a modernized ePrivacy Directive (Directive 2002/58/EC) to complement the General Data Protection Regulation (GDPR), it proposed an exemption to manage a change in another EU law to sweep “number-independent interpersonal communications services” into the current regulatory structure of electronics communication. The policy justification for allowing a categorical exemption to the ePrivacy Directive is for combatting child sexual abuse online. This derogation of EU law is limited to at most five years and quite possibly less time if the EU can enact a successor to the ePrivacy Directive, an ePrivacy Regulation. However, it is unclear when this derogation will be agreed upon and enacted.

In September 2020, the European Commission (EC) issued “a Proposal for a Regulation on a temporary derogation from certain provisions of the ePrivacy Directive 2002/58/EC as regards the use of technologies by number-independent interpersonal communicationsservice providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.” The final regulation took effect on 21 December 2020. However, the EC has also issued a draft of compromise ePrivacy Regulation, the results of extensive communications. The GDPR was enacted with an update of the ePrivacy Directive in mind.

In early December, an EU Parliament committee approved the proposed derogation but the full Parliament has not yet acted upon the measure. The Parliament needs to reach agreement with the Presidency of the Council and the European Commission. In its press release, the Civil Liberties, Justice and Home Affairs explained:

The proposed regulation will provide for limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

Article 1 sets out the scope and aim of the temporary regulation:

This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.

The EC explained the legal and policy background for the exemption to the ePrivacy Directive:

  • On 21 December 2020, with the entry into application of the European Electronic Communications Code (EECC), the definition of electronic communications services will be replaced by a new definition, which includes number-independent interpersonal communications services. From that date on, these services will, therefore, be covered by the ePrivacy Directive, which relies on the definition of the EECC. This change concerns communications services like webmail messaging services and internet telephony.
  • Certain providers of number-independent interpersonal communications services are already using specific technologies to detect child sexual abuse on their services and report it to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, and/or to remove child sexual abuse material. These organisations refer to national hotlines for reporting child sexual abuse material, as well as organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the EU and in third countries.
  • Child sexual abuse is a particularly serious crime that has wide-ranging and serious life-long consequences for victims. In hurting children, these crimes also cause significant and long- term social harm. The fight against child sexual abuse is a priority for the EU. On 24 July 2020, the European Commission adopted an EU strategy for a more effective fight against child sexual abuse, which aims to provide an effective response, at EU level, to the crime of child sexual abuse. The Commission announced that it will propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and oblige them to report that material to public authorities by the second quarter of 2021. The announced legislation will be intended to replace this Regulation, by putting in place mandatory measures to detect and report child sexual abuse, in order to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse, while ensuring respect of the fundamental rights of the users, including in particular the right to freedom of expression and opinion, protection of personal data and privacy, and providing for mechanisms to ensure accountability and transparency.

The EC baldly asserts the problem of child online sexual abuse justifies a loophole to the broad prohibition on violating the privacy of EU persons. The EC did note that the fight against this sort of crime is a political priority for the EC, one that ostensibly puts the EU close to the views of the Five Eyes nations that have been pressuring technology companies to end the practice of making apps and hardware encrypted by default.

The EC explained:

The present proposal therefore presents a narrow and targeted legislative interim solution with the sole objective of creating a temporary and strictly limited derogation from the applicability of Articles 5(1) and 6 of the ePrivacy Directive, which protect the confidentiality of communications and traffic data. This proposal respects the fundamental rights, including the rights to privacy and protection of personal data, while enabling providers of number-independent interpersonal communications services to continue using specific technologies and continue their current activities to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services, pending the adoption of the announced long- term legislation. Voluntary efforts to detect solicitation of children for sexual purposes (“grooming”) also must be limited to the use of existing, state-of-the-art technology that corresponds to the safeguards set out. This Regulation should cease to apply in December 2025.

The EC added “[i]n case the announced long-term legislation is adopted and enters into force prior to this date, that legislation should repeal the present Regulation.”

In November, the European Data Protections Supervisor (EDPS) Wojciech Wiewiorówski published his opinion on the temporary, limited derogation from the EU’s regulation on electronics communication and privacy. Wiewiorówski cautioned that a short-term exception, however well-intended, would lead to future loopholes that would ultimately undermine the purpose of the legislation. Moreover, Wiewiorówski found that the derogation was not sufficiently specific guidance and safeguards and is not proportional. Wiewiorówski argued:

  • In particular, he notes that the measures envisaged by the Proposal would constitute an interference with the fundamental rights to respect for private life and data protection of all users of very popular electronic communications services, such as instant messaging platforms and applications. Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
  • The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field. The EDPS therefore considers it essential that the Proposal is not adopted, even in the form a temporary derogation, until all the necessary safeguards set out in this Opinion are integrated.
  • In particular, in the interest of legal certainty, the EDPS considers that it is necessary to clarify whether the Proposal itself is intended to provide a legal basis for the processing within the meaning of the GDPR, or not. If not, the EDPS recommends clarifying explicitly in the Proposal which legal basis under the GDPR would be applicable in this particular case.
  • In this regard, the EDPS stresses that guidance by data protection authorities cannot substitute compliance with the requirement of legality. It is insufficient to provide that the temporary derogation is “without prejudice” to the GDPR and to mandate prior consultation of data protection authorities. The co-legislature must take its responsibility and ensure that the proposed derogation complies with the requirements of Article 15(1), as interpreted by the CJEU.
  • In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.
  • Finally, the EDPS is of the view that the five-year period as proposed does not appear proportional given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards within the text of the legislation. He considers that the validity of any transitional measure should not exceed 2 years.

The Five Eyes nations (Australia, Canada, New Zealand, the United Kingdom, and the United States) issued a joint statement in which their ministers called for quick action.

In this statement, we highlight how from 21 December 2020, the ePrivacy Directive, applied without derogation, will make it easier for children to be sexually exploited and abused without detection – and how the ePrivacy Directive could make it impossible both for providers of internet communications services, and for law enforcement, to investigate and prevent such exploitation and abuse. It is accordingly essential that the European Union adopt urgently the derogation to the ePrivacy Directive as proposed by the European Commission in order for the essential work carried out by service providers to shield endangered children in Europe and around the world to continue.

Without decisive action, from 21 December 2020 internet-based messaging services and e-mail services captured by the European Electronic Communications Code’s (EECC) new, broader definition of ‘electronic communications services’ are covered by the ePrivacy Directive. The providers of electronic communications services must comply with the obligation to respect the confidentiality of communications and the conditions for processing communications data in accordance with the ePrivacy Directive. In the absence of any relevant national measures made under Article 15 of that Directive, this will have the effect of making it illegal for service providers operating within the EU to use their current tools to protect children, with the impact on victims felt worldwide.

As mentioned, this derogation comes at a time when the EC and the EU nations are trying to finalize and enact an ePrivacy Regulation. In the original 2017 proposal, the EC stated:

The ePrivacy Directive ensures the protection of fundamental rights and freedoms, in particular the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the Union.

The ePrivacy Regulation is intended to work in concert with the GDPR, and the draft 2020 regulation contains the following passages explaining the intended interplay of the two regulatory schemes:

  • Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons. Processing of electronic communications data by providers of electronic communications services and networks should only be permitted in accordance with this Regulation. This Regulation does not impose any obligations on the end-user End-users who are legal persons may have rights conferred by Regulation (EU) 2016/679 to the extent specifically required by this Regulation
  • While the principles and main provisions of Directive 2002/58/EC of the European Parliament and of the Council remain generally sound, that Directive has not fully kept pace with the evolution of technological and market reality, resulting in an inconsistent or insufficient effective protection of privacy and confidentiality in relation to electronic communications. Those developments include the entrance on the market of electronic communications services that from a consumer perspective are substitutable to traditional services, but do not have to comply with the same set of rules. Another development concerns new techniques that allow for tracking of online behaviour of end-users, which are not covered by Directive 2002/58/EC. Directive 2002/58/EC should therefore be repealed and replaced by this Regulation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Périgois on Unsplash

UK and EU Defer Decision On Data Flows

Whether there will be an adequacy decision allowing the free flow of personal data under the GDPR from the EU to the recently departed UK has been punted. And, its recent status as a member of the EU notwithstanding, the UK might not get an adequacy decision.

In reaching agreement on many aspects of the United Kingdom’s (UK) exit from the European Union (EU), negotiators did not reach agreement on whether the EU would permit the personal data of EU persons to continue flowing to the UK under the easiest means possible. Instead, the EU and UK agreed to let the status quo continue until an adequacy decision is made or six months lapse. The value of data flowing between the UK and EU was valued at more than £100 billion in 2017 according to British estimates, with the majority of this trade being from the UK to the EU.

Under the General Data Protection Regulation (GDPR), the personal data of EU people can be transferred to other nations for most purposes once the European Commission (EC) has found the other nation has adequate protection equal to those granted in the EU. Of course, this has been an ongoing issue with data flows to the United States (U.S.) as two agreements (Safe Harbor and Privacy Shield) and their EC adequacy decisions were ruled illegal, in large part, because, according to the EU’s highest court, U.S. law does not provide EU persons with the same rights they have in the EU. Most recently, this occurred in 2020 when the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the EU-United States Privacy Shield (aka Schrems II). It bears note that transfers of personal data may occur through other means under the GDPR that may prove more resource intensive: standard data protection clauses (SCC), binding corporate rules (BCR), and others.

Nevertheless, an adequacy decision is seen as the most desirable means of transfer and the question of whether the UK’s laws are sufficient has lingered over the Brexit discussions, with some claiming that the nation’s membership in the Five Eyes surveillance alliance with the U.S. and others possibly disqualifying the UK. Given the range of thorny issues the UK and EU punted (e.g. how to handle the border between Northern Ireland and Ireland), it is not surprising that the GDPR and data flows was also punted.

The UK-EU Trade and Cooperation Agreement (TCA) explained the terms of the data flow agreement and, as noted, in the short term, the status quo will continue with data flows to the UK being treated as if it were still part of the EU. This state will persist until the EC reaches an adequacy decision or for four months with another two months of the status quo being possible in the absence of an adequacy decision so long as neither the UK nor EU object. Moreover, these provisions are operative only so long as the UK has its GDPR compliant data protection law (i.e. UK Data Protection Act 2018) in place and does exercise specified “designated powers.” The UK has also deemed EU and European Economic Area (EEA) and European Free Trade Association (EFTA) nations to be adequate for purposes of data transfers from the UK on a transitional basis.

Specifically, the TCA provides

For the duration of the specified period, transmission of personal data from the Union to the United Kingdom shall not be considered as transfer to a third country under Union law, provided that the data protection legislation of the United Kingdom on 31 December 2020, as it is saved and incorporated into United Kingdom law by the European Union (Withdrawal) Act 2018 and as modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (“the applicable data protection regime”), applies and provided that the United Kingdom does not exercise the designated powers without the agreement of the Union within the Partnership Council.

The UK also agreed to notify the EU if it “enters into a new instrument which can be relied on to transfer personal data to a third country under Article 46(2)(a) of the UK GDPR or section 75(1)(a) of the UK Data Protection Act 2018 during the specified period.” However, if the EU were to object, it appears from the terms of the TCA, all the EU could do is force the UK “to discuss the relevant object.” And yet, should the UK sign a treaty allowing personal data to flow to a nation the EU deems inadequate, this could obviously adversely affect the UK’s prospects of getting an adequacy decision.

Not surprisingly, the agreement also pertains to the continued flow of personal data as part of criminal investigations and law enforcement matters but not national security matters. Moreover, these matters fall outside the scope of the GDPR and would not be affected in many ways by an adequacy decision or a lack of one. In a British government summary, it is stated that the TCA

provide[s] for law enforcement and judicial cooperation between the UK, the EU and its Member States in relation to the prevention, investigation, detection and prosecution of criminal offences and the prevention of and fight against money laundering and financing of terrorism.

The text of the TCA makes clear national security matters visa vis data flows and information sharing are not covered:

This Part only applies to law enforcement and judicial cooperation in criminal matters taking place exclusively between the United Kingdom, on the one side, and the Union and the Member States, on the other side. It does not apply to situations arising between the Member States, or between Member States and Union institutions, bodies, offices and agencies, nor does it apply to the activities of authorities with responsibilities for safeguarding national security when acting in that field.

The TCA also affirms:

  • The cooperation provided for in this Part is based on the Parties’ long-standing commitment to ensuring a high level of protection of personal data.
  • To reflect that high level of protection, the Parties shall ensure that personal data processed under this Part is subject to effective safeguards in the Parties’ respective data protection regimes…

The United Kingdom’s data protection authority (DPA), the Information Commissioner’s Office (ICO), issued an explanation of how British law enforcement entities should act in light of the TCA. The ICO explained to British entities on law enforcement-related data transfers to the UK:

  • We are now a ‘third country’ for EU data protection purposes. If you receive personal data from a law enforcement partner in the EU, this means the sender will need to comply with the transfer provisions under their national data protection law (which are likely to be similar to those in Part 3 of the DPA 2018).
  • This means the EU sender needs to make sure other appropriate safeguards are in place – probably through a contract or other binding legal instrument, or by making their own assessment of appropriate safeguards. The sender can take into account the protection provided by the DPA 2018 itself when making this assessment.
  • If you receive personal data from other types of organisations in the EU or EEA who are subject to the GDPR, the sender will need to comply with the transfer provisions of the UK GDPR. You may want to consider putting standard contractual clauses (SCCs) in place to ensure adequate safeguards in these cases. We have produced an interactive tool to help you use the SCCs.

The ICO explained for transfers from the UK to the EU (but not the EEA):

  • There is a transitional adequacy decision in place to cover transfers to EU member states and Gibraltar. This will not extend to EEA countries outside the EU, where you should continue to consider other safeguards.
  • This means you can continue to send personal data from the UK to your law enforcement partners in the EU, as long as you can show the transfer is necessary for law enforcement purposes. You can also transfer personal data to non-law enforcement bodies in the EU if you can meet some additional conditions, but you will need to notify the ICO.

Turning back to an adequacy decision and commercial transfers of personal data from the EU to the UK, in what may well be a preview of a world in which there is no adequacy decision between the UK and EU, the European Data Protection Board (EDPB) issued an “information note” in mid-December that spells out how the GDPR would be applied:

  • In the absence of an adequacy decision applicable to the UK as per Article 45 GDPR, such transfers will require appropriate safeguards(e.g., standard data protection clauses, binding corporate rules, codes of conduct…), as well as enforceable data subject rights and effective legal remedies for data subjects, in accordance with Article 46 GDPR.
  • Subject to specific conditions, it may still be possible to transfer personal data to the UK based on a derogation listed in Article 49 GDPR. However, Article 49 GDPR has an exceptional nature and the derogations it contains must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive.
  • Moreover, where personal data are transferred to the UK on the basis of Article 46 GDPR safeguards, supplementary measures might be necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, in accordance with the Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.

Regarding commercial data transfers, the ICO issued a statement urging British entities to start setting up “alternative transfer mechanisms” to ensure data continues to flow from the EU to UK:

  • The Government has announced that the Treaty agreed with the EU will allow personal data to flow freely from the EU (and EEA) to the UK, until adequacy decisions have been adopted, for no more than six months.
  • This will enable businesses and public bodies across all sectors to continue to freely receive data from the EU (and EEA), including law enforcement agencies.
  • As a sensible precaution, before and during this period, the ICO recommends that businesses work with EU and EEA organisations who transfer personal data to them, to put in place alternative transfer mechanisms, to safeguard against any interruption to the free flow of EU to UK personal data.

However, even with these more restrictive means of transferring personal data to the UK exist, there will likely be legal challenges. It bears note that in light of Schrems II, EU DPAs are likely to apply a much higher level of scrutiny to SCCs, and challenges to the legality of using SCCs to transfer personal data to the U.S. have already been commenced. It also seems certain the legality of using SCCs to transfer data to the UK would be challenged, as well.

However, returning to the preliminary issue of whether the EC will give the UK an adequacy decision, there may a number of obstacles to a finding that the UK’s data protection and surveillance laws are indeed adequate under EU law[1]. Firstly, the UK’s surveillance practices in light of a recent set of CJEU rulings may prove difficult for the EC to stomach. In 2020, the CJEU handed down a pair of rulings (here and here) on the extent to which European Union (EU) nations may engage in bulk, indiscriminate collection of two types of data related to electronic communications. The CJEU found that while EU member nations may conduct these activities to combat crime or national security threats during periods limited by necessity and subject to oversight, nations may not generally require the providers of electronic communications to store and provide indiscriminate location data and traffic data in response to an actual national security danger or a prospective one. The CJEU combined three cases into two rulings that came from the UK, France, and Belgium to elucidate the reach of the Privacy and Electronic Communications Directive in relation to foundational EU laws.

The UK is, of course, one of the U.S.’s staunchest allies and partners when it comes to government surveillance of electronic communications. On this point, the CJEU summarized the beginning of the case out of the UK:

  • At the beginning of 2015, the existence of practices for the acquisition and use of bulk communications data by the various security and intelligence agencies of the United Kingdom, namely GCHQ, MI5 and MI6, was made public, including in a report by the Intelligence and Security Committee of Parliament (United Kingdom). On 5 June 2015, Privacy International, a non-governmental organisation, brought an action before the Investigatory Powers Tribunal (United Kingdom) against the Secretary of State for Foreign and Commonwealth Affairs, the Secretary of State for the Home Department and those security and intelligence agencies, challenging the lawfulness of those practices.

Secondly, the government of Prime Minister Boris Johnson may aspire to change data laws in ways the EU does not. In media accounts, unnamed EC officials were critical of the UK’s 2020 “National Data Strategy,” particularly references to “legal barriers (real and perceived)” to accessing data that “must be addressed.”

Thirdly, it may become a matter of politics. The EU has incentives to make the UK’s exit from the EU difficult to dissuade other nations from following the same path. Moreover, having previously been the second largest economy in the EU as measured by GDP, the UK may prove a formidable economy competitor, lending more weight to the view that the EU may not want to help the UK’s  businesses compete with the EU’s.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by succo from Pixabay


[1] European Union Parliament, “The EU-UK relationship beyond Brexit: options for Police Cooperation and Judicial Cooperation in Criminal Matters,” Page 8: Although the UK legal framework is currently broadly in line with the EU legal framework and the UK is a signatory to the European Convention on Human Rights (ECHR), there are substantial questions over whether the Data Protection Act fully incorporates the data protection elements required by the Charter of Fundamental Rights, concerning the use of the national security exemption from the GDPR used by the UK, the retention of data and bulk powers granted to its security services, and over its onward transfer of this data to third country security partners such as the ‘Five Eyes’ partners (Britain, the USA, Australia, New Zealand and Canada).

Further Reading, Other Developments, and Coming Events (14 December)

Further Reading

  • Russian Hackers Broke Into Federal Agencies, U.S. Officials Suspect” By David Sanger — The New York Times.; “Russian government hackers are behind a broad espionage campaign that has compromised U.S. agencies, including Treasury and Commerce” By Ellen Nakashima and Craig Timberg — The Washington Post; “Suspected Russian hackers spied on U.S. Treasury emails – sources” By Chris Bing — Reuters. Apparently, Sluzhba vneshney razvedki Rossiyskoy Federatsii (SVR), the Russian Federation’s Foreign Intelligence Service, has exploited a vulnerability in SolarWinds’ update system used by many United States (U.S.) government systems, Fortune 500 companies, and the U.S.’ top ten largest telecommunications companies. Reportedly, APT29 (aka Cozy Bear) has had free reign in the email systems of the Departments of the Treasury and Commerce among other possible victims. The hackers may have also accessed a range of other entities around the world using the same SolarWind system. Moreover, these penetrations may be related to the recently announced theft of hacking tools a private firm, FireEye, used to test clients’ systems.
  • Hackers steal Pfizer/BioNTech COVID-19 vaccine data in Europe, companies say” By Jack Stubbs — Reuters. The European Union’s (EU) agency that oversees and approve medications has been hacked, and documents related to one of the new COVID-19 vaccines may have been stolen. The European Medicines Agency (EMA) was apparently penetrated, and materials related to Pfizer and BioNTech’s vaccine were exfiltrated. The scope of the theft is not yet known, but this is the latest in many attempts to hack into the entities conducting research on the virus and potential vaccines.
  • The AI Girlfriend Seducing China’s Lonely Men” By Zhang Wanqing — Sixth Tone. A chat bot powered by artificial intelligence that some men in the People’s Republic of China (PRC) are using extensively raises all sorts of ethical and privacy issues. Lonely people have turned to this AI technology and have confided their deepest feelings, which are stored by the company. It seems like a matter of time until these data are mined for commercial value or hacked. Also, the chatbot has run afoul of PRC’s censorship policies. Finally, is this a preview of the world to come, much like the 2013 film, Her, in which humans have relationships with AI beings?
  • YouTube will now remove videos disputing Joe Biden’s election victory” By Makena Kelly — The Verge. The Google subsidiary announced that because the safe harbor deadline has been reached and a sufficient number of states have certified President-elect Joe Biden, the platform will begin taking down misleading election videos. This change in policy may have come about, in part, because of pressure from Democrats in Congress about what they see as Google’s lackluster efforts to find and remove lies, misinformation, and disinformation about the 2020 election.
  • Lots of people are gunning for Google. Meet the man who might have the best shot.” By Emily Birnbaum — Protocol. Colorado Attorney General Phil Weiser may be uniquely qualified to lead state attorneys general on a second antitrust and anti-competition action against Google given his background as a law professor steeped in antitrust and his background in the Department of Justice and White House during the Obama Administration.

Other Developments

  • Cybersecurity firm, FireEye, revealed it was “attacked by a highly sophisticated threat actor, one whose discipline, operational security, and techniques lead us to believe it was a state-sponsored attack” according to CEO Kevin Mandia. This hacking may be related to vast penetration of United States (U.S.) government systems revealed over the weekend. Mandia stated FireEye has “found that the attacker targeted and accessed certain Red Team assessment tools that we use to test our customers’ security…[that] mimic the behavior of many cyber threat actors and enable FireEye to provide essential diagnostic security services to our customers.” Mandia claimed none of these tools were zero-day exploits. FireEye is “proactively releasing methods and means to detect the use of our stolen Red Team tools…[and] out of an abundance of caution, we have developed more than 300 countermeasures for our customers, and the community at large, to use in order to minimize the potential impact of the theft of these tools.
    • Mandia added:
      • Consistent with a nation-state cyber-espionage effort, the attacker primarily sought information related to certain government customers. While the attacker was able to access some of our internal systems, at this point in our investigation, we have seen no evidence that the attacker exfiltrated data from our primary systems that store customer information from our incident response or consulting engagements, or the metadata collected by our products in our dynamic threat intelligence systems. If we discover that customer information was taken, we will contact them directly.
      • Based on my 25 years in cyber security and responding to incidents, I’ve concluded we are witnessing an attack by a nation with top-tier offensive capabilities. This attack is different from the tens of thousands of incidents we have responded to throughout the years. The attackers tailored their world-class capabilities specifically to target and attack FireEye. They are highly trained in operational security and executed with discipline and focus. They operated clandestinely, using methods that counter security tools and forensic examination. They used a novel combination of techniques not witnessed by us or our partners in the past.
      • We are actively investigating in coordination with the Federal Bureau of Investigation and other key partners, including Microsoft. Their initial analysis supports our conclusion that this was the work of a highly sophisticated state-sponsored attacker utilizing novel techniques.    
  • The United States’ (U.S.) Department of Justice filed suit against Facebook for “tactics that discriminated against U.S. workers and routinely preferred temporary visa holders (including H-1B visa holders) for jobs in connection with the permanent labor certification (PERM) process.” The DOJ is asking for injunction to stop Facebook from engaging in the alleged conduct, civil penalties, and damages for workers harmed by this conduct.
    • The DOJ contended:
      • The department’s lawsuit alleges that beginning no later than Jan. 1, 2018 and lasting until at least Sept. 18, 2019, Facebook employed tactics that discriminated against U.S. workers and routinely preferred temporary visa holders (including H-1B visa holders) for jobs in connection with the PERM process. Rather than conducting a genuine search for qualified and available U.S. workers for permanent positions sought by these temporary visa holders, Facebook reserved the positions for temporary visa holders because of their immigration status, according to the complaint. The complaint also alleges that Facebook sought to channel jobs to temporary visa holders at the expense of U.S. workers by failing to advertise those vacancies on its careers website, requiring applicants to apply by physical mail only, and refusing to consider any U.S. workers who applied for those positions. In contrast, Facebook’s usual hiring process relies on recruitment methods designed to encourage applications by advertising positions on its careers website, accepting electronic applications, and not pre-selecting candidates to be hired based on a candidate’s immigration status, according to the lawsuit.
      • In its investigation, the department determined that Facebook’s ineffective recruitment methods dissuaded U.S. workers from applying to its PERM positions. The department concluded that, during the relevant period, Facebook received zero or one U.S. worker applicants for 99.7 percent of its PERM positions, while comparable positions at Facebook that were advertised on its careers website during a similar time period typically attracted 100 or more applicants each. These U.S. workers were denied an opportunity to be considered for the jobs Facebook sought to channel to temporary visa holders, according to the lawsuit. 
      • Not only do Facebook’s alleged practices discriminate against U.S. workers, they have adverse consequences on temporary visa holders by creating an employment relationship that is not on equal terms. An employer that engages in the practices alleged in the lawsuit against Facebook can expect more temporary visa holders to apply for positions and increased retention post-hire. Such temporary visa holders often have limited job mobility and thus are likely to remain with their company until they can adjust status, which for some can be decades.
      • The United States’ complaint seeks civil penalties, back pay on behalf of U.S. workers denied employment at Facebook due to the alleged discrimination in favor of temporary visa holders, and other relief to ensure Facebook stops the alleged violations in the future. According to the lawsuit, and based on the department’s nearly two-year investigation, Facebook’s discrimination against U.S. workers was intentional, widespread, and in violation of a provision of the Immigration and Nationality Act (INA), 8 U.S.C. § 1324b(a)(1), that the Department of Justice’s Civil Rights Division enforces. 
  • A trio of consumer authority regulators took the lead in coming into agreement with Apple to add “a new section to each app’s product page in its App Store, containing key information about the data the app collects and an accessible summary of the most important information from the privacy policy.” The United Kingdom’s UK’s Competition and Markets Authority (CMA), the Netherlands Authority for Consumers and Markets and the Norwegian Consumer Authority led the effort that “ongoing work from the International Consumer Protection and Enforcement Network (ICPEN), involving 27 of its consumer authority members across the world.” The three agencies explained:
    • Consumer protection authorities, including the CMA, became concerned that people were not being given clear information on how their personal data would be used before choosing an app, including on whether the app developer would share their personal data with a third party. Without this information, consumers are unable to compare and choose apps based on how they use personal data.
  • Australia’s Council of Financial Regulators (CFR) has released a Cyber Operational Resilience Intelligence-led Exercises (CORIE) framework “to test and demonstrate the cyber maturity and resilience of institutions within the Australian financial services industry.”

Coming Events

  • On 15 December, the Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing titled “The Role of Private Agreements and Existing Technology in Curbing Online Piracy” with these witnesses:
    • Panel I
      • Ms. Ruth Vitale, Chief Executive Officer, CreativeFuture
      • Mr. Probir Mehta, Head of Global Intellectual Property and Trade Policy, Facebook, Inc.
      • Mr. Mitch Glazier, Chairman and CEO, Recording Industry Association of America
      • Mr. Joshua Lamel, Executive Director, Re:Create
    • Panel II
      • Ms. Katherine Oyama, Global Director of Business Public Policy, YouTube
      • Mr. Keith Kupferschmid, Chief Executive Officer, Copyright Alliance
      • Mr. Noah Becker, President and Co-Founder, AdRev
      • Mr. Dean S. Marks, Executive Director and Legal Counsel, Coalition for Online Accountability
  • The Senate Armed Services Committee’s Cybersecurity Subcommittee will hold a closed briefing on Department of Defense Cyber Operations on 15 December with these witnesses:
    • Mr. Thomas C. Wingfield, Deputy Assistant Secretary of Defense for Cyber Policy, Office of the Under Secretary of Defense for Policy
    • Mr. Jeffrey R. Jones, Vice Director, Command, Control, Communications and Computers/Cyber, Joint Staff, J-6
    • Ms. Katherine E. Arrington, Chief Information Security Officer for the Assistant Secretary of Defense for Acquisition, Office of the Under Secretary of Defense for Acquisition and Sustainment
    • Rear Admiral Jeffrey Czerewko, United States Navy, Deputy Director, Global Operations, J39, J3, Joint Staff
  • The Senate Banking, Housing, and Urban Affairs Committee’s Economic Policy Subcommittee will conduct a hearing titled “US-China: Winning the Economic Competition, Part II” on 16 December with these witnesses:
    • The Honorable Will Hurd, Member, United States House of Representatives;
    • Derek Scissors, Resident Scholar, American Enterprise Institute;
    • Melanie M. Hart, Ph.D., Senior Fellow and Director for China Policy, Center for American Progress; and
    • Roy Houseman, Legislative Director, United Steelworkers (USW).
  • On 17 December the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency’s (CISA) Information and Communications Technology (ICT) Supply Chain Risk Management (SCRM) Task Force will convene for a virtual event, “Partnership in Action: Driving Supply Chain Security.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by stein egil liland from Pexels

Task Force Calls For Enhanced Digital Regulation in UK

The UK may soon reform its competition and consumer laws visa vis digital markets.

A United Kingdom (UK) entity has recommended that Prime Minister Boris Johnson and his Conservative government remake digital regulation in the UK, especially with respect to competition policy. A task force has returned an extensive set of recommendations requiring legislation and increased coordination and a new focus for existing regulators. The timeline for such action is not clear, and Downing Street would have to agree before anything happens. However, the UK’s new regulatory scheme and the European Union’s ongoing efforts to revamp its regulatory approach to large technology firms will both likely affect United States (U.S.) multinationals such as Facebook and Google. It may also serve as a template for the U.S. to remake its regulation of digital competition.

The United Kingdom’s Competition & Markets Authority (CMA) led an effort consisting of the Office of Communications (Ofcom) and the Information Commissioner’s Office (ICO) in the form of the Digital Markets Taskforce. The Task Force follows the 2019 “Unlocking digital competition, Report of the Digital Competition Expert Panel”, an effort led by Obama Administration Council of Economic Advisers Chair Jason Furman and the more recent July 2020 “Online platforms and digital advertising market study.” In 2019, the Task Force issued its “Digital Markets Strategy” that “sets out five strategic aims, and seven priority focus areas.”

The Task Force acknowledged its efforts in the UK were not unique. It referenced similar inquiries and plans to reform other nations’ regulation of digital markets in the U.S., the EU, Germany, Japan, and Australia.

The Task Force summarized its findings:

The accumulation and strengthening of market power by a small number of digital firms has the potential to cause significant harm to consumers and businesses that rely on them, to innovative competitors and to the economy and society more widely:

  • A poor deal for consumers and businesses who rely on them. These firms can exploit their powerful positions. For consumers this can mean they get a worse deal than they would in a more competitive market, for example having less protection or control of their data. For businesses this can mean they are, for example, charged higher listing fees or higher prices for advertising online. These higher prices for businesses can then feed through into higher prices for consumers for a wide range of products and services across the economy.
  • Innovative competitors face an unfair disadvantage. A powerful digital firm can extend its strong position in one market into other markets, ultimately giving itself an unfair advantage over its rivals. This means innovative competitors, even if they have a good idea, are likely to find it much harder to compete and grow their businesses. This can result in long-term harmful effects on innovation and the dynamism of UK markets.
  • A less vibrant digital economy. If powerful digital firms act to unfairly disadvantage their innovative competitors, these innovative firms will find it harder to enter and expand in new markets, meaning the ‘unicorns’ of tomorrow that will support jobs and the future digital economy will not emerge.

The Task Force calls for the establishment of a new Digital Markets Unit (DMU) that would be particularly focused on policing potential harm before it occurs. Thus, the Task Force is calling for a regulator that is proactive and nimble enough to address risks to competition and consumers any harm happens. The DMU would oversee a new “Strategic Market Status” regime, and the Task Force is recommending that the government and Parliament revisit and refresh consumer and competition laws. The Task Force stated that the “government should put in place a regulatory framework for the most powerful digital firms, alongside strengthening existing competition and consumer laws…[and] [i]n considering the design of this regulatory framework we have sought to strike the right balance between the following key principles:

  • Evidence driven and effective – regulation must be effective, and that means ensuring it is evidence based, but also that it can react swiftly enough to prevent and address harms. The activities undertaken by the most powerful digital firms are diverse and a ‘one size fits all’ approach could have damaging results.
  • Proportionate and targeted – regulation must be proportionate and targeted at addressing a particular problem, minimising the risk of any possible unintended consequences.
  • Open, transparent and accountable – across all its work the DMU should operate in an open and transparent manner. In reaching decisions it should consult a wide range of parties. It should clearly articulate why it has reached decisions and be held accountable for them.
  • Proactive and forward-looking – the DMU should be focused on preventing harm from occurring, rather than enforcing ex post. It should seek to understand how digital markets might evolve, the risks this poses to competition and innovation, and act proactively to assess and manage those risks.
  • Coherent – the DMU should seek to promote coherence with other regulatory regimes both domestically and internationally, in particular by working through the Digital Regulation Cooperation Forum which is already working to deliver a step change in coordination and cooperation between regulators in digital markets.

The Task Force provided more detail on the new SMS scheme:

The entry point to the SMS regime is an assessment of whether a firm has ‘strategic market status’. This should be an evidence-based economic assessment as to whether a firm has substantial, entrenched market power in at least one digital activity, providing the firm with a strategic position (meaning the effects of its market power are likely to be particularly widespread and/or significant). It is focused on assessing the very factors which may give rise to harm, and which motivate the need for regulatory intervention.

Those firms that are designated with SMS should be subject to the following three pillars of the regime:

  • An enforceable code of conduct that sets out clearly how an SMS firm is expected to behave in relation to the activity motivating its SMS designation. The aim of the code is to manage the effects of market power, for example by preventing practices which exploit consumers and businesses or exclude innovative competitors.
  • Pro-competitive interventions like personal data mobility, interoperability and data access which can be used to address the factors which are the source of an SMS firm’s market power in a particular activity. These interventions seek to drive longer-term dynamic changes in these activities, opening up opportunities for greater competition and innovation.
  • SMS merger rules to ensure closer scrutiny of transactions involving SMS firms, given the particular risks and potential consumer harm arising from these transactions.

The SMS regime should be an ex ante regime, focused on proactively preventing harm. Fostering a compliance culture within SMS firms will be crucial to its overall success. However, a key part of fostering compliance is credible deterrence and the DMU will need to be able to take tough action where harm does occur, requiring firms to change their behaviour, and with the ability to impose substantial penalties. The ability to take tough action sits alongside enabling resolution through a participative approach, whereby the DMU seeks to engage constructively with all affected parties to achieve fast and effective results.

The Task Force sketched its ideal timeline during which Parliament would enact its recommendations, which would be next year at the earliest:

We believe the case for an ex ante regime in digital markets has been made. We therefore welcome the government’s response to the CMA’s online platforms and digital advertising market study, and its commitment to establishing a DMU from April 2021 within the CMA. We also welcome government’s commitment to consult on proposals for a new pro-competition regime in early 2021 and to legislate to put the DMU on a statutory footing when parliamentary time allows. We urge government to move quickly in taking this legislation forward. As government rightly acknowledges, similar action is being pursued across the globe and there is a clear opportunity for the UK to lead the way in championing a modern pro-competition, pro-innovation regime.

The Task Force summarized its recommendations to the government:

A Digital Markets Unit

Recommendation 1: The government should set up a DMU which should seek to further the interests of consumers and citizens in digital markets, by promoting competition and innovation.

  • Recommendation 1a: The DMU should be a centre of expertise and knowledge in relation to competition in digital markets.
  • Recommendation 1b: The DMU should be proactive, seeking to foster compliance with regulatory requirements and taking swift action to prevent harm from occurring.

A pro-competition regime for the most powerful digital firms

Recommendation 2: The government should establish a pro-competition framework, to be overseen by the DMU, to pursue measures in relation to SMS firms which further the interests of consumers and citizens, by promoting competition and innovation.

Recommendation 3: The government should provide the DMU with the power to designate a firm with SMS.

  • Recommendation 3a: SMS should require a finding that the firm has substantial, entrenched market power in at least one digital activity, providing the firm with a strategic position.
  • Recommendation 3b: The DMU should set out in formal guidance its prioritisation rules for designation assessments. These should include the firm’s revenue (globally and within the UK), the activity undertaken by the firm and a consideration of whether a sector regulator is better placed to address the issues of concern.
  • Recommendation 3c: The designation process should be open and transparent with a consultation on the provisional decision and the assessment completed within a statutory deadline.
  • Recommendation 3d: A firm’s SMS designation should be set for a fixed period before being reviewed.
  • Recommendation 3e: When a firm meets the SMS test, the associated remedies should apply only to a subset of the firm’s activities, whilst the status should apply to the firm as a whole.

Recommendation 4: The government should establish the SMS regime such that when the SMS test is met, the DMU can establish an enforceable code of conduct for the firm in relation to its designated activities to prevent it from taking advantage of its power and position.

  • Recommendation 4a: A code should comprise high-level objectives supported by principles and guidance.
  • Recommendation 4b: The objectives of the code should be set out in legislation, with the remainder of the content of each code to be determined by the DMU, tailored to the activity, conduct and harms it is intended to address.
  • Recommendation 4c: The DMU should ensure the code addresses the concerns about the effect of the power and position of SMS firms when dealing with publishers, as identified by the Cairncross Review.
  • Recommendation 4d: The code of conduct should always apply to the activity or activities which are the focus of the SMS designation.
  • Recommendation 4e: The DMU should consult on and establish a code as part of the designation assessment. The DMU should be able to vary the code outside the designation review cycle.

Recommendation 5: SMS firms should have a legal obligation to ensure their conduct is compliant with the requirements of the code at all times and put in place measures to foster compliance.

Recommendation 6: The government should establish the SMS regime such that the DMU can impose pro-competitive interventions on an SMS firm to drive dynamic change as well as to address harms related to the designated activities.

  • Recommendation 6a: With the exception of ownership separation, the DMU should not be limited in the types of remedies it is able to apply.
  • Recommendation 6b: The DMU should be able to implement PCIs anywhere within an SMS firm in order to address a concern related to its substantial entrenched market power and strategic position in a designated activity.
  • Recommendation 6c: In implementing a PCI the DMU should demonstrate that it is an effective and proportionate remedy to an adverse effect on competition or consumers. A PCI investigation should be completed within a fixed statutory deadline.
  • Recommendation 6d: PCIs should be implemented for a limited duration and should be regularly reviewed.

Recommendation 7: The government should establish the SMS regime such that the DMU can undertake monitoring in relation to the conduct of SMS firms and has a range of tools available to resolve concerns.

  • Recommendation 7a: Where appropriate, the DMU should seek to resolve concerns using a participative approach, engaging with parties to deliver fast and effective resolution.
  • Recommendation 7b: The DMU should be able to open formal investigations into breaches of the code and where a breach is found, require an SMS firm to change its behaviour. These investigations should be completed within a fixed statutory deadline.
  • Recommendation 7c: The DMU should be able to impose substantial penalties for breaches of the code and for breaches of code and PCI orders.
  • Recommendation 7d: The DMU should be able to take action quickly on an interim basis where it suspects the code has been breached.
  • Recommendation 7e: The DMU should be able to undertake scoping assessments where it is concerned there is an adverse effect on competition or consumers in relation to a designated activity. The outcome of such assessments could include a code breach investigation, a pro-competitive intervention investigation, or variation to a code principle or guidance.

Recommendation 8: The government should establish the SMS regime such that the DMU can draw information from a wide range of sources, including by using formal information gathering powers, to gather the evidence it needs to inform its work.

Recommendation 9: The government should ensure the DMU’s decisions are made in an open and transparent manner and that it is held accountable for them.

  • Recommendation 9a: The DMU’s decisions should allow for appropriate internal scrutiny.
  • Recommendation 9b: The DMU should consult on its decisions.
  • Recommendation 9c: The DMU’s decisions should be timely, with statutory deadlines used to set expectations and deliver speedy outcomes.
  • Recommendation 9d: The DMU’s decisions should be judicially reviewable on ordinary judicial review principles and the appeals process should deliver robust outcomes at pace.

Recommendation 10: The government should establish the SMS regime such that SMS firms are subject to additional merger control requirements.

Recommendation 11: The government should establish the SMS merger control regime such that SMS firms are required to report all transactions to the CMA. In addition, transactions that meet clear-cut thresholds should be subject to mandatory notification, with completion prohibited prior to clearance. Competition concerns should be assessed using the existing substantive test but a lower and more cautious standard of proof.

A modern competition and consumer regime for digital markets

Recommendation 12: The government should provide the DMU with a duty to monitor digital markets to enable it to build a detailed understanding of how digital businesses operate, and to provide the basis for swifter action to drive competition and innovation and prevent harm.

Recommendation 13: The government should strengthen competition and consumer protection laws and processes to ensure they are better adapted for the digital age.

  • Recommendation 13a: The government should pursue significant reforms to the markets regime to ensure it can be most effectively utilised to promote competition and innovation across digital markets, for example by pursuing measures like data mobility and interoperability.
  • Recommendation 13b: The government should strengthen powers to tackle unlawful or illegal activity or content on digital platforms which could result in economic detriment to consumers and businesses.
  • Recommendation 13c: The government should take action to strengthen powers to enable effective consumer choice in digital markets, including by addressing instances where choice architecture leads to consumer harm.
  • Recommendation 13d: The government should provide for stronger enforcement of the Platform to Business Regulation.

A coherent regulatory landscape

Recommendation 14: The government should ensure the DMU is able to work closely with other regulators with responsibility for digital markets, in particular Ofcom, the ICO and the FCA.

  • Recommendation 14a: The DMU should be able to share information with other regulators and seek reciprocal arrangements.
  • Recommendation 14b: The government should consider, in consultation with Ofcom and the FCA, empowering these agencies with joint powers with the DMU in relation to the SMS regime, with the DMU being the primary authority.

Recommendation 15: The government should enable the DMU to work closely with regulators in other jurisdictions to promote a coherent regulatory landscape.

  • Recommendation 15a: The DMU should be able to share information with regulators in other jurisdictions and should seek reciprocal arrangements.
  • Recommendation 15b: The DMU should explore establishing a network of international competition and consumer agencies to facilitate better monitoring and action in relation to the conduct of SMS firms.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Free-Photos from Pixabay