Further Reading, Other Developments, and Coming Events (12 January 2021)

Further Reading

  • Biden’s NSC to focus on global health, climate, cyber and human rights, as well as China and Russia” By Karen DeYoung — The Washington Post. Like almost every incoming White House, the Biden team has announced a restructuring of the National Security Council (NSC) to better effectuate the President-elect’s policy priorities. To not surprise, the volume on cybersecurity policy will be turned up. Other notable change is plans to take “cross-cutting” approaches to issues that will likely meld foreign and domestic and national security and civil issues, meaning there could be a new look on offensive cyber operations, for example. It is possible President Biden decides to put the genie back in the bottle, so to speak, by re-imposing an interagency decision-making process as opposed to the Trump Administration’s approach of delegating discretion to the National Security Agency/Cyber Command head. Also, the NSC will focus on emerging technology, a likely response to the technology arms race the United States finds itself in against the People’s Republic of China.
  • Exclusive: Pandemic relief aid went to media that promoted COVID misinformation” By Caitlin Dickson — yahoo! news. The consulting firm Alethea Group and the nonprofit Global Disinformation Index are claiming the COVID stimulus Paycheck Protection Program (PPP) provided loans and assistance to five firms that “were publishing false or misleading information about the pandemic, thus profiting off the infodemic” according to an Alethea Group vice president. This report follows an NBC News article claiming that 14 white supremacist and racist organizations have also received PPP loans. The Alethea Group and Global Disinformation Index named five entities who took PPP funds and kept spreading pandemic misinformation: Epoch Media Group, Newsmax Media, The Federalist, Liftable Media, and Prager University.
  • Facebook shuts Uganda accounts ahead of vote” — France24. The social media company shuttered a number of Facebook and Instagram accounts related to government officials in Uganda ahead of an election on account of “Coordinated Inauthentic Behaviour” (CIB). This follows the platform shutting down accounts related to the French Army and Russia seeking to influence events in Africa. These and other actions may indicate the platform is starting to pay the same attention to the non-western world as at least one former employee has argued the platform was negligent at best and reckless at worst in not properly resourcing efforts to police CIB throughout the Third World.
  • China tried to punish European states for Huawei bans by adding eleventh-hour rule to EU investment deal” By Finbarr Bermingham — South China Morning Post. At nearly the end of talks on a People’s Republic of China (PRC)-European Union (EU) trade deal, PRC negotiators tried slipping in language that would have barred entry to the PRC’s cloud computing market to any country or company from a country that restricts Huawei’s services and products. This is alternately being seen as either standard Chinese negotiating tactics or an attempt to avenge the thwarting of the crown jewel in its telecommunications ambitions.
  • Chinese regulators to push tech giants to share consumer credit data – sources” By Julie Zhu — Reuters. Ostensibly in a move to better manage the risks of too much unsafe lending, tech giants in the People’s Republic of China (PRC) will soon need to share data on consumer loans. It seems inevitable that such data will be used by Beijing to further crack down on undesirable people and elements within the PRC.
  • The mafia turns social media influencer to reinforce its brand” By Miles Johnson — The Financial Times. Even Italy’s feared ’Ndrangheta is creating and curating a social media presence.

Other Developments

  • President Donald Trump signed an executive order (EO) that bans eight applications from the People’s Republic of China on much the same grounds as the EOs prohibiting TikTok and WeChat. If this EO is not rescinded by the Biden Administration, federal courts may block its implementation as has happened with the TikTok and WeChat EOs to date. Notably, courts have found that the Trump Administration exceeded its authority under the International Emergency Economic Powers Act (IEEPA), which may also be an issue in the proposed prohibition on Alipay, CamScanner, QQ Wallet, SHAREit, Tencent QQ, VMate, WeChat Pay, and WPS Office. Trump found:
    • that additional steps must be taken to deal with the national emergency with respect to the information and communications technology and services supply chain declared in Executive Order 13873 of May 15, 2019 (Securing the Information and Communications Technology and Services Supply Chain).  Specifically, the pace and pervasiveness of the spread in the United States of certain connected mobile and desktop applications and other software developed or controlled by persons in the People’s Republic of China, to include Hong Kong and Macau (China), continue to threaten the national security, foreign policy, and economy of the United States.  At this time, action must be taken to address the threat posed by these Chinese connected software applications.
    • Trump directed that within 45 days of issuance of the EO, there shall be a prohibition on “any transaction by any person, or with respect to any property, subject to the jurisdiction of the United States, with persons that develop or control the following Chinese connected software applications, or with their subsidiaries, as those transactions and persons are identified by the Secretary of Commerce (Secretary) under subsection (e) of this section: Alipay, CamScanner, QQ Wallet, SHAREit, Tencent QQ, VMate, WeChat Pay, and WPS Office.”
  • The Government Accountability Office (GAO) issued its first statutorily required annual assessment of how well the United States Department of Defense (DOD) is managing its major information technology (IT) procurements. The DOD spent more than $36 billion of the $90 billion the federal government was provided for IT in FY 2020. The GAO was tasked with assessing how well the DOD did in using iterative development, managing costs and schedules, and implementing cybersecurity measures. The GAO found progress in the first two realms but a continued lag in deploying long recommended best practices to ensure the security of the IT the DOD buys or builds. Nonetheless, the GAO focused on 15 major IT acquisitions that qualify as administrative (i.e. “business”) and communications and information security (i.e. “non-business.”) While there were no explicit recommendations made, the GAO found:
    • Ten of the 15 selected major IT programs exceeded their planned schedules, with delays ranging from 1 month for the Marine Corps’ CAC2S Inc 1 to 5 years for the Air Force’s Defense Enterprise Accounting and Management System-Increment 1.
    • …eight of the 10 selected major IT programs that had tested their then-current technical performance targets reported having met all of their targets…. As of December 2019, four programs had not yet conducted testing activities—Army’s ACWS, Air Force’s AFIPPS Inc 1, Air Force’s MROi, and Navy ePS. Testing data for one program, Air Force’s ISPAN Inc 4, were classified.
    • …officials from the 15 selected major IT programs we reviewed reported using software development approaches that may help to limit risks to cost and schedule outcomes. For example, major business IT programs reported using COTS software. In addition, most programs reported using an iterative software development approach and using a minimum deployable product. With respect to cybersecurity practices, all the programs reported developing cybersecurity strategies, but programs reported mixed experiences with respect to conducting cybersecurity testing. Most programs reported using operational cybersecurity testing, but less than half reported conducting developmental cybersecurity testing. In addition, programs that reported conducting cybersecurity vulnerability assessments experienced fewer increases in planned program costs and fewer schedule delays. Programs also reported a variety of challenges associated with their software development and cybersecurity staff.
    • 14 of the 15 programs reported using an iterative software development approach which, according to leading practices, may help reduce cost growth and deliver better results to the customer. However, programs also reported using an older approach to software development, known as waterfall, which could introduce risk for program cost growth because of its linear and sequential phases of development that may be implemented over a longer period of time. Specifically, two programs reported using a waterfall approach in conjunction with an iterative approach, while one was solely using a waterfall approach.
    • With respect to cybersecurity, programs reported mixed implementation of specific practices, contributing to program risks that might impact cost and schedule outcomes. For example, all 15 programs reported developing cybersecurity strategies, which are intended to help ensure that programs are planning for and documenting cybersecurity risk management efforts.
    • In contrast, only eight of the 15 programs reported conducting cybersecurity vulnerability assessments—systematic examinations of an information system or product intended to, among other things, determine the adequacy of security measures and identify security deficiencies. These eight programs experienced fewer increases in planned program costs and fewer schedule delays relative to the programs that did not report using cybersecurity vulnerability assessments.
  • The United States (U.S.) Department of Energy gave notice of a “Prohibition Order prohibiting the acquisition, importation, transfer, or installation of specified bulk-power system (BPS) electric equipment that directly serves Critical Defense Facilities (CDFs), pursuant to Executive Order 13920.” (See here for analysis of the executive order.) The Department explained:
    • Executive Order No. 13920 of May 1, 2020, Securing the United States Bulk-Power System (85 FR 26595 (May 4, 2020)) (E.O. 13920) declares that threats by foreign adversaries to the security of the BPS constitute a national emergency. A current list of such adversaries is provided in a Request for Information (RFI), issued by the Department of Energy (Department or DOE) on July 8, 2020 seeking public input to aid in its implementation of E.O. 13920. The Department has reason to believe, as detailed below, that the government of the People’s Republic of China (PRC or China), one of the listed adversaries, is equipped and actively planning to undermine the BPS. The Department has thus determined that certain BPS electric equipment or programmable components subject to China’s ownership, control, or influence, constitute undue risk to the security of the BPS and to U.S. national security. The purpose of this Order is to prohibit the acquisition, importation, transfer, or subsequent installation of such BPS electric equipment or programmable components in certain sections of the BPS.
  • The United States’ (U.S.) Department of Commerce’s Bureau of Industry and Security (BIS) added the People’s Republic of China’s (PRC) Semiconductor Manufacturing International Corporation (SMIC) to its Entity List in a move intended to starve the company of key U.S. technology needed to manufacture high end semiconductors. Therefore, any U.S. entity wishing to do business with SMIC will need a license which the Trump Administration may not be likely to grant. The Department of Commerce explained in its press release:
    • The Entity List designation limits SMIC’s ability to acquire certain U.S. technology by requiring U.S. exporters to apply for a license to sell to the company.  Items uniquely required to produce semiconductors at advanced technology nodes—10 nanometers or below—will be subject to a presumption of denial to prevent such key enabling technology from supporting China’s military-civil fusion efforts.
    • BIS also added more than sixty other entities to the Entity List for actions deemed contrary to the national security or foreign policy interest of the United States.  These include entities in China that enable human rights abuses, entities that supported the militarization and unlawful maritime claims in the South China Sea, entities that acquired U.S.-origin items in support of the People’s Liberation Army’s programs, and entities and persons that engaged in the theft of U.S. trade secrets.
    • As explained in the Federal Register notice:
      • SMIC is added to the Entity List as a result of China’s military-civil fusion (MCF) doctrine and evidence of activities between SMIC and entities of concern in the Chinese military industrial complex. The Entity List designation limits SMIC’s ability to acquire certain U.S. technology by requiring exporters, reexporters, and in-country transferors of such technology to apply for a license to sell to the company. Items uniquely required to produce semiconductors at advanced technology nodes 10 nanometers or below will be subject to a presumption of denial to prevent such key enabling technology from supporting China’s military modernization efforts. This rule adds SMIC and the following ten entities related to SMIC: Semiconductor Manufacturing International (Beijing) Corporation; Semiconductor Manufacturing International (Tianjin) Corporation; Semiconductor Manufacturing International (Shenzhen) Corporation; SMIC Semiconductor Manufacturing (Shanghai) Co., Ltd.; SMIC Holdings Limited; Semiconductor Manufacturing South China Corporation; SMIC Northern Integrated Circuit Manufacturing (Beijing) Co., Ltd.; SMIC Hong Kong International Company Limited; SJ Semiconductor; and Ningbo Semiconductor International Corporation (NSI).
  • The United States’ (U.S.) Department of Commerce’s Bureau of Industry and Security (BIS) amended its Export Administration Regulations “by adding a new ‘Military End User’ (MEU) List, as well as the first tranche of 103 entities, which includes 58 Chinese and 45 Russian companies” per its press release. The Department asserted:
    • The U.S. Government has determined that these companies are ‘military end users’ for purposes of the ‘military end user’ control in the EAR that applies to specified items for exports, reexports, or transfers (in-country) to the China, Russia, and Venezuela when such items are destined for a prohibited ‘military end user.’
  • The Australia Competition and Consumer Commission (ACCC) rolled out another piece of the Consumer Data Right (CDR) scheme under the Competition and Consumer Act 2010, specifically accreditation guidelines “to provide information and guidance to assist applicants with lodging a valid application to become an accredited person” to whom Australians may direct data holders share their data. The ACCC explained:
    • The CDR aims to give consumers more access to and control over their personal data.
    • Being able to easily and efficiently share data will improve consumers’ ability to compare and switch between products and services and encourage competition between service providers, leading to more innovative products and services for consumers and the potential for lower prices.
    • Banking is the first sector to be brought into the CDR.
    • Accredited persons may receive a CDR consumer’s data from a data holder at the request and consent of the consumer. Any person, in Australia or overseas, who wishes to receive CDR data to provide products or services to consumers under the CDR regime, must be accredited
  • Australia’s government has released its “Data Availability and Transparency Bill 2020” that “establishes a new data sharing scheme for federal government data, underpinned by strong safeguards to mitigate risks and simplified processes to make it easier to manage data sharing requests” according to the summary provided in Parliament by the government’s point person. In the accompanying “Explanatory Memorandum,” the following summary was provided:
    • The Bill establishes a new data sharing scheme which will serve as a pathway and regulatory framework for sharing public sector data. ‘Sharing’ involves providing controlled access to data, as distinct from open release to the public.
    • To oversee the scheme and support best practice, the Bill creates a new independent regulator, the National Data Commissioner (the Commissioner). The Commissioner’s role is modelled on other regulators such as the Australian Information Commissioner, with whom the Commissioner will cooperate.
    • The data sharing scheme comprises the Bill and disallowable legislative instruments (regulations, Minister-made rules, and any data codes issued by the Commissioner). The Commissioner may also issue non-legislative guidelines that participating entities must have regard to, and may release other guidance as necessary.
    • Participants in the scheme are known as data scheme entities:
      • Data custodians are Commonwealth bodies that control public sector data, and have the right to deal with that data.
      • Accredited users are entities accredited by the Commissioner to access to public sector data. To become accredited, entities must satisfy the security, privacy, infrastructure and governance requirements set out in the accreditation framework.
      • Accredited data service providers (ADSPs) are entities accredited by the Commissioner to perform data services such as data integration. Government agencies and users will be able to draw upon ADSPs’ expertise to help them to share and use data safely.
    • The Bill does not compel sharing. Data custodians are responsible for assessing each sharing request, and deciding whether to share their data if satisfied the risks can be managed.
    • The data sharing scheme contains robust safeguards to ensure sharing occurs in a consistent and transparent manner, in accordance with community expectations. The Bill authorises data custodians to share public sector data with accredited users, directly or through an ADSP, where:
      • Sharing is for a permitted purpose – government service delivery, informing government policy and programs, or research and development;
      • The data sharing principles have been applied to manage the risks of sharing; and
      • The terms of the arrangement are recorded in a data sharing agreement.
    • Where the above requirements are met, the Bill provides limited statutory authority to share public sector data, despite other Commonwealth, State and Territory laws that prevent sharing. This override of non-disclosure laws is ‘limited’ because it occurs only when the Bill’s requirements are met, and only to the extent necessary to facilitate sharing.
  • The United Kingdom’s Competition and Markets Authority’s (CMA) is asking interested parties to provide input on the proposed acquisition of British semiconductor company by a United States (U.S.) company before it launches a formal investigation later this year. However, CMA is limited to competition considerations, and any national security aspects of the proposed deal would need to be investigated by Prime Minister Boris Johnson’s government. CMA stated:
    • US-based chip designer and producer NVIDIA Corporation (NVIDIA) plans to purchase the Intellectual Property Group business of UK-based Arm Limited (Arm) in a deal worth $40 billion. Arm develops and licenses intellectual property (IP) and software tools for chip designs. The products and services supplied by the companies support a wide range of applications used by businesses and consumers across the UK, including desktop computers and mobile devices, game consoles and vehicle computer systems.
    • CMA added:
      • The CMA will look at the deal’s possible effect on competition in the UK. The CMA is likely to consider whether, following the takeover, Arm has an incentive to withdraw, raise prices or reduce the quality of its IP licensing services to NVIDIA’s rivals.
  • The Israeli firm, NSO Group, has been accused by an entity associated with a British university of using real-time cell phone data to sell its COVID-19 contact tracing app, Fleming, in ways that may have broken the laws of a handful of nations. Forensic Architecture,  a research agency, based at Goldsmiths, University of London, argued:
    • In March 2020, with the rise of COVID-19, Israeli cyber-weapons manufacturer NSO Group launched a contact-tracing technology named ‘Fleming’. Two months later, a database belonging to NSO’s Fleming program was found unprotected online. It contained more than five hundred thousand datapoints for more than thirty thousand distinct mobile phones. NSO Group denied there was a security breach. Forensic Architecture received and analysed a sample of the exposed database, which suggested that the data was based on ‘real’ personal data belonging to unsuspecting civilians, putting their private information in risk
    • Forensic Architecture added:
      • Leaving a database with genuine location data unprotected is a serious violation of the applicable data protection laws. That a surveillance company with access to personal data could have overseen this breach is all the more concerning.
      • This could constitute a violation of the General Data Protection Regulation (GDPR) based on where the database was discovered as well as the laws of the nations where NSO Group allegedly collected personal data
    • The NSO Group denied the claims and was quoted by Tech Crunch:
      • “We have not seen the supposed examination and have to question how these conclusions were reached. Nevertheless, we stand by our previous response of May 6, 2020. The demo material was not based on real and genuine data related to infected COVID-19 individuals,” said an unnamed spokesperson. (NSO’s earlier statement made no reference to individuals with COVID-19.)
      • “As our last statement details, the data used for the demonstrations did not contain any personally identifiable information (PII). And, also as previously stated, this demo was a simulation based on obfuscated data. The Fleming system is a tool that analyzes data provided by end users to help healthcare decision-makers during this global pandemic. NSO does not collect any data for the system, nor does NSO have any access to collected data.”

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Judith Scharnowski from Pixabay

New Cybersecurity Law and Strategy Unveiled

The EU is revising and replacing a 2016 regime to govern cybersecurity across the bloc.

The European Union (EU) is floating a proposal to reform its 2016 law on cybersecurity throughout the European Union to address gaps the current regime is not addressing. This proposal was released in concert with a new cybersecurity strategy and a statutory proposal to address physical (i.e. non-cyber) infrastructure. These proposals are the latest in a line of policy changes put forth by the EU’s new leadership to make this decade the EU’s Digital Decade. It may, however, take years for these proposals to become laws. For example, the successor to the ePrivacy Directive has been held up in negotiations for the last few years.

New European Commission (EC) President Ursula von der Leyen spelled out her vision for the EU for the years of 2019 through 2024, including “A Europe fit for the digital age.” In its February 2020 “Communication: Shaping Europe’s digital future,” the EC spelled out more how von der Leyen’s vision would be effectuated:

A European cybersecurity strategy, including the establishment of a joint Cybersecurity Unit, a Review of the Security of Network and Information Systems (NIS) Directive and giving a push to the single market for cybersecurity.

To this end, in mid-December 2020, the EC and the High Representative of the Union for Foreign Affairs and Security Policy unveiled a new EU Cybersecurity Strategy and “proposals to address both cyber and physical resilience of critical entities and networks: a Directive on measures for high common level of cybersecurity across the Union (revised NIS Directive or ‘NIS 2′), and a new Directive on the resilience of critical entities.”

Let us turn to the NIS 2 first. This proposal would replace the 2016 “Directive on security of network and information systems (NIS Directive)” ((EU) 2016/1148) currently in effect throughout the EU. NIS 2 would impose new obligations and responsibilities on EU member states and essential and important entities. The nations of the EU would need to draft and implement cybersecurity frameworks/strategies, which includes setting up vulnerability disclosure programs, voluntary cybersecurity information sharing programs, a policy to address information and communications technology (ICT) supply chain risk, and cybersecurity standards for publicly bought and used ICT. EU nations would also need to name “competent” national authorities to enforce NIS 2, for the EC identified lax or non-existent enforcement of existing cybersecurity laws as a rationale for the new proposal. Consequently, such authorities must be empowered to issue binding directives, if necessary, warnings, or instructions to cease certain conduct. These authorities must also work with data protection authorities in the event of data breaches. NIS 2 also provides for administrative fines and penalties to be established in the laws of EU nations.

Additionally, all EU nations should have computer security incident response teams (CSIRTs). NIS 2 would apply to a number of public and private entities in certain sectors, which are deemed “essential:” energy; transport; banking; financial market infrastructures; health, drinking water; waste water; digital infrastructure; public administration and space. Some public and private entities would be “important” entities and subject to the NIS 2 in these sectors: postal and courier services; waste management; manufacture, production and distribution of chemicals; food production, processing and distribution; manufacturing and digital providers. Micro and small entities would largely not be swept up into NIS 2 even if they are part of one of the aforementioned sectors. However, “providers of electronic communications networks or of publicly available electronic communications services, trust service providers, Top-level domain name (TLD) name registries and public administration, and certain other entities” would be governed by NIS 2 regardless of their size.

The EU would also establish a Cooperation Group that would be tasked with helping EU nations work more harmoniously under the NIS 2. However, this new body, unlike, say the General Data Protection Regulation’s created European Data Protection Board (EDPB), would not have power to compel its members to comply with NIS 2.

Notably, NIS 2 would require that: “Member States shall ensure that essential and important entities shall take appropriate and proportionate technical and organisational measures to manage the risks posed to the security of network and information systems which those entities use in the provision of their services.” The law lists a number of elements that must go into these measures. Moreover, “essential and important entities notify, without undue delay, the competent authorities or the CSIRT…of any incident having a significant impact on the provision of their services.” The NIS 2 lays out broad criteria as to what constitutes a “significant impact:”

  • the incident has caused or has the potential to cause substantial operational disruption or financial losses for the entity concerned;
  • the incident has affected or has the potential to affect other natural or legal persons by causing considerable material or non-material losses.

In order to address ICT supply chain risk, EU countries may elect to “require essential and important entities to certify certain ICT products, ICT services and ICT processes under specific European cybersecurity certification schemes adopted ” under the legislation that created the European Union Agency for Cybersecurity (ENISA).

As noted earlier, EU nations must establish systems for essential and important entities to share information but need not compel them to do so. Article 26 provides that nation “shall ensure that essential and important entities may exchange relevant cybersecurity information among themselves including information relating to cyber threats, vulnerabilities, indicators of compromise, tactics, techniques and procedures, cybersecurity alerts and configuration tools.” EU countries must also have a system for any non-essential, non-important entities or those from sectors not covered by NIS 2 can also voluntarily submit information.

The EC argued that the NIS Directive is now outdated and is in desperate need of revision to reflect current realities:

Notwithstanding its notable achievements, the NIS Directive, which paved the way for a significant change in mind-set, in relation to the institutional and regulatory approach to cybersecurity in many Member States, has also proven its limitations. The digital transformation of society (intensified by the COVID-19 crisis) has expanded the threat landscape and is bringing about new challenges which require adapted and innovative responses. The number of cyber-attacks continues to rise, with increasingly sophisticated attacks coming from a wide range of sources inside and outside the EU.

The EC highlighted some of the limitations in how the NIS Directive has been implemented by EU member states and its failure to drive the adoption of better cyber practices by EU businesses:

The evaluation on the functioning of the NIS Directive, conducted for the purposes of the Impact Assessment, identified the following issues: (1) the low level of cyber resilience of businesses operating in the EU; (2) the inconsistent resilience across Member States and sectors; and (3) the low level of joint situational awareness and lack of joint crisis response. For example, certain major hospitals in a Member State do not fall within the scope of the NIS Directive and hence are not required to implement the resulting security measures, while in another Member State almost every single healthcare provider in the country is covered by the NIS security requirements.

The EC explained how the NIS 2 relates to a proposal released the same day to address physical infrastructure in the EU:

The proposal is therefore closely aligned with the proposal for a Directive on the resilience of critical entities, which aims at enhancing the resilience of critical entities against physical threats in a large number of sectors. The proposal aims to ensure that competent authorities under both legal acts take complementary measures and exchange information as necessary regarding cyber and non-cyber resilience, and that particularly critical operators in the sectors considered to be ‘essential’ per the proposal at hand are also subject to more general resilience-enhancing obligations with an emphasis on non-cyber risks.

The EC’s impact assessment on how well the NIS Directive is working shows limitations in scope and application, some of which may be attributed to changes in the EU and the world:

  • The scope of the NIS Directive is too limited in terms of the sectors covered, mainly due to: (i) increased digitisation in recent years and a higher degree of interconnectedness, (ii) the scope of the NIS Directive no longer reflecting all digitised sectors providing key services to the economy and society as a whole.
  • The NIS Directive is not sufficiently clear when it comes to the scope for operators of essential services and its provisions do not provide sufficient clarity regarding national competence over digital service providers. This has led to a situation in which certain types of entities have not been identified in all Member States and are therefore not required to put in place security measures and report incidents.
  • The NIS Directive allowed wide discretion to the Member States when laying down security and incident reporting requirements for operators of essential services (hereinafter called ‘OES(s)’). The evaluation shows that in some instances Member States have implemented these requirements in significantly different ways, creating additional burden for companies operating in more than one Member State.
  • The supervision and enforcement regime of the NIS Directive is ineffective. For example, Member States have been very reluctant to apply penalties to entities failing to put in place security requirements or report incidents. This can have negative consequences for the cyber resilience of individual entities.
  • The financial and human resources set aside by Member States for fulfilling their tasks (such as OES identification or supervision), and consequently the different levels of maturity in dealing with cybersecurity risks, vary greatly. This further exacerbates the differences in cyber resilience between Member States.
  • Member States do not share information systematically with one another, with negative consequences in particular for the effectiveness of the cybersecurity measures and for the level of joint situational awareness at EU level. This is also the case for information sharing among private entities, and for the engagement between the EU level cooperation structures and private entities.

The EC’s proposal contains a summary of what the new law would do:

  • The Directive, in particular: (a) lays down obligations for the Member States to adopt a national cybersecurity strategy, designate competent national authorities, single points of contact and CSIRTs; (b) provides that Member States shall lay down cybersecurity risk management and reporting obligations for entities referred to as essential entities in Annex I and important entities in Annex II; (c) provides that Member States shall lay down obligations on cybersecurity information sharing.
  • It applies to certain public or private essential entities operating in the sectors listed in Annex I (energy; transport; banking; financial market infrastructures; health, drinking water; waste water; digital infrastructure; public administration and space) and certain important entities operating in the sectors listed in Annex II (postal and courier services; waste management; manufacture, production and distribution of chemicals; food production, processing and distribution; manufacturing and digital providers). Micro and small entities within the meaning of Commission Recommendation 2003/361/EC of 6 May 2003 are excluded from the scope of the Directive, except for providers of electronic communications networks or of publicly available electronic communications services, trust service providers, Top-level domain name (TLD) name registries and public administration, and certain other entities, such as the sole provider of a service in a Member State.

The EC also released “The EU’s Cybersecurity Strategy for the Digital Decade” alongside the NIS 2 “to ensure a global and open Internet with strong guardrails to address the risks to the security and fundamental rights and freedoms of people in Europe.” The EC spelled out its dramatic plan to remake how the bloc regulates, invests in, and structures policies around cybersecurity. The EC claimed “[a]s a key component of Shaping Europe’s Digital Future, the Recovery Plan for Europe  and the EU Security Union Strategy, the Strategy will bolster Europe’s collective resilience against cyber threats and help to ensure that all citizens and businesses can fully benefit from trustworthy and reliable services and digital tools.” If the EU follows through, this strategy may have significant effects in the EU and around the world.

The EC further explained:

  • Following the progress achieved under the previous strategies, it contains concrete proposals for deploying three principal instruments –regulatory, investment and policy instruments – to address three areas of EU action – (1) resilience, technological sovereignty and leadership, (2) building operational capacity to prevent, deter and respond, and (3) advancing a global and open cyberspace. The EU is committed to supporting this strategy through an unprecedented level of investment in the EU’s digital transition over the next seven years – potentially quadrupling previous levels – as part of new technological and industrial policies and the recovery agenda
  • Cybersecurity must be integrated into all these digital investments, particularly key technologies like Artificial Intelligence (AI), encryption and quantum computing, using incentives, obligations and benchmarks. This can stimulate the growth of the European cybersecurity industry and provide the certainty needed to ease the phasing out of legacy systems. The European Defence Fund (EDF) will support European cyber defence solutions, as part of the European defence technological and industrial base. Cybersecurity is included in external financial instruments to support our partners, notably the Neighbourhood, Development and International Cooperation Instrument. Preventing the misuse of technologies, protecting critical infrastructure and ensuring the integrity of supply chains also enables the EU’s adherence to the UN norms, rules and principles of responsible state behavior.

Per the EC’s press release, the ”Directive on the resilience of critical entities” “expands both the scope and depth of the 2008 European Critical Infrastructure directive.” The EC added:

Ten sectors are now covered: energy, transport, banking, financial market infrastructures, health, drinking water, waste water, digital infrastructure, public administration and space. Under the proposed directive, Member States would each adopt a national strategy for ensuring the resilience of critical entities and carry out regular risk assessments. These assessments would also help identify a smaller subset of critical entities that would be subject to obligations intended to enhance their resilience in the face of non-cyber risks, including entity-level risk assessments, taking technical and organisational measures, and incident notification. The Commission, in turn, would provide complementary support to Member States and critical entities, for instance by developing a Union-level overview of cross-border and cross-sectoral risks, best practice, methodologies, cross-border training activities and exercises to test the resilience of critical entities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Prawny from Pixabay

UK and EU Defer Decision On Data Flows

Whether there will be an adequacy decision allowing the free flow of personal data under the GDPR from the EU to the recently departed UK has been punted. And, its recent status as a member of the EU notwithstanding, the UK might not get an adequacy decision.

In reaching agreement on many aspects of the United Kingdom’s (UK) exit from the European Union (EU), negotiators did not reach agreement on whether the EU would permit the personal data of EU persons to continue flowing to the UK under the easiest means possible. Instead, the EU and UK agreed to let the status quo continue until an adequacy decision is made or six months lapse. The value of data flowing between the UK and EU was valued at more than £100 billion in 2017 according to British estimates, with the majority of this trade being from the UK to the EU.

Under the General Data Protection Regulation (GDPR), the personal data of EU people can be transferred to other nations for most purposes once the European Commission (EC) has found the other nation has adequate protection equal to those granted in the EU. Of course, this has been an ongoing issue with data flows to the United States (U.S.) as two agreements (Safe Harbor and Privacy Shield) and their EC adequacy decisions were ruled illegal, in large part, because, according to the EU’s highest court, U.S. law does not provide EU persons with the same rights they have in the EU. Most recently, this occurred in 2020 when the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the EU-United States Privacy Shield (aka Schrems II). It bears note that transfers of personal data may occur through other means under the GDPR that may prove more resource intensive: standard data protection clauses (SCC), binding corporate rules (BCR), and others.

Nevertheless, an adequacy decision is seen as the most desirable means of transfer and the question of whether the UK’s laws are sufficient has lingered over the Brexit discussions, with some claiming that the nation’s membership in the Five Eyes surveillance alliance with the U.S. and others possibly disqualifying the UK. Given the range of thorny issues the UK and EU punted (e.g. how to handle the border between Northern Ireland and Ireland), it is not surprising that the GDPR and data flows was also punted.

The UK-EU Trade and Cooperation Agreement (TCA) explained the terms of the data flow agreement and, as noted, in the short term, the status quo will continue with data flows to the UK being treated as if it were still part of the EU. This state will persist until the EC reaches an adequacy decision or for four months with another two months of the status quo being possible in the absence of an adequacy decision so long as neither the UK nor EU object. Moreover, these provisions are operative only so long as the UK has its GDPR compliant data protection law (i.e. UK Data Protection Act 2018) in place and does exercise specified “designated powers.” The UK has also deemed EU and European Economic Area (EEA) and European Free Trade Association (EFTA) nations to be adequate for purposes of data transfers from the UK on a transitional basis.

Specifically, the TCA provides

For the duration of the specified period, transmission of personal data from the Union to the United Kingdom shall not be considered as transfer to a third country under Union law, provided that the data protection legislation of the United Kingdom on 31 December 2020, as it is saved and incorporated into United Kingdom law by the European Union (Withdrawal) Act 2018 and as modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (“the applicable data protection regime”), applies and provided that the United Kingdom does not exercise the designated powers without the agreement of the Union within the Partnership Council.

The UK also agreed to notify the EU if it “enters into a new instrument which can be relied on to transfer personal data to a third country under Article 46(2)(a) of the UK GDPR or section 75(1)(a) of the UK Data Protection Act 2018 during the specified period.” However, if the EU were to object, it appears from the terms of the TCA, all the EU could do is force the UK “to discuss the relevant object.” And yet, should the UK sign a treaty allowing personal data to flow to a nation the EU deems inadequate, this could obviously adversely affect the UK’s prospects of getting an adequacy decision.

Not surprisingly, the agreement also pertains to the continued flow of personal data as part of criminal investigations and law enforcement matters but not national security matters. Moreover, these matters fall outside the scope of the GDPR and would not be affected in many ways by an adequacy decision or a lack of one. In a British government summary, it is stated that the TCA

provide[s] for law enforcement and judicial cooperation between the UK, the EU and its Member States in relation to the prevention, investigation, detection and prosecution of criminal offences and the prevention of and fight against money laundering and financing of terrorism.

The text of the TCA makes clear national security matters visa vis data flows and information sharing are not covered:

This Part only applies to law enforcement and judicial cooperation in criminal matters taking place exclusively between the United Kingdom, on the one side, and the Union and the Member States, on the other side. It does not apply to situations arising between the Member States, or between Member States and Union institutions, bodies, offices and agencies, nor does it apply to the activities of authorities with responsibilities for safeguarding national security when acting in that field.

The TCA also affirms:

  • The cooperation provided for in this Part is based on the Parties’ long-standing commitment to ensuring a high level of protection of personal data.
  • To reflect that high level of protection, the Parties shall ensure that personal data processed under this Part is subject to effective safeguards in the Parties’ respective data protection regimes…

The United Kingdom’s data protection authority (DPA), the Information Commissioner’s Office (ICO), issued an explanation of how British law enforcement entities should act in light of the TCA. The ICO explained to British entities on law enforcement-related data transfers to the UK:

  • We are now a ‘third country’ for EU data protection purposes. If you receive personal data from a law enforcement partner in the EU, this means the sender will need to comply with the transfer provisions under their national data protection law (which are likely to be similar to those in Part 3 of the DPA 2018).
  • This means the EU sender needs to make sure other appropriate safeguards are in place – probably through a contract or other binding legal instrument, or by making their own assessment of appropriate safeguards. The sender can take into account the protection provided by the DPA 2018 itself when making this assessment.
  • If you receive personal data from other types of organisations in the EU or EEA who are subject to the GDPR, the sender will need to comply with the transfer provisions of the UK GDPR. You may want to consider putting standard contractual clauses (SCCs) in place to ensure adequate safeguards in these cases. We have produced an interactive tool to help you use the SCCs.

The ICO explained for transfers from the UK to the EU (but not the EEA):

  • There is a transitional adequacy decision in place to cover transfers to EU member states and Gibraltar. This will not extend to EEA countries outside the EU, where you should continue to consider other safeguards.
  • This means you can continue to send personal data from the UK to your law enforcement partners in the EU, as long as you can show the transfer is necessary for law enforcement purposes. You can also transfer personal data to non-law enforcement bodies in the EU if you can meet some additional conditions, but you will need to notify the ICO.

Turning back to an adequacy decision and commercial transfers of personal data from the EU to the UK, in what may well be a preview of a world in which there is no adequacy decision between the UK and EU, the European Data Protection Board (EDPB) issued an “information note” in mid-December that spells out how the GDPR would be applied:

  • In the absence of an adequacy decision applicable to the UK as per Article 45 GDPR, such transfers will require appropriate safeguards(e.g., standard data protection clauses, binding corporate rules, codes of conduct…), as well as enforceable data subject rights and effective legal remedies for data subjects, in accordance with Article 46 GDPR.
  • Subject to specific conditions, it may still be possible to transfer personal data to the UK based on a derogation listed in Article 49 GDPR. However, Article 49 GDPR has an exceptional nature and the derogations it contains must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive.
  • Moreover, where personal data are transferred to the UK on the basis of Article 46 GDPR safeguards, supplementary measures might be necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, in accordance with the Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.

Regarding commercial data transfers, the ICO issued a statement urging British entities to start setting up “alternative transfer mechanisms” to ensure data continues to flow from the EU to UK:

  • The Government has announced that the Treaty agreed with the EU will allow personal data to flow freely from the EU (and EEA) to the UK, until adequacy decisions have been adopted, for no more than six months.
  • This will enable businesses and public bodies across all sectors to continue to freely receive data from the EU (and EEA), including law enforcement agencies.
  • As a sensible precaution, before and during this period, the ICO recommends that businesses work with EU and EEA organisations who transfer personal data to them, to put in place alternative transfer mechanisms, to safeguard against any interruption to the free flow of EU to UK personal data.

However, even with these more restrictive means of transferring personal data to the UK exist, there will likely be legal challenges. It bears note that in light of Schrems II, EU DPAs are likely to apply a much higher level of scrutiny to SCCs, and challenges to the legality of using SCCs to transfer personal data to the U.S. have already been commenced. It also seems certain the legality of using SCCs to transfer data to the UK would be challenged, as well.

However, returning to the preliminary issue of whether the EC will give the UK an adequacy decision, there may a number of obstacles to a finding that the UK’s data protection and surveillance laws are indeed adequate under EU law[1]. Firstly, the UK’s surveillance practices in light of a recent set of CJEU rulings may prove difficult for the EC to stomach. In 2020, the CJEU handed down a pair of rulings (here and here) on the extent to which European Union (EU) nations may engage in bulk, indiscriminate collection of two types of data related to electronic communications. The CJEU found that while EU member nations may conduct these activities to combat crime or national security threats during periods limited by necessity and subject to oversight, nations may not generally require the providers of electronic communications to store and provide indiscriminate location data and traffic data in response to an actual national security danger or a prospective one. The CJEU combined three cases into two rulings that came from the UK, France, and Belgium to elucidate the reach of the Privacy and Electronic Communications Directive in relation to foundational EU laws.

The UK is, of course, one of the U.S.’s staunchest allies and partners when it comes to government surveillance of electronic communications. On this point, the CJEU summarized the beginning of the case out of the UK:

  • At the beginning of 2015, the existence of practices for the acquisition and use of bulk communications data by the various security and intelligence agencies of the United Kingdom, namely GCHQ, MI5 and MI6, was made public, including in a report by the Intelligence and Security Committee of Parliament (United Kingdom). On 5 June 2015, Privacy International, a non-governmental organisation, brought an action before the Investigatory Powers Tribunal (United Kingdom) against the Secretary of State for Foreign and Commonwealth Affairs, the Secretary of State for the Home Department and those security and intelligence agencies, challenging the lawfulness of those practices.

Secondly, the government of Prime Minister Boris Johnson may aspire to change data laws in ways the EU does not. In media accounts, unnamed EC officials were critical of the UK’s 2020 “National Data Strategy,” particularly references to “legal barriers (real and perceived)” to accessing data that “must be addressed.”

Thirdly, it may become a matter of politics. The EU has incentives to make the UK’s exit from the EU difficult to dissuade other nations from following the same path. Moreover, having previously been the second largest economy in the EU as measured by GDP, the UK may prove a formidable economy competitor, lending more weight to the view that the EU may not want to help the UK’s  businesses compete with the EU’s.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by succo from Pixabay


[1] European Union Parliament, “The EU-UK relationship beyond Brexit: options for Police Cooperation and Judicial Cooperation in Criminal Matters,” Page 8: Although the UK legal framework is currently broadly in line with the EU legal framework and the UK is a signatory to the European Convention on Human Rights (ECHR), there are substantial questions over whether the Data Protection Act fully incorporates the data protection elements required by the Charter of Fundamental Rights, concerning the use of the national security exemption from the GDPR used by the UK, the retention of data and bulk powers granted to its security services, and over its onward transfer of this data to third country security partners such as the ‘Five Eyes’ partners (Britain, the USA, Australia, New Zealand and Canada).

Further Reading, Other Developments, and Coming Events (5 January 2021)

Further Reading

  • China Used Stolen Data To Expose CIA Operatives In Africa And Europe;” “Beijing Ransacked Data as U.S. Sources Went Dark in China;” “Tech Giants Are Giving China A Vital Edge In Espionage” By Zach Dorfman — Foreign Policy. This terrifying trio of articles lays bare the 180 degree change in espionage advantage the People’s Republic of China (PRC) seems to hold over the United States (U.S.). Hacking, big data, processing, algorithms, and other technological issues play prominent roles in the PRC’s seeming advantage. It remains to be seen how the U.S. responds to the new status quo.
  • Singapore police can access COVID-19 contact tracing data for criminal investigations” By Eileen Yu — ZDNet. During questioning in Singapore’s Parliament, it was revealed the police can use existing authority to access the data on a person’s smartphone collected by the nation’s TraceTogether app. Technically, this would entail a person being asked by the police to upload their data, which is stored on devices and encrypted. Nonetheless, this is the very scenario privacy advocates have been saying is all but inevitable with COVID-19 tracing apps on phones.
  • As Understanding of Russian Hacking Grows, So Does Alarm” By David Sanger, Nicole Perlroth, and Julian Barnes — The New York Times. Like a detonated bomb, the Russian hack of United States (U.S.) public and private systems keeps getting worse in terms of damage and fallout. The scope continues to widen as it may come to pass that thousands of U.S. entities have been compromised in ways that leave them vulnerable to future attacks. Incidentally, the massive hack has tarnished somewhat the triumph of the U.S. intelligence agencies in fending off interference with the 2020 election.
  • Google workers launch unconventional union with help of Communications Workers of America” By Nitasha Tiku — The Washington Post. A new union formed in Google stopped short of seeking certification by the National Labor Relations Board (NLRB), which will block it from collective bargaining. Nonetheless, the new union will collect dues and have a board of directors. This may lead to additional unionizing efforts in union-averse Silicon Valley and throughout the tech world.
  • ‘Break up the groupthink’: Democrats press Biden to diversify his tech picks” By Cristiano Lima — Politico. Key Democratic groups in the House are pushing the Biden team to appoint people of color for key technology positions at agencies such as the Federal Trade Commission (FTC), Federal Communications Commission (FCC), the Office of Science and Technology Policy (OSTP).

Other Developments

  • The Congress overrode President Donald Trump’s veto of the FY 2021 National Defense Authorization Act (NDAA), thus enacting the annual defense and national security policy bill, which includes a number of technology provisions that will have effects in the public and private sectors. (See here and here for analysis of these provisions in the “William M. “Mac” Thornberry National Defense Authorization Act for Fiscal Year 2021” (H.R.6395).
  • A federal court dismissed a lawsuit brought by a civil liberties and privacy advocacy group to stop implementation of President Donald Trump’s executive order aimed at social media companies and their liability protection under 47 USC 230 (aka Section 230). In June, the Center for Democracy and Technology (CDT), filed suit in federal court to block enforcement of the “Executive Order (EO) on Preventing Online Censorship.” However, the United States District Court of the District of Columbia ruled that CDT is not injured by the executive order (EO) and any such lawsuit is premature. The court dismissed the lawsuit for lack of jurisdiction.
    • In its complaint, CDT argued the EO “violates the First Amendment in two fundamental respects:
      • First, the Order is plainly retaliatory: it attacks a private company, Twitter, for exercising its First Amendment right to comment on the President’s statements.
      • Second, and more fundamentally, the Order seeks to curtail and chill the constitutionally protected speech of all online platforms and individuals— by demonstrating the willingness to use government authority to retaliate against those who criticize the government.”
  • The Federal Trade Commission (FTC) reached a settlement with a company that sells emergency travel and medical services for failing “to take reasonable steps to secure sensitive consumer information such as health records,” including having a unsecured cloud database a security researcher stumbled upon with the sensitive data of more than 130,000 people. Moreover, the company claimed a certification of compliance with the Health Insurance Portability and Accountability Act (HIPAA), which turned out to be untrue. In the complaint, the FTC alleged that these and other practices “constitute unfair and/or deceptive acts or practices, in or affecting commerce in violation of Section 5(a) of the Federal Trade Commission Act.” The FTC and the company reached agreement on a consent order that will require the company’s compliance for at least 20 years.
    • In the complaint, the FTC stated that SkyMed “advertises, offers for sale, and sells nationwide a wide array of emergency travel membership plans that cover up to eighteen different emergency travel and medical evacuation services for members who sustain serious illnesses or injuries during travel in certain geographic areas.”
    • The FTC asserted a security researcher discovered SkyMed’s “database, which could be located and accessed by anyone on the internet, contained approximately 130,000 membership records with consumers’ personal information stored in plain text, including information populated in certain fields for names, dates of birth, gender, home addresses, email addresses, phone numbers, membership information and account numbers, and health information.”
    • The FTC noted the company told affected customers that it had investigated and “[t]here was no medical or payment-related information visible and no indication that the information has been misused.” This turns out to be completely false, and the company’s “investigation did not determine that consumers’ health information was neither stored on the cloud database, nor improperly accessed by an unauthorized third party.”
    • The FTC summarized the terms of the consent order and SkyMed’s obligations:
      • Under the proposed settlement, SkyMed is prohibited from misrepresenting how it secures personal data, the circumstances of and response to a data breach, and whether the company has been endorsed by or participates in any government-sponsored privacy or security program. The company also will be required to send a notice to affected consumers detailing the data that was exposed by the data breach.
      • As part of the mandated information security program, the company must identify and document potential internal and external risks and design, implement, and maintain safeguards to protect personal information it collects from those risks. In addition, SkyMed must obtain biennial assessments of its information security program by a third party, which the FTC has authority to approve, to examine the effectiveness of SkyMed’s information security program, identify any gaps or weaknesses, and monitor efforts to address these problems. The settlement also requires a senior SkyMed executive to certify annually that the company is complying with the requirements of the settlement.
  • The European Commission (EC) has communicated its vision for a new cybersecurity strategy to the European Parliament and European Council “to ensure a global and open Internet with strong guardrails to address the risks to the security and fundamental rights and freedoms of people in Europe.” The EC spelled out its dramatic plan to remake how the bloc regulates, invests in, and structures policies around cybersecurity. The EC claimed “[a]s a key component of Shaping Europe’s Digital Future, the Recovery Plan for Europe  and the EU Security Union Strategy, the Strategy will bolster Europe’s collective resilience against cyber threats and help to ensure that all citizens and businesses can fully benefit from trustworthy and reliable services and digital tools.” If the European Union (EU) follows through, this strategy may have significant effects in the EU and around the world. The EC further explained:
    • Following the progress achieved under the previous strategies, it contains concrete proposals for deploying three principal instruments –regulatory, investment and policy instruments – to address three areas of EU action – (1) resilience, technological sovereignty and leadership, (2) building operational capacity to prevent, deter and respond, and (3) advancing a global and open cyberspace. The EU is committed to supporting this strategy through an unprecedented level of investment in the EU’s digital transition over the next seven years – potentially quadrupling previous levels – as part of new technological and industrial policies and the recovery agenda
    • Cybersecurity must be integrated into all these digital investments, particularly key technologies like Artificial Intelligence (AI), encryption and quantum computing, using incentives, obligations and benchmarks. This can stimulate the growth of the European cybersecurity industry and provide the certainty needed to ease the phasing out of legacy systems. The European Defence Fund (EDF) will support European cyber defence solutions, as part of the European defence technological and industrial base. Cybersecurity is included in external financial instruments to support our partners, notably the Neighbourhood, Development and International Cooperation Instrument. Preventing the misuse of technologies, protecting critical infrastructure and ensuring the integrity of supply chains also enables the EU’s adherence to the UN norms, rules and principles of responsible state behavior.
    • With respect to actions that might be taken, the EC stated that “[t]he EU should ensure:
      • Adoption of revised NIS Directive;
      • Regulatory measures for an Internet of Secure Things
      • Through the CCCN investment in cybersecurity (notably through the Digital Europe Programme, Horizon Europe and recovery facility) to reach up to €4.5 billion in public and private investments over 2021-2027;
      • An EU network of AI-enabled Security Operation Centres and an ultra-secure communication infrastructure harnessing quantum technologies;
      • Widespread adoption of cybersecurity technologies through dedicated support to SMEs under the Digital Innovation Hubs;
      • Development of an EU DNS resolver service as a safe and open alternative for EU citizens, businesses and public administration to access the Internet; and
      • Completion of the implementation of the 5G Toolbox by the second quarter of 2021
      • Complete the European cybersecurity crisis management framework and determine the process, milestones and timeline for establishing the Joint Cyber Unit;
      •  Continue implementation of cybercrime agenda under the Security Union Strategy;
      • Encourage and facilitate the establishment of a Member States’ cyber intelligence working group residing within the EU INTCEN;
      • Advance the EU’s cyber deterrence posture to prevent, discourage, deter and respond to malicious cyber activities;
      • Review the Cyber Defence Policy Framework;
      • Facilitate the development of an EU “Military Vision and Strategy on Cyberspace as a Domain of Operations” for CSDP military missions and operations;
      • Support synergies between civil, defence and space industries; and
      • Reinforce cybersecurity of critical space infrastructures under the Space Programme.
      • Define a set of objectives in international standardisation processes, and promote these at international level;
      • Advance international security and stability in cyberspace, notably through the proposal by the EU and its Member States for a Programme of Action to Advance Responsible State Behaviour in Cyberspace (PoA) in the United Nations;
      • Offer practical guidance on the application of human rights and fundamental freedoms in cyberspace;
      • Better protect children against child sexual abuse and exploitation, as well as a Strategy on the Rights of the Child;
      • Strengthen and promote the Budapest Convention on Cybercrime, including through the work on the Second Additional Protocol to the Budapest Convention;
      • Expand EU cyber dialogue with third countries, regional and international organisations, including through an informal EU Cyber Diplomacy Network;
      • Reinforce the exchanges with the multi-stakeholder community, notably by regular and structured exchanges with the private sector, academia and civil society; and
      • Propose an EU External Cyber Capacity Building Agenda and an EU Cyber Capacity Building Board.
  • The U.S.-China  Economic  and  Security  Review  Commission released its annual report on the People’s Republic of China (PRC) per its “mandate “to monitor, investigate, and report to Congress on the national security implications of the bilateral trade and economic relationship between the United States and the People’s Republic of China.” The Commission argued:
    • Left unchecked, the PRC will continue building a new global order anathema to the interests and values that have underpinned unprecedented economic growth and stability among nations in the post-Cold War era. The past 20 years are littered with the Chinese  Communist  Party’s (CCP) broken promises. In China’s intended new order, there is little reason to believe CCP promises of “win-win” solutions, mutual respect, and peaceful coexistence. A clear understanding of the CCP’s adversarial national security and economic ambitions is essential as U.S. and allied leaders develop the policies and programs that will define the conditions of global freedom and shape our future.
    • The Commission made ten “Key Recommendations:”
      • Congress adopt the principle of reciprocity as foundational in all legislation bearing on U.S.-China relations.
      • Congress expand the authority of the Federal Trade Commission (FTC) to monitor and take foreign government subsidies into account in premerger notification processes.
      • Congress direct the U.S. Department of State to produce an annual report detailing China’s actions in the United Nations and its subordinate agencies that subvert the principles and purposes of the United Nations
      • Congress hold hearings to consider the creation of an interagency executive Committee on Technical Standards that would be responsible for coordinating U.S. government policy and priorities on international standards.
      • Congress consider establishing a “Manhattan Project”-like effort to ensure that the American public has access to safe and secure supplies of critical lifesaving and life-sustaining drugs and medical equipment, and to ensure that these supplies are available from domestic sources or, where necessary, trusted allies.
      • Congress enact legislation establishing a China Economic Data Coordination Center (CEDCC) at the Bureau of Economic Analysis at the U.S. Department of Commerce.
      • Congress direct the Administration, when sanctioning an entity in the People’s Republic of China for actions contrary to the economic and national security interests of the United States or for violations of human rights, to also sanction the parent entity.
      • Congress consider enacting legislation to make the Director of the American Institute in Taiwan a presidential nomination subject to the advice and consent of the United States Senate.
      • Congress amend the Immigration and Nationality Act to clarify that association with a foreign government’s technology transfer programs may be considered grounds to deny a nonimmigrant visa if the foreign government in question is deemed a strategic competitor of the United States, or if the applicant has engaged in violations of U.S. laws relating to espionage, sabotage, or export controls.
      • Congress direct the Administration to identify and remove barriers to receiving United States visas for Hong Kong residents attempting to exit Hong Kong for fear of political persecution.
  • The Electronic Privacy Information Center, the Center for Digital Democracy, the Campaign for a Commercial-Free Childhood, the Parent Coalition for Student Privacy, and Consumer Federation of America asked the Federal Trade Commission (FTC) “to recommend specific changes to the proposed Consent Order to safeguard the privacy interests of Zoom users” in their comments submitted regarding the FTC’s settlement with Zoom. In November, the FTC split along party lines to approve a settlement with Zoom to resolve allegations that the video messaging platform violated the FTC Act’s ban on unfair and deceptive practices in commerce. Zoom agreed to a consent order mandating a new information security program, third party assessment, prompt reporting of covered incidents and other requirements over a period of 20 years. The two Democratic Commissioners voted against the settlement and dissented because they argued it did not punish the abundant wrongdoing and will not dissuade future offenders. Commissioners Rohit Chopra and Rebecca Kelly Slaughter dissented for a variety of reasons that may be summed up: the FTC let Zoom off with a slap on the wrist. Kelly Slaughter focused on the majority’s choice to ignore the privacy implications of Zoom’s misdeeds, especially by not including any requirements that Zoom improve its faulty privacy practices.
    • The groups “recommend that the FTC modify the proposed Consent Order and require Zoom to(1) implement a comprehensive privacy program; (2) obtain regular independent privacy assessments and make those assessments available to the public; (3) provide meaningful redress for victims of Zoom’s unfair and deceptive trade practices; and (4) ensure the adequate protection and limits on the collection of children’s data.”

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Free-Photos from Pixabay

EU Regulators Settle Dispute Over Proper Punishment of Twitter For Breach

The EDPB uses its GDPR powers to manage a dispute between DPAs.

The European Data Protection Board (EDPB) concluded its first use of powers granted under the General Data Protection Regulation (GDPR) (Regulation (EU) 2016/679 of the European Parliament and of the Council) to resolve a dispute among EU regulators on how to apply the GDPR in punishing a violator. In this case, the EDPB had to referee how Twitter should be punished for a data breach arising from a bug affecting users of an Android OS. Ireland’s Data Protection Commission (DPC) and unnamed concerned supervisory agencies (CSA) disagreed about how Twitter should be fined for the GDPR breach, and so an unused article of the GDPR was triggered that put the EDPB in charge of resolving the dispute. The EDPB considered the objections raised by other EU agencies and found that the DPC needed to recalculate its fine that was set to be a maximum of $300,000 of a possible $69.2 million. Thereafter, the DPC revised and decided that “an administrative fine of €450,000 on Twitter” is “an effective, proportionate and dissuasive measure.”

The DPC issued a revised decision that incorporates the EDPB’s decision on the case that arose from a glitch that changed a person’s protected tweets to unprotected. Twitter users may protect their tweets, meaning only certain people, usually just followers, can see this content. However, a bug with the Android OS resulted in a person’s desire to protect their tweets being thwarted the DPC explained:

The bug that resulted in this data breach meant that, if a user operating an Android device changed the  email  address  associated  with  that  Twitter  account,  their  tweets  became  unprotected  and consequently were accessible to the wider public without the user’s knowledge.

The DPC said this breach occurred between September 2017 and January 2019, affecting 88,726 EU and European Economic Area (EEA) users, and on 8 January 2019, Twitter alerted the DPC, triggering an investigation. Twitter revealed:

On 26 December 2018, we received a bug report through our bug bounty program that if a Twitter user with a protected account, using Twitter for Android, changed their email address the bug would result in their account being unprotected.

Article 33(1) of the GDPR requires breaches to be reported to a DPA within 72 hours in most cases:

In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.

However, Twitter conceded by way of reason as to why it had not reported the breach within the 72 hour window:

The severity of the issue – and that it was reportable – was not appreciated until 3 January 2018 at which point Twitter’s incident response process was put into action.

Additionally, Article 33(5) would become relevant during the DPC investigation:

The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.

Consequently, Twitter had a responsibility as the controller to document all the relevant facts about the data breach and then to report the breach within 72 hours of becoming aware of the breach subject to a range of exceptions.

Shortly thereafter, the DPC named itself the lead supervisory agency (LSA), investigated and reached its proposed decision in late April and submitted it to the European Commission (EC). And, this is where the need for the EDPB to step in began.

Irish Data Protection Commissioner Helen Dixon explained the scope of the subsequent investigation:

  1. Whether Twitter International Company (TIC) complied with its obligations, in accordance with Article 33(1) GDPR, to notify the Commission of the Breach without undue delay and, where feasible, not later than 72 hours after having become aware of it; and
  2. Whether TIC complied with its obligation under Article 33(5) to document the Breach.

Dixon found that TIC did not comply with Article 33(1) and found unpersuasive the main claim of TIC that because Twitter, International, its processor under EU law, did not alert TIC in a timely fashion, it need not meet the 72 hour window. Moreover, Dixon found TIC did not meet its Article 33(5) obligations such that its compliance with Article 33 could be determined. However, the size of the fine became the issue necessitating the EDPB step in because the Austrian Supervisory Authority (Österreichische Datenschutzbehörde), the German Supervisory Authority (Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit) and the Italian Supervisory Authority (Garante per la protezione dei dati personali) made “relevant and reasoned” objections.

Per the GDPR, the EDPB intervened. Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

In its mid-November statement, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 2% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

According to the DPC, the EDPB ultimately decided that

…the [DPC] is required to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC, and to amend its Draft Decision by increasing the level of the fine in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.

Dixon went back and reasoned through the breach and compliance. She stressed that the GDPR infringements were largely aside and apart from the substance of the breach, which is why the administrative fine was low. Nonetheless, Dixon reexamined the evidence in light of the EDPB’s decision and concluded in relevant part:

  • I therefore consider that the nature of the obligations arising under Article 33(1) and Article 33(5) are such that, compliance is central to the overall functioning of the supervision and enforcement regime performed by supervisory authorities in relation to both the specific issue of personal data breaches but also the identification and assessment of wider issues of non-compliance by controllers. As such, non-compliance with these obligations has serious consequences in that it risks undermining the effective exercise by supervisory authorities of their functions under the GDPR. With regard to the nature of the specific infringements in these circumstances, it is clear, having regard to the foregoing, that in the circumstances of this case, the delayed notification under Article 33(1) inevitably delayed the Commission’s assessment of the Breach. With regard to Article 33(5), the deficiencies in the “documenting” of the Breach by TIC impacted on the Commission’s overall efficient assessment of the Breach, necessitating the raising of multiple queries concerning the facts and sequencing surrounding the notification of the Breach.
  • Accordingly, having regard to the potential for damage to data subjects caused by the delayed notification to the Commission (which I have set out above in the context of Article 83(2)(a)), the corollary of this is that any category of personal data could have been affected by the delayed notification. Whilst, as stated above, there was no direct evidence of damage, at the same time, it cannot be definitively said that there was no damage to data subjects or no affected categories of personal data.

Dixon also recalculated the fine that she noted was bound on the upper limit at €10 million or 2% of annual worldwide revenue after once again turning aside TIC’s argument that it independent of Twitter for purposes of determining a fine. Dixon determined the appropriate administrative fine would be about $500,000 and Twitter’s worldwide revenue was $3.46 billion in 2019 (meaning a maximum penalty of $69.2 million.) Dixon explained:

Having regard to all of the foregoing, and, in particular, having had due regard to all of the factors which I am required to consider under Articles 83(2)(a) to (k), as applicable, and in the interests of effectiveness, proportionality and deterrence, and in light of the re-assessment of the elements I have implemented and documented above in accordance with the EDPB Decision, I have decided to impose an administrative fine of $500,000, which equates (in my estimation for this purpose) to €450,000. In deciding to impose a fine in this amount, I have had regard to the previous range of the fine, set out in the Draft Decision (of $150,000 – $300,000), and to the binding direction in the EDPB Decision, at paragraph 207 thereof, that the level of the fine should be increased “..in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.”

In its Article 65 decision, the EDPB judged the various objections to the DPC’s proposed decision against Article 4(24) of the GDPR:

‘relevant and reasoned objection’ means an objection to a draft decision as to whether there is an infringement of this Regulation, or whether envisaged action in relation to the controller or processor complies with this Regulation, which clearly demonstrates the significance of the risks posed by the draft decision as regards the fundamental rights and freedoms of data subjects and, where applicable, the free flow of personal data within the Union;

The EDPB ultimately decided “the fine proposed in the Draft Decision is too low and therefore does not fulfil its purpose as a corrective measure, in particular it does not meet the requirements of Article 83(1) GDPR of being effective, dissuasive and proportionate.” The EDPB directed the DPC “to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC so as to ensure it is appropriate to the facts of the case.” However, the EDPB turned aside a number of other objections raised by EU DPAs as failing to meet the standard of review in Article 4(24):

  • the competence of the LSA;
  • the qualification of the roles of TIC and Twitter, Inc., respectively;
  • the infringements of the GDPR identified by the LSA;
  • the existence of possible additional (or alternative) infringements of the GDPR;
  • the lack of a reprimand;

However, the EDPB stressed:

Regarding the objections deemed not to meet the requirements stipulated by Art 4(24) GDPR, the EDPB does not take any position on the merit of any substantial issues raised by these objections. The EDPB reiterates that its current decision is without any prejudice to any assessments the EDPB may be called upon to make in other cases, including with the same parties, taking into account the contents of the relevant draft decision and the objections raised by the CSAs.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by papagnoc from Pixabay

Further Reading, Other Development, and Coming Events (4 January 2021)

Further Reading

  • Microsoft Says Russian Hackers Viewed Some of Its Source Code” By Nicole Perlroth — The New York Times. The Sluzhba vneshney razvedki Rossiyskoy Federatsii’s (SVR) hack keeps growing and growing with Microsoft admitting its source code was viewed through an employee account. It may be that authorized Microsoft resellers were one of the vectors by which the SVR accessed SolarWinds, FireEye, and ultimately a number of United States (U.S.) government agencies. Expect more revelations to come about the scope and breadth of entities and systems the SVR compromised.
  • In 2020, we reached peak Internet. Here’s what worked — and what flopped.” By Geoffrey Fowler — The Washington Post. The newspaper’s tech columnist reviews the technology used during the pandemic and what is likely to stay with us when life returns to some semblance of normal.
  • Facebook Says It’s Standing Up Against Apple For Small Businesses. Some Of Its Employees Don’t Believe It.” By Craig Silverman and Ryan Mac — BuzzFeed News. Again, two of the best-sourced journalists when it comes to Facebook have exposed employee dissent within the social media and advertising giant, and this time over the company’s advertising blitz positioning it as the champion of small businesses that allegedly stand to be hurt when Apple rolls out iOS 14 that will allow users to block the type of tracking across apps and the internet Facebook thrives on. The company’s PR campaign stands in contrast to the anecdotal stories about errors that harmed and impeded small companies in using Facebook to advertise and sell products and services to cusstomers.
  • SolarWinds hack spotlights a thorny legal problem: Who to blame for espionage?” By Tim Starks — cyberscoop. This piece previews possible and likely inevitable litigation to follow from the SolarWinds hack, including possible securities action on the basis of fishy dumps of stock by executive, breach of contract, and negligence for failing to patch and address vulnerabilities in a timely fashion. Federal and state regulators will probably get on the field, too. But this will probably take years to play out as Home Depot settled claims arising from its 2014 breach with state attorneys general in November 2020.
  • The Tech Policies the Trump Administration Leaves Behind” By Aaron Boyd — Nextgov. A look back at the good, the bad, and the ugly of the Trump Administration’s technology policies, some of which will live on in the Biden Administration.

Other Developments

  • In response to the SolarWinds hack, the Federal Bureau of Investigation (FBI), the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA), and the Office of the Director of National Intelligence (ODNI) issued a joint statement indicating that the process established in Pursuant to Presidential Policy Directive (PPD) 41, an Obama Administration policy has been activated and a Cyber Unified Coordination Group (UCG) has been formed “to coordinate a whole-of-government response to this significant cyber incident.” The agencies explained “[t]he UCG is intended to unify the individual efforts of these agencies as they focus on their separate responsibilities.”
    • In PPD-41 it is explained that a UCG “shall serve as the primary method for coordinating between and among Federal agencies in response to a significant cyber incident as well as for integrating private sector partners into incident response efforts, as appropriate.” Moreover, “[t]he Cyber UCG is intended to result in unity of effort and not to alter agency authorities or leadership, oversight, or command responsibilities.”
  • Following the completion of its “in-depth” investigation, the European Commission (EC) cleared Google’s acquisition of Fitbit with certain conditions, removing a significant hurdle for the American multinational in buying the wearable fitness tracker company. In its press release, the EC explained that after its investigation, “the Commission had concerns that the transaction, as initially notified, would have harmed competition in several markets.” To address and allay concerns, Google bound itself for ten years to a set of commitments that can be unilaterally extended by the EC and will be enforced, in part, by the appointment of a trustee to oversee compliance.
    • The EC was particularly concerned about:
      • Advertising: By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to that of Fitbit. By increasing the already vast amount of data that Google could use for the personalisation of ads, it would be more difficult for rivals to match Google’s services in the markets for online search advertising, online display advertising, and the entire “ad tech” ecosystem. The transaction would therefore raise barriers to entry and expansion for Google’s competitors for these services to the detriment of advertisers, who would ultimately face higher prices and have less choice.
      • Access to Web Application Programming Interface (‘API’) in the market for digital healthcare: A number of players in this market currently access health and fitness data provided by Fitbit through a Web API, in order to provide services to Fitbit users and obtain their data in return. The Commission found that following the transaction, Google might restrict competitors’ access to the Fitbit Web API. Such a strategy would come especially at the detriment of start-ups in the nascent European digital healthcare space.
      • Wrist-worn wearable devices: The Commission is concerned that following the transaction, Google could put competing manufacturers of wrist-worn wearable devices at a disadvantage by degrading their interoperability with Android smartphones.
    • As noted, Google made a number of commitments to address competition concerns:
      • Ads Commitment:
        • Google will not use for Google Ads the health and wellness data collected from wrist-worn wearable devices and other Fitbit devices of users in the EEA, including search advertising, display advertising, and advertising intermediation products. This refers also to data collected via sensors (including GPS) as well as manually inserted data.
        • Google will maintain a technical separation of the relevant Fitbit’s user data. The data will be stored in a “data silo” which will be separate from any other Google data that is used for advertising.
        • Google will ensure that European Economic Area (‘EEA’) users will have an effective choice to grant or deny the use of health and wellness data stored in their Google Account or Fitbit Account by other Google services (such as Google Search, Google Maps, Google Assistant, and YouTube).
      • Web API Access Commitment:
        • Google will maintain access to users’ health and fitness data to software applications through the Fitbit Web API, without charging for access and subject to user consent.
      • Android APIs Commitment:
        • Google will continue to license for free to Android original equipment manufacturers (OEMs) those public APIs covering all current core functionalities that wrist-worn devices need to interoperate with an Android smartphone. Such core functionalities include but are not limited to, connecting via Bluetooth to an Android smartphone, accessing the smartphone’s camera or its GPS. To ensure that this commitment is future-proof, any improvements of those functionalities and relevant updates are also covered.
        • It is not possible for Google to circumvent the Android API commitment by duplicating the core interoperability APIs outside the Android Open Source Project (AOSP). This is because, according to the commitments, Google has to keep the functionalities afforded by the core interoperability APIs, including any improvements related to the functionalities, in open-source code in the future. Any improvements to the functionalities of these core interoperability APIs (including if ever they were made available to Fitbit via a private API) also need to be developed in AOSP and offered in open-source code to Fitbit’s competitors.
        • To ensure that wearable device OEMs have also access to future functionalities, Google will grant these OEMs access to all Android APIs that it will make available to Android smartphone app developers including those APIs that are part of Google Mobile Services (GMS), a collection of proprietary Google apps that is not a part of the Android Open Source Project.
        • Google also will not circumvent the Android API commitment by degrading users experience with third party wrist-worn devices through the display of warnings, error messages or permission requests in a discriminatory way or by imposing on wrist-worn devices OEMs discriminatory conditions on the access of their companion app to the Google Play Store.
  • The United States (U.S.) Department of Health and Human Services’ (HHS) Office of Civil Rights (OCR) has proposed a major rewrite of the regulations governing medical privacy in the U.S. As the U.S. lacks a unified privacy regime, the proposed changes would affect on those entities in the medical sector subject to the regime, which is admittedly many such entities. Nevertheless, it is almost certain the Biden Administration will pause this rulemaking and quite possibly withdraw it should it prove crosswise with the new White House’s policy goals.
    • HHS issued a notice of proposed rulemaking “to modify the Standards for the Privacy of Individually Identifiable Health Information (Privacy Rule) under the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act).”
      • HHS continued:
        • The Privacy Rule is one of several rules, collectively known as the HIPAA Rules, that protect the privacy and security of individuals’ medical records and other protected health information (PHI), i.e., individually identifiable health information maintained or transmitted by or on behalf of HIPAA covered entities (i.e., health care providers who conduct covered health care transactions electronically, health plans, and health care clearinghouses).
        • The proposals in this NPRM support the Department’s Regulatory Sprint to Coordinated Care (Regulatory Sprint), described in detail below. Specifically, the proposals in this NPRM would amend provisions of the Privacy Rule that could present barriers to coordinated care and case management –or impose other regulatory burdens without sufficiently compensating for, or offsetting, such burdens through privacy protections. These regulatory barriers may impede the transformation of the health care system from a system that pays for procedures and services to a system of value-based health care that pays for quality care.
    • In a press release, OCR asserted:
      • The proposed changes to the HIPAA Privacy Rule include strengthening individuals’ rights to access their own health information, including electronic information; improving information sharing for care coordination and case management for individuals; facilitating greater family and caregiver involvement in the care of individuals experiencing emergencies or health crises; enhancing flexibilities for disclosures in emergency or threatening circumstances, such as the Opioid and COVID-19 public health emergencies; and reducing administrative burdens on HIPAA covered health care providers and health plans, while continuing to protect individuals’ health information privacy interests.
  • The Federal Trade Commission (FTC) has used its powers to compel selected regulated entities to provide requested information in asking that “nine social media and video streaming companies…provide data on how they collect, use, and present personal information, their advertising and user engagement practices, and how their practices affect children and teens.” The TFTC is using its Section 6(b) authority to compel the information from Amazon.com, Inc., ByteDance Ltd., which operates the short video service TikTok, Discord Inc., Facebook, Inc., Reddit, Inc., Snap Inc., Twitter, Inc., WhatsApp Inc., and YouTube LLC. Failure to respond can result in the FTC fining a non-compliant entity.
    • The FTC claimed in its press release it “is seeking information specifically related to:
      • how social media and video streaming services collect, use, track, estimate, or derive personal and demographic information;
      • how they determine which ads and other content are shown to consumers;
      • whether they apply algorithms or data analytics to personal information;
      • how they measure, promote, and research user engagement; and
      • how their practices affect children and teens.
    • The FTC explained in its sample order:
      • The Commission is seeking information concerning the privacy policies, procedures, and practices of Social Media and Video Streaming Service providers, Including the method and manner in which they collect, use, store, and disclose Personal Information about consumers and their devices. The Special Report will assist the Commission in conducting a study of such policies, practices, and procedures.
  • The United States (U.S.) Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) supplemented its Emergency Directive 21-01 to federal civilian agencies in response to the Sluzhba vneshney razvedki Rossiyskoy Federatsii’s (SVR) hack via SolarWinds. In an 18 December update, CISA explained:
    • This section provides additional guidance on the implementation of CISA Emergency Directive (ED) 21-01, to include an update on affected versions, guidance for agencies using third-party service providers, and additional clarity on required actions.
    •  In a 30 December update, CISA stated:
      • Specifically, all federal agencies operating versions of the SolarWinds Orion platform other than those identified as “affected versions” below are required to use at least SolarWinds Orion Platform version 2020.2.1HF2. The National Security Agency (NSA) has examined this version and verified that it eliminates the previously identified malicious code. Given the number and nature of disclosed and undisclosed vulnerabilities in SolarWinds Orion, all instances that remain connected to federal networks must be updated to 2020.2.1 HF2 by COB December 31, 2020. CISA will follow up with additional supplemental guidance, to include further clarifications and hardening requirements.
  • Australia’s Attorney-General’s Department published an unclassified version of the four volumes of the “Report of the Comprehensive Review of the Legal Framework of the National Intelligence Community,” an “examination of the legislative framework underpinning the National Intelligence Community (NIC)…the first and largest since the Hope Royal Commissions considered the Australian Intelligence Community (AIC) in the 1970s and 1980s.” Ultimately, the authors of the report concluded:
    • We do not consider the introduction of a common legislative framework, in the form of a single Act governing all or some NIC agencies, to be a practical, pragmatic or proportionate reform. It would be unlikely that the intended benefits of streamlining and simplifying NIC legislation could be achieved due to the diversity of NIC agency functions—from intelligence to law enforcement, regulatory and policy—and the need to maintain differences in powers, immunities and authorising frameworks. The Review estimates that reform of this scale would cost over $200million and take up to 10years to complete. This would be an impractical and disproportionate undertaking for no substantial gain. In our view, the significant costs and risks of moving to a single, consolidated Act clearly outweigh the limited potential benefits.
    • While not recommending a common legislative framework for the entire NIC, some areas of NIC legislation would benefit from simplification and modernisation. We recommend the repeal of the TIA Act, Surveillance Devices Act 2004(SD Act) and parts of the Australian Security Intelligence Organisation Act 1979 (ASIO Act), and their replacement with a single new Act governing the use of electronic surveillance powers—telecommunications interception, covert access to stored communications, computers and telecommunications data, and the use of optical, listening and tracking devices—under Commonwealth law.
  • The National Institute of Standards and Technology (NIST) released additional materials to supplement a major rewrite of a foundational security guidance document. NIST explained “[n]ew supplemental materials for NIST Special Publication (SP) 800-53 Revision 5, Security and Privacy Controls for Information Systems and Organizations, are available for download to support the December 10, 2020 errata release of SP 800-53 and SP 800-53B, Control Baselines for Information Systems and Organizations.” These supplemental materials include:
    • A comparison of the NIST SP 800-53 Revision 5 controls and control enhancements to Revision 4. The spreadsheet describes the changes to each control and control enhancement, provides a brief summary of the changes, and includes an assessment of the significance of the changes.  Note that this comparison was authored by The MITRE Corporation for the Director of National Intelligence (DNI) and is being shared with permission by DNI.
    • Mapping of the Appendix J Privacy Controls (Revision 4) to Revision 5. The spreadsheet supports organizations using the privacy controls in Appendix J of SP 800-53 Revision 4 that are transitioning to the integrated control catalog in Revision 5.
    • Mappings between NIST SP 800-53 and other frameworks and standards. The mappings provide organizations a general indication of SP 800-53 control coverage with respect to other frameworks and standards. When leveraging the mappings, it is important to consider the intended scope of each publication and how each publication is used; organizations should not assume equivalency based solely on the mapping tables because mappings are not always one-to-one and there is a degree of subjectivity in the mapping analysis.
  • Via a final rule, the Department of Defense (DOD) codified “the National Industrial Security Program Operating Manual (NISPOM) in regulation…[that] establishes requirements for the protection of classified information disclosed to or developed by contractors, licensees, grantees, or certificate holders (hereinafter referred to as contractors) to prevent unauthorized disclosure.” The DOD stated “[i]n addition to adding the NISPOM to the Code of Federal Regulations (CFR), this rule incorporates the requirements of Security Executive Agent Directive (SEAD) 3, “Reporting Requirements for Personnel with Access to Classified Information or Who Hold a Sensitive Position.” The DOD stated “SEAD 3 requires reporting by all contractor cleared personnel who have been granted eligibility for access to classified information.”
    • The DOD added “[t]his NISPOM rule provides for a single nation-wide implementation plan which will, with this rule, include SEAD 3 reporting by all contractor cleared personnel to report specific activities that may adversely impact their continued national security eligibility, such as reporting of foreign travel and foreign contacts.”
    • The DOD explained “NISP Cognizant Security Agencies (CSAs) shall conduct an analysis of such reported activities to determine whether they pose a potential threat to national security and take appropriate action.”
    • The DOD added that “the rule also implements the provisions of Section 842 of Public Law 115-232, which removes the requirement for a covered National Technology and Industrial Base (NTIB) entity operating under a special security agreement pursuant to the NISP to obtain a national interest determination as a condition for access to proscribed information.”
  • An advisory committee housed at the United States (U.S.) Department of Homeland Security (DHS) is calling for the White House to quickly “operationalize intelligence in a classified space with senior executives and cyber experts from most critical entities in the energy, financial services, and communications sectors working directly with intelligence analysts and other government staff.” In their report, the President’s National Infrastructure Advisory Council (NIAC) proposed the creation of a Critical Infrastructure Command Center (CICC) to “provid[e] real-time collaboration between government and industry…[and] take direct action and provide tactical solutions to mitigate, remediate,  and deter threats.” NIAC urged the President to “direct relevant federal agencies to support the private sector in executing the concept, including identifying the required government staff…[and] work with Congress to ensure the appropriate authorities are established to allow the CICC to fully realize its operational functionality.” NIAC recommended “near-term actions to implement the CICC concept:
    • 1.The President should direct the relevant federal agencies to support the private sector in rapidly standing up the CICC concept with the energy, financial services, and communications sectors:
      • a. Within 90 days the private sector will identify the executives who will lead execution of the CICC concept and establish governing criteria (including membership, staffing and rotation, and other logistics).
      • b. Within 120 days the CICC sector executives will identify and assign the necessary CICC staff from the private sector.
      • c. Within 90 days an appropriate venue to house the operational component will be identified and the necessary agreements put in place.
    • 2. The President should direct the Intelligence Community and other relevant government agencies to identify and co-locate the required government staff counterparts to enable the direct coordination required by the CICC. This staff should be pulled from the IC, SSAs, and law enforcement.
    • 3. The President, working with Congress, should establish the appropriate authorities and mission for federal agencies to directly share intelligence with critical infrastructure companies, along with any other authorities required for the CICC concept to be fully successful (identified in Appendix A).
    • 4. Once the CICC concept is fully operational (within 180 days), the responsible executives should deliver a report to the NSC and the NIAC demonstrating how the distinct capabilities of the CICC have been achieved and the impact of the capabilities to date. The report should identify remaining gaps in resources, direction, or authorities.

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by opsa from Pixabay

EC Finally Unveils Digital Services Act and Digital Markets Act

The EU releases its proposals to remake digital markets.

The European Commission (EC) has released its draft proposals to remake how the European Union (EU) regulates digital markets and digital services, the latest in the bloc’s attempts to rein in what it sees as harms and abuses to people and competition in Europe and the world. At the earliest, these proposals would take effect in 2022 and are sure to be vigorously opposed by large United States (U.S.) multinationals like Google and Facebook and will also likely faced more restrained pushback from the U.S. government.

The Digital Markets Act would allow the EU to designate certain core platform services as gatekeepers subject to certain quanitative metrics or on a case-by-case basis. Once a company is deemed a gatekeeper, it would be subject to much greater regulation by the EU and violations of the new act could result in fines of 10% of worldwide revenue.

In its press release, the EC asserted:

European values are at the heart of both proposals. The new rules will better protect consumers and their fundamental rights online, and will lead to fairer and more open digital markets for everyone. A modern rulebook across the single market will foster innovation, growth and competitiveness and will provide users with new, better and reliable online services. It will also support the scaling up of smaller platforms, small and medium-sized enterprises, and start-ups, providing them with easy access to customers across the whole single market while lowering compliance costs. Furthermore, the new rules will prohibit unfair conditions imposed by online platforms that have become or are expected to become gatekeepers to the single market. The two proposals are at the core of the Commission’s ambition to make this Europe’s Digital Decade.

In the Digital Markets Act, the EC explained the problem with large platforms dominating certain digital markets. The EC discussed the harm to people and medium and small businesses as some large companies control certain markets and use their size and dominance to extract unfair prices for inferior services and products. The EC listed the core platform services that might be regulated:

  • online intermediation services (incl. for example marketplaces, app stores and online intermediation services in other sectors like mobility, transport or energy)
  • online search engines,
  • social networking
  • video sharing platform services,
  • number-independent interpersonal electronic communication services,
  • operating systems,
  • cloud services and
  • advertising services, including advertising networks, advertising exchanges and any other advertising intermediation services, where these advertising services are being related to one or more of the other core platform services mentioned above.

Clearly, a number of major American firms could easily be considered “core platform services” including Amazon, Apple, Google, Facebook, Instagram, YouTube, WhatsApp, Microsoft, and others. Whether they would be deemed gatekeepers would hinge on whether they meet the quantitative metrics the EU will put in place, and this will be a rebuttable presumption such that if a firm meets the standards, it may present evidence to the contrary and argue it is not a gatekeeper.

The EC detailed the quantitative metrics in Article 3. A company may qualify if it meets all three of the following criteria subject to further metrics:

A provider of core platform services shall be designated as gatekeeper if:

(a) it has a significant impact on the internal market;

(b) it operates a core platform service which serves as an important gateway for business users to reach end users; and

(c) it enjoys an entrenched and durable position in its operations or it is foreseeable that it will enjoy such a position in the near future.

The other metrics include €6.5 billion in revenue over the last three years or a €65 billion market capitalization and the provision of core platform services in at least three member states to show a “significant impact on internal market.” For the second category listed above, a company would need to provide a core platform service to 45 million or more people in the EU and 10,000 or more businesses in the EU. And, for the last category, passing the 45 million user and 10,000 business threshold for three consecutive years would suffice. The act reads:

A provider of core platform services shall be presumed to satisfy:

(a) the requirement in paragraph 1 point (a) where the undertaking to which it belongs achieves an annual EEA turnover equal to or above EUR 6.5 billion in the last three financial years, or where the average market capitalisation or the equivalent fair market value of the undertaking to which it belongs amounted to at least EUR 65 billion in the last financial year, and it provides a core platform service in at least three Member States;

(b) the requirement in paragraph 1 point (b) where it provides a core platform service that has more than 45 million monthly active end users established or located in the Union and more than 10,000 yearly active business users established in the Union in the last financial year; for the purpose of the first subparagraph, monthly active end users shall refer to the average number of monthly active end users throughout the largest part of the last financial year;

(c) the requirement in paragraph 1 point (c) where the thresholds in point (b) were met in each of the last three financial years.

The EU would also be able to label a provider of core platform services a gatekeeper on a case-by-case basis:

Provision should also be made for the assessment of the gatekeeper role of providers of core platform services which do not satisfy all of the quantitative thresholds, in light of the overall objective requirements that they have a significant impact on the internal market, act as an important gateway for business users to reach end users and benefit from a durable and entrenched position in their operations or it is foreseeable that it will do so in the near future.

It bears note that a company would be found to be a gatekeeper if it is merely foreseeable that it will satisfy these criteria soon. This flexibility could allow the EU to track companies and flag them as gatekeepers before they, in fact, achieve the sort of market dominance this regulation is intended to stop.

Among the relevant excerpts from the “Reasons for and objectives of the proposal” section of the act are:

  • Large platforms have emerged benefitting from characteristics of the sector such as strong network effects, often embedded in their own platform ecosystems, and these platforms represent key structuring elements of today’s digital economy, intermediating the majority of transactions between end users and business users. Many of these undertakings are also comprehensively tracking and profiling end users. A few large platforms increasingly act as gateways or gatekeepers between business users and end users and enjoy an entrenched and durable position, often as a result of the creation of conglomerate ecosystems around their core platform services, which reinforces existing entry barriers.
  • As such, these gatekeepers have a major impact on, have substantial control over the access to, and are entrenched in digital markets, leading to significant dependencies of many business users on these gatekeepers, which leads, in certain cases, to unfair behaviour vis-à-vis these business users. It also leads to negative effects on the contestability of the core platform services concerned. Regulatory initiatives by Member States cannot fully address these effects; without action at EU level, they could lead to a fragmentation of the Internal Market.
  • Unfair practices and lack of contestability lead to inefficient outcomes in the digital sector in terms of higher prices, lower quality, as well as less choice and innovation to the detriment of European consumers. Addressing these problems is of utmost importance in view of the size of the digital economy (estimated at between 4.5% to 15.5% of global GDP in 2019 with a growing trend) and the important role of online platforms in digital markets with its societal and economic implications.
  • Weak contestability and unfair practices in the digital sector are more frequent and pronounced in certain digital services than others. This is the case in particular for widespread and commonly used digital services and infrastructures that mostly directly intermediate between business users and end users.
  • The enforcement experience under EU competition rules, numerous expert reports and studies and the results of the OPC show that there are a number of digital services that have the following features: (i) highly concentrated multi-sided platform services, where usually one or very few large digital platforms set the commercial conditions with considerable autonomy; (ii) a few large digital platforms act as gateways for business users to reach their customers and vice-versa; and (iii) gatekeeper power of these large digital platforms is often misused by means of unfair behaviour vis-à-vis economically dependent business users and customers.
  • The proposal is therefore further limited to a number of ‘core platform services’ where the identified problems are most evident and prominent and where the presence of a limited number of large online platforms that serve as gateways for business users and end users has led or is likely to lead to weak contestability of these services and of the markets in which these intervene. These core platform services include: (i) online intermediation services (incl. for example marketplaces, app stores and online intermediation services in other sectors like mobility, transport or energy) (ii) online search engines, (iii) social networking (iv)video sharing platform services, (v) number-independent interpersonal electronic communication services, (vi) operating systems, (vii) cloud services and (viii) advertising services, including advertising networks, advertising exchanges and any other advertising intermediation services, where these advertising services are being related to one or more of the other core platform services mentioned above.
  • The fact that a digital service qualifies as a core platform service does not mean that issues of contestability and unfair practices arise in relation to every provider of these core platform services. Rather, these concerns appear to be particularly strong when the core platform service is operated by a gatekeeper. Providers of core platform providers can be deemed to be gatekeepers if they: (i) have a significant impact on the internal market, (ii) operate one or more important gateways to customers and (iii) enjoy or are expected to enjoy an entrenched and durable position in their operations.
  • Such gatekeeper status can be determined either with reference to clearly circumscribed and appropriate quantitative metrics, which can serve as rebuttable presumptions to determine the status of specific providers as a gatekeeper, or based on a case-by-case qualitative assessment by means of a market investigation.

The Digital Services Act would add new regulation on top of Directive 2000/31/EC (aka the e-Commerce Directive) by “[b]uilding on the key principles set out in the e-Commerce Directive, which remain valid today.” This new scheme “seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.”

The Digital Services Act is focused mostly on the information and misinformation present all over the online world and the harms it wreaks on EU citizens. However, the EC is also seeking to balance fundamental EU rights in more tightly regulating online platforms. Like the Digital Markets Act, this regulation would focus on the largest online content, product and services providers, which, as a practical matter, would likely be Facebook, Amazon, Google, Spotify, and a handful of other companies. Once a company has 10% of more of the EU’s population using its offerings, then the requirements of the Digital Services Act would be triggered.

Additionally, the Digital Services Act unites two online issues not usually considered together in the United States (U.S.): harmful online content and harmful online products. Even though it seems logical to consider these online offerings in tandem, there is clear bifurcation in the U.S. in how these two issues are regulated to the extent they are at the federal and state levels.

The Digital Services Act “will introduce a series of new, harmonised EU-wide obligations for digital services, carefully graduated on the basis of those services’ size and impact, such as:

  • Rules for the removal of illegal goods, services or content online;
  • Safeguards for users whose content has been erroneously deleted by platforms;
  • New obligations for very large platforms to take risk-based action to prevent abuse of their systems;
  • Wide-ranging transparency measures, including on online advertising and on the algorithms used to recommend content to users;
  • New powers to scrutinize how platforms work, including by facilitating access by researchers to key platform data;
  • New rules on traceability of business users in online market places, to help track down sellers of illegal goods or services;
  • An innovative cooperation process among public authorities to ensure effective enforcement across the single market.”

The EC explained

new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.

The EC spelled out what the Digital Services Act would do:

This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes:

(a) a framework for the conditional exemption from liability of providers of intermediary services;

(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services;

(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.

The EC explained the purpose of the act:

  • this proposal seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.
  • The proposal defines clear responsibilities and accountability for providers of intermediary services, and in particular online platforms, such as social media and marketplaces. By setting out clear due-diligence obligations for certain intermediary services, including notice-and-action procedures for illegal content and the possibility to challenge the platforms’ content moderation decisions, the proposal seeks to improve users’ safety online across the entire Union and improve the protection of their fundamental rights. Furthermore, an obligation for certain online platforms to receive, store and partially verify and publish information on traders using their services will ensure a safer and more transparent online environment for consumers.
  • Recognising the particular impact of very large online platforms on our economy and society, the proposal sets a higher standard of transparency and accountability on how the providers of such platforms moderate content, on advertising and on algorithmic processes. It sets obligations to assess the risks their systems pose to develop appropriate risk management tools to protect the integrity of their services against the use of manipulative techniques.

The EC summarized how the act will work:

  • The operational threshold for service providers in scope of these obligations includes those online platforms with a significant reach in the Union, currently estimated to be amounting to more than 45 million recipients of the service. This threshold is proportionate to the risks brought by the reach of the platforms in the Union; where the Union’s population changes by a certain percentage, the Commission will adjust the number of recipients considered for the threshold, so that it consistently corresponds to 10 % of the Union’s population. Additionally, the Digital Services Act will set out a co-regulatory backstop, including building on existing voluntary initiatives.
  • This proposal should constitute the appropriate basis for the development of robust technologies to prevent the reappearance of illegal information, accompanied with the highest safeguards to avoid that lawful content is taken down erroneously; such tools could be developed on the basis of voluntary agreements between all parties concerned and should be encouraged by Member States; it is in the interest of all parties involved in the provision of intermediary services to adopt and implement such procedures; the provisions of this Regulation relating to liability should not preclude the development and effective operation, by the different interested parties, of technical systems of protection and identification and of automated recognition made possible by digital technology within the limits laid down by Regulation 2016/679.
  • Union citizens and others are exposed to ever-increasing risks and harms online – from the spread of illegal content and activities, to limitations to express themselves and other societal harms. The envisaged policy measures in this legislative proposal will substantially improve this situation by providing a modern, future-proof governance framework, effectively safeguarding the rights and legitimate interests of all parties involved, most of all Union citizens. The proposal introduces important safeguards to allow citizens to freely express themselves, while enhancing user agency in the online environment, as well as the exercise of other fundamental rights such as the right to an effective remedy, non-discrimination, rights of the child as well as the protection of personal data and privacy online.
  • The proposed Regulation will mitigate risks of erroneous or unjustified blocking speech, address the chilling effects on speech, stimulate the freedom to receive information and hold opinions, as well as reinforce users’ redress possibilities. Specific groups or persons may be vulnerable or disadvantaged in their use of online services because of their gender, race or ethnic origin, religion or belief, disability, age or sexual orientation. They can be disproportionately affected by restrictions and removal measures following from (unconscious or conscious) biases potentially embedded in the notification systems by users and third parties, as well as replicated in automated content moderation tools used by platforms. The proposal will mitigate discriminatory risks, particularly for those groups or persons and will contribute to the protection of the rights of the child and the right to human dignity online. The proposal will only require removal of illegal content and will impose mandatory safeguards when users’ information is removed, including the provision of explanatory information to the user, complaint mechanisms supported by the service providers as well as external out-of-court dispute resolution mechanism. Furthermore, it will ensure EU citizens are also protected when using services provided by providers not established in the Union but active on the internal market, since those providers are covered too.
  • With regard to service providers’ freedom to conduct a business, the costs incurred on businesses are offset by reducing fragmentation across the internal market. The proposal introduces safeguards to alleviate the burden on service providers, including measures against repeated unjustified notices and prior vetting of trusted flaggers by public authorities. Furthermore, certain obligations are targeted to very large online platforms, where the most serious risks often occur and which have the capacity absorb the additional burden.
  • The proposed legislation will preserve the prohibition of general monitoring obligations of the e-Commerce Directive, which in itself is crucial to the required fair balance of fundamental rights in the online world. The new Regulation prohibits general monitoring obligations, as they could disproportionately limit users’ freedom of expression and freedom to receive information, and could burden service providers excessively and thus unduly interfere with their freedom to conduct a business. The prohibition also limits incentives for online surveillance and has positive implications for the protection of personal data and privacy.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Sambeet D from Pixabay

Further Reading, Other Developments, and Coming Events (15 December)

Further Reading

  • DHS, State and NIH join list of federal agencies — now five — hacked in major Russian cyberespionage campaign” By Ellen Nakashima and Craig Timberg — The Washington Post; “Scope of Russian Hack Becomes Clear: Multiple U.S. Agencies Were Hit” By David E. Sanger, Nicole Perlroth and Eric Schmitt — The New York Times; The list of United States (U.S.) government agencies breached by Sluzhba vneshney razvedki Rossiyskoy Federatsii (SVR), the Russian Federation’s Foreign Intelligence Service, has grown. Now the Department of Homeland Security, Defense, and State and the National Institutes of Health are reporting they have been breached. It is unclear if Fortune 500 companies in the U.S. and elsewhere and U.S. nuclear laboratories were also breached in this huge, sophisticated espionage exploit. It appears the Russians were selective and careful, and these hackers may have only accessed information held on U.S. government systems. And yet, the Trump Administration continues to issue equivocal statements neither denying nor acknowledging the hack, leaving the public to depend on quotes from anonymous officials. Perhaps admitting the Russians hacked U.S. government systems would throw light on Russian interference four years ago, and the President is loath to even contemplate that attack. In contrast, President Donald Trump has made all sorts of wild, untrue claims about vote totals being hacked despite no evidence supporting his assertions. It appears that the declaration of mission accomplished by some agencies of the Trump Administration over no Russian hacking of or interference with the 2020 election will be overshadowed by what may prove the most damaging hack of U.S. government systems ever.
  • Revealed: China suspected of spying on Americans via Caribbean phone networks” By Stephanie Kirchgaessner — The Guardian. This story depends on one source, so take it for what it is worth, but allegedly the People’s Republic of China (PRC) is using vulnerabilities in mobile communications networks to hack into the phones of Americans travelling in the Caribbean. If so, the PRC may be exploiting the same Signaling System 7 (SS7) weaknesses an Israeli firm, Circles, is using to sell access to phones, at least according to a report published recently by the University of Toronto’s Citizen Lab.
  • The Cartel Project | Revealed: The Israelis Making Millions Selling Cyberweapons to Latin America” By Amitai Ziv — Haaretz. Speaking of Israeli companies, the NSO Group among others are actively selling offensive cyber and surveillance capabilities to Central American nations often through practices that may be corrupt.
  • U.S. Schools Are Buying Phone-Hacking Tech That the FBI Uses to Investigate Terrorists” By Tom McKay and Dhruv Mehrotra — Gizmodo. Israeli firm Cellebrite and competitors are being used in school systems across the United States (U.S.) to access communications on students’ phones. The U.S. Supreme Court caselaw gives schools very wide discretion for searches, and the Fourth Amendment is largely null and void on school grounds.
  • ‘It’s Hard to Prove’: Why Antitrust Suits Against Facebook Face Hurdles” By Mike Issac and Cecilia Kang — The New York Times. The development of antitrust law over the last few decades may have laid an uphill path for the Federal Trade Commission (FTC) and state attorneys general in securing a breakup of Facebook, something that has not happened on a large scale since the historic splintering of AT&T in the early 1980’s.
  • Exclusive: Israeli Surveillance Companies Are Siphoning Masses Of Location Data From Smartphone Apps” By Thomas Brewster — Forbes. Turns out Israeli firms are using a feature (or what many would call a bug) in the online advertising system that allows those looking to buy ads to get close to real-time location data from application developers looking to sell advertising space. By putting out a shingle as a Demand Side Platform, it is possible to access reaps of location data, and two Israeli companies are doing just that and offering the service of locating and tracking people using this quirk in online advertising. And this is not just companies in Israel. There is a company under scrutiny in the United States (U.S.) that may have used these practices and then provided location data to federal agencies.

Other Developments

  • The Government Accountability Office (GAO) evaluated the United States’ (U.S.) Department of Defense’s electromagnetic spectrum (EMS) operations found that the DOD’s efforts to maintain EMS superiority over the Russian Federation and the People’s Republic of China (PRC). The GAO concluded:
    • Studies have shown that adversaries of the United States, such as China and Russia, are developing capabilities and strategies that could affect DOD superiority in the information environment, including the EMS. DOD has also reported that loss of EMS superiority could result in the department losing control of the battlefield, as its Electromagnetic Spectrum Operations (EMSO) supports many warfighting functions across all domains. DOD recognizes the importance of EMSO to military operations in actual conflicts and in operations short of open conflict that involve the broad information environment. However, gaps we identified in DOD’s ability to develop and implement EMS-related strategies have impeded progress in meeting DOD’s goals. By addressing gaps we found in five areas—(1) the processes and procedures to integrate EMSO throughout the department, (2) governance reforms to correct diffuse organization, (3) responsibility by an official with appropriate authority, (4) a strategy implementation plan, and (5) activities that monitor and assess the department’s progress in implementing the strategy—DOD can capitalize on progress that it has already made and better support ensuring EMS superiority.
    • The GAO recommended:
      • The Secretary of Defense should ensure that the Vice Chairman of the Joint Chiefs of Staff, as Senior Designated Official of the Electromagnetic Spectrum Operations Cross-Functional Team (CFT), identifies the procedures and processes necessary to provide for integrated defense-wide strategy, planning, and budgeting with respect to joint electromagnetic spectrum operations, as required by the FY19 NDAA. (Recommendation 1)
      • The Secretary of Defense should ensure that the Vice Chairman of the Joint Chiefs of Staff as Senior Designated Official of the CFT proposes EMS governance, management, organizational, and operational reforms to the Secretary. (Recommendation 2)
      • The Secretary of Defense should assign clear responsibility to a senior official with authority and resources necessary to compel action for the long-term implementation of the 2020 strategy in time to oversee the execution of the 2020 strategy implementation plan. (Recommendation 3)
      • The Secretary of Defense should ensure that the designated senior official for long-term strategy implementation issues an actionable implementation plan within 180 days following issuance of the 2020 strategy. (Recommendation 4)
      • The Secretary of Defense should ensure that the designated senior official for long-term strategy implementation creates oversight processes that would facilitate the department’s implementation of the 2020 strategy. (Recommendation 5)
  • A forerunner to Apple’s App Store has sued the company, claiming it has monopolized applications on its operating system to the detriment of other parties and done the same with respect to its payment system. The company behind Cydia is arguing that it conceived of and created the first application store for the iPhone, offering a range of programs Apple did not. Cydia is claiming that once Apple understood how lucrative an app store would be, it blocked Cydia and established its own store, the exclusive means through which programs can be installed and used on the iOS. Furthermore, this has enabled Apple to levy 30% of all in-application purchases made, which is allegedly a $50 billion market annually. This is the second high-profile suit this year against Apple. Epic Games, the maker of the popular game, Fortnite, sued Apple earlier this year on many of the same grounds because the company started allowing users to buy directly from it for a 30% discount. Apple responded by removing the game from the App Store, which has blocked players from downloading updated versions. That litigation has just begun. In its complaint, Cydia asserts:
    • Historically, distribution of apps for a specific operating system (“OS”) occurred in a separate and robustly competitive market. Apple, however, began coercing users to utilize no other iOS app distribution service but the App Store, coupling it closer and closer to the iPhone itself in order to crowd out all competition. But Apple did not come up with this idea initially—it only saw the economic promise that iOS app distribution represented after others, like [Cydia], demonstrated that value with their own iOS app distribution products/services. Faced with this realization, Apple then decided to take that separate market (as well as the additional iOS app payment processing market described herein) for itself.
    • Cydia became hugely popular by offering a marketplace to find and obtain third party iOS applications that greatly expanded the capabilities of the stock iPhone, including games, productivity applications, and audio/visual applications such as a video recorder (whereas the original iPhone only allowed still cameraphotos). Apple subsequently took many of these early third party applications’ innovations, incorporating them into the iPhone directly or through apps.
    • But far worse than simply copying others’ innovations, Apple also recognized that it could reap enormous profits if it cornered this fledgling market for iOS app distribution, because that would give Apple complete power over iOS apps, regardless of the developer. Apple therefore initiated a campaign to eliminate competition for iOS app distribution altogether. That campaign has been successful and continues to this day. Apple did (and continues to do) so by, inter alia, tying the App Store app to iPhone purchases by preinstalling it on all iOS devices and then requiring it as the default method to obtain iOS apps, regardless of user preference for other alternatives; technologically locking down the iPhone to prevent App Store competitors like Cydia from even operating on the device; and imposing contractual terms on users that coerce and prevent them from using App Store competitors. Apple has also mandated that iOS app developers use it as their sole option for app payment processing (such as in-app purchases), thus preventing other competitors, such as Cydia, from offering the same service to those developers.
    • Through these and other anticompetitive acts, Apple has wrongfully acquired and maintained monopoly power in the market (or aftermarket) for iOS app distribution, and in the market (or aftermarket) for iOS app payment processing. Apple has frozen Cydia and all other competitors out of both markets, depriving them of the ability to compete with the App Store and to offer developers and consumers better prices, better service, and more choice. This anticompetitive conduct has unsurprisingly generated massive profits and unprecedented market capitalization for Apple, as well as incredible market power.
  • California is asking to join antitrust suit against Google filed by the United States Department of Justice (DOJ) and eleven state attorneys general. This antitrust action centers on Google’s practices of making Google the default search engine on Android devices and paying browsers and other technology entities to make Google the default search engine. However, a number of states that had initially joined the joint state investigation of Google have opted not to join this action and will instead be continuing to investigate, signaling a much broader case than the one filed in the United States District Court for the District of Columbia. In any event, if the suit does proceed, and a change in Administration could result in a swift change in course, it may take years to be resolved. Of course, given the legion leaks from the DOJ and state attorneys general offices about the pressure U.S. Attorney General William Barr placed on staff and attorneys to bring a case before the election, there is criticism that rushing the case may result in a weaker, less comprehensive action that Google may ultimately fend off.
    • And, there is likely to be another lawsuit against Google filed by other state attorneys general. A number of attorneys general who had orginally joined the effort led by Texas Attorney General Ken Paxton in investigating Google released a statement at the time the DOJ suit was filed, indicating their investigation would continue, presaging a different, possibly broader lawsuit that might also address Google’s role in other markets. The attorneys general of New York, Colorado, Iowa, Nebraska, North Carolina, Tennessee, and Utah did not join the case that was filed but may soon file a related but parallel case. They stated:
      • Over the last year, both the U.S. DOJ and state attorneys general have conducted separate but parallel investigations into Google’s anticompetitive market behavior. We appreciate the strong bipartisan cooperation among the states and the good working relationship with the DOJ on these serious issues. This is a historic time for both federal and state antitrust authorities, as we work to protect competition and innovation in our technology markets. We plan to conclude parts of our investigation of Google in the coming weeks. If we decide to file a complaint, we would file a motion to consolidate our case with the DOJ’s. We would then litigate the consolidated case cooperatively, much as we did in the Microsoft case.
  • France’s Commission nationale de l’informatique et des libertés (CNIL) handed down multi-million Euro fines on Google and Amazon for putting cookies on users’ devices. CNIL fined Google a total of €100 million and Amazon €35 million because its investigation of both entities determined “when a user visited [their] website, cookies were automatically placed on his or her computer, without any action required on his or her part…[and] [s]everal of these cookies were used for advertising purposes.”
    • CNIL explained the decision against Google:
      • [CNIL] noticed three breaches of Article 82 of the French Data Protection Act:
      • Deposit of cookies without obtaining the prior consent of the user
        • When a user visited the website google.fr, several cookies used for advertising purposes were automatically placed on his or her computer, without any action required on his or her part.
        • Since this type of cookies can only be placed after the user has expressed his or her consent, the restricted committee considered that the companies had not complied with the requirement provided for in Article 82 of the French Data Protection Act regarding the collection of prior consent before placing cookies that are not essential to the service.
      • Lack of information provided to the users of the search engine google.fr
        • When a user visited the page google.fr, an information banner displayed at the bottom of the page, with the following note “Privacy reminder from Google”, in front of which were two buttons: “Remind me later” and “Access now”.
        • This banner did not provide the user with any information regarding cookies that had however already been placed on his or her computer when arriving on the site. The information was also not provided when he or she clicked on the button “Access now”.
        • Therefore, the restricted committee considered that the information provided by the companies did not enable the users living in France either to be previously and clearly informed regarding the deposit of cookies on their computer or, therefore, to be informed of the purposes of these cookies and the available means enabling to refuse them.
      • Partial failure of the « opposition » mechanism
        • When a user deactivated the ad personalization on the Google search by using the available mechanism from the button “Access now”, one of the advertising cookies was still stored on his or her computer and kept reading information aimed at the server to which it is attached.
        • Therefore, the restricted committee considered that the “opposition” mechanism set up by the companies was partially defective, breaching Article 82 of the French Data Protection Act.
    • CNIL explained the case against Amazon:
      • [CNIL] noticed two breaches of Article 82 of the French Data Protection Act:
      • Deposit of cookies without obtaining the prior consent of the user
        • The restricted committee noted that when a user visited one of the pages of the website amazon.fr, a large number of cookies used for advertising purposes was automatically placed on his or her computer, before any action required on his or her part. Yet, the restricted committee recalled that this type of cookies, which are not essential to the service, can only be placed after the user has expressed his or her consent. It considered that the deposit of cookies at the same time as arriving on the site was a practice which, by its nature, was incompatible with a prior consent.
      • Lack of information provided to the users of the website amazon.fr
        • First, the restricted committee noted that, in the case of a user visiting the website amazon.fr, the information provided was neither clear, nor complete.
        • It considered that the information banner displayed by the company, which was “By using this website, you accept our use of cookies allowing to offer and improve our services. Read More.”, only contained a general and approximate information regarding the purposes of all the cookies placed. In particular, it considered that, by reading the banner, the user could not understand that cookies placed on his or her computer were mainly used to display personalized ads. It also noted that the banner did not explain to the user that it could refuse these cookies and how to do it.
        • Then, the restricted committee noticed that the company’s failure to comply with its obligation was even more obvious regarding the case of users that visited the website amazon.fr after they had clicked on an advertisement published on another website. It underlined that in this case, the same cookies were placed but no information was provided to the users about that.
  • Senator Amy Klobuchar (D-MN) wrote the Secretary of Health and Human Services (HHS), to express “serious concerns regarding recent reports on the data collection practices of Amazon’s health-tracking bracelet (Halo) and to request information on the actions [HHS] is taking to ensure users’ health data is secure.” Klobuchar stated:
    • The Halo is a fitness tracker that users wear on their wrists. The tracker’s smartphone application (app) provides users with a wide-ranging analysis of their health by tracking a range of biological metrics including heartbeat patterns, exercise habits, sleep patterns, and skin temperature. The fitness tracker also enters into uncharted territory by collecting body photos and voice recordings and transmitting this data for analysis. To calculate the user’s body fat percentage, the Halo requires users to take scans of their body using a smartphone app. These photos are then temporarily sent to Amazon’s servers for analysis while the app returns a three-dimensional image of the user’s body, allowing the user to adjust the image to see what they would look like with different percentages of body fat. The Halo also offers a tone analysis feature that examines the nuances of a user’s voice to indicate how the user sounds to others. To accomplish this task, the device has built-in microphones that listen and records a user’s voice by taking periodic samples of speech throughout the day if users opt-in to the feature.
    • Recent reports have raised concerns about the Halo’s access to this extensive personal and private health information. Among publicly available consumer health devices, the Halo appears to collect an unprecedented level of personal information. This raises questions about the extent to which the tracker’s transmission of biological data may reveal private information regarding the user’s health conditions and how this information can be used. Last year, a study by BMJ (formerly the British Medical Journal) found that 79 percent of health apps studied by researchers were found to share user data in a manner that failed to provide transparency about the data being shared. The study concluded that health app developers routinely share consumer data with third-parties and that little transparency exists around such data sharing.
    • Klobuchar asked the Secretary of Health and Human Services Alex Azar II to “respond to the following questions:
      • What actions is HHS taking to ensure that fitness trackers like Halo safeguard users’ private health information?
      • What authority does HHS have to ensure the security and privacy of consumer data collected and analyzed by health tracking devices like Amazon’s Halo?
      • Are additional regulations required to help strengthen privacy and security protections for consumers’ personal health data given the rise of health tracking devices? Why or why not?
      • Please describe in detail what additional authority or resources that the HHS could use to help ensure the security and protection of consumer health data obtained through health tracking devices like the Halo.

Coming Events

  • On 15 December, the Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing titled “The Role of Private Agreements and Existing Technology in Curbing Online Piracy” with these witnesses:
    • Panel I
      • Ms. Ruth Vitale, Chief Executive Officer, CreativeFuture
      • Mr. Probir Mehta, Head of Global Intellectual Property and Trade Policy, Facebook, Inc.
      • Mr. Mitch Glazier, Chairman and CEO, Recording Industry Association of America
      • Mr. Joshua Lamel, Executive Director, Re:Create
    • Panel II
      • Ms. Katherine Oyama, Global Director of Business Public Policy, YouTube
      • Mr. Keith Kupferschmid, Chief Executive Officer, Copyright Alliance
      • Mr. Noah Becker, President and Co-Founder, AdRev
      • Mr. Dean S. Marks, Executive Director and Legal Counsel, Coalition for Online Accountability
  • The Senate Armed Services Committee’s Cybersecurity Subcommittee will hold a closed briefing on Department of Defense Cyber Operations on 15 December with these witnesses:
    • Mr. Thomas C. Wingfield, Deputy Assistant Secretary of Defense for Cyber Policy, Office of the Under Secretary of Defense for Policy
    • Mr. Jeffrey R. Jones, Vice Director, Command, Control, Communications and Computers/Cyber, Joint Staff, J-6
    • Ms. Katherine E. Arrington, Chief Information Security Officer for the Assistant Secretary of Defense for Acquisition, Office of the Under Secretary of Defense for Acquisition and Sustainment
    • Rear Admiral Jeffrey Czerewko, United States Navy, Deputy Director, Global Operations, J39, J3, Joint Staff
  • The Senate Banking, Housing, and Urban Affairs Committee’s Economic Policy Subcommittee will conduct a hearing titled “US-China: Winning the Economic Competition, Part II” on 16 December with these witnesses:
    • The Honorable Will Hurd, Member, United States House of Representatives;
    • Derek Scissors, Resident Scholar, American Enterprise Institute;
    • Melanie M. Hart, Ph.D., Senior Fellow and Director for China Policy, Center for American Progress; and
    • Roy Houseman, Legislative Director, United Steelworkers (USW).
  • On 17 December the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency’s (CISA) Information and Communications Technology (ICT) Supply Chain Risk Management (SCRM) Task Force will convene for a virtual event, “Partnership in Action: Driving Supply Chain Security.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Naya Shaw from Pexels

Privacy Shield Hearing

The focus was on how the U.S. and EU can reach agreement on an arrangement that will not be struck down by the EU’s highest court.

Last week, the Senate Commerce, Science, and Transportation Committee held a hearing on the now invalidated European Union (EU)-United States (U.S.) Privacy Shield, a mechanism that allowed companies to transfer the personal data of EU residents to the U.S. The EU’s highest court struck down the adequacy decision that underpinned the system on the basis of U.S. surveillance activities and lack of redress that violated EU law. This is the second time in the decade the EU’s top court has invalidated a transfer arrangement, the first being the Safe Harbor system. Given the estimated billions, or even trillions, of dollars in value realized from data flows between the EU and U.S. there is keen interest on both sides of the Atlantic in finding a legal path forward. However, absent significant curtailment of U.S. surveillance and/or a significant expansion of the means by which EU nationals could have violations of their rights rectified, it would appear a third agreement may not withstand the inevitable legal challenges. Moreover, there are questions as to the legality of other transfer tools in light of the Court of Justice for the European Union’s decision in the case known as Schrems II, and the legality of some Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) may be soon be found in violation, too.

Consequently, a legislative fix, or some portion thereof, could be attached to federal privacy legislation. Hence, the striking down of Privacy Shield may provide additional impetus to Congress and the next Administration to reach a deal on privacy. Moreover, the lapsed reauthorization of some Foreign Intelligence Surveillance Act authorities may be another legislative opportunity for the U.S. to craft an approach amendable to the EU in order to either obtain an adequacy decision or a successor agreement to the Privacy Shield.

Chair Roger Wicker (R-MS) approached the issue from the perspective of international trade and the economic benefit accruing to businesses on both sides of the Atlantic. His opening remarks pertained less to the privacy and surveillance aspects of the CJEU’s ruling. Wicker appears to be making the case that the EU seems to misunderstand that redress rights in the U.S. are more than adequate, and the U.S.’ surveillance regime is similar to those of some EU nations. One wonders if the CJEU is inclined to agree with this position. Nonetheless, Wicker expressed hope that the EU and U.S. can reach “a durable and lasting data transfer framework…that provides meaningful data protections to consumers, sustains the free flow of information across the Atlantic, and encourages continued economic and strategic partnership with our European allies – a tall order but an essential order.” He worried about the effect of the CJEU’s ruling on SCCs. Wicker made the case that the EU and U.S. share democratic values and hinted that the ongoing talks in the committee to reach a federal data privacy law might include augmented redress rights that might satisfy the CJEU.

Ranking Member Maria Cantwell (D-WA) spoke very broadly about a range of issues related to data transfers and privacy. She stressed the importance of data flows in the context of larger trade relations. Cantwell also stressed the shared values between the U.S. and the EU and her hope that the two entities work “together on these very important national concerns, trade and technology, so that we can continue to improve economic opportunities and avoid moves towards protectionism.” She also called for federal privacy legislation but hinted that states should still be able to regulate privacy, suggesting her commitment to having a federal law be a floor for state laws. Cantwell also asserted that bulk surveillance, the likes of which the National security Agency has engaged in, may simply not be legal under EU law.

Deputy Assistant Secretary of Commerce for Services James Sullivan blurred the issues presented by Schrems II much like Cantwell did. The CJEU’s decision that focused on U.S. surveillance practices and the lack of meaningful recourse in the U.S. if an EU resident’s rights were violated was merged into a call for like-minded nations to unite against authoritarian nations. Sullivan distinguished between U.S. surveillance and the surveillance conducted by the People’s Republic of China (without naming the nation) and other regimes as if this should satisfy the EU as to the legality and propriety of U.S. treatment of EU personal data. Sullivan stated:

  • The Schrems II decision has created enormous uncertainties for U.S. companies and the transatlantic economy at a particularly precarious time. Immediately upon issuance of the ruling, the 5,400 Privacy Shield participants and their business partners in the EU could no longer rely on the Framework as a lawful basis for transferring personal data from Europe to the United States. Because neither the Court nor European data protection authorities provided for any enforcement grace period, Privacy Shield companies were left with three choices: (1) risk facing potentially huge fines (of up to 4 percent of total global turnover in the preceding year) for violating GDPR, (2) withdraw from the European market, or (3) switch right away to another more expensive data transfer mechanism.
  • Unfortunately, because of the Court’s ruling in the Privacy Shield context that U.S. laws relating to government access to data do not confer adequate protections for EU personal data, the use of other mechanisms like SCCs and BCRs to transfer EU personal data to the United States is now in question as well.
  • The objective of any potential agreement between the United States and the European Commission to address Schrems II is to restore the continuity of transatlantic data flows and the Framework’s privacy protections by negotiating targeted enhancements to Privacy Shield that address the Court’s concerns in Schrems II. Any such enhancements must respect the U.S. Government’s security responsibilities to our citizens and allies.
  • To be clear, we expect that any enhancements to the Privacy Shield Framework would also cover transfers under all other EU-approved data transfer mechanisms like SCCs and BCRs as well.
  • The Schrems II decision has underscored the need for a broader discussion among likeminded democracies on the issue of government access to data. Especially as a result of the extensive U.S. surveillance reforms since 2015, the United States affords privacy protections relating to national security data access that are equivalent to or greater than those provided by many other democracies in Europe and elsewhere.
  • To minimize future disruptions to data transfers, we have engaged with the European Union and other democratic nations in a multilateral discussion to develop principles based on common practices for addressing how best to reconcile law enforcement and national security needs for data with protection of individual rights.
  • It is our view that democracies should come together to articulate shared principles regarding government access to personal data—to help make clear the distinction between democratic societies that respect civil liberties and the rule of law and authoritarian governments that engage in the unbridled collection of personal data to surveil, manipulate, and control their citizens and other individuals without regard to personal privacy and human rights. Such principles would allow us to work with like-minded partners in preserving and promoting a free and open Internet enabled by the seamless flow of data.

Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips stressed he was speaking in a personal capacity and not for the FTC. He extolled the virtues of the “free and open” internet model in the U.S. with the double implication that it is superior both to nations like the PRC and Russia but also the EU model. Phillips seemed to be advocating for talking the EU into accepting that the U.S.’s privacy regime and civil liberties are stronger than any other nation. Her also made the case, like other witnesses, that the U.S. data privacy and protection regulation is more similar to the EU than the PRC, Russia, and others. Phillips also sought to blur the issues and recast Privacy Shield in the context of the global struggle between democracies and authoritarian regimes. Phillips asserted:

  • First, we need to find a path forward after Schrems II, to permit transfers between the U.S. and EU. I want to recognize the efforts of U.S. and EU negotiators to find a replacement for Privacy Shield. While no doubt challenging, I have confidence in the good faith and commitment of public servants like Jim Sullivan, with whom I have the honor of appearing today, and our partners across the Atlantic. I have every hope and expectation that protecting cross-border data flows will be a priority for the incoming Administration, and I ask for your help in ensuring it is.
  • Second, we must actively engage with nations evaluating their approach to digital governance, something we at the FTC have done, to share and promote the benefits of a free and open Internet. There is an active conversation ongoing internationally, and at every opportunity—whether in public forums or via private assistance—we must ensure our voice and view is heard.
  • Third, we should be vocal in our defense of American values and policies. While we as Americans always look to improve our laws—and I commend the members of this committee on their important work on privacy legislation and other critical matters—we do not need to apologize to the world. When it comes to civil liberties or the enforcement of privacy laws, we are second to none. Indeed, in my view, the overall U.S. privacy framework—especially with the additional protections built into Privacy Shield—should certainly qualify as adequate under EU standards.
  • Fourth, as European leaders call to strengthen ties with the U.S., we should prioritize making our regimes compatible for the free flow of data. This extends to the data governance regimes of like-minded countries outside of Europe as well. Different nations will have different rules, but relatively minor differences need not impede mutually-beneficial commerce. We need not and should not purport to aim for a single, identical system of data governance. And we should remind our allies, and remind ourselves, that far more unites liberal democracies than divides us.
  • Fifth and finally, if we must draw lines, those lines should be drawn between allies with shared values—the U.S., Europe, Japan, Australia, and others—and those, like China and Russia, that offer a starkly different vision. I am certainly encouraged when I hear recognition of this distinction from Europe. European Data Protection Supervisor Wojciech Wiewiórowski recently noted that the U.S. is much closer to Europe than is China and that he has a preference for data being processed by countries that share values with Europe. Some here in the U.S. are even proposing agreements to solidify the relationships among technologically advanced democracies, an idea worth exploring in more detail

Washington University Professor of Law Neil Richards stressed that the Schrems II decision spells out how the U.S. would achieve adequacy: reforming surveillance and providing meaningful redress for alleged privacy violations. Consequently, FISA would need to be rewritten and narrowed and a means for EU residents to seek relief beyond the current Ombudsman system is needed, possibly a statutory right to sue. Moreover, he asserted strong data protection and privacy laws are needed and some of the bills introduced in this Congress could fit the bill. Richards asserted:

In sum, the Schrems litigation is a creature of distrust, and while it has created problems for American law and commerce, it has also created a great opportunity. That opportunity lies before this Committee –the chance to regain American leadership in global privacy and data protection by passing a comprehensive law that provides appropriate safeguards, enforceable rights, and effective legal remedies for consumers. I believe that the way forward can not only safeguard the ability to share personal data across the Atlantic, but it can do so in a way that builds trust between the United States and our European trading partners and between American companies and their American and European customers. I believe that there is a way forward, but it requires us to recognize that strong, clear, trust-building rules are not hostile to business interest, that we need to push past the failed system of “notice and choice,” that we need to preserve effective consumer remedies and state-level regulatory innovation, and seriously consider a duty of loyalty. In that direction, I believe, lies not just consumer protection, but international cooperation and economic prosperity.

Georgia Tech University Professor Peter Swire explained that the current circumstances make the next Congress the best possibility in memory to enact privacy legislation because of the need for a Privacy Shield replacement, passage of the new California Privacy Rights Act (Proposition 24), and the Biden Administration’s likely support for such legislation. Swire made the following points:

  1. The European Data Protection Board in November issued draft guidance with an extremely strict interpretation of how to implement the Schrems II case.
  2. The decision in Schrems II is based on EU constitutional law. There are varying current interpretations in Europe of what is required by Schrems II, but constitutional requirements may restrict the range of options available to EU and U.S. policymakers.
  3. Strict EU rules about data transfers, such as the draft EDPB guidance, would appear to result in strict data localization, creating numerous major issues for EU- and U.S.-based businesses, as well as affecting many online activities of EU individuals.
  4. Along with concerns about lack of individual redress, the CJEU found that the EU Commission had not established that U.S. surveillance was “proportionate” in its scope and operation. Appendix 2 to this testimony seeks to contribute to an informed judgment on proportionality, by cataloguing developments in U.S. surveillance safeguards since the Commission’s issuance of its Privacy Shield decision in 2016.
  5. Negotiating an EU/U.S. adequacy agreement is important in the short term.
  6. A short-run agreement would assist in creating a better overall long-run agreement or agreements.
  7. As the U.S. considers its own possible legal reforms in the aftermath of Schrems II, it is prudent and a normal part of negotiations to seek to understand where the other party – the EU – may have flexibility to reform its own laws.
  8. Issues related to Schrems II have largely been bipartisan in the U.S., with substantial continuity across the Obama and Trump administrations, and expected as well for a Biden administration.
  9. Passing comprehensive privacy legislation would help considerably in EU/U.S. negotiations.
  10. This Congress may have a unique opportunity to enact comprehensive commercial privacy legislation for the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dooffy Design from Pixabay

Further Reading, Other Development, and Coming Events (7 December)

Further Reading

  • Facebook steps up campaign to ban false information about coronavirus vaccines” By Elizabeth Dwoskin — The Washington Post. In its latest step to find and remove lies, misinformation, and disinformation, the social media giant is now committing to removing and blocking untrue material about COVID-19 vaccines, especially from the anti-vaccine community. Will the next step be to take on anti-vaccination proponents generally?
  • Comcast’s 1.2 TB data cap seems like a ton of data—until you factor in remote work” By Rob Pegoraro — Fast Company. Despite many people and children working and learning from home, Comcast is reimposing a 1.2 terabyte limit on data for homes. Sounds like quite a lot until you factor in video meetings, streaming, etc. So far, other providers have not set a cap.
  • Google’s star AI ethics researcher, one of a few Black women in the field, says she was fired for a critical email” By Drew Harwell and Nitasha Tiku — The Washington Post. Timnit Gebru, a top flight artificial intelligence (AI) computer scientist, was fired for questioning Google’s review of a paper she wanted to present at an AI conference that is likely critical of the company’s AI projects. Google claims she resigned, but Gebru says she was fired. She has long been an advocate for women and minorities in tech and AI and her ouster will likely only increase scrutiny of and questions about Google’s commitment to diversity and an ethical approach to the development and deployment of AI. It will also probably lead to more employee disenchantment about the company that follows in the wake of protests about Google’s involvement with the United States Department of Defense’s Project Maven and hiring of former United States Department of Homeland Security chief of staff Miles Taylor who was involved with the policies that resulted in caging children and separating families on the southern border of the United States.
  • Humans Can Help Clean Up Facebook and Twitter” By Greg Bensinger — The New York Times. In this opinion piece, the argument is made that social media platforms should redeploy their human monitors to the accounts that violate terms of service most frequently (e.g., President Donald Trump) and more aggressively label and remove untrue or inflammatory content, they would have a greater impact on lies, misinformation, and disinformation.
  • Showdown looms over digital services tax” By Ashley Gold — Axios. Because the Organization for Economic Cooperation and Development (OECD) has not reached a deal on digital services taxes, a number of the United States (U.S.) allies could move forward with taxes on U.S. multinationals like Amazon, Google, and Apple. The Trump Administration has variously taken an adversarial position threatening to retaliate against countries like France who have enacted a tax that has not been collected during the OECD negotiations. The U.S. also withdrew from talks. It is probable the Biden Administration will be more willing to work in a multi-lateral fashion and may strike a deal on an issue that it not going away as the United Kingdom, Italy, and Canada also have plans for a digital tax.
  • Trump’s threat to veto defense bill over social-media protections is heading to a showdown with Congress” By Karoun Demirjian and Tony Romm — The Washington Post. I suppose I should mention of the President’s demands that the FY 2021 National Defense Authorization Act (NDAA) contain a repeal of 47 U.S.C. 230 (Section 230 of the Communications Act) that came at the eleventh hour and fifty-ninth minute of negotiations on a final version of the bill. Via Twitter, Donald Trump threatened to veto the bill which has been passed annually for decades. Republicans were not having it, however, even if they agreed on Trump’s desire to remove liability protection for technology companies. And yet, if Trump continues to insist on a repeal, Republicans may find themselves in a bind and the bill could conceivably get pulled until President-elect Joe Biden is sworn in. On the other hand, Trump’s veto threats about renaming military bases currently bearing the names of Confederate figures have not been renewed even though the final version of the bill contains language instituting a process to do just that.

Other Developments

  • The Senate Judiciary Committee held over its most recent bill to narrow 47 U.S.C. 230 (Section 230 of the Communications Act) that provides liability protection for technology companies for third-party material posted on their platforms and any decisions to edit, alter, or remove such content. The committee opted to hold the “Online Content Policy Modernization Act” (S.4632), which may mean the bill’s chances of making it to the Senate floor are low. What’s more, even if the Senate passes Section 230 legislation, it is not clear there will be sufficient agreement with Democrats in the House to get a final bill to the President before the end of this Congress. On 1 October, the committee also decided to hold over bill to try to reconcile the fifteen amendments submitted for consideration. The Committee could soon meet again to formally markup and report out this legislation.
    • At the earlier hearing, Chair Lindsey Graham (R-SC) submitted an amendment revising the bill’s reforms to Section 230 that incorporate some of the below amendments but includes new language. For example, the bill includes a definition of “good faith,” a term not currently defined in Section 230. This term would be construed as a platform taking down or restricting content only according to its publicly available terms of service, not as a pretext, and equally to all similarly situated content. Moreover, good faith would require alerting the user and giving him or her an opportunity to respond subject to certain exceptions. The amendment also makes clear that certain existing means of suing are still available to users (e.g. suing claiming a breach of contract.)
    • Senator Mike Lee (R-UT) offered a host of amendments:
      • EHF20913 would remove “user[s]” from the reduced liability shield that online platforms would receive under the bill. Consequently, users would still not be legally liable for the content posted by another user.
      • EHF20914 would revise the language the language regarding the type of content platforms could take down with legal protection to make clear it would not just be “unlawful” content but rather content “in violation of a duly enacted law of the United States,” possibly meaning federal laws and not state laws. Or, more likely, the intent would be to foreclose the possibility a platform would say it is acting in concert with a foreign law and still assert immunity.
      • EHF20920 would add language making clear that taking down material that violates terms of service or use according to an objectively reasonable belief would be shielded from liability.
      • OLL20928 would expand legal protection to platforms for removing or restricting spam,
      • OLL20929 would bar the Federal Communications Commission (FCC) from a rulemaking on Section 230.
      • OLL20930 adds language making clear if part of the revised Section 230 is found unconstitutional, the rest of the law would still be applicable.
      • OLL20938 revises the definition of an “information content provider,” the term of art in Section 230 that identifies a platform, to expand when platforms may be responsible for the creation or development of information and consequently liable for a lawsuit.
    • Senator Josh Hawley (R-MO) offered an amendment that would create a new right of action for people to sue large platforms for taking down his or her content if not done in “good faith.” The amendment limits this right only to “edge providers” who are platforms with more than 30 million users in the U.S. , 300 million users worldwide, and with revenues of more than $1.5 billion. This would likely exclude all platforms except for Twitter, Facebook, Instagram, TikTok, Snapchat, and a select group of a few others.
    • Senator John Kennedy (R-LA) offered an amendment that removes all Section 230 legal immunity from platforms that collect personal data and then uses an “automated function” to deliver targeted or tailored content to a user unless a user “knowingly and intentionally elect[s]” to receive such content.
  • The Massachusetts Institute of Technology’s (MIT) Work of the Future Task Force issued its final report and drew the following conclusions:
    • Technological change is simultaneously replacing existing work and creating new work. It is not eliminating work altogether.
    • Momentous impacts of technological change are unfolding gradually.
    • Rising labor productivity has not translated into broad increases in incomes because labor market institutions and policies have fallen into disrepair.
    • Improving the quality of jobs requires innovation in labor market institutions.
    • Fostering opportunity and economic mobility necessitates cultivating and refreshing worker skills.
    • Investing in innovation will drive new job creation, speed growth, and meet rising competitive challenges.
    • The Task Force stated:
      • In the two-and-a-half years since the Task Force set to work, autonomous vehicles, robotics, and AI have advanced remarkably. But the world has not been turned on its head by automation, nor has the labor market. Despite massive private investment, technology deadlines have been pushed back, part of a normal evolution as breathless promises turn into pilot trials, business plans, and early deployments — the diligent, if prosaic, work of making real technologies work in real settings to meet the demands of hard-nosed customers and managers.
      • Yet, if our research did not confirm the dystopian vision of robots ushering workers off of factor y floors or artificial intelligence rendering superfluous human expertise and judgment, it did uncover something equally pernicious: Amidst a technological ecosystem delivering rising productivity, and an economy generating plenty of jobs (at least until the COVID-19 crisis), we found a labor market in which the fruits are so unequally distributed, so skewed towards the top, that the majority of workers have tasted only a tiny morsel of a vast har vest.
      • As this report documents, the labor market impacts of technologies like AI and robotics are taking years to unfold. But we have no time to spare in preparing for them. If those technologies deploy into the labor institutions of today, which were designed for the last century, we will see similar effects to recent decades: downward pressure on wages, skills, and benefits, and an increasingly bifurcated labor market. This report, and the MIT Work of the Future Task Force, suggest a better alternative: building a future for work that har vests the dividends of rapidly advancing automation and ever-more powerful computers to deliver opportunity and economic security for workers. To channel the rising productivity stemming from technological innovations into broadly shared gains, we must foster institutional innovations that complement technological change.
  • The European Data Protection Supervisor (EDPS) Wojciech Wiewiorówski published his “preliminary opinion on the European Commission’s (EC) Communication on “A European strategy for data” and the creation of a common space in the area of health, namely the European Health Data Space (EHDS).” The EDPS lauded the goal of the EHDS, “the prevention, detection and cure of diseases, as well as for evidence-based decisions in order to enhance effectiveness, accessibility and sustainability of the healthcare systems.” However, Wiewiorówski articulated his concerns that the EC needs to think through the applicability of the General Data Protection Regulation (GDPR), among other European Union (EU) laws before it can legally move forward. The EDPS stated:
    • The EDPS calls for the establishment of a thought-through legal basis for the processing operations under the EHDS in line with Article 6(1) GDPR and also recalls that such processing must comply with Article 9 GDPR for the processing of special categories of data.
    • Moreover, the EDPS highlights that due to the sensitivity of the data to be processed within the EHDS, the boundaries of what constitutes a lawful processing and a compatible further processing of the data must be crystal-clear for all the stakeholders involved. Therefore, the transparency and the public availability of the information relating to the processing on the EHDS will be key to enhance public trust in the EHDS.
    • The EDPS also calls on the Commission to clarify the roles and responsibilities of the parties involved and to clearly identify the precise categories of data to be made available to the EHDS. Additionally, he calls on the Member States to establish mechanisms to assess the validity and quality of the sources of the data.
    • The EDPS underlines the importance of vesting the EHDS with a comprehensive security infrastructure, including both organisational and state-of-the-art technical security measures to protect the data fed into the EHDS. In this context, he recalls that Data Protection Impact Assessments may be a very useful tool to determine the risks of the processing operations and the mitigation measures that should be adopted.
    • The EDPS recommends paying special attention to the ethical use of data within the EHDS framework, for which he suggests taking into account existing ethics committees and their role in the context of national legislation.
    • The EDPS is convinced that the success of the EHDS will depend on the establishment of a strong data governance mechanism that provides for sufficient assurances of a lawful, responsible, ethical management anchored in EU values, including respect for fundamental rights. The governance mechanism should regulate, at least, the entities that will be allowed to make data available to the EHDS, the EHDS users, the Member States’ national contact points/ permit authorities, and the role of DPAs within this context.
    • The EDPS is interested in policy initiatives to achieve ‘digital sovereignty’ and has a preference for data being processed by entities sharing European values, including privacy and data protection. Moreover, the EDPS calls on the Commission to ensure that the stakeholders taking part in the EHDS, and in particular, the controllers, do not transfer personal data unless data subjects whose personal data are transferred to a third country are afforded a level of protection essentially equivalent to that guaranteed within the European Union.
    • The EDPS calls on Member States to guarantee the effective implementation of the right to data portability specifically in the EHDS, together with the development of the necessary technical requirements. In this regard, he considers that a gap analysis might be required regarding the need to integrate the GDPR safeguards with other regulatory safeguards, provided e.g. by competition law or ethical guidelines.
  • The Office of Management and Budget (OMB) extended a guidance memorandum directing agencies to consolidate data centers after Congress pushed back the sunset date for the program. OMB extended OMB Memorandum M-19-19, Update to Data Center Optimization Initiative (DCOI) through 30 September 2022, which applies “to the 24 Federal agencies covered by the Chief Financial Officers (CFO) Act of 1990, which includes the Department of Defense.” The DCOI was codified in the “Federal Information Technology Acquisition Reform” (FITARA) (P.L. 113-291) and extended in 2018 until October 1, 2020. And this sunset was pushed back another two years in the FY 2020 National Defense Authorization Act (NDAA) (P.L. 116-92).
    • In March 2020, the Government Accountability Office (GAO) issued another of its periodic assessments of the DCOI, started in 2012 by the Obama Administration to shrink the federal government’s footprint of data centers, increase efficiency and security, save money, and reduce energy usage.
    • The GAO found that 23 of the 24 agencies participating in the DCOI met or planned to meet their FY 2019 goals to close 286 of the 2,727 data centers considered part of the DCOI. This latter figure deserves some discussion, for the Trump Administration changed the definition of what is a data center to exclude smaller ones (so-called non-tiered data centers). GAO asserted that “recent OMB DCOI policy changes will reduce the number of data centers covered by the policy and both OMB and agencies may lose important visibility over the security risks posed by these facilities.” Nonetheless, these agencies are projecting savings of $241.5 million when all the 286 data centers planned for closure in FY 2019 actually close. It bears note that the GAO admitted in a footnote it “did not independently validate agencies’ reported cost savings figures,” so these numbers may not be reliable.
    • In terms of how to improve the DCOI, the GAO stated that “[i]n addition to reiterating our prior open recommendations to the agencies in our review regarding their need to meet DCOI’s closure and savings goals and optimization metrics, we are making a total of eight new recommendations—four to OMB and four to three of the 24 agencies. Specifically:
      • The Director of the Office of Management and Budget should (1) require that agencies explicitly document annual data center closure goals in their DCOI strategic plans and (2) track those goals on the IT Dashboard. (Recommendation 1)
      • The Director of the Office of Management and Budget should require agencies to report in their quarterly inventory submissions those facilities previously reported as data centers, even if those facilities are not subject to the closure and optimization requirements of DCOI. (Recommendation 2)
      • The Director of the Office of Management and Budget should document OMB’s decisions on whether to approve individual data centers when designated by agencies as either a mission critical facility or as a facility not subject to DCOI. (Recommendation 3)
      • The Director of the Office of Management and Budget should take action to address the key performance measurement characteristics missing from the DCOI optimization metrics, as identified in this report. (Recommendation 4)
  • Australia’s Inspector-General of Intelligence and Security (IGIS) released its first report on how well the nation’s security services did in observing the law with respect to COVID  app  data. The IGIS “is satisfied that the relevant agencies have policies and procedures in place and are taking reasonable steps to avoid intentional collection of COVID app data.” The IGIS revealed that “[i]ncidental collection in the course of the lawful collection of other data has occurred (and is permitted by the Privacy Act); however, there is no evidence that any agency within IGIS jurisdiction has decrypted, accessed or used any COVID app data.” The IGIS is also “satisfied  that  the intelligence agencies within IGIS jurisdiction which have the capability to incidentally collect a least some types of COVID app data:
    • Are aware of their responsibilities under Part VIIIA of the Privacy Act and are taking active steps to minimise the risk that they may collect COVID app data.
    • Have appropriate  policies  and  procedures  in  place  to  respond  to  any  incidental  collection of COVID app data that they become aware of. 
    • Are taking steps to ensure any COVID app data is not accessed, used or disclosed.
    • Are taking steps to ensure any COVID app data is deleted as soon as practicable.
    • Have not decrypted any COVID app data.
    • Are applying the usual security measures in place in intelligence agencies such that a ‘spill’ of any data, including COVID app data, is unlikely.
  • New Zealand’s Government Communications Security Bureau’s National Cyber Security Centre (NCSC) has released its annual Cyber Threat Report that found that “nationally significant organisations continue to be frequently targeted by malicious cyber actors of all types…[and] state-sponsored and non-state actors targeted public and private sector organisations to steal information, generate revenue, or disrupt networks and services.” The NCSC added:
    • Malicious cyber actors have shown their willingness to target New Zealand organisations in all sectors using a range of increasingly advanced tools and techniques. Newly disclosed vulnerabilities in products and services, alongside the adoption of new services and working arrangements, are rapidly exploited by state-sponsored actors and cyber criminals alike. A common theme this year, which emerged prior to the COVID-19 pandemic, was the exploitation of known vulnerabilities in internet-facing applications, including corporate security products, remote desktop services and virtual private network applications.
  • The former Director of the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) wrote an opinion piece disputing President Donald Trump’s claims that the 2020 Presidential Election was fraudulent. Christopher Krebs asserted:
    • While I no longer regularly speak to election officials, my understanding is that in the 2020 results no significant discrepancies attributed to manipulation have been discovered in the post-election canvassing, audit and recount processes.
    • This point cannot be emphasized enough: The secretaries of state in Georgia, Michigan, Arizona, Nevada and Pennsylvania, as well officials in Wisconsin, all worked overtime to ensure there was a paper trail that could be audited or recounted by hand, independent of any allegedly hacked software or hardware.
    • That’s why Americans’ confidence in the security of the 2020 election is entirely justified. Paper ballots and post-election checks ensured the accuracy of the count. Consider Georgia: The state conducted a full hand recount of the presidential election, a first of its kind, and the outcome of the manual count was consistent with the computer-based count. Clearly, the Georgia count was not manipulated, resoundingly debunking claims by the president and his allies about the involvement of CIA supercomputers, malicious software programs or corporate rigging aided by long-gone foreign dictators.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Daniel Schludi on Unsplash