Further Reading, Other Developments, and Coming Events (10 February 2021)

Further Reading

  • A Hacker Tried to Poison a Florida City’s Water Supply, Officials Say” By Andy Greenberg — WIRED. Given the fact that most water and sewage systems are linked to the internet, even their operational systems, it is surprising these sorts of incidents do not occur more frequently.
  • UK regulator to write to WhatsApp over Facebook data sharing” By Alex Hern — The Guardian. The United Kingdom’s (UK) Information Commissioner Elizabeth Denham said her agency will be pressing Facebook to keep the data its subsidiary, WhatsApp, separate. Now that the UK has exited the European Union, it is no longer bound by the EU‘s system which made Ireland’s Data Protection Commission the lead regulator on Facebook and WhatsApp. And so, WhatsApp’s 2017 commitment not to hand over user data to Facebook until it was compliant with the General Data Protection Regulation (GDPR) falls to the ICO to oversee in the UK.
  • Telegram, Pro-Democracy Tool, Struggles Over New Fans From Far Right” By Michael Schwirtz — The New York Times. The same features that makes messaging app Telegram ideal for warding off attacks by authoritarian regimes to shut down communication makes the platform ideal for right-wing extremists in the United States (U.S.) Federal and state authorities may see their attempts to track and monitor domestic terrorism hit the same roadblocks that foiled Moscow and Tehran’s attempts to crack down on Telegram. The platform uses end-to-end encrypted communications and has servers all over the world.
  • Exclusive: The end of the Maher era at Wikipedia” By Felix Salmon — Axios. The CEO who revitalized Wikimedia is leaving the organization stronger than she found it.
  • After Defending Its Low-Cost Internet Offering, Comcast Agrees To Increase Speeds” By Caroline O’Donovan — BuzzFeed News. The bad publicity seems to have worked on Comcast as the company is now meeting most of the demands of activists, students, and officials by increasing the speed of its low cost broadband option. Comcast said the changes will take effect on 1 March.

Other Developments

  • The Federal Communications Commission (FCC) announced that it is “seeking comment on several petitions requesting permission to use E-Rate program funds to support remote learning during the pandemic.” Comments are due by 16 February and reply comments are due by 23 February. The FCC explained:
    • Today’s Public Notice from the FCC’s Wireline Competition Bureau highlights three petitions that cover the bulk of issues presented in other petitions filed with the Commission.  These include petitions filed by a coalition of E-Rate stakeholders led by the Schools, Health & Libraries Broadband (SHLB) Coalition; a petition filed on behalf of the State of Colorado; and a petition filed by the State of Nevada, Nevada Board of Education and Nevada Department of Education. 
    • The FCC noted:
      • The E-Rate program was authorized by Congress as part of the Telecommunications Act of 1996 (the Telecommunications Act), and created by the Commission in 1997 to, among other things, enhance, to the extent technically feasible and economically reasonable, access to advanced telecommunications and information services for all public and nonprofit elementary and secondary schools and libraries. Under the E-Rate program, eligible schools, libraries, and consortia (comprised of eligible schools and libraries) may request universal service discounts for eligible services and/or equipment (collectively, eligible services), including connections necessary to support broadband connectivity to eligible schools and libraries. Eligible services must be used “primarily for educational purposes.” In the case of schools, “educational purposes” is defined as “activities that are integral, immediate, and proximate to the education of students. In the case of libraries, “educational purposes” is defined as activities that are “integral, immediate, and proximate to the provision of library services to library patrons.”
      • As the pandemic continues to force schools and libraries across the country to remain closed and rely on remote learning and virtual services, either in whole or in part, the need for broadband connections—particularly for those students, teachers, staff, and patrons that lack an adequate connection at home—is more critical than ever.  Eligible schools and libraries explain that they are hampered in their ability to address the connectivity needs brought on, and in many cases exacerbated, by COVID-19 because of the restrictions on off-campus use of E-Rate-funded services and facilities.   Last spring, as the COVID-19 pandemic forced schools and libraries to grapple with the challenges of transitioning to remote learning, the FCC began to receive requests for emergency relief aimed at ensuring that all students have sufficient connectivity at home.
  • The European Commission’s President appealed to the United States (U.S.) in joining the European Union to jointly regulate technology. At the Davos Agenda, EC President Ursula von der Leyen made remarks, a significant portion of which focused on technological issues and the European Union’s (EU) proposals, the Digital Services Act and Digital Markets Act. It is unclear to extent to which the new administration in Washington will be willing to work with the EU. Undoubtedly, the Biden Administration will interpret a number of EU policies and decisions as being implicitly aimed at the U.S. technology sector but there may be common ground. Von der Leyen stated:
    • A year ago at Davos, we talked also intensively about digitalisation. The pandemic has massively accelerated the process. The European Union will dedicate 20% of NextGenerationEU to digital projects. To nurture innovative ecosystems, for example where universities, companies, innovators can access data and cooperate. To boost the vibrant start-up scene we have in cities like Sofia and Lisbon and to become a global hub for Artificial Intelligence. So that the 2020s can finally be Europe’s Digital Decade.
    • But for this to be a success, we must also address the darker sides of the digital world. Like for so many of us, the storming of the Capitol came as a shock to me. We are always quick to say: Democracy and values, they are part of our DNA. And that is true. But we must nurture our democracy every day, and defend our institutions against the corrosive power of hate speech, of disinformation, fake news and incitement to violence. In a world where polarising opinions are the loudest, it is a short step from crude conspiracy theories to the death of a police officer. Unfortunately, the storming of the Capitol Hill showed us how just true that is.
    • The business model of online platforms has an impact – and not only on free and fair competition, but also on our democracies, our security and on the quality of our information. That is why we need to contain this immense power of the big digital companies. Because we want the values we cherish in the offline world also to be respected online. At its most basic, this means that what is illegal offline should be illegal online too. And we want the platforms to be transparent about how their algorithms work. Because we cannot accept that decisions, that have a far-reaching impact on our democracy, are taken by computer programmes alone.
    • Right after von der Leyen addressed the unease she and others felt about the U.S. President’s freedom of expression being abridged because of a company’s rules outside of any controlling legal framework, she stated:
      • I want to invite our friends in the United States to join our initiatives. Together, we could create a digital economy rulebook that is valid worldwide: It goes from data protection and privacy to the security of critical infrastructure. A body of rules based on our values: Human rights and pluralism, inclusion and the protection of privacy. So Europe stands ready.
      • The challenges to our democracy, the pandemic, climate change – in his inauguration speech President Joe Biden so aptly spoke of a Cascade of Crises. And indeed, we face an outstanding set of challenges. But we can meet them – if we work together. That is what we all have to learn again after four long years. That it is not a sign of weakness, to reach out and help each other, but a signal of strength.
  • Consumer Reports tried to become an authorized agent under the “California Consumer Privacy Act” (CCPA) (AB 375) to make do not sell personal data requests or opt out requests. The CCPA was designed to allow California residents to use services that would handle these preferences on a global scale. In their report on the pilot program, Consumer Reports concluded:
    • Unfortunately, too many companies have made it difficult, if not impossible, for agents and consumers to submit opt-out requests. The AG should enforce companies’ compliance with the law so that the authorized agent provisions work as intended. Moreover, the AG should promulgate additional common-sense rules to make sure that opt outs are simple and effective, even when submitted by an authorized agent.
    • Consumer Reports made these recommendations:
      • The AG should hold companies accountable when they violate the law. The AG needs to hold companies accountable for failure to comply with the CCPA’s authorized agent provisions. Without a viable authorized agent option, consumers could be left to navigate complicated processes or interfaces in order to exercise their California privacy rights themselves. Enforcement will help ensure that companies work harder to make sure that they have appropriate agent flows. The AG should also step in when customer service isn’t effective, and should consider directing enforcement resources to encourage better training in this area.
      • The AG should clarify that data shared for cross-context targeted advertising is a sale, and tighten the restrictions on service providers. Many companies have exploited ambiguities in the definition of sale and the rules surrounding service providers to ignore consumers’ requests to opt out of behavioral advertising. While the newly-passed California Privacy Rights Act will largely address these loopholes, these provisions will not go into effect until January 1, 2023. Thus, the AG should exercise its broad authority to issue rules to clarify that the transfer of data between unrelated companies for any commercial purpose falls under the definition of sale. Another common way for companies to avoid honoring consumers’ right to opt out of behavioral advertising is by claiming a service provider exemption. For example, the Interactive Advertising Bureau (IAB), a trade group that represents the ad tech industry, developed a framework for companies to evade the opt out by abusing a provision in the CCPA meant to permit a company to perform certain limited services on its behalf. To address this problem, the AG should clarify that companies cannot transfer data to service providers for behavioral advertising if the consumer has opted out of sale.
      • The AG should prohibit dark patterns as outlined in the Third Set of Proposed Modifications. We appreciate that the AG has proposed to “require minimal steps to allow the consumer to opt-out” and to prohibit dark patterns, “a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out[,]” in the Third Set of Proposed Modifications to the CCPA Regulations. This proposal should be finalized as quickly as possible. This is essential, given the difficulties that authorized agents and consumers have experienced in attempting to stop the sale of their information, as demonstrated in the study.
      • The AG should require companies to notify agents when the opt-out request has been received and when it has been honored. Too often, the company provided no information on whether or not the opt-out request had been honored. While the CCPA rules require companies to notify consumers if an opt-out request has been rejected, there is no requirement to provide notice of receipt, or notice of confirmation—nor is there guidance on how to respond to opt-out requests when the company does not possess the consumer’s data. The authorized agent was, in some cases, unable to explain to the consumer whether not the opt-out process had been completed. To ensure that the authorized agent service is effective, companies must be required to provide notification upon receipt and completion of the opt-out request. Required notification is also important for compliance purposes. For example, the regulations require companies to comply with opt outs within 15 business days. Without providing adequate notification, there’s no way to judge whether or not the company has honored the law and to hold them accountable if not. Further, if the company does sell consumers’ personal information, but does not have personal information about the consumer who is the subject of the request, the company should be required to notify the agent that the request has been received, and that the company will honor the opt out if and when they do collect the consumer’s data. In the case of an agent opt out, the notification should go to the agent. Otherwise, the consumer could end up getting emails from hundreds, if not thousands, of different companies.
      • The AG should clarify that if an agent inadvertently submits a request incorrectly, the company should either accept it or inform the agent how to submit it appropriately. The regulations provide helpful guidance with respect to consumer access and deletion requests, which ensures that even if a consumer inadvertently submits a request incorrectly, there is a process in place to help them submit it properly. If a consumer submits a request in a manner that is not one of the designated methods of submission, or is deficient in some manner unrelated to the verification process, the business shall either: (1) Treat the request as if it had been submitted in accordance with the business’s designated manner, or (2) Provide the consumer with information on how to submit the request or remedy any deficiencies with the request, if applicable. The AG should clarify that this guidance applies to all authorized agent-submitted requests as well.
  • The Government Accountability Office (GAO) assessed the Department of Defense’s (DOD) efforts to transition to a more secure version of the Global Positioning System (GPS), an initiative that spans back to the administration of former President George W. Bush. The GAO stated “due to the complexity of the technology, M-code remains years away from being widely fielded across DOD. M-code-capable receiver equipment includes different components, and the development and manufacture of each is key to the modernization effort. These include:
    • special M-code application-specific integrated circuit chips,
    • special M-code receiver cards, being developed under the Air Force Military GPS User Equipment (MGUE) programs, and
    • the next generation of GPS receivers capable of using M-code signals from GPS satellites.
    • The GAO added:
      • DOD will need to integrate all of these components into different types of weapon systems… Integration across DOD will be a considerable effort involving hundreds of different weapon systems, including some with complex and unique integration needs or configurations.
    • The GAO further asserted:
      • The Air Force is almost finished—approximately one year behind schedule— developing and testing one M-code card for testing on the Marine Corps Joint Light Tactical Vehicle and the Army Stryker vehicle. However, one card intended for use in aircraft and ships is significantly delayed and missed key program deadlines. The Air Force is revising its schedule for testing this card.
      • The M-code card development delays have had ripple effects on GPS receiver modernization efforts and the weapon systems that intend to use them.
  • The advocate who brought the cases that brought down both the Safe Harbor and Privacy Shield agreements between the United States (U.S.) and European Union (EU) announced that Ireland’s Data Protection Commission (DPC) has agreed to finally decide on the legality of Facebook’s data transfers to the U.S. that gave rise to both lawsuits. In a press release, none of your business (noyb). Last fall, noyb announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish DPC today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.)” In September 2020, after the DPC directed Facebook to stop transferring the personal data of European Union citizens to the U.S., the company filed suit in Ireland’s court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • In explaining the most recent development, noyb further asserted:
      • The DPC has agreed with Max Schrems’ demand to swiftly end a 7.5 year battle over EU-US data transfers by Facebook and come to a decision on Facebook’s EU-US data flows. This only came after a Judicial Review against the DPC was filed by Mr Schrems. The case would have been heard by the Irish High Court today.
      • New “own volition” procedure blocked pending complaint from 2013. The Irish DPC oversees the European operations of Facebook. In Summer 2020 the European Court of Justice (CJEU) ruled on a complaint by Mr Schrems that had been pending since 2013 and came before the CJEU for the second time (“Schrems II”): Under the CJEU judgment the DPC must stop Facebook’s EU-US data flows over extreme US Surveillance Laws (like FISA 702). Instead of implementing this ruling, the DPC started a new “own volition” case and paused the original procedure for an indefinite time. Mr Schrems and Facebook brought two Judicial Review procedures against the DPC: While Facebook argued in December that the “own volition” procedure should not go ahead, Mr Schrems argued that his complaints procedure should be heard independently of the “own volition” case.
      • Walls are closing in on Facebook’s EU-US data transfers. The DPC has now settled the second Judicial Review with Mr Schrems just a day before the hearing was to take place, and pledged to finalize his complaints procedure swiftly.
      • As part of the settlement, Mr Schrems will also be heard in the “own volition” procedure and get access to all submissions made by Facebook, should the Court allow the “own volition” investigation to go ahead. Mr Schrems and the DPC further agreed that the case will be dealt with under the GDPR, not the Irish Data Protection Act that was applicable before 2018. The DPC may await the High Court judgement in Facebook’s Judicial Review before investigating the original complaint.
      • This agreement could in essence make the original complaints procedure from 2013 the case that ultimately determines the destiny of Facebook’s EU-US transfers in the wake of the Snowden disclosures. Under the GDPR the DPC has every liberty to issue fines of up to 4% pf Facebook’s global turnover and transfer prohibitions, even on the basis of this individual case.
  • The Information Technology Industry Council (ITI), BSA | The Software Alliance, Internet Association, Computer and Communications Industry Association, and the National Foreign Trade Council made recommendations to the Biden Administration on technology policy and asserted in their press release:
    • Prioritize strategic engagement with U.S. trading partners by ensuring continued protected transatlantic data flows, establishing a U.S.-EU Trade & Technology Council, engaging China through prioritization of digital and technology issues, broadening U.S. engagement and leadership in the Asia-Pacific region, addressing key barriers to digital trade with India, and providing capacity building assistance to the African Union;
    • Promote U.S. competitiveness through leadership on digital trade by countering unilateral, targeted digital taxes, building acceptance of state-of-the-art digital trade commitments, promoting workforce development initiatives globally, and more; and
    • Reassert U.S. multilateral leadership by strengthening and leveraging engagement in global fora such as the WTO, OECD, United Nations, G20, G7, APEC, and others, and by expanding existing plurilateral trade agreements.
  • A group of civil rights organizations and public interest organizations issued “Civil Rights, Privacy, and Technology: Recommended 2021 Oversight Priorities for the 117th Congress” that builds upon the October 2020 Civil Rights Principles for the Era of Big Data. These groups stated:
    • The 117th Congress must take action to ensure that technology serves all people in the United States, rather than facilitating discrimination or reinforcing existing inequities.
    • They cited the following areas of policy that need to be addressed:
      • Broadband Internet
      • Democracy: Voting, the Census, and Hateful Content Online
      • Policing and Justice
      • Immigration Surveillance Technology
      • Commercial Data Practices and Privacy
      • Workers, Labor, and Hiring
  • The United Kingdom’s (UK) Information Commissioner Elizabeth Denham sketched out how she is approaching her final year in office in a blog post. Denham stated:
    • The ICO’s immediate focus remains supporting organisations through the impacts of COVID 19. We have prioritised providing advice and support on data protection related aspects of the pandemic since the start, and will continue to do so, adjusting and responding to the new challenges the country will face until, well, ‘all this is finished’. That work includes protecting people’s rights, and making sure data protection is considered at the earliest stage of any innovations.
    • The Age Appropriate Design Code will start to have a real impact, as the transition period around its introduction comes to an end, and we will be working hard to support organisations to make the necessary changes to comply with the law.
    • We’ll also be focused on supporting organisations around data sharing, following the publication of our guidance last month. The guidance is accompanied by practical resources to help organisations share data in line with the law. As I discussed with the House of Lords Public Services Committee this month, data sharing is an important area of focus, and we will also be supporting broader work to encourage the necessary culture change to remove obstacles to data sharing.
    • Other support for organisations planned for this year includes guidance on political campaigning, facial recognition, and codes of conduct and certification schemes, as well as a digital version of our Data Protection Practitioners’ Conference in April. We’ll also have the latest phases of our grants scheme and sandbox programme. Both are an effective way of the ICO supporting original thinking around privacy, illustrated by the innovative data sharing projects we’ve recently worked with.
    • Our operational work will also continue, including the latest phases of our work looking at data broking, the use of sexual crime victims’ personal information, and adtech, including audits focused on digital marketing platforms.

Coming Events

  • On 10 February, the House Homeland Committee will hold a hearing titled “Homeland Cybersecurity: Assessing Cyber Threats and Building Resilience” with these witnesses:
    • Mr. Chris Krebs, Former Director, Cybersecurity and Infrastructure Security Agency, U.S. Department of Homeland Security
    • Ms. Sue Gordon, Former Principal Deputy Director of National Intelligence, Office of the Director of National Intelligence
    • Mr. Michael Daniel, President & CEO, Cyber Threat Alliance
    • Mr. Dmitri Alperovitch, Executive Chairman, Silverado Policy Accelerator
  • The House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Justice Restored: Ending Forced Arbitration and Protecting Fundamental Rights” on 11 February.
  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Supushpitha Atapattu from Pexels

EDPB and EDPS Issue Opinions On EC’s Draft SCCs

The EU’s two bloc-wide data protection entities weighed in on the EC’s proposed changes to SCCs, meant to satisfy the Schrems II ruling.

The European Union’s (EU) data protection authorities have rendered their joint opinions on the European Commission’s (EC) draft revisions of Standard Contractual Clauses (SCC) permissible under the General Data Protection Regulation (GDPR). At present, SCCs are the primary means by which companies are transferring the personal data of EU residents to other nations for processing, especially the United States (U.S.), without adequacy decisions. Since the adequacy decision on the U.S. was struck down companies have been left only with SCCs, and there are efforts afoot to have the EU’s top court to strike down SCCs governing the transfer of personal data to the U.S. on account of what critics call inadequate redress and protection from U.S. surveillance.

Before I turn to the European Data Protection Board (EDPB) and European Data Protection Supervisor’s (EDPS) joint opinions, some background would be helpful. In mid-2020, in a very anticipated decision, the EU’s top court struck down the adequacy decision underpinning the U.S.-EU Privacy Shield agreement. Under the GDPR, the easiest way for a controller to transfer the personal data of EU residents for processing outside the EU is through such a decision that essentially says the laws of the other nation are basically equivalent to the EU’s with respect to the rights they provide. The U.S. is the biggest trading partner with the EU with respect to these data flows with companies like Facebook and Google generating billions, maybe even trillions, of dollars in economic activity. Consequently, both Washington and Brussels have many reasons to favor the easiest route to making data flows happen. However, the forerunner to Privacy Shield (i.e. Safe Harbor) was also struck down, largely because of the inadequacy of U.S. privacy rights and mass surveillance, and so the U.S. made some changes, but these, too, proved inadequate, and litigation brought by Austrian activist and privacy advocate Maximillian Schrems against Facebook finally made its way to the Court of Justice for the European Union (CJEU).

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the CJEU explained:

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the U.S. lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law. However, the CJEU left open the question of whether SCCs may permit the continued exporting of EU personal data into the U.S. for companies like Facebook, Google, and many, many others. Consequently, there has been no small amount of interpreting and questioning of whether this may be a way for the trans-Atlantic data flow to continue. And yet, the CJEU seemed clear that additional measures would likely be necessary. Indeed, the CJEU asserted “[c]ontrollers and processors should be encouraged to provide additional safeguards via contractual commitments that supplement standard protection clauses” and “[i]n so far as those standard data protection clauses cannot, having regard to their very nature, provide guarantees beyond a contractual obligation to ensure compliance with the level of protection required under EU law, they may require, depending on the prevailing position in a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with that level of protection.”

Thereafter the EC stepped into the breach to seemingly shore up SCCs to protect them from the same fate as Privacy Shield, for it sems like it is a matter of time before the legality of SCCs are challenged. In mid-November 2020, the EC released for comment a draft revision of SCC for transfers of personal data to countries outside the EU with input due by 10 December. The EC had last revised EU law on SCCs in 2010, some years before the GDPR came into force. The EC released draft legislative language and, in an Annex, actual contract language for use by controllers and processors in the form of modules that are designed to be used in a variety of common circumstances (e.g., transfers by controllers to other controllers or a controller to a processor.) However, the EC stressed that SCCs form a floor and controllers, processors, and other parties are free to add additional language so long as it does not contradict or denigrate the rights protected by SCCs.

In the implementing decision, the EC asserted

the standard contractual clauses needed to be updated in light of new requirements in Regulation (EU) 2016/679. Moreover, since the adoption of these decisions, important developments have taken place in the digital economy, with the widespread use of new and more complex processing operations often involving multiple data importers and exporters, long and complex processing chains as well as evolving business relationships. This calls for a modernisation of the standard contractual clauses to better reflect those realities, by covering additional processing and transfer situations and to use a more flexible approach, for example with respect to the number of parties able to join the contract.

The EC continued:

The standard contractual clauses set out in the Annex to this Decision may be used by a controller or a processor in order to provide appropriate safeguards within the meaning of Article 46(1) of Regulation (EU) 2016/679 for the transfer of personal data to a processor or a controller established in a third country. This also includes the transfer of personal data by a controller or processor not established in the Union, to the extent that the processing is subject to Regulation (EU) 2016/679 pursuant to Article 3(2) thereof, because it relates to the offering of goods or services to data subjects in the Union or the monitoring of their behaviour as far as their behaviour takes place within the Union.

The EC explained the design and intent of the SCC language in the Annex:

  • The standard contractual clauses set out in the Annex to this Decision combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains. In addition to the general clauses, controllers and processors should select the module applicable to their situation, which makes it possible to tailor their obligations under the standard contractual clauses to their corresponding role and responsibilities in relation to the data processing at issue. It should be possible for more than two parties to adhere to the standard contractual clauses. Moreover, additional controllers and processors should be allowed to accede to the standard contractual clauses as data exporters or importers throughout the life cycle of the contract of which those clauses form a part.
  • These Clauses set out appropriate safeguards, including enforceable data subject rights and effective legal remedies, pursuant to Article 46(1), and Article 46 (2)(c) of Regulation (EU) 2016/679 and, with respect to data transfers from controllers to processors and/or processors to processors, standard contractual clauses pursuant to Article 28(7) of Regulation (EU) 2016/679, provided they are not modified, except to add or update information in the Annexes. This does not prevent the Parties from including the standard contractual clauses laid down in this Clauses in a wider contract, and to add other clauses or additional safeguards provided that they do not contradict, directly or indirectly, the standard contractual clauses or prejudice the fundamental rights or freedoms of data subjects. These Clauses are without prejudice to obligations to which the data exporter is subject by virtue of the Regulation (EU) 2016/679

On the same day, the EC released its SCC proposals, the EDPB issued guidance documents, which was surely not coincidental. In “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,” the EDPB explained the genesis and rationale for the document:

  • The GDPR or the [CJEU] do not define or specify the “additional safeguards”, “additional measures” or “supplementary measures” to the safeguards of the transfer tools listed under Article 46.2 of the GDPR that controllers and processors may adopt to ensure compliance with the level of protection required under EU law in a particular third country.
  • The EDPB has decided, on its own initiative, to examine this question and to provide controllers and processors, acting as exporters, with recommendations on the process they may follow to identify and adopt supplementary measures. These recommendations aim at providing a methodology for the exporters to determine whether and which additional measures would need to be put in place for their transfers. It is the primary responsibility of exporters to ensure that the data transferred is afforded in the third country of a level of protection essentially equivalent to that guaranteed within the EU. With these recommendations, the EDPB seeks to encourage consistent application of the GDPR and the Court’s ruling, pursuant to the EDPB’s mandate

Broadly speaking, whether SCCs and supplemental measures will pass muster under the GDPR will be determined on a case-by-case basis. The EDPB did not offer much in the way of bright line rules. Indeed, it will be up to SAs to determine if transfers to nations like the U.S. are possible under the GDPR, meaning these recommendations may shed more light on this central question without deciding it. One wonders, as a practical matter, if the SAs will have the capacity, resources, and will to police SCCs to ensure the GDPR and Charter are being met.

Nonetheless, the EDPB stressed the principle of accountability under which controllers which export personal data must ensure that whatever mechanism and supplemental measures govern a data transfer, the data must receive the same protection it would in the EU. The EDPB made the point that EU protections travel with the data and should EU personal data make its way to a country where it is not possible for appropriate protection to occur, then the transfer violates the GDPR. Moreover, these recommendations pertain to both public and private transfers of EU data to private sector entities outside the EU.

These recommendations work like a decision tree with exporters needing to ask themselves a series of questions to determine whether they must use supplemental measures. This may prove a resource intensive process, for exporters will need to map all transfers (i.e. know exactly) where the data are going. The exporter must understand the laws and practices of the third nation in order to put in place appropriate measures if this is possible in order to meet the EU’s data protection standards.

Reading between the lines leads one to conclude that data exporters may not send personal data to the U.S. for its federal surveillance regime is not “necessary and proportionate,” at least from the EU’s view. The U.S. lacks judicial redress in the case a U.S. national, let alone a foreign national, objects to the sweeping surveillance. The U.S. also has neither a national data protection law nor a dedicated data protection authority. These hints seem to also convey the EDPB’s view on the sorts of legal reforms needed in the U.S. before an adequacy decision would pass muster with the CJEU.

The EDPB said it was still evaluating how Schrems II affects the use of BCR and ad hoc contractual clauses, two of the other alternate means of transferring EU personal data in the absence of an adequacy agreement.

Nevertheless, in an annex, the EDPB provided examples of supplementary measures that may be used depending on the circumstances, of course, such as “flawlessly implemented” encryption and pseudonymizing data. However, the EDPB discusses these in the context of different scenarios and calls for more conditions than just the two aforementioned. Moreover, the EDPB rules out two scenarios categorically as being inadequate: “Transfer to cloud services providers or other processors which require access to data in the clear” and “Remote access to data for business purposes.”

The EDPB also issued an update to guidance published after the first lawsuit brought by Maximilian Schrems resulted in the striking down of the Safe Harbor transfer agreement. The forerunner to the EDPB, the Working Party 29, had drafted and released the European Essential Guarantees, and so, in light of Schrems II, the EDPB updated and published “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” “to provide elements to examine, whether surveillance measures allowing access to personal data by public authorities in a third country, being national security agencies or law enforcement authorities, can be regarded as a justifiable interference or not” with fundamental EU rights and protections. As the EDPB explains, these recommendations are intended to help data controllers and exporters determine whether other nations have protections and processes in place equivalent to those of the EU visa vis their surveillance programs. The EDPB stressed that these are the essential guarantees and other features and processes may be needed for a determination of lawfulness under EU law.

The EDPB formulated the four European Essential Guarantees:

A. Processing should be based on clear, precise and accessible rules

B. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated

C. An independent oversight mechanism should exist

D. Effective remedies need to be available to the individual

Where the new joint opinions of the EDPB and EDPS fit into this process is that the EC asked for a joint opinion on its drafts as noted at the beginning of one of their opinions:

On 12 November 2020, the European Commission requested a joint opinion of the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) on the basis of Article 42(1), (2) of Regulation (EU) 2018/1725 (EU DPR) on these two sets of draft standard contractual clauses and the respective implementing acts.

Consequently, the EDPB and EDPS issued the following:

In Joint Opinion 1/2021, the two bodies explained:

The EDPB and the EDPS are of the opinion that clauses which merely restate the provisions of Article 28(3) and (4) GDPR and Article 29 (3) and (4) EUDPR are inadequate to constitute standard contractual clauses. The Board and EDPS have therefore decided to analyse the document in its entirety, including the appendices. In the opinion of the Board and the EDPS, a contract under Article 28 GDPR or Article 29 EUDPR should further stipulate and clarify how the provisions will be fulfilled. It is in this light that the Draft SCCs submitted to the Board and EDPS for opinion are analysed.

The EDPB and EDPS go on to ask the EC to better clarify the difference between the legislation on transfers between controllers and processors, which is meant to happen only inside the EU, and the transfers to third countries. They asked for clarity on the scope of the language. The EDPB and EDPS also asked that the EC expand the the intra-EU SCC decision to include those nations that have been found adequate (e.g. Israel, Japan, New Zealand, and others.)

The EDPB and EDPS did find much to like, however:

  • Adopted standard contractual clauses constitute a set of guarantees to be used as is, as they are intended to protect data subjects and mitigate specific risks associated with the fundamental principles of data protection.
  • The EDPB and the EDPS welcome in general the adoption of standard contractual clauses as a strong accountability tool that facilitates compliance by controllers and processors to their obligations under the GDPR and the EUDPR.
  • The EDPB already issued opinions on standard contractual clauses prepared by the Danish Supervisory Authority2 and the Slovenian Supervisory Authority 3.
  • To ensure a coherent approach to personal data protection throughout the Union, the EDPB and the EDPS strongly welcome the envisaged adoption of SCCs having an EU-wide effect by the Commission.
  • The same set of SCCs will indeed apply irrespective of whether this relationship involves private entities, public authorities of the Member States or EU institutions or bodies. These EU-wide SCCs will ensure further harmonisation and legal certainty.
  • The EDPB and the EDPS also welcome the fact that the same set of SCCs should apply in respect of the relationship between controllers and processors subject to GDPR and EUDPR respectively.

In Joint Opinion 2/2021, the EDPB and EDPS stated:

The Draft SCCs combine general clauses with a modular approach to cater for various transfer scenarios. In addition to the general clauses, controllers and processors should select the module applicable to their situation among the four following modules:

  • Module One: transfer controller to controller;
  • Module Two: transfer controller to processor;
  • Module Three: transfer processor to processor;
  • Module Four: transfer processor to controller.

Again, the EDPB and EDPS wanted greater clarity on the language in this decision, especially regarding SCCs governing EU institutions subject not to the GDPR but to Regulation (EU) 2018/1725 (aka the EUDPR). In general, the EDPB and EDPS had this comment on the actual draft SCCs:

The EDPB and the EDPS welcome the introduction of specific modules for each transfer scenarios. However, the EDPB and the EDPS note that it is not clear whether one set of the SCCs can include several modules in practice to address different situations, or whether this should amount to the signing of several sets of the SCCs. In order to achieve maximum readability and easiness in the practical application of the SCCs, the EDPB and the EDPS suggest that the European Commission provides additional guidance (e.g. in the form of flowcharts, publication of Frequently Asked Questions (FAQs), etc.). In particular, it should be made clear that the combination of different modules in a single set of SCCs cannot lead to the blurring of roles and responsibilities among the parties.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dimitris Vetsikas from Pixabay

Preview of Senate Democratic Chairs

It’s not clear who will end up where, but new Senate chairs will change focus and agenda of committees and debate over the next two years.

With the victories of Senators-elect Rafael Warnock (D-GA) and Jon Ossoff (D-GA), control of the United States Senate will tip to the Democrats once Vice President-elect Kamala Harris (D) is sworn in and can break the 50-50 tie in the chamber in favor of the Democrats. With the shift in control, new chairs will take over committees key to setting the agenda over the next two years in the Senate. However, given the filibuster, and the fact that Senate Republicans will exert maximum leverage through its continued use, Democrats will be hamstrung and forced to work with Republicans on matters such as federal privacy legislation, artificial intelligence (AI), the Internet of Things (IOT), cybersecurity, data flows, surveillance, etc. just as Republicans have had to work with Democrats over the six years they controlled the chamber. Having said that, Democrats will be in a stronger position than they had been and will have the power to set the agenda in committee hearings, being empowered to call the lion’s share of witnesses and to control the floor agenda. What’s more, Democrats will be poised to confirm President-elect Joe Biden’s nominees at agencies like the Federal Communications Commission (FCC), Federal Trade Commission (FTC), the Department of Justice (DOJ), and others, giving the Biden Administration a free hand in many areas of technology policy.

All of that being said, this is not meant to be an exhaustive look at all the committees of jurisdiction and possible chairs. Rather, it seeks to survey likely chairs on selected committees and some of their priorities for the next two years. Subcommittee chairs will also be important, but until the cards get shuffled among the chairs, it will not be possible to see where they land at the subcommittee level.

When considering the possible Democratic chairs of committees, one must keep in mind it is often a matter of musical chairs with the most senior members getting first choice. And so, with Senator Patrick Leahy (D-VT) as the senior-most Democratic Senator, he may well choose to leave the Appropriations Committee and move back to assume the gavel of the Judiciary Committee. Leahy has long been a stakeholder on antitrust, data security, privacy, and surveillance legislation and would be in a position to influence what bills on those and other matters before the Senate look like. If Leahy does not move to the chair on Judiciary, he may still be entitled to chair a subcommittee and exert influence.

If Leahy stays put, then current Senate Minority Whip Dick Durbin (D-IL) would be poised to leapfrog Senator Dianne Feinstein (D-CA) to chair Judiciary after Feinstein was persuaded to step aside on account of her lackluster performance in a number of high-profile hearings in 2020. Durbin has also been active on privacy, data security, and surveillance issues. The Judiciary Committee will be central to a number of technology policies, including Foreign Intelligence Surveillance Act reauthorization, privacy legislation, Section 230 reform, antitrust, and others. On the Republican side of the dais, Senator Lindsey Graham (R-SC) leaving the top post because of term limit restrictions imposed by Republicans, and Senator Charles Grassley (R-IA) is set to replace him. How this changes the 47 USC 230 (Section 230) debate is not immediately clear. And yet, Grassley and three colleagues recently urged the Trump Administration in a letter to omit language in a trade agreement with the United Kingdom (UK) that mirrors the liability protection Section 230. Senators Rob Portman (R-OH), Mark R. Warner (D-VA), Richard Blumenthal (D-CT), and Grassley argued to U.S. Trade Representative Ambassador Robert Lighthizer that a “safe harbor” like the one provided to technology companies for hosting or moderating third party content is outdated, not needed in a free trade agreement, contrary to the will of both the Congress and UK Parliament, and likely to be changed legislatively in the near future. It is likely, however, Grassley will fall in with other Republicans propagating the narrative that social media is unfairly biased against conservatives, particularly in light of the recent purge of President Donald Trump for his many, repeated violations of policy.

The Senate Judiciary Committee will be central in any policy discussions of antitrust and anticompetition in the technology realm. But it bears note the filibuster (and the very low chances Senate Democrats would “go nuclear” and remove all vestiges of the functional supermajority requirement to pass legislation) will give Republicans leverage to block some of the more ambitious reforms Democrats might like to enact (e.g. the House Judiciary Committee’s October 2020 final report that calls for nothing less than a complete remaking of United States (U.S.) antitrust policy and law; see here for more analysis.)

It seems Senator Sherrod Brown (D-OH) will be the next chair of the Senate Banking, Housing, and Urban Development Committee which has jurisdiction over cybersecurity, data security, privacy, and other issues in the financial services sector, making it a player on any legislation designed to encompass the whole of the United States economy. Having said that, it may again be the case that sponsors of, say, privacy legislation decide to cut the Gordian knot of jurisdictional turf battles by cutting out certain committees. For example, many of the privacy bills had provisions making clear they would deem financial services entities in compliance with the Financial Services Modernization Act of 1999 (P.L. 106-102) (aka Gramm-Leach-Bliley) to be in compliance with the new privacy regime. I suppose these provisions may have been included on the basis of the very high privacy and data security standards Gramm-Leach-Bliley has brought about (e.g. the Experian hack), or sponsors of federal privacy legislation made the strategic calculation to circumvent the Senate Banking Committee as much as they can. Nonetheless, this committee has sought to insert itself into the policymaking process on privacy last year as Brown and outgoing Chair Mike Crapo (R-ID) requested “feedback” in February 2019 “from interested stakeholders on the collection, use and protection of sensitive information by financial regulators and private companies.” Additionally, Brown released what may be the most expansive privacy bill from the perspective of privacy and civil liberties advocates, the “Data Accountability and Transparency Act of 2020” in June 2020 (see here for my analysis.) Therefore, Brown may continue to push for a role in federal privacy legislation with a gavel in his hands.

In a similar vein, Senator Patty Murray (D-WA) will likely take over the Senate Health, Education, Labor, and Pensions (HELP) Committee which has jurisdiction over health information privacy and data security through the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act). Again, as with the Senate Banking Committee and Gramm-Leach-Bliley, most of the privacy bills exempt HIPAA-compliant entities. And yet, even if her committee is cut out of a direct role in privacy legislation, Murray will still likely exert influence through oversight of and possible legislation changing HIPAA regulations and the Department of Health and Human Services (HHS) enforcement and rewriting of these standards for most of the healthcare industry. For example, HHS is rushing a rewrite of the HIPAA regulations at the tail end of the Trump Administration, and Murray could be in a position to inform how the Biden Administration and Secretary of Health and Human Services-designate Xavier Berra handles this rulemaking. Additionally, Murray may push the Office of Civil Rights (OCR), the arm of HHS that writes and enforces these regulations, to prioritize matters differently.

Senator Maria Cantwell (D-WA) appears to be the next chair of the Senate Commerce, Science, and Transportation Committee and arguably the largest technology portfolio in the Senate. It is the primary committee of jurisdiction for the FCC, FTC, National Telecommunications and Information Administration (NTIA), the National Institute of Standards and Technology (NIST), and the Department of Commerce. Cantwell may exert influence on which people are nominated to head and staff those agencies and others. Her committee is also the primary committee of jurisdiction for domestic and international privacy and data protection matters. And so, federal privacy legislation will likely be drafted by this committee, and legislative changes so the U.S. can enter into a new personal data sharing agreement with the European Union (EU) would also likely involve her and her committee.

Cantwell and likely next Ranking Member Roger Wicker (R-MS) agree on many elements of federal privacy law but were at odds last year on federal preemption and whether people could sue companies for privacy violations. Between them, they circulated three privacy bills. In September 2020, Wicker and three Republican colleagues introduced the “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) (see here for more analysis). Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis).

Cantwell could also take a leading role on Section 230, but her focus, of late, seems to be on how technology companies are wreaking havoc to traditional media. released a report that she has mentioned during her opening statement at the 23 September hearing aimed at trying to revive data privacy legislation. She and her staff investigated the decline and financial troubles of local media outlets, which are facing a cumulative loss in advertising revenue of up to 70% since 2000. And since advertising revenue has long been the life blood of print journalism, this has devastated local media with many outlets shutting their doors or radically cutting their staff. This trend has been exacerbated by consolidation in the industry, often in concert with private equity or hedge funds looking to wring the last dollars of value from bargain basement priced newspapers. Cantwell also claimed that the overwhelming online advertising dominance of Google and Facebook has further diminished advertising revenue and other possible sources of funding through a variety of means. She intimates that much of this content may be illegal under U.S. law, and the FTC may well be able to use its Section 5 powers against unfair and deceptive acts and its anti-trust authority to take action. (see here for more analysis and context.) In this vein, Cantwell will want her committee to play in any antitrust policy changes, likely knowing massive changes in U.S. law are not possible in a split Senate with entrenched party positions and discipline.

Senator Jack Reed (D-RI) will take over the Senate Armed Services Committee and its portfolio over national security technology policy that includes the cybersecurity, data protection and supply chain of national security agencies and their contractors, AI, offensive and defensive U.S. cyber operations, and other realms. Much of the changes Reed and his committee will seek to make will be through the annual National Defense Authorization Act (NDAA) (see here and here for the many technology provisions in the FY 2021 NDAA.) Reed may also prod the Department of Defense (DOD) to implement or enforce the Cybersecurity Maturity Model Certification (CMMC) Framework differently than envisioned and designed by the Trump Administration. In December 2020, a new rule took effect designed to drive better cybersecurity among U.S. defense contractors. This rule brings together two different lines of effort to require the Defense Industrial Base (DIB) to employ better cybersecurity given the risks they face by holding and using classified information, Federal Contract Information (FCI) and Controlled Unclassified Information (CUI). The Executive Branch has long wrestled with how to best push contractors to secure their systems, and Congress and the White House have opted for using federal contract requirements in that contractors must certify compliance. However, the most recent initiative, the CMMC Framework will require contractors to be certified by third party assessors. And yet, it is not clear the DOD has wrestled with the often-misaligned incentives present in third party certification schemes.

Reed’s committee will undoubtedly delve deep into the recent SolarWinds hack and implement policy changes to avoid a reoccurrence. Doing so may lead the Senate Armed Services Committee back to reconsidering the Cyberspace Solarium Commission’s (CSC) March 2020 final report and follow up white papers, especially their views embodied in “Building a Trusted ICT Supply Chain.”

Senator Mark Warner (D-VA) will likely take over the Senate Intelligence Committee. Warner has long been a stakeholder on a number of technology issues and would be able to exert influence on the national security components of such issues. He and his committee will almost certainly play a role in the Congressional oversight of and response to the SolarWinds hack. Likewise, his committee shares jurisdiction over FISA with the Senate Judiciary Committee and over national security technology policy with the Armed Services Committee.

Senator Amy Klobuchar (D-MN) would be the Senate Democratic point person on election security from her perch at the Senate Rules and Administration Committee, which may enable her to more forcefully push for the legislative changes she has long advocated for. In May 2019, Klobuchar and other Senate Democrats introduced the “Election Security Act” (S. 1540), the Senate version of the stand-alone measure introduced in the House that was taken from the larger package, the “For the People Act” (H.R. 1) passed by the House.

In August 2018, the Senate Rules and Administration Committee postponed indefinitely a markup on a compromise bill to provide states additional assistance in securing elections from interference, the “The Secure Elections Act” (S.2593). Reportedly, there was concern among state officials that a provision requiring audits of election results would be in effect an unfunded mandate even though this provision was softened at the insistence of Senate Republican leadership. However, a Trump White House spokesperson indicated in a statement that the Administration opposed the bill, which may have posed an additional obstacle to Committee action. However, even if the Senate had passed its bill, it was unlikely that the Republican controlled House would have considered companion legislation (H.R. 6663).

Senator Gary Peters (D-MI) may be the next chair of the Senate Homeland Security and Governmental Affairs Committee, and if so, he will continue to face the rock on which many the bark of cybersecurity legislation has been dashed: Senator Ron Johnson (R-WI). So significant has Johnson’s opposition been to bipartisan cybersecurity legislation from the House, some House Republican stakeholders have said so in media accounts not bothering to hide in anonymity. And so whatever Peters’ ambitions may be to shore up the cybersecurity of the federal government as his committee will play a role in investigating and responding to the Russian hack of SolarWinds and many federal agencies, he will be limited by whatever Johnson and other Republicans will allow to move through the committee and through the Senate. Of course, Peters’ purview would include the Department of Homeland Security and the Cybersecurity and Infrastructure Security Agency (CISA) and its remit to police the cybersecurity practices of the federal government. Peters would also have in his portfolio the information technology (IT) practices of the federal government, some $90 billion annually across all agencies.

Finally, whether it be Leahy or Durbin at the Senate Appropriations Committee, this post allows for immense influence in funding and programmatic changes in all federal programs through the power of the purse Congress holds.

Privacy Shield Hearing

The focus was on how the U.S. and EU can reach agreement on an arrangement that will not be struck down by the EU’s highest court.

Last week, the Senate Commerce, Science, and Transportation Committee held a hearing on the now invalidated European Union (EU)-United States (U.S.) Privacy Shield, a mechanism that allowed companies to transfer the personal data of EU residents to the U.S. The EU’s highest court struck down the adequacy decision that underpinned the system on the basis of U.S. surveillance activities and lack of redress that violated EU law. This is the second time in the decade the EU’s top court has invalidated a transfer arrangement, the first being the Safe Harbor system. Given the estimated billions, or even trillions, of dollars in value realized from data flows between the EU and U.S. there is keen interest on both sides of the Atlantic in finding a legal path forward. However, absent significant curtailment of U.S. surveillance and/or a significant expansion of the means by which EU nationals could have violations of their rights rectified, it would appear a third agreement may not withstand the inevitable legal challenges. Moreover, there are questions as to the legality of other transfer tools in light of the Court of Justice for the European Union’s decision in the case known as Schrems II, and the legality of some Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) may be soon be found in violation, too.

Consequently, a legislative fix, or some portion thereof, could be attached to federal privacy legislation. Hence, the striking down of Privacy Shield may provide additional impetus to Congress and the next Administration to reach a deal on privacy. Moreover, the lapsed reauthorization of some Foreign Intelligence Surveillance Act authorities may be another legislative opportunity for the U.S. to craft an approach amendable to the EU in order to either obtain an adequacy decision or a successor agreement to the Privacy Shield.

Chair Roger Wicker (R-MS) approached the issue from the perspective of international trade and the economic benefit accruing to businesses on both sides of the Atlantic. His opening remarks pertained less to the privacy and surveillance aspects of the CJEU’s ruling. Wicker appears to be making the case that the EU seems to misunderstand that redress rights in the U.S. are more than adequate, and the U.S.’ surveillance regime is similar to those of some EU nations. One wonders if the CJEU is inclined to agree with this position. Nonetheless, Wicker expressed hope that the EU and U.S. can reach “a durable and lasting data transfer framework…that provides meaningful data protections to consumers, sustains the free flow of information across the Atlantic, and encourages continued economic and strategic partnership with our European allies – a tall order but an essential order.” He worried about the effect of the CJEU’s ruling on SCCs. Wicker made the case that the EU and U.S. share democratic values and hinted that the ongoing talks in the committee to reach a federal data privacy law might include augmented redress rights that might satisfy the CJEU.

Ranking Member Maria Cantwell (D-WA) spoke very broadly about a range of issues related to data transfers and privacy. She stressed the importance of data flows in the context of larger trade relations. Cantwell also stressed the shared values between the U.S. and the EU and her hope that the two entities work “together on these very important national concerns, trade and technology, so that we can continue to improve economic opportunities and avoid moves towards protectionism.” She also called for federal privacy legislation but hinted that states should still be able to regulate privacy, suggesting her commitment to having a federal law be a floor for state laws. Cantwell also asserted that bulk surveillance, the likes of which the National security Agency has engaged in, may simply not be legal under EU law.

Deputy Assistant Secretary of Commerce for Services James Sullivan blurred the issues presented by Schrems II much like Cantwell did. The CJEU’s decision that focused on U.S. surveillance practices and the lack of meaningful recourse in the U.S. if an EU resident’s rights were violated was merged into a call for like-minded nations to unite against authoritarian nations. Sullivan distinguished between U.S. surveillance and the surveillance conducted by the People’s Republic of China (without naming the nation) and other regimes as if this should satisfy the EU as to the legality and propriety of U.S. treatment of EU personal data. Sullivan stated:

  • The Schrems II decision has created enormous uncertainties for U.S. companies and the transatlantic economy at a particularly precarious time. Immediately upon issuance of the ruling, the 5,400 Privacy Shield participants and their business partners in the EU could no longer rely on the Framework as a lawful basis for transferring personal data from Europe to the United States. Because neither the Court nor European data protection authorities provided for any enforcement grace period, Privacy Shield companies were left with three choices: (1) risk facing potentially huge fines (of up to 4 percent of total global turnover in the preceding year) for violating GDPR, (2) withdraw from the European market, or (3) switch right away to another more expensive data transfer mechanism.
  • Unfortunately, because of the Court’s ruling in the Privacy Shield context that U.S. laws relating to government access to data do not confer adequate protections for EU personal data, the use of other mechanisms like SCCs and BCRs to transfer EU personal data to the United States is now in question as well.
  • The objective of any potential agreement between the United States and the European Commission to address Schrems II is to restore the continuity of transatlantic data flows and the Framework’s privacy protections by negotiating targeted enhancements to Privacy Shield that address the Court’s concerns in Schrems II. Any such enhancements must respect the U.S. Government’s security responsibilities to our citizens and allies.
  • To be clear, we expect that any enhancements to the Privacy Shield Framework would also cover transfers under all other EU-approved data transfer mechanisms like SCCs and BCRs as well.
  • The Schrems II decision has underscored the need for a broader discussion among likeminded democracies on the issue of government access to data. Especially as a result of the extensive U.S. surveillance reforms since 2015, the United States affords privacy protections relating to national security data access that are equivalent to or greater than those provided by many other democracies in Europe and elsewhere.
  • To minimize future disruptions to data transfers, we have engaged with the European Union and other democratic nations in a multilateral discussion to develop principles based on common practices for addressing how best to reconcile law enforcement and national security needs for data with protection of individual rights.
  • It is our view that democracies should come together to articulate shared principles regarding government access to personal data—to help make clear the distinction between democratic societies that respect civil liberties and the rule of law and authoritarian governments that engage in the unbridled collection of personal data to surveil, manipulate, and control their citizens and other individuals without regard to personal privacy and human rights. Such principles would allow us to work with like-minded partners in preserving and promoting a free and open Internet enabled by the seamless flow of data.

Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips stressed he was speaking in a personal capacity and not for the FTC. He extolled the virtues of the “free and open” internet model in the U.S. with the double implication that it is superior both to nations like the PRC and Russia but also the EU model. Phillips seemed to be advocating for talking the EU into accepting that the U.S.’s privacy regime and civil liberties are stronger than any other nation. Her also made the case, like other witnesses, that the U.S. data privacy and protection regulation is more similar to the EU than the PRC, Russia, and others. Phillips also sought to blur the issues and recast Privacy Shield in the context of the global struggle between democracies and authoritarian regimes. Phillips asserted:

  • First, we need to find a path forward after Schrems II, to permit transfers between the U.S. and EU. I want to recognize the efforts of U.S. and EU negotiators to find a replacement for Privacy Shield. While no doubt challenging, I have confidence in the good faith and commitment of public servants like Jim Sullivan, with whom I have the honor of appearing today, and our partners across the Atlantic. I have every hope and expectation that protecting cross-border data flows will be a priority for the incoming Administration, and I ask for your help in ensuring it is.
  • Second, we must actively engage with nations evaluating their approach to digital governance, something we at the FTC have done, to share and promote the benefits of a free and open Internet. There is an active conversation ongoing internationally, and at every opportunity—whether in public forums or via private assistance—we must ensure our voice and view is heard.
  • Third, we should be vocal in our defense of American values and policies. While we as Americans always look to improve our laws—and I commend the members of this committee on their important work on privacy legislation and other critical matters—we do not need to apologize to the world. When it comes to civil liberties or the enforcement of privacy laws, we are second to none. Indeed, in my view, the overall U.S. privacy framework—especially with the additional protections built into Privacy Shield—should certainly qualify as adequate under EU standards.
  • Fourth, as European leaders call to strengthen ties with the U.S., we should prioritize making our regimes compatible for the free flow of data. This extends to the data governance regimes of like-minded countries outside of Europe as well. Different nations will have different rules, but relatively minor differences need not impede mutually-beneficial commerce. We need not and should not purport to aim for a single, identical system of data governance. And we should remind our allies, and remind ourselves, that far more unites liberal democracies than divides us.
  • Fifth and finally, if we must draw lines, those lines should be drawn between allies with shared values—the U.S., Europe, Japan, Australia, and others—and those, like China and Russia, that offer a starkly different vision. I am certainly encouraged when I hear recognition of this distinction from Europe. European Data Protection Supervisor Wojciech Wiewiórowski recently noted that the U.S. is much closer to Europe than is China and that he has a preference for data being processed by countries that share values with Europe. Some here in the U.S. are even proposing agreements to solidify the relationships among technologically advanced democracies, an idea worth exploring in more detail

Washington University Professor of Law Neil Richards stressed that the Schrems II decision spells out how the U.S. would achieve adequacy: reforming surveillance and providing meaningful redress for alleged privacy violations. Consequently, FISA would need to be rewritten and narrowed and a means for EU residents to seek relief beyond the current Ombudsman system is needed, possibly a statutory right to sue. Moreover, he asserted strong data protection and privacy laws are needed and some of the bills introduced in this Congress could fit the bill. Richards asserted:

In sum, the Schrems litigation is a creature of distrust, and while it has created problems for American law and commerce, it has also created a great opportunity. That opportunity lies before this Committee –the chance to regain American leadership in global privacy and data protection by passing a comprehensive law that provides appropriate safeguards, enforceable rights, and effective legal remedies for consumers. I believe that the way forward can not only safeguard the ability to share personal data across the Atlantic, but it can do so in a way that builds trust between the United States and our European trading partners and between American companies and their American and European customers. I believe that there is a way forward, but it requires us to recognize that strong, clear, trust-building rules are not hostile to business interest, that we need to push past the failed system of “notice and choice,” that we need to preserve effective consumer remedies and state-level regulatory innovation, and seriously consider a duty of loyalty. In that direction, I believe, lies not just consumer protection, but international cooperation and economic prosperity.

Georgia Tech University Professor Peter Swire explained that the current circumstances make the next Congress the best possibility in memory to enact privacy legislation because of the need for a Privacy Shield replacement, passage of the new California Privacy Rights Act (Proposition 24), and the Biden Administration’s likely support for such legislation. Swire made the following points:

  1. The European Data Protection Board in November issued draft guidance with an extremely strict interpretation of how to implement the Schrems II case.
  2. The decision in Schrems II is based on EU constitutional law. There are varying current interpretations in Europe of what is required by Schrems II, but constitutional requirements may restrict the range of options available to EU and U.S. policymakers.
  3. Strict EU rules about data transfers, such as the draft EDPB guidance, would appear to result in strict data localization, creating numerous major issues for EU- and U.S.-based businesses, as well as affecting many online activities of EU individuals.
  4. Along with concerns about lack of individual redress, the CJEU found that the EU Commission had not established that U.S. surveillance was “proportionate” in its scope and operation. Appendix 2 to this testimony seeks to contribute to an informed judgment on proportionality, by cataloguing developments in U.S. surveillance safeguards since the Commission’s issuance of its Privacy Shield decision in 2016.
  5. Negotiating an EU/U.S. adequacy agreement is important in the short term.
  6. A short-run agreement would assist in creating a better overall long-run agreement or agreements.
  7. As the U.S. considers its own possible legal reforms in the aftermath of Schrems II, it is prudent and a normal part of negotiations to seek to understand where the other party – the EU – may have flexibility to reform its own laws.
  8. Issues related to Schrems II have largely been bipartisan in the U.S., with substantial continuity across the Obama and Trump administrations, and expected as well for a Biden administration.
  9. Passing comprehensive privacy legislation would help considerably in EU/U.S. negotiations.
  10. This Congress may have a unique opportunity to enact comprehensive commercial privacy legislation for the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dooffy Design from Pixabay

Further Reading, Other Development, and Coming Events (8 December)

Further Reading

  • Facebook failed to put fact-check labels on 60% of the most viral posts containing Georgia election misinformation that its own fact-checkers had debunked, a new report says” By Tyler Sonnemaker — Business Insider. Despite its vows to improve its managing of untrue and false content, the platform is not consistently taking down such material related to the runoffs for the Georgia Senate seats. The group behind this finding argues it is because Facebook does not want to. What is left unsaid is that engagement drives revenue, and so, Facebook’s incentives are not to police all violations. Rather it would be to take down enough to be able to say their doing something.
  • Federal Labor Agency Says Google Wrongly Fired 2 Employees” By Kate Conger and Noam Scheiber — The New York Times. The National Labor Relations Board (NLRB) has reportedly sided with two employees Google fired for activities that are traditionally considered labor organizing. The two engineers had been dismissed for allegedly violating the company’s data security practices when they researched the company’s retention of a union-busting firm and sought to alert others about organizing. Even though Google is vowing to fight the action, which has not been finalized, it may well settle given the view of Big Tech in Washington these days. This action could also foretell how a Biden Administration NLRB may look at the labor practices of these companies.
  • U.S. states plan to sue Facebook next week: sources” By Diane Bartz — Reuters. We could see state and federal antitrust suits against Facebook this week. One investigation led by New York Attorney General Tish James could include 40 states although the grounds for alleged violations have not been leaked at this point. It may be Facebook’s acquisition of potential rivals Instagram and WhatsApp that have allowed it to dominate the social messaging market. The Federal Trade Commission (FTC) may also file suit, and, again, the grounds are unknown. The European Commission (EC) is also investigating Facebook for possible violations of European Union (EU) antitrust law over the company’s use of the personal data it holds and uses and about its operation of it online marketplace.
  • The Children of Pornhub” By Nicholas Kristof — The New York Times. This column comprehensively traces the reprehensible recent history of a Canadian conglomerate Mindgeek that owns Pornhub where one can find reams of child and non-consensual pornography. Why Ottawa has not cracked down on this firm is a mystery. The passage and implementation of the “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” (P.L. 115-164) that narrowed the liability shield under 47 USC 230 has forced the company to remove content, a significant change from its indifference before the statutory change in law. Kristof suggests some easy, common sense changes Mindgeek could implement to combat the presence of this illegal material, but it seems like the company will do enough to say it is acting without seriously reforming its platform. Why would it? There is too much money to be made. Additionally, those fighting against this sort of material have been pressuring payment platforms to stop doing business with Mindgeek. PayPal has foresworn any  interaction, and due to pressure Visa and Mastercard are “reviewing” their relationship with Mindgeek and Pornhub. In a statement to a different news outlet, Pornhub claimed it is “unequivocally committed to combating child sexual abuse material (CSAM), and has instituted a comprehensive, industry-leading trust and safety policy to identify and eradicate illegal material from our community.” The company further claimed “[a]ny assertion that we allow CSAM is irresponsible and flagrantly untrue….[w]e have zero tolerance for CSAM.”
  • Amazon and Apple Are Powering a Shift Away From Intel’s Chips” By Don Clark — The New York Times. Two tech giants have chosen new faster, cheaper chips signaling a possible industry shift away from Intel, the firm that has been a significant player for decades. Intel will not go quietly, of course, and a key variable is whether must have software and applications are rewritten to accommodate the new chips from a British firm, Arm.

Other Developments

  • The Government Accountability Office (GAO) and the National Academy of Medicine (NAM) have released a joint report on artificial intelligence in healthcare, consisting of GAO’s Technology Assessment: Artificial Intelligence in Health Care: Benefits and Challenges of Technologies to Augment Patient Care and NAM’s Special Publication: Advancing Artificial Intelligence in Health Settings Outside the Hospital and Clinic. GAO’s report “discusses three topics: (1) current and emerging AI tools available for augmenting patient care and their potential benefits, (2) challenges to the development and adoption of these tools, and (3) policy options to maximize benefits and mitigate challenges to the use of AI tools to augment patient care.” NAM’s “paper aims to provide an analysis of: 1) current technologies and future applications of AI in HSOHC, 2) the logistical steps and challenges involved in integrating AI- HSOHC applications into existing provider workflows, and 3) the ethical and legal considerations of such AI tools, followed by a brief proposal of potential key initiatives to guide the development and adoption of AI in health settings outside the hospital and clinic (HSOHC).
    • The GAO “identified five categories of clinical applications where AI tools have shown promise to augment patient care: predicting health trajectories, recommending treatments, guiding surgical care, monitoring patients, and supporting population health management.” The GAO “also identified three categories of administrative applications where AI tools have shown promise to reduce provider burden and increase the efficiency of patient care: recording digital clinical notes, optimizing operational processes, and automating laborious tasks.” The GAO stated:
      • This technology assessment also identifies challenges that hinder the adoption and impact of AI tools to augment patient care, according to stakeholders, experts, and the literature. Difficulties accessing sufficient high-quality data may hamper innovation in this space. Further, some available data may be biased, which can reduce the effectiveness and accuracy of the tools for some people. Addressing bias can be difficult because the electronic health data do not currently represent the general population. It can also be challenging to scale tools up to multiple locations and integrate them into new settings because of differences in institutions and the patient populations they serve. The limited transparency of AI tools used in health care can make it difficult for providers, regulators, and others to determine whether an AI tool is safe and effective. A greater dispersion of data across providers and institutions can make securing patient data difficult. Finally, one expert described how existing case law does not specifically address AI tools, which can make providers and patients reticent to adopt them. Some of these challenges are similar to those identified previously by GAO in its first publication in this series, such as the lack of high-quality, structured data, and others are more specific to patient care, such as liability concerns.
    • The GAO “described six policy options:”
      • Collaboration. Policymakers could encourage interdisciplinary collaboration between developers and health care providers. This could result in AI tools that are easier to implement and use within an existing workflow.
      • Data Access. Policymakers could develop or expand high-quality data access mechanisms. This could help developers address bias concerns by ensuring data are representative, transparent, and equitable.
      • Best Practices. Policymakers could encourage relevant stakeholders and experts to establish best practices (such as standards) for development, implementation, and use of AI technologies. This could help with deployment and scalability of AI tools by providing guidance on data, interoperability, bias, and formatting issues.
      • Interdisciplinary Education. Policymakers could create opportunities for more workers to develop interdisciplinary skills. This could allow providers to use AI tools more effectively, and could be accomplished through a variety of methods, including changing medical curricula or grants.
      • Oversight Clarity. Policymakers could collaborate with relevant stakeholders to clarify appropriate oversight mechanisms. Predictable oversight could help ensure that AI tools remain safe and effective after deployment and throughout their lifecycle.
      • Status Quo. Policymakers could allow current efforts to proceed without intervention.
    • NAM claimed
      • Numerous AI-powered health applications designed for personal use have been shown to improve patient outcomes, building predictions based on large volumes of granular, real-time, and individualized behavioral and medical data. For instance, some forms of telehealth, a technology that has been critical during the COVID-19 pandemic, benefit considerably from AI software focused on natural language processing, which enables efficient triaging of patients based on urgency and type of illness. Beyond patient-provider communication, AI algorithms relevant to diabetic and cardiac care have demonstrated remarkable efficacy in helping patients manage their blood glucose levels in their day-to-day lives and in detecting cases of atrial fibrillation. AI tools that monitor and synthesize longitudinal patient behaviors are also particularly useful in psychiatric care, where of the exact timing of interventions is often critical. For example, smartphone-embedded sensors that track location and proximity of individuals can alert clinicians of possible substance use, prompting immediate intervention. On the population health level, these individual indicators of activity and health can be combined with environmental- and system-level data to generate predictive insight into local and global health trends. The most salient example of this may be the earliest warnings of the COVID-19 outbreak, issued in December 2019 by two private AI technology firms.
      • Successful implementation and widespread adoption of AI applications in HSOHC requires careful consideration of several key issues related to personal data, algorithm development, and health care insurance and payment. Chief among them are data interoperability, standardization, privacy, ameliorating systemic biases in algorithms, reimbursement of AI- assisted services, quality improvement, and integration of AI tools into provider workflows. Overcoming these challenges and optimizing the impact of AI tools on clinical outcomes will involve engaging diverse stakeholders, deliberately designing AI tools and interfaces, rigorously evaluating clinical and economic utility, and diffusing and scaling algorithms across different health settings. In addition to these potential logistical and technical hurdles, it is imperative to consider the legal and ethical issues surrounding AI, particularly as it relates to the fair and humanistic deployment of AI applications in HSOHC. Important legal considerations include the appropriate designation of accountability and liability of medical errors resulting from AI- assisted decisions for ensuring the safety of patients and consumers. Key ethical challenges include upholding the privacy of patients and their data—particularly with regard to non-HIPAA covered entities involved in the development of algorithms—building unbiased AI algorithms based on high-quality data from representative populations, and ensuring equitable access to AI technologies across diverse communities.
  • The National Institute of Standards and Technology (NIST) published a “new study of face recognition technology created after the onset of the COVID-19 pandemic [that] shows that some software developers have made demonstrable progress at recognizing masked faces.” In Ongoing Face Recognition Vendor Test (FRVT) Part 6B: Face Recognition Accuracy with Face Masks Using Post-COVID-19 Algorithms (NISTIR 8331), NIST stated the “report augments its predecessor with results for more recent algorithms provided to NIST after mid-March 2020.” NIST said that “[w]hile we do not have information on whether or not a particular algorithm was designed with face coverings in mind, the results show evidence that a number of developers have adapted their algorithms to support face recognition on subjects potentially wearing face masks.” NIST stated that
    • The following results represent observations on algorithms provided to NIST both before and after the COVID-19 pandemic to date. We do not have information on whether or not a particular algorithm was designed with face coverings in mind. The results documented capture a snapshot of algorithms submitted to the FRVT 1:1 in face recognition on subjects potentially wearing face masks.
      • False rejection performance: All algorithms submitted after the pandemic continue to give in-creased false non-match rates (FNMR) when the probes are masked. While a few pre-pandemic algorithms still remain within the most accurate on masked photos, some developers have submit-ted algorithms after the pandemic showing significantly improved accuracy and are now among the most accurate in our test.
      • Evolution of algorithms on face masks: We observe that a number of algorithms submitted since mid-March 2020 show notable reductions in error rates with face masks over their pre-pandemic predecessors. When comparing error rates for unmasked versus masked faces, the median FNMR across algorithms submitted since mid-March 2020 has been reduced by around 25% from the median pre-pandemic results. The figure below presents examples of developer evolution on both masked and unmasked datasets. For some developers, false rejection rates in their algorithms submitted since mid-March 2020 decreased by as much as a factor of 10 over their pre-pandemic algorithms, which is evidence that some providers are adapting their algorithms to handle facemasks. However, in the best cases, when comparing results for unmasked images to masked im-ages, false rejection rates have increased from 0.3%-0.5% (unmasked) to 2.4%-5% (masked).
      • False acceptance performance: As most systems are configured with a fixed threshold, it is necessary to report both false negative and false positive rates for each group at that threshold. When comparing a masked probe to an unmasked enrollment photo, in most cases, false match rates (FMR) are reduced by masks. The effect is generally modest with reductions in FMR usually being smaller than a factor of two. This property is valuable in that masked probes do not impart adverse false match security consequences for verification.
      • Mask-agnostic face recognition: All 1:1 verification algorithms submitted to the FRVT test since the start of the pandemic are evaluated on both masked and unmasked datasets. The test is de-signed this way to mimic operational reality: some images will have masks, some will not (especially enrollment samples from a database or ID card). And to the extent that the use of protective masks will exist for some time, our test will continue to evaluate algorithmic capability on verifying all combinations of masked and unmasked faces.
  • The government in London has issued a progress report on its current cybersecurity strategy that has another year to run. The Paymaster General assessed how well the United Kingdom (UK) has implemented the National Cyber Security Strategy 2016 to 2021 and pointed to goals yet to be achieved. This assessment comes in the shadow of the pending exit of the UK from the European Union (EU) and Prime Minister Boris Johnson’s plans to increase the UK’s role in select defense issues, including cyber operations. The Paymaster General stated:
    • The global landscape has changed significantly since the publication of the National Cyber Security Strategy Progress Report in May 2019. We have seen unprecedented levels of disruption to our way of life that few would have predicted. The COVID-19 pandemic has increased our reliance on digital technologies – for our personal communications with friends and family and our ability to work remotely, as well as for businesses and government to continue to operate effectively, including in support of the national response.
    • These new ways of living and working highlight the importance of cyber security, which is also underlined by wider trends. An ever greater reliance on digital networks and systems, more rapid advances in new technologies, a wider range of threats, and increasing international competition on underlying technologies and standards in cyberspace, emphasise the need for good cyber security practices for individuals, businesses and government.
    • Although the scale and international nature of these changes present challenges, there are also opportunities. With the UK’s departure from the European Union in January 2020, we can define and strengthen Britain’s place in the world as a global leader in cyber security, as an independent, sovereign nation.
    • The sustained, strategic investment and whole of society approach delivered so far through the National Cyber Security Strategy has ensured we are well placed to respond to this changing environment and seize new opportunities.
    • The Paymaster General asserted:
      • [The] report has highlighted growing risks, some accelerated by the COVID-19 pandemic, and longer-term trends that will shape the environment over the next decade:
      • Ever greater reliance on digital networks and systems as daily life moves online, bringing huge benefits but also creating new systemic and individuals risks.
      • Rapid technological change and greater global competition, challenging our ability to shape the technologies that will underpin our future security and prosperity.
      • A wider range of adversaries as criminals gain easier access to commoditised attack capabilities and cyber techniques form a growing part of states’ toolsets.
      • Competing visions for the future of the internet and the risk of fragmentation, making consensus on norms and ethics in cyberspace harder to achieve.
      • In February 2020 the Prime Minister announced the Integrated Review of Security, Defence, Development and Foreign Policy. This will define the government’s ambition for the UK’s role in the world and the long-term strategic aims of our national security and foreign policy. It will set out the way in which the UK will be a problem-solving and burden-sharing nation, and a strong direction for recovery from COVID-19, at home and overseas.
      • This will help to shape our national approach and priorities on cyber security beyond 2021. Cyber security is a key element of our international, defence and security posture, as well as a driving force for our economic prosperity.
  • The University of Toronto’s Citizen Lab published a report on an Israeli surveillance firm that uses “[o]ne of the widest-used—but least appreciated” means of surveilling people (i.e., “leveraging of weaknesses in the global mobile telecommunications infrastructure to monitor and intercept phone calls and traffic.” Citizen Lab explained that an affiliate of the NSO Group, “Circles is known for selling systems to exploit Signaling System 7 (SS7) vulnerabilities, and claims to sell this technology exclusively to nation-states.” Citizen Lab noted that “[u]nlike NSO Group’s Pegasus spyware, the SS7 mechanism by which Circles’ product reportedly operates does not have an obvious signature on a target’s phone, such as the telltale targeting SMS bearing a malicious link that is sometimes present on a phone targeted with Pegasus.” Citizen Lab found that
    • Circles is a surveillance firm that reportedly exploits weaknesses in the global mobile phone system to snoop on calls, texts, and the location of phones around the globe. Circles is affiliated with NSO Group, which develops the oft-abused Pegasus spyware.
    • Circles, whose products work without hacking the phone itself, says they sell only to nation-states. According to leaked documents, Circles customers can purchase a system that they connect to their local telecommunications companies’ infrastructure, or can use a separate system called the “Circles Cloud,” which interconnects with telecommunications companies around the world.
    • According to the U.S. Department of Homeland Security, all U.S. wireless networks are vulnerable to the types of weaknesses reportedly exploited by Circles. A majority of networks around the globe are similarly vulnerable.
    • Using Internet scanning, we found a unique signature associated with the hostnames of Check Point firewalls used in Circles deployments. This scanning enabled us to identify Circles deployments in at least 25 countries.
    • We determine that the governments of the following countries are likely Circles customers: Australia, Belgium, Botswana, Chile, Denmark, Ecuador, El Salvador, Estonia, Equatorial Guinea, Guatemala, Honduras, Indonesia, Israel, Kenya, Malaysia, Mexico, Morocco, Nigeria, Peru, Serbia, Thailand, the United Arab Emirates (UAE), Vietnam, Zambia, and Zimbabwe.
    • Some of the specific government branches we identify with varying degrees of confidence as being Circles customers have a history of leveraging digital technology for human rights abuses. In a few specific cases, we were able to attribute the deployment to a particular customer, such as the Security Operations Command (ISOC) of the Royal Thai Army, which has allegedly tortured detainees.
  • Senators Ron Wyden (D-OR) Elizabeth Warren (D-MA) Edward J. Markey (D-MA) and Brian Schatz (D-HI) “announced that the Department of Homeland Security (DHS) will launch an inspector general investigation into Customs and Border Protection’s (CBP) warrantless tracking of phones in the United States following an inquiry from the senators earlier this year” per their press release.
    • The Senators added:
      • As revealed by public contracts, CBP has paid a government contractor named Venntel nearly half a million dollars for access to a commercial database containing location data mined from applications on millions of Americans’ mobile phones. CBP officials also confirmed the agency’s warrantless tracking of phones in the United States using Venntel’s product in a September 16, 2020 call with Senate staff.
      • In 2018, the Supreme Court held in Carpenter v. United States that the collection of significant quantities of historical location data from Americans’ cell phones is a search under the Fourth Amendment and therefore requires a warrant.
      • In September 2020, Wyden and Warren successfully pressed for an inspector general investigation into the Internal Revenue Service’s use of Venntel’s commercial location tracking service without a court order.
    • In a letter, the DHS Office of the Inspector General (OIG) explained:
      • We have reviewed your request and plan to initiate an audit that we believe will address your concerns. The objective of our audit is to determine if the Department of Homeland Security (DHS) and it [sic] components have developed, updated, and adhered to policies related to cell-phone surveillance devices. In addition, you may be interested in our audit to review DHS’ use and protection of open source intelligence. Open source intelligence, while different from cell phone surveillance, includes the Department’s use of information provided by the public via cellular devices, such as social media status updates, geo-tagged photos, and specific location check-ins.
    • In an October letter, these Senators plus Senator Sherrod Brown (D-OH) argued:
      • CBP is not above the law and it should not be able to buy its way around the Fourth Amendment. Accordingly, we urge you to investigate CBP’s warrantless use of commercial databases containing Americans’ information, including but not limited to Venntel’s location database. We urge you to examine what legal analysis, if any, CBP’s lawyers performed before the agency started to use this surveillance tool. We also request that you determine how CBP was able to begin operational use of Venntel’s location database without the Department of Homeland Security Privacy Office first publishing a Privacy Impact Assessment.
  • The American Civil Liberties Union (ACLU) has filed a lawsuit in a federal court in New York City, seeking an order to compel the United States (U.S.) Department of Homeland Security (DHS), U.S. Customs and Border Protection (CBP), and U.S. Immigration and Customs Enforcement (ICE) “to release records about their purchases of cell phone location data for immigration enforcement and other purposes.” The ACLU made these information requests after numerous media accounts showing that these and other U.S. agencies were buying location data and other sensitive information in ways intended to evade the bar in the Fourth Amendment against unreasonable searches.
    • In its press release, the ACLU asserted:
      • In February, The Wall Street Journal reported that this sensitive location data isn’t just for sale to commercial entities, but is also being purchased by U.S. government agencies, including by U.S. Immigrations and Customs Enforcement to locate and arrest immigrants. The Journal identified one company, Venntel, that was selling access to a massive database to the U.S. Department of Homeland Security, U.S. Customs and Border Protection, and ICE. Subsequent reporting has identified other companies selling access to similar databases to DHS and other agencies, including the U.S. military.
      • These practices raise serious concerns that federal immigration authorities are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. There’s even more reason for alarm when those agencies evade requests for information — including from U.S. senators — about such practices. That’s why today we asked a federal court to intervene and order DHS, CBP, and ICE to release information about their purchase and use of precise cell phone location information. Transparency is the first step to accountability.
    • The ACLU explained in the suit:
      • Multiple news sources have confirmed these agencies’ purchase of access to databases containing precise location information for millions of people—information gathered by applications (apps) running on their smartphones. The agencies’ purchases raise serious concerns that they are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. Yet, more than nine months after the ACLU submitted its FOIA request (“the Request”), these agencies have produced no responsive records. The information sought is of immense public significance, not only to shine a light on the government’s use of powerful location-tracking data in the immigration context, but also to assess whether the government’s purchase of this sensitive data complies with constitutional and legal limitations and is subject to appropriate oversight and control.
  • Facebook’s new Oversight Board announced “the first cases it will be deliberating and the opening of the public comment process” and “the appointment of five new trustees.” The cases were almost all referred by Facebook users and the new board is asking for comments on the right way to manage what may be objectionable content. The Oversight Board explained it “prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
    • The new trustees are:
      • Kristina Arriaga is a globally recognized advocate for freedom of expression, with a focus on freedom of religion and belief. Kristina is president of the advisory firm Intrinsic.
      • Cherine Chalaby is an expert on internet governance, international finance and technology, with extensive board experience. As Chairman of ICANN, he led development of the organization’s five-year strategic plan for 2021 to 2025.
      • Wanda Felton has over 30 years of experience in the financial services industry, including serving as Vice Chair of the Board and First Vice President of the Export-Import Bank of the United States.
      • Kate O’Regan is a former judge of the Constitutional Court of South Africa and commissioner of the Khayelitsha Commission. She is the inaugural director of the Bonavero Institute of Human Rights at the University of Oxford.
      • Robert Post is an American legal scholar and Professor of Law at Yale Law School, where he formerly served as Dean. He is a leading scholar of the First Amendment and freedom of speech.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Further Reading, Other Development, and Coming Events (7 December)

Further Reading

  • Facebook steps up campaign to ban false information about coronavirus vaccines” By Elizabeth Dwoskin — The Washington Post. In its latest step to find and remove lies, misinformation, and disinformation, the social media giant is now committing to removing and blocking untrue material about COVID-19 vaccines, especially from the anti-vaccine community. Will the next step be to take on anti-vaccination proponents generally?
  • Comcast’s 1.2 TB data cap seems like a ton of data—until you factor in remote work” By Rob Pegoraro — Fast Company. Despite many people and children working and learning from home, Comcast is reimposing a 1.2 terabyte limit on data for homes. Sounds like quite a lot until you factor in video meetings, streaming, etc. So far, other providers have not set a cap.
  • Google’s star AI ethics researcher, one of a few Black women in the field, says she was fired for a critical email” By Drew Harwell and Nitasha Tiku — The Washington Post. Timnit Gebru, a top flight artificial intelligence (AI) computer scientist, was fired for questioning Google’s review of a paper she wanted to present at an AI conference that is likely critical of the company’s AI projects. Google claims she resigned, but Gebru says she was fired. She has long been an advocate for women and minorities in tech and AI and her ouster will likely only increase scrutiny of and questions about Google’s commitment to diversity and an ethical approach to the development and deployment of AI. It will also probably lead to more employee disenchantment about the company that follows in the wake of protests about Google’s involvement with the United States Department of Defense’s Project Maven and hiring of former United States Department of Homeland Security chief of staff Miles Taylor who was involved with the policies that resulted in caging children and separating families on the southern border of the United States.
  • Humans Can Help Clean Up Facebook and Twitter” By Greg Bensinger — The New York Times. In this opinion piece, the argument is made that social media platforms should redeploy their human monitors to the accounts that violate terms of service most frequently (e.g., President Donald Trump) and more aggressively label and remove untrue or inflammatory content, they would have a greater impact on lies, misinformation, and disinformation.
  • Showdown looms over digital services tax” By Ashley Gold — Axios. Because the Organization for Economic Cooperation and Development (OECD) has not reached a deal on digital services taxes, a number of the United States (U.S.) allies could move forward with taxes on U.S. multinationals like Amazon, Google, and Apple. The Trump Administration has variously taken an adversarial position threatening to retaliate against countries like France who have enacted a tax that has not been collected during the OECD negotiations. The U.S. also withdrew from talks. It is probable the Biden Administration will be more willing to work in a multi-lateral fashion and may strike a deal on an issue that it not going away as the United Kingdom, Italy, and Canada also have plans for a digital tax.
  • Trump’s threat to veto defense bill over social-media protections is heading to a showdown with Congress” By Karoun Demirjian and Tony Romm — The Washington Post. I suppose I should mention of the President’s demands that the FY 2021 National Defense Authorization Act (NDAA) contain a repeal of 47 U.S.C. 230 (Section 230 of the Communications Act) that came at the eleventh hour and fifty-ninth minute of negotiations on a final version of the bill. Via Twitter, Donald Trump threatened to veto the bill which has been passed annually for decades. Republicans were not having it, however, even if they agreed on Trump’s desire to remove liability protection for technology companies. And yet, if Trump continues to insist on a repeal, Republicans may find themselves in a bind and the bill could conceivably get pulled until President-elect Joe Biden is sworn in. On the other hand, Trump’s veto threats about renaming military bases currently bearing the names of Confederate figures have not been renewed even though the final version of the bill contains language instituting a process to do just that.

Other Developments

  • The Senate Judiciary Committee held over its most recent bill to narrow 47 U.S.C. 230 (Section 230 of the Communications Act) that provides liability protection for technology companies for third-party material posted on their platforms and any decisions to edit, alter, or remove such content. The committee opted to hold the “Online Content Policy Modernization Act” (S.4632), which may mean the bill’s chances of making it to the Senate floor are low. What’s more, even if the Senate passes Section 230 legislation, it is not clear there will be sufficient agreement with Democrats in the House to get a final bill to the President before the end of this Congress. On 1 October, the committee also decided to hold over bill to try to reconcile the fifteen amendments submitted for consideration. The Committee could soon meet again to formally markup and report out this legislation.
    • At the earlier hearing, Chair Lindsey Graham (R-SC) submitted an amendment revising the bill’s reforms to Section 230 that incorporate some of the below amendments but includes new language. For example, the bill includes a definition of “good faith,” a term not currently defined in Section 230. This term would be construed as a platform taking down or restricting content only according to its publicly available terms of service, not as a pretext, and equally to all similarly situated content. Moreover, good faith would require alerting the user and giving him or her an opportunity to respond subject to certain exceptions. The amendment also makes clear that certain existing means of suing are still available to users (e.g. suing claiming a breach of contract.)
    • Senator Mike Lee (R-UT) offered a host of amendments:
      • EHF20913 would remove “user[s]” from the reduced liability shield that online platforms would receive under the bill. Consequently, users would still not be legally liable for the content posted by another user.
      • EHF20914 would revise the language the language regarding the type of content platforms could take down with legal protection to make clear it would not just be “unlawful” content but rather content “in violation of a duly enacted law of the United States,” possibly meaning federal laws and not state laws. Or, more likely, the intent would be to foreclose the possibility a platform would say it is acting in concert with a foreign law and still assert immunity.
      • EHF20920 would add language making clear that taking down material that violates terms of service or use according to an objectively reasonable belief would be shielded from liability.
      • OLL20928 would expand legal protection to platforms for removing or restricting spam,
      • OLL20929 would bar the Federal Communications Commission (FCC) from a rulemaking on Section 230.
      • OLL20930 adds language making clear if part of the revised Section 230 is found unconstitutional, the rest of the law would still be applicable.
      • OLL20938 revises the definition of an “information content provider,” the term of art in Section 230 that identifies a platform, to expand when platforms may be responsible for the creation or development of information and consequently liable for a lawsuit.
    • Senator Josh Hawley (R-MO) offered an amendment that would create a new right of action for people to sue large platforms for taking down his or her content if not done in “good faith.” The amendment limits this right only to “edge providers” who are platforms with more than 30 million users in the U.S. , 300 million users worldwide, and with revenues of more than $1.5 billion. This would likely exclude all platforms except for Twitter, Facebook, Instagram, TikTok, Snapchat, and a select group of a few others.
    • Senator John Kennedy (R-LA) offered an amendment that removes all Section 230 legal immunity from platforms that collect personal data and then uses an “automated function” to deliver targeted or tailored content to a user unless a user “knowingly and intentionally elect[s]” to receive such content.
  • The Massachusetts Institute of Technology’s (MIT) Work of the Future Task Force issued its final report and drew the following conclusions:
    • Technological change is simultaneously replacing existing work and creating new work. It is not eliminating work altogether.
    • Momentous impacts of technological change are unfolding gradually.
    • Rising labor productivity has not translated into broad increases in incomes because labor market institutions and policies have fallen into disrepair.
    • Improving the quality of jobs requires innovation in labor market institutions.
    • Fostering opportunity and economic mobility necessitates cultivating and refreshing worker skills.
    • Investing in innovation will drive new job creation, speed growth, and meet rising competitive challenges.
    • The Task Force stated:
      • In the two-and-a-half years since the Task Force set to work, autonomous vehicles, robotics, and AI have advanced remarkably. But the world has not been turned on its head by automation, nor has the labor market. Despite massive private investment, technology deadlines have been pushed back, part of a normal evolution as breathless promises turn into pilot trials, business plans, and early deployments — the diligent, if prosaic, work of making real technologies work in real settings to meet the demands of hard-nosed customers and managers.
      • Yet, if our research did not confirm the dystopian vision of robots ushering workers off of factor y floors or artificial intelligence rendering superfluous human expertise and judgment, it did uncover something equally pernicious: Amidst a technological ecosystem delivering rising productivity, and an economy generating plenty of jobs (at least until the COVID-19 crisis), we found a labor market in which the fruits are so unequally distributed, so skewed towards the top, that the majority of workers have tasted only a tiny morsel of a vast har vest.
      • As this report documents, the labor market impacts of technologies like AI and robotics are taking years to unfold. But we have no time to spare in preparing for them. If those technologies deploy into the labor institutions of today, which were designed for the last century, we will see similar effects to recent decades: downward pressure on wages, skills, and benefits, and an increasingly bifurcated labor market. This report, and the MIT Work of the Future Task Force, suggest a better alternative: building a future for work that har vests the dividends of rapidly advancing automation and ever-more powerful computers to deliver opportunity and economic security for workers. To channel the rising productivity stemming from technological innovations into broadly shared gains, we must foster institutional innovations that complement technological change.
  • The European Data Protection Supervisor (EDPS) Wojciech Wiewiorówski published his “preliminary opinion on the European Commission’s (EC) Communication on “A European strategy for data” and the creation of a common space in the area of health, namely the European Health Data Space (EHDS).” The EDPS lauded the goal of the EHDS, “the prevention, detection and cure of diseases, as well as for evidence-based decisions in order to enhance effectiveness, accessibility and sustainability of the healthcare systems.” However, Wiewiorówski articulated his concerns that the EC needs to think through the applicability of the General Data Protection Regulation (GDPR), among other European Union (EU) laws before it can legally move forward. The EDPS stated:
    • The EDPS calls for the establishment of a thought-through legal basis for the processing operations under the EHDS in line with Article 6(1) GDPR and also recalls that such processing must comply with Article 9 GDPR for the processing of special categories of data.
    • Moreover, the EDPS highlights that due to the sensitivity of the data to be processed within the EHDS, the boundaries of what constitutes a lawful processing and a compatible further processing of the data must be crystal-clear for all the stakeholders involved. Therefore, the transparency and the public availability of the information relating to the processing on the EHDS will be key to enhance public trust in the EHDS.
    • The EDPS also calls on the Commission to clarify the roles and responsibilities of the parties involved and to clearly identify the precise categories of data to be made available to the EHDS. Additionally, he calls on the Member States to establish mechanisms to assess the validity and quality of the sources of the data.
    • The EDPS underlines the importance of vesting the EHDS with a comprehensive security infrastructure, including both organisational and state-of-the-art technical security measures to protect the data fed into the EHDS. In this context, he recalls that Data Protection Impact Assessments may be a very useful tool to determine the risks of the processing operations and the mitigation measures that should be adopted.
    • The EDPS recommends paying special attention to the ethical use of data within the EHDS framework, for which he suggests taking into account existing ethics committees and their role in the context of national legislation.
    • The EDPS is convinced that the success of the EHDS will depend on the establishment of a strong data governance mechanism that provides for sufficient assurances of a lawful, responsible, ethical management anchored in EU values, including respect for fundamental rights. The governance mechanism should regulate, at least, the entities that will be allowed to make data available to the EHDS, the EHDS users, the Member States’ national contact points/ permit authorities, and the role of DPAs within this context.
    • The EDPS is interested in policy initiatives to achieve ‘digital sovereignty’ and has a preference for data being processed by entities sharing European values, including privacy and data protection. Moreover, the EDPS calls on the Commission to ensure that the stakeholders taking part in the EHDS, and in particular, the controllers, do not transfer personal data unless data subjects whose personal data are transferred to a third country are afforded a level of protection essentially equivalent to that guaranteed within the European Union.
    • The EDPS calls on Member States to guarantee the effective implementation of the right to data portability specifically in the EHDS, together with the development of the necessary technical requirements. In this regard, he considers that a gap analysis might be required regarding the need to integrate the GDPR safeguards with other regulatory safeguards, provided e.g. by competition law or ethical guidelines.
  • The Office of Management and Budget (OMB) extended a guidance memorandum directing agencies to consolidate data centers after Congress pushed back the sunset date for the program. OMB extended OMB Memorandum M-19-19, Update to Data Center Optimization Initiative (DCOI) through 30 September 2022, which applies “to the 24 Federal agencies covered by the Chief Financial Officers (CFO) Act of 1990, which includes the Department of Defense.” The DCOI was codified in the “Federal Information Technology Acquisition Reform” (FITARA) (P.L. 113-291) and extended in 2018 until October 1, 2020. And this sunset was pushed back another two years in the FY 2020 National Defense Authorization Act (NDAA) (P.L. 116-92).
    • In March 2020, the Government Accountability Office (GAO) issued another of its periodic assessments of the DCOI, started in 2012 by the Obama Administration to shrink the federal government’s footprint of data centers, increase efficiency and security, save money, and reduce energy usage.
    • The GAO found that 23 of the 24 agencies participating in the DCOI met or planned to meet their FY 2019 goals to close 286 of the 2,727 data centers considered part of the DCOI. This latter figure deserves some discussion, for the Trump Administration changed the definition of what is a data center to exclude smaller ones (so-called non-tiered data centers). GAO asserted that “recent OMB DCOI policy changes will reduce the number of data centers covered by the policy and both OMB and agencies may lose important visibility over the security risks posed by these facilities.” Nonetheless, these agencies are projecting savings of $241.5 million when all the 286 data centers planned for closure in FY 2019 actually close. It bears note that the GAO admitted in a footnote it “did not independently validate agencies’ reported cost savings figures,” so these numbers may not be reliable.
    • In terms of how to improve the DCOI, the GAO stated that “[i]n addition to reiterating our prior open recommendations to the agencies in our review regarding their need to meet DCOI’s closure and savings goals and optimization metrics, we are making a total of eight new recommendations—four to OMB and four to three of the 24 agencies. Specifically:
      • The Director of the Office of Management and Budget should (1) require that agencies explicitly document annual data center closure goals in their DCOI strategic plans and (2) track those goals on the IT Dashboard. (Recommendation 1)
      • The Director of the Office of Management and Budget should require agencies to report in their quarterly inventory submissions those facilities previously reported as data centers, even if those facilities are not subject to the closure and optimization requirements of DCOI. (Recommendation 2)
      • The Director of the Office of Management and Budget should document OMB’s decisions on whether to approve individual data centers when designated by agencies as either a mission critical facility or as a facility not subject to DCOI. (Recommendation 3)
      • The Director of the Office of Management and Budget should take action to address the key performance measurement characteristics missing from the DCOI optimization metrics, as identified in this report. (Recommendation 4)
  • Australia’s Inspector-General of Intelligence and Security (IGIS) released its first report on how well the nation’s security services did in observing the law with respect to COVID  app  data. The IGIS “is satisfied that the relevant agencies have policies and procedures in place and are taking reasonable steps to avoid intentional collection of COVID app data.” The IGIS revealed that “[i]ncidental collection in the course of the lawful collection of other data has occurred (and is permitted by the Privacy Act); however, there is no evidence that any agency within IGIS jurisdiction has decrypted, accessed or used any COVID app data.” The IGIS is also “satisfied  that  the intelligence agencies within IGIS jurisdiction which have the capability to incidentally collect a least some types of COVID app data:
    • Are aware of their responsibilities under Part VIIIA of the Privacy Act and are taking active steps to minimise the risk that they may collect COVID app data.
    • Have appropriate  policies  and  procedures  in  place  to  respond  to  any  incidental  collection of COVID app data that they become aware of. 
    • Are taking steps to ensure any COVID app data is not accessed, used or disclosed.
    • Are taking steps to ensure any COVID app data is deleted as soon as practicable.
    • Have not decrypted any COVID app data.
    • Are applying the usual security measures in place in intelligence agencies such that a ‘spill’ of any data, including COVID app data, is unlikely.
  • New Zealand’s Government Communications Security Bureau’s National Cyber Security Centre (NCSC) has released its annual Cyber Threat Report that found that “nationally significant organisations continue to be frequently targeted by malicious cyber actors of all types…[and] state-sponsored and non-state actors targeted public and private sector organisations to steal information, generate revenue, or disrupt networks and services.” The NCSC added:
    • Malicious cyber actors have shown their willingness to target New Zealand organisations in all sectors using a range of increasingly advanced tools and techniques. Newly disclosed vulnerabilities in products and services, alongside the adoption of new services and working arrangements, are rapidly exploited by state-sponsored actors and cyber criminals alike. A common theme this year, which emerged prior to the COVID-19 pandemic, was the exploitation of known vulnerabilities in internet-facing applications, including corporate security products, remote desktop services and virtual private network applications.
  • The former Director of the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) wrote an opinion piece disputing President Donald Trump’s claims that the 2020 Presidential Election was fraudulent. Christopher Krebs asserted:
    • While I no longer regularly speak to election officials, my understanding is that in the 2020 results no significant discrepancies attributed to manipulation have been discovered in the post-election canvassing, audit and recount processes.
    • This point cannot be emphasized enough: The secretaries of state in Georgia, Michigan, Arizona, Nevada and Pennsylvania, as well officials in Wisconsin, all worked overtime to ensure there was a paper trail that could be audited or recounted by hand, independent of any allegedly hacked software or hardware.
    • That’s why Americans’ confidence in the security of the 2020 election is entirely justified. Paper ballots and post-election checks ensured the accuracy of the count. Consider Georgia: The state conducted a full hand recount of the presidential election, a first of its kind, and the outcome of the manual count was consistent with the computer-based count. Clearly, the Georgia count was not manipulated, resoundingly debunking claims by the president and his allies about the involvement of CIA supercomputers, malicious software programs or corporate rigging aided by long-gone foreign dictators.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Daniel Schludi on Unsplash

Further Reading, Other Developments, and Coming Events (4 December)

Further Reading

  • How Misinformation ‘Superspreaders’ Seed False Election Theories” By Sheera Frenkel — The New York Times. A significant percentage of lies, misinformation, and disinformation about the legitimacy of the election have been disseminated by a small number of right-wing figures, which are then repeated, reposted, and retweeted. The Times relies on research of how much engagement people like President Donald Trump and Dan Bongino get on Facebook after posting untrue claims about the election and it turns out that such trends and rumors do not start spontaneously.
  • Facebook Said It Would Ban Holocaust Deniers. Instead, Its Algorithm Provided a Network for Them” By Aaron Sankin — The Markup. This news organization still found Holocaust denial material promoted by Facebook’s algorithm even though the platform said it was taking down such material recently. This result may point to the difficulty of policing objectionable material that uses coded language and/or the social media platforms lack of sufficient resources to weed out this sort of content.
  • What Facebook Fed the Baby Boomers” By Charlie Warzel — The New York Times. A dispiriting trip inside two people’s Facebook feeds. This article makes the very good point that comments are not moderated, and these tend to be significant sources of vitriol and disinformation.
  • How to ‘disappear’ on Happiness Avenue in Beijing” By Vincent Ni and Yitsing Wang — BBC. By next year, the People’s Republic of China (PRC) may have as many as 560 million security cameras, and one artist ran an experiment of sorts to see if a group of people could walk down a major street in the capital without being seen by a camera or without their face being seen at places with lots of cameras.
  • Patients of a Vermont Hospital Are Left ‘in the Dark’ After a Cyberattack” By Ellen Barry and Nicole Perlroth — The New York Times. A Russian hacking outfit may have struck back after the Department of Defense’s (DOD) Cyber Command and Microsoft struck them. A number of hospitals were hacked, and care was significantly disrupted. This dynamic may lend itself to arguments that the United States (U.S.) may be wise to curtail its offensive operations.
  • EU seeks anti-China alliance on tech with Biden” By Jakob Hanke Vela and David M. Herszenhorn — Politico. The European Union (EU) is hoping the United States (U.S.) will be more amenable to working together in the realm of future technology policy, especially against the People’s Republic of China (PRC) which has made a concerted effort to drive the adoption of standards that favor its companies (e.g., the PRC pushed for and obtained 5G standards that will favor Huawei). Diplomatically speaking, this is considered low-hanging fruit, and a Biden Administration will undoubtedly be more multilateral than the Trump Administration.
  • Can We Make Our Robots Less Biased Than We Are?” By David Berreby — The New York Times. The bias present in facial recognition technology and artificial intelligence is making its way into robotics, posing the question of how do we change this? Many African American and other minority scientists are calling for the inclusion of people of color inn designing such systems as a countermeasure to the usual bias for white men.

Other Developments

  • The top Democrat on the Senate Homeland Security and Governmental Affairs Committee wrote President Donald Trump and “slammed the Trump Administration for their lack of action against foreign adversaries, including Russia, China, and North Korea, that have sponsored cyber-attacks against American hospitals and research institutions in an effort to steal information related to development of Coronavirus vaccines.” Peters used language that was unusually strong as Members of Congress typically tone down the rhetoric and deploy coded language to signal their level of displeasure about administration action or inaction. Peters could well feel strongly about what he perceives to be Trump Administration indifference to the cyber threats facing institutions researching and developing COVID-19 vaccines, this is an issue on which he may well be trying to split Republicans, placing them in the difficult position of lining up behind a president disinclined to prioritize some cyber issues or breaking ranks with him.
    • Peters stated:
      • I urge you, again, to send a strong message to any foreign government attempting to hack into our medical institutions that this behavior is unacceptable. The Administration should use the tools at its disposal, including the threat of sanctions, to deter future attacks against research institutions. In the event that any foreign government directly threatens the lives of Americans through attacks on medical facilities, other Department of Defense capabilities should be considered to make it clear that there will be consequences for these actions.
  • A United States federal court has ruled against a Trump Administration appointee Michael Pack and the United States Agency for Global Media (USAGM) and their attempts to interfere illegally with the independence of government-funded news organizations such as the Voice of America (VOA). The District Court for the District of Columbia enjoined Pack and the USAGM from a list of actions VOA and USAGM officials claim are contrary to the First Amendment and the organization’s mission.
  • The Federal Trade Commission (FTC) is asking a United States federal court to compel former Trump White House advisor Steve Bannon to appear for questioning per a Civil Investigative Demand (CID) as part of its ongoing probe of Cambridge Analytica’s role in misusing personal data of Facebook users in the 2016 Presidential Election. The FTC noted it “issued the CID to determine, among other things, whether Bannon may be held individually liable for the deceptive conduct of Cambridge Analytica, LLC—the subject of an administrative law enforcement action brought by the Commission.” There had been an interview scheduled in September but the day before it was to take place, Bannon’s lawyers informed the FTC he would not be attending.
    • In 2019, the FTC settled with former Cambridge Analytica CEO Alexander Nix and app developer Aleksandr Kogan in “administrative orders restricting how they conduct any business in the future, and requiring them to delete or destroy any personal information they collected.” The FTC did not, however, settle with the company itself. The agency alleged “that Cambridge Analytica, Nix, and Kogan deceived consumers by falsely claiming they did not collect any personally identifiable information from Facebook users who were asked to answer survey questions and share some of their Facebook profile data.” Facebook settled with the FTC for a record $5 billion for its role in the Cambridge Analytica scandal and for how it violated its 2012 consent order with the agency.
  • Apple responded to a group of human rights and civil liberties organizations about its plans to deploy technology on its operating system that allows users greater control of their privacy. Apple confirmed that its App Tracking Transparency (ATT) would be made part of its iOS early next year and would provide users of Apple products with a prompt with a warning about how their information may be used by the app developer. ATT would stop app developers from tracking users when they use other apps on ta device. Companies like Facebook have objected, claiming that the change is a direct shot at them and their revenue. Apple does not reap a significant revenue stream from collecting, combining, and processing user data whereas Facebook does. Facebook also tracks users across devices and apps on a device through a variety of means.
    • Apple stated:
      • We delayed the release of ATT to early next year to give developers the time they indicated they needed to properly update their systems and data practices, but we remain fully committed to ATT and to our expansive approach to privacy protections. We developed ATT for a single reason: because we share your concerns about users being tracked without their consent and the bundling and reselling of data by advertising networks and data brokers.
      • ATT doesn’t ban the reasonable collection of user data for app functionality or even for advertising. Just as with the other data-access permissions we have added over many software releases, developers will be able to explain why they want to track users both before the ATT prompt is shown and in the prompt itself. At that point, users will have the freedom to make their own choice about whether to proceed. This privacy innovation empowers consumers — not Apple — by simply making it clear what their options are, and giving them the information and power to choose.
    • As mentioned, a number of groups wrote Apple in October “to express our disappointment that Apple is delaying the full implementation of iOS 14’s anti-tracking features until early 2021.” They argued:
      • These features will constitute a vital policy improvement with the potential to strengthen respect for privacy across the industry. Apple should implement these features as expeditiously as possible.
      • We were heartened by Apple’s announcement that starting with the iOS 14 update, all app developers will be required to provide information that will help users understand the privacy implications of an app before they install it, within the App Store interface.
      • We were also pleased that iOS 14 users would be required to affirmatively opt in to app tracking, on an app-by-app basis. Along with these changes, we urge Apple to verify the accuracy of app policies, and to publish transparency reports showing the number of apps that are rejected and/or removed from the App Store due to inadequate or inaccurate policies.
  • The United States (U.S.) Government Accountability Office (GAO) sent its assessment of the privacy notices and practices of U.S. banks and credit unions to the chair of the Senate committee that oversees this issue. Senate Banking, Housing, and Urban Affairs Committee Chair Mike Crapo (R-ID) had asked the GAO “to examine the types of personal information that financial institutions collect, use, and share; how they make consumers aware of their information-sharing practices; and federal regulatory oversight of these activities.” The GAO found that a ten-year-old model privacy disclosure form used across these industries may comply with the prevailing federal requirements but no longer encompasses the breadth and scope of how the personal information of people is collected, processed, and used. The GAO called on the Consumer Financial Protection Bureau (CFPB) to update this form. The GAO explained:
    • Banks and credit unions collect, use, and share consumers’ personal information—such as income level and credit card transactions—to conduct everyday business and market products and services. They share this information with a variety of third parties, such as service providers and retailers.
    • The Gramm-Leach-Bliley Act (GLBA) requires financial institutions to provide consumers with a privacy notice describing their information-sharing practices. Many banks and credit unions elect to use a model form—issued by regulators in 2009—which provides a safe harbor for complying with the law (see figure). GAO found the form gives a limited view of what information is collected and with whom it is shared. Consumer and privacy groups GAO interviewed cited similar limitations. The model form was issued over 10 years ago. The proliferation of data-sharing since then suggests a reassessment of the form is warranted. Federal guidance states that notices about information collection and usage are central to providing privacy protections and transparency.
    • Since Congress transferred authority to the CFPB for implementing GLBA privacy provisions, the agency has not reassessed if the form meets consumer expectations for disclosures of information-sharing. CFPB officials said they had not considered a reevaluation because they had not heard concerns from industry or consumer groups about privacy notices. Improvements to the model form could help ensure that consumers are better informed about all the ways banks and credit unions collect and share personal information
    • The increasing amounts of and changing ways in which industry collects and shares consumer personal information—including from online activities—highlights the importance of clearly disclosing practices for collection, sharing, and use. However, our work shows that banks and credit unions generally used the model form, which was created more than 10 years ago, to make disclosures required under GLBA. As a result, the disclosures often provided a limited view of how banks and credit unions collect, use, and share personal information.
    • We recognize that the model form is required to be succinct, comprehensible to consumers, and allow for comparability across institutions. But, as information practices continue to change or expand, consumer insights into those practices may become even more limited. Improvements and updates to the model privacy form could help ensure that consumers are better informed about all the ways that banks and credit unions collect, use, and share personal information. For instance, in online versions of privacy notices, there may be opportunities for readers to access additional details—such as through hyperlinks—in a manner consistent with statutory requirements.
  • The Australian Competition & Consumer Commission (ACCC) is asking for feedback on Google’s proposed $2.1 billion acquisition of Fitbit. In a rather pointed statement, the chair of the ACCC, Rod Sims, made clear “[o]ur decision to begin consultation should not be interpreted as a signal that the ACCC will ultimately accept the undertaking and approve the transaction.” The buyout is also under scrutiny in the European Union (EU) and may be affected by the suit the United States Department of Justice (DOJ) and some states have brought against the company for anti-competitive behavior. The ACCC released a Statement of Issues in June about the proposed deal.
    • The ACCC explained “[t]he proposed undertaking would require Google to:
      • not use certain user data collected through Fitbit and Google wearables for Google’s advertising purposes for 10 years, with an option for the ACCC to extend this obligation by up to a further 10 years;
      • maintain access for third parties, such as health and fitness apps, to certain user data collected through Fitbit and Google wearable devices for 10 years; and
      • maintain levels of interoperability between third party wearables and Android smartphones for 10 years.
    • In August, the EU “opened an in-depth investigation to assess the proposed acquisition of Fitbit by Google under the EU Merger Regulation.” The European Commission (EC) expressed its concerns “that the proposed transaction would further entrench Google’s market position in the online advertising markets by increasing the already vast amount of data that Google could use for personalisation of the ads it serves and displays.” The EC stated “[a]t this stage of the investigation, the Commission considers that Google:
      • is dominant in the supply of online search advertising services in the EEA countries (with the exception of Portugal for which market shares are not available);
      • holds a strong market position in the supply of online display advertising services at least in Austria, Belgium, Bulgaria, Croatia, Denmark, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Norway, Poland, Romania, Slovakia, Slovenia, Spain, Sweden and the United Kingdom, in particular in relation to off-social networks display ads;
      • holds a strong market position in the supply of ad tech services in the EEA.
    • The EC explained that it “will now carry out an in-depth investigation into the effects of the transaction to determine whether its initial competition concerns regarding the online advertising markets are confirmed…[and] will also further examine:
      • the effects of the combination of Fitbit’s and Google’s databases and capabilities in the digital healthcare sector, which is still at a nascent stage in Europe; and
      • whether Google would have the ability and incentive to degrade the interoperability of rivals’ wearables with Google’s Android operating system for smartphones once it owns Fitbit.
    • Amnesty International (AI) sent EC Executive Vice-President Margrethe Vestager a letter, arguing “[t]he merger risks further extending the dominance of Google and its surveillance-based business model, the nature and scale of which already represent a systemic threat to human rights.” AI asserted “[t]he deal is particularly troubling given the sensitive nature of the health data that Fitbit holds that would be acquired by Google.” AI argued “[t]he Commission must ensure that the merger does not proceed unless the two business enterprises can demonstrate that they have taken adequate account of the human rights risks and implemented strong and meaningful safeguards that prevent and mitigate these risks in the future.”
  • Europol, the United Nations Interregional Crime and Justice Research Institute (UNICRI) and Trend Micro have cooperated on a report that looks “into current and predicted criminal uses of artificial intelligence (AI).
    • The organizations argued “AI could be used to support:
      • convincing social engineering attacks at scale;
      • document-scraping malware to make attacks more efficient;
      • evasion of image recognition and voice biometrics;
      • ransomware attacks, through intelligent targeting and evasion;
      • data pollution, by identifying blind spots in detection rules.
    • The organizations concluded:
      • Based on available insights, research, and a structured open-source analysis, this report covered the present state of malicious uses and abuses of AI, including AI malware, AI-supported password guessing, and AI-aided encryption and social engineering attacks. It also described concrete future scenarios ranging from automated content generation and parsing, AI-aided reconnaissance, smart and connected technologies such as drones and autonomous cars, to AI-enabled stock market manipulation, as well as methods for AI-based detection and defense systems.
      • Using one of the most visible malicious uses of AI — the phenomenon of so-called deepfakes — the report further detailed a case study on the use of AI techniques to manipulate or generate visual and audio content that would be difficult for humans or even technological solutions to immediately distinguish from authentic ones.
      • As speculated on in this paper, criminals are likely to make use of AI to facilitate and improve their attacks by maximizing opportunities for profit within a shorter period, exploiting more victims, and creating new, innovative criminal business models — all the while reducing their chances of being caught. Consequently, as “AI-as-a-Service”206 becomes more widespread, it will also lower the barrier to entry by reducing the skills and technical expertise required to facilitate attacks. In short, this further exacerbates the potential for AI to be abused by criminals and for it to become a driver of future crimes.
      • Although the attacks detailed here are mostly theoretical, crafted as proofs of concept at this stage, and although the use of AI to improve the effectiveness of malware is still in its infancy, it is plausible that malware developers are already using AI in more obfuscated ways without being detected by researchers and analysts. For instance, malware developers could already be relying on AI-based methods to bypass spam filters, escape the detection features of antivirus software, and frustrate the analysis of malware. In fact, DeepLocker, a tool recently introduced by IBM and discussed in this paper, already demonstrates these attack abilities that would be difficult for a defender to stop.
      • To add, AI could also enhance traditional hacking techniques by introducing new ways of performing attacks that would be difficult for humans to predict. These could include fully automated penetration testing, improved password-guessing methods, tools to break CAPTCHA security systems, or improved social engineering attacks. With respect to open-source tools providing such functionalities, the paper discussed some that have already been introduced, such as DeepHack, DeepExploit, and XEvil.
      • The widespread use of AI assistants, meanwhile, also creates opportunities for criminals who could exploit the presence of these assistants in households. For instance, criminals could break into a smart home by hijacking an automation system through exposed audio devices.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

EDPB Publishes Schrems II Recommendations; EU Parliament Issues Draft SCC Revisions

The EU takes steps to respond to the CJEU’s striking down of the EU-US Privacy Shield by augmenting SCCs and other transfer mechanisms.

The European Data Protection Board (EDPB) published recommendations for entities exporting and importing the personal data of European Union (EU) residents in light of the court decision striking down the adequacy decision that allowed transfers to the United States (U.S.). The EDPB noted that alternate mechanisms like standard contractual clauses (SCC) may still be used for transfers to nations without adequate protections of EU rights provided that supplemental measures are used. It should be noted that the EDPB said that supplemental measures will be needed for the use of any transfers to nations that do not guarantee the same level of rights as the EU, which would include Binding Corporate Rules (BCR). While, the EDPB’s recommendations will undoubtedly prove persuasive with the Supervisory Authorities (SA), each SA will ultimately assess whether the mechanisms and supplementary measures used by entities comport with General Data Protection Regulation (GDPR) and the EU’s Charter of Fundamental Rights.

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the Court of Justice for the European Union (CJEU) explained:

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the U.S. lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law. However, the CJEU left open the question of whether SCCs may permit the continued exporting of EU personal data into the U.S. for companies like Facebook, Google, and many, many others. Consequently, there has been no small amount of interpreting and questioning of whether this may be a way for the trans-Atlantic data flow worth billions, perhaps even trillions, of dollars to continue. And yet, the CJEU seemed clear that additional measures would likely be necessary. Indeed, the CJEU asserted “[c]ontrollers and processors should be encouraged to provide additional safeguards via contractual commitments that supplement standard protection clauses” and “[i]n so far as those standard data protection clauses cannot, having regard to their very nature, provide guarantees beyond a contractual obligation to ensure compliance with the level of protection required under EU law, they may require, depending on the prevailing position in a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with that level of protection.”

In “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data,” the EDPB explained the genesis and rationale for the document:

  • The GDPR or the [CJEU] do not define or specify the “additional safeguards”, “additional measures” or “supplementary measures” to the safeguards of the transfer tools listed under Article 46.2 of the GDPR that controllers and processors may adopt to ensure compliance with the level of protection required under EU law in a particular third country.
  • The EDPB has decided, on its own initiative, to examine this question and to provide controllers and processors, acting as exporters, with recommendations on the process they may follow to identify and adopt supplementary measures. These recommendations aim at providing a methodology for the exporters to determine whether and which additional measures would need to be put in place for their transfers. It is the primary responsibility of exporters to ensure that the data transferred is afforded in the third country of a level of protection essentially equivalent to that guaranteed within the EU. With these recommendations, the EDPB seeks to encourage consistent application of the GDPR and the Court’s ruling, pursuant to the EDPB’s mandate

Broadly speaking, whether SCCs and supplemental measures will pass muster under the GDPR will be determined on a case-by-case basis. The EDPB did not offer much in the way of bright line rules. Indeed, it will be up to SAs to determine if transfers to nations like the U.S. are possible under the GDPR, meaning these recommendations may shed more light on this central question without deciding it. One wonders, as a practical matter, if the SAs will have the capacity, resources, and will to police SCCs to ensure the GDPR and Charter are being met.

Nonetheless, the EDPB stressed the principle of accountability under which controllers which export personal data must ensure that whatever mechanism and supplemental measures govern a data transfer, the data must receive the same protection it would in the EU. The EDPB made the point that EU protections travel with the data and should EU personal data make its way to a country where it is not possible for appropriate protection to occur, then the transfer violates the GDPR. Moreover, these recommendations pertain to both public and private transfers of EU data to private sector entities outside the EU.

These recommendations work like a decision tree with exporters needing to ask themselves a series of questions to determine whether they must use supplemental measures. This may prove a resource intensive process, for exporters will need to map all transfers (i.e. know exactly) where the data are going. The exporter must understand the laws and practices of the third nation in order to put in place appropriate measures if this is possible in order to meet the EU’s data protection standards.

Reading between the lines leads one to conclude that data exporters may not send personal data to the U.S. for its federal surveillance regime is not “necessary and proportionate,” at least from the EU’s view. The U.S. lacks judicial redress in the case a U.S. national, let alone a foreign national, objects to the sweeping surveillance. The U.S. also has neither a national data protection law nor a dedicated data protection authority. These hints seem to also convey the EDPB’s view on the sorts of legal reforms needed in the U.S. before an adequacy decision would pass muster with the CJEU.

The EDPB said it was still evaluating how Schrems II affects the use of BCR and ad hoc contractual clauses, two of the other alternate means of transferring EU personal data in the absence of an adequacy agreement.

Nevertheless, in an annex, the EDPB provided examples of supplementary measures that may be used depending on the circumstances, of course, such as “flawlessly implemented” encryption and pseudonymizing data. However, the EDPB discusses these in the context of different scenarios and calls for more conditions than just the two aforementioned. Moreover, the EDPB rules out two scenarios categorically as being inadequate: “Transfer to cloud services providers or other processors which require access to data in the clear” and “Remote access to data for business purposes.”

The EDPB also issued an update to guidance published after the first lawsuit brought by Maximilian Schrems resulted in the striking down of the Safe Harbor transfer agreement. The forerunner to the EDPB, the Working Party 29, had drafted and released the European Essential Guarantees, and so, in light of Schrems II, the EDPB updated and published “Recommendations 02/2020 on the European Essential Guarantees for surveillance measures” “to provide elements to examine, whether surveillance measures allowing access to personal data by public authorities in a third country, being national security agencies or law enforcement authorities, can be regarded as a justifiable interference or not” with fundamental EU rights and protections. As the EDPB explains, these recommendations are intended to help data controllers and exporters determine whether other nations have protections and processes in place equivalent to those of the EU visa vis their surveillance programs. The EDPB stressed that these are the essential guarantees and other features and processes may be needed for a determination of lawfulness under EU law.

The EDPB formulated the four European Essential Guarantees:

A. Processing should be based on clear, precise and accessible rules

B. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated

C. An independent oversight mechanism should exist

D. Effective remedies need to be available to the individual

The European Commission (EC) has also released for comment a draft revision of SCC for transfers of personal data to countries outside the EU. The EC is accepting comments and input until 10 December. It may be no accident that the EDPB and EC more or less acted in unison to address the practical and statutory changes necessary to effectuate the CJEU’s striking down of the EU-US Privacy Shield. Whatever the case, the EC released draft legislative language and, in an Annex, actual contract language for use by controllers and processors in the form of modules that are designed to be used in a variety of common circumstances (e.g., transfers by controllers to other controllers or a controller to a processor.) However, as the EDPB did, the EC stressed that SCCs form a floor and controllers, processors, and other parties are free to add additional language so long as it does not contradict or denigrate the rights protected by SCCs.

In the implementing decision, the EC asserted

the standard contractual clauses needed to be updated in light of new requirements in Regulation (EU) 2016/679. Moreover, since the adoption of these decisions, important developments have taken place in the digital economy, with the widespread use of new and more complex processing operations often involving multiple data importers and exporters, long and complex processing chains as well as evolving business relationships. This calls for a modernisation of the standard contractual clauses to better reflect those realities, by covering additional processing and transfer situations and to use a more flexible approach, for example with respect to the number of parties able to join the contract.

The EC continued:

The standard contractual clauses set out in the Annex to this Decision may be used by a controller or a processor in order to provide appropriate safeguards within the meaning of Article 46(1) of Regulation (EU) 2016/679 for the transfer of personal data to a processor or a controller established in a third country. This also includes the transfer of personal data by a controller or processor not established in the Union, to the extent that the processing is subject to Regulation (EU) 2016/679 pursuant to Article 3(2) thereof, because it relates to the offering of goods or services to data subjects in the Union or the monitoring of their behaviour as far as their behaviour takes place within the Union.

The EC explained the design and intent of the SCC language in the Annex:

  • The standard contractual clauses set out in the Annex to this Decision combine general clauses with a modular approach to cater for various transfer scenarios and the complexity of modern processing chains. In addition to the general clauses, controllers and processors should select the module applicable to their situation, which makes it possible to tailor their obligations under the standard contractual clauses to their corresponding role and responsibilities in relation to the data processing at issue. It should be possible for more than two parties to adhere to the standard contractual clauses. Moreover, additional controllers and processors should be allowed to accede to the standard contractual clauses as data exporters or importers throughout the life cycle of the contract of which those clauses form a part.
  • These Clauses set out appropriate safeguards, including enforceable data subject rights and effective legal remedies, pursuant to Article 46(1), and Article 46 (2)(c) of Regulation (EU) 2016/679 and, with respect to data transfers from controllers to processors and/or processors to processors, standard contractual clauses pursuant to Article 28(7) of Regulation (EU) 2016/679, provided they are not modified, except to add or update information in the Annexes. This does not prevent the Parties from including the standard contractual clauses laid down in this Clauses in a wider contract, and to add other clauses or additional safeguards provided that they do not contradict, directly or indirectly, the standard contractual clauses or prejudice the fundamental rights or freedoms of data subjects. These Clauses are without prejudice to obligations to which the data exporter is subject by virtue of the Regulation (EU) 2016/679

In October, the Trump Administration released a crib sheet they are hoping U.S. multinationals will have success in using to argue to SAs that SCC and BCR and U.S. law satisfy the European Court of Justice’s ruling that struck down the EU-U.S. Privacy Shield. And, the Trump Administration is basically arguing, sure, we spy, but most EU citizens data is not surveilled and EU governments themselves often share in the proceeds of the surveillance we conduct. Moreover, there are plenty of safeguards and means of redress in the U.S. system because, you know, we say so. It is unlikely this analysis will be very persuasive in the EU, especially since these broad arguments do not go to the criticisms the EU has had under Privacy Shield about U.S. surveillance and privacy rights nor to the basis for the CJEU’s ruling.

Earlier this month, the European Data Protection Supervisor (EDPS) published a strategy detailing how EU agencies and bodies should comply with the CJEU ruling that struck down the EU-US Privacy Shield and threw into question the compliance of SCC with EU law and the GDPR. The EDPS has already started working with EU Institutions’, bodies, offices and agencies (EUIs) on the process of determining if their transfers of the personal data of people in the EU to the U.S. meets the CJEU’s judgement.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Anthony Beck from Pexels

Schrems II Guidance

The agency that oversees the data protection of EU agencies has laid out its view on how they should comply with the GDPR after the EU-US Privacy Shield.

The European Data Protection Supervisor (EDPS) has published a strategy detailing how European Union (EU) agencies and bodies should comply with the Court of Justice of the European Union’s (CJEU) ruling that struck down the EU-United States (U.S.) Privacy Shield (aka Schrems II) and threw into question the compliance of Standard Contractual Clauses (SCC) with EU law and the General Protection Data Regulation (GDPR). The EDPS has already started working with EU Institutions’, bodies, offices and agencies (EUIs) on the process of determining if their transfers of the personal data of people in the EU to the U.S. meets the CJEU’s judgement.

The EDPS makes clear most of the transfers by EUIs to the U.S. are on account of using U.S. information and communications technology (ICT) products and services, meaning U.S. multinationals like Microsoft, Google, and others. The EDPS has proposed a strategy that would first identify risks and then move to address them. It bears stressing that this strategy applies only to EUIs and not private sector controllers, but it is likely the European Data Protection Board (EDPB) and EU DPAs will take notice of the EDPS’ strategy on how to comply with Schrems II. However, the EDPS acknowledges that it is obliged to follow the EDPB’s lead and vows to change its strategy upon issuance of EDPB guidance on Schrems II and SCC. And yet, the EDPS explained that EUIs will need to report back on how they are implementing the steps in the strategy, particularly on those ongoing transfers to countries like the U.S. that have inadequate data protection laws, those transfers that have been suspended, and any transfers being conducted per derogations in the GDPR. On the basis of this feedback, the EDPS will “establish long-term compliance” in 2021.

It seems a bit backwards for the EDPS to task the EUIs with determining which transfers under SCC may proceed under the GDPR when it might be a more efficient process for the EDPS to take on this job directly and rule on the ICT services and providers, permitting all EUIs to understand which comply with EU law and which do not. However, the EDPS is exploring the possibility of determining the sufficiency of data protection in other nations, most likely, first and foremost the U.S., and then working with EU stakeholders to coordinate compliance with the CJEU’s ruling and the GDPR.

The EDPS claimed the CJEU “clarified the roles and responsibilitiesof controllers, recipients of data outside of the European Economic Area (EEA) (data importers) and supervisory authorities…[and] ruled the following:

  • The Court invalidated the Privacy Shield adequacy Decision and confirmed that the SCCs were valid providing that they include effective mechanisms to ensure compliance in practice with the “essentially equivalent” level of protection guaranteed within the EU by the General Data Protection Regulation (GDPR). Transfers of personal data pursuant to the SCCs are suspended or prohibited in the event of a breach of such clauses, or in case it is impossible to honour them.
  • The SCCs for transfers may then require, depending on the prevailing position of a particular third country, the adoption of supplementary measures by the controller in order to ensure compliance with the level of protection guaranteed within the EU.
  • In order to continue these data transfers, the Court stresses that before transferring personal data to a third country, it is the data exporters’ and data importers’ responsibility to assess whether the legislation of the third country of destination enables the data importer to comply with the guarantees provided through the transfer tools in place. If this is not the case, it is also the exporter and the importer’s duty to assess whether they can implement supplementary measures to ensure an essentially equivalent level of protection as provided by EU law. Should data exporters, after taking into account the circumstances of the transfer and possible supplementary measures, conclude that appropriate safeguards cannot be ensured, they are required to suspend or terminate the transfer of personal data. In case the exporter intends nevertheless to continue the transfer of personal data, they must notify their competent SA.
  • The competent supervisory authority is required to suspend or prohibit a transfer of personal data to a third country pursuant to the SCCs if, when considering the circumstances of that transfer, those clauses are not or cannot be complied with in the third country of destination and the protection of the data transferred under EU law cannot be ensured by other means.

EDPS explained:

The EDPS’ report on the 2017 survey entitled, Measuring compliance with data protection rules in EU institutions, provides evidence that there has been a significant rise in the number of transfers related to the core business of EUIs in recent years. This number is even higher now, due to the increased use of ICT services and social media. The EDPS’ own-initiative investigation into the use of Microsoft products and services by EUIs and subsequent recommendations in that regard confirms the importance to ensure a level of protection that is essentially equivalent as the one guaranteed within the EU, as provided by relevant data protection laws, to be interpreted in accordance with the EU Charter. In this context, the EDPS has already flagged a number of linked issues concerning sub-processors, data location, international transfers and the risk of unlawful disclosure of data – issues that the EUIs were unable to control and ensure proper safeguards to protect data that left the EU/EEA. The issues we raised in our investigation report are consistent with the concerns expressed in the Court’s Judgment, which we are assessing in relation to any processor agreed to by EUIs.

Regarding data flows to the U.S. quite possibly in violation of the GDPR and Schrems II, the EDPS:

  • Moreover, a majority of data flows to processors most probably happen because EUIs use service providers that are either based in the U.S. or that use sub-processors based in the U.S., in particular for ICT services, which fall under the scope of U.S. surveillance laws. Such companies have primarily relied on the Privacy Shield adequacy Decision to transfer personal data to the U.S. and the use of SCCs as a secondary measure.
  • Therefore, the present Strategy emphasizes the priority to address transfers of data by EUIs or on their behalf in the context of controller to process or contract and/or processor to sub-processor contracts, in particular towards the United States.

The EDPS is calling for “a twofold approach as the most appropriate:

(1) Identify urgent compliance and/or enforcement actions through a risk based approach for transfers towards the U.S. presenting high risks for data subjects and in parallel

(2) provide guidance and pursue mid-term case-by-case EDPS compliance and or enforcement actions for all transfers towards the U.S. or other third countries.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pete Linforth from Pixabay

Further Reading, Other Developments, and Coming Events (5 November)

Further Reading

  • Confusion and conflict stir online as Trump claims victory, questions states’ efforts to count ballots” By Craig Timberg, Tony Romm, Isaac Stanley-Becker and Drew Harwell — Washington Post. When the post-mortem on the 2020 Election is written, it is likely to be the case that foreign disinformation was not the primary threat. Rather, it may be domestic interference given the misinformation, disinformation, and lies circulating online despite the best efforts of social media platforms to label, take down, and block such material. However, if this article is accurate, much of it is coming from the right wing, including the President.
  • Polls close on Election Day with no apparent cyber interference” By Kevin Collier and Ken Dilanian — NBC News. Despite crowing from officials like The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) Director Christopher Krebs and U.S. Cyber Command head General Paul Naksone, it is not altogether clear that U.S. efforts, especially publicized offensive operations are the reason there were no significant cyber attacks on Election Day. However, officials are cautioning the country is not out of the woods as vote counting is ongoing and opportunities for interference and mischief remain.
  • Russian hackers targeted California, Indiana Democratic parties” By Raphael Satter, Christopher Bing, Joel Schectman — Reuters. Apparently, Microsoft helped foil Russian efforts to hack two state Democratic parties and think tanks, some of which are allied with the Democratic party. However, it appears none of the attempts, which occurred earlier this year, were successful. The article suggests but does not claim that increased cyber awareness and defenses foiled most of the attempts by hacking group, Fancy Bear.
  • LexisNexis to Pay $5 Million Class Action Settlement for Selling DMV Data” By Joseph Cox — Vice. Data broker LexisNexis is settling a suit that it violated the Drivers’ Privacy Protection Act (DPPA) by obtaining Department of Motor Vehicles (DMV) records on people for a purpose not authorized under the law. Vice has written a number of articles on the practices of DMVs selling people’s data, which has caught the attention of at least two Democratic Members of Congress who have said they will introduce legislation to tighten the circumstances under which these data may be shared or sold.
  • Spy agency ducks questions about ‘back doors’ in tech products” By Joseph Menn — Reuters. Senator Ron Wyden (D-OR) is demanding that the National Security Agency (NSA) reveal the guidelines put in place after former NSA contractor Edward Snowden revealed the agency’s practice of getting backdoors in United States (U.S.) technology it could use in the future. This practice allowed the NSA to sidestep warrant requirements, but it also may have weakened technology that was later exploited by other governments as the People’s Republic of China (PRC) allegedly did to Juniper in 2015. After Snowden divulged the NSA’s practice, reforms were supposedly put in place but never shared with Congress.

Other Developments

  • Australia’s Joint Committee on Intelligence and Security issued a new report into Australia’s mandatory data retention regime that makes 22 recommendations to “increase transparency around the use of the mandatory data retention and increase the threshold for when data can be accessed…[and] reduce the currently very broad access to telecommunications data under the Telecommunications Act.” The committee stated “[t]he report’s 22 recommendations include:
    • access to data kept under the mandatory data retention regime will only be available under specific circumstances
    • the Department of Home Affairs develop guidelines for data collection including an ability for enforcement agencies and Home Affairs to produce reports to oversight agencies or Parliament when requested
    • the repeal of section 280(1)(b) of the Telecommunications Act which allows for access where ‘disclosure or use is required or authorised by or under law.’ It is the broad language in this subsection that has allowed the access that concerned the committee
    • The committee explained:
      • The Parliamentary Joint Committee on Intelligence and Security (the Committee) is required by Part 5-1A of the Telecommunications (Interception and Access) Act 1979 (TIA Act) to undertake a review of the mandatory data retention regime (MDRR).
      • The mandatory data retention regime is a legislative framework which requires carriers, carriage service providers and internet service providers to retain a defined set of telecommunications data for two years, ensuring that such data remains available for law enforcement and national security investigations.
  • Senators Ron Wyden (D-OR) and Sherrod Brown (D-OH) wrote a letter “to trade associations urging them to take immediate action to ensure their members are not complicit in China’s state-directed human rights abuses, including by relocating production from the Xinjiang Uyghur Autonomous Region.” They stated:
    • We write to express our concerns over reports that the industries and companies that the U.S. Chamber of Commerce represents have supply chains that have been implicated in the state-sanctioned forced labor of Uyghurs and other Muslim groups in the Xinjiang Uyghur Autonomous Region of China (XUAR) and in sites where Uyghurs have been relocated.  The decision to operate or contract with production facilities overseas must be accompanied by high standards of supply chain accountability and transparency to ensure that no company’s products are made with forced labor.  We urge your members to take immediate action to ensure goods manufactured for them are not complicit in the China’s state-directed human rights abuses, including by relocating production from the XUAR.  In addition, we ask your members to take critical, comprehensive steps to achieve the supply chain integrity and transparency American consumers and workers deserve.  It is past time for American multinational companies to be part of the solution, not part of the problem, on efforts to eradicate forced labor and end human rights abuses against workers in China. 
  • The Federal Trade Commission (FTC) finalized a settlement alleging violations of the now struck down European Union-United States Privacy Shield. In its press release, the agency explained it had “alleged that NTT Global Data Centers Americas, Inc. (NTT), formerly known as RagingWire Data Centers, Inc., claimed in its online privacy policy and marketing materials that the company participated in the Privacy Shield framework and complied with the program’s requirements.” The FTC noted “the company’s certification lapsed in January 2018 and it failed to comply with certain Privacy Shield requirements while it was a participant in the framework.” The FTC stated:
    • Under the settlement, the company, among other things, is prohibited not just from misrepresenting its compliance with or participation in the Privacy Shield framework, but also any other privacy or data security program sponsored by the government or any self-regulatory or standard-setting organization. The company also must continue to apply the Privacy Shield requirements or equivalent protections to personal information it collected while participating in the framework or return or delete the information.
    • Although the European Court of Justice invalidated the Privacy Shield framework in July 2020, that decision does not affect the validity of the FTC’s decision and order relating to NTT’s misrepresentations about its participation in and compliance with the framework. The framework allowed participants to transfer data legally from the European Union to the United States.
  • The Commission nationale de l’informatique et des libertés (CNIL) issued a press release, explaining that France’s “Council of State acknowledges the existence of a risk of data transfer from the Health Data Hub to the United States and requests additional safeguards.” CNIL stated it “will advise the public authorities on appropriate measures and will ensure, for research authorization related to the health crisis, that there is a real need to use the platform.” This announcement follows from the Court of Justice of the European Union (CJEU) striking down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). CNIL summarized the “essentials:”
    • Fearing that some data might be transferred to the United States, some claimants lodged an appeal with the Council of State requesting the suspension of the “Health Data Hub”, the new platform designed to ultimately host all the health data of people who receive medical care in France.
    • The Court considers that a risk cannot be excluded with regard to the transfer of health data hosted on the Health Data Hub platform to the US intelligence.
    • Because of the usefulness of the Health Data Hub in managing the health crisis, it refuses to suspend the operation of the platform.
    • However, it requires the Health Data Hub to strengthen its contract with Microsoft on a number of points and to seek additional safeguards to better protect the data it hosts.
    • It is the responsibility of the CNIL to ensure, for authorization of research projects on the Health Data Hub in the context of the health crisis, that the use of the platform is technically necessary, and to advise public authorities on the appropriate safeguards.
    • These measures will have to be taken while awaiting a lasting solution that will eliminate any risk of access to personal data by the American authorities, as announced by the French Secretary of State for the Digital Agenda.
  • The United Kingdom’s (UK) National Cyber Security Centre (NCSC) has published its annual review that “looks back at some of the key developments and highlights from the NCSC’s work between 1 September 2019 and 31 August 2020.” In the foreword, new NCSC Chief Executive Officer Lindy Cameron provided an overview:
    • Expertise from across the NCSC has been surged to assist the UK’s response to the pandemic. More than 200 of the 723 incidents the NCSC handled this year related to coronavirus and we have deployed experts to support the health sector, including NHS Trusts, through cyber incidents they have faced. We scanned more than one million NHS IP addresses for vulnerabilities and our cyber expertise underpinned the creation of the UK’s coronavirus tracing app.
    • An innovative approach to removing online threats was created through the ‘Suspicious Email Reporting Service’ – leading to more than 2.3 million reports of malicious emails being flagged by the British public. Many of the 22,000 malicious URLs taken down as a result related to coronavirus scams, such as pretending to sell PPE equipment to hide a cyber attack. The NCSC has often been described as world-leading, and that has been evident over the last 12 months. Our innovative ‘Exercise in a Box’ tool, which supports businesses and individuals to test their cyber defences against realistic scenarios, was used in 125 countries in the last year.
    • Recognising the change in working cultures due to the pandemic, our team even devised a specific exercise on remote working, which has helped organisations to understand where current working practices may be presenting alternative cyber risks. Proving that cyber really is a team sport, none of this would be possible without strong partnerships internationally and domestically. We worked closely with law enforcement – particularly the National Crime Agency – and across government, industry, academia and, of course, the UK public.
    • The NCSC is also looking firmly ahead to the future of cyber security, as our teams work to understand both the risks and opportunities to the UK presented by emerging technologies. A prominent area of work this year was the NCSC’s reviews of high-risk vendors such as Huawei – and in particular the swift and thorough review of US sanctions against Huawei. The NCSC gave advice on the impact these changes would have in the UK, publishing a summary of the advice given to government as well as timely guidance for operators and the public.
  • Australia’s Department of Industry, Science, Energy and Resources has put out for comment a discussion paper titled “An AI Action Plan for all Australians” to “shape Australia’s vision for artificial intelligence (AI).” The department said it “is now consulting on the development of a whole-of-government AI Action Plan…[that] will help us maximise the benefits of AI for all Australians and manage the potential challenges.” The agency said “[t]he will help to:
    • ensure the development and use of AI in Australia is responsible
    • coordinate government policy and national capability under a clear, common vision for AI in Australia
    • explore the actions needed for our AI future
    • The department explained:
      • Building on Australia’s AI Ethics Framework, the Australian Government is developing an AI Action Plan. It is a key component of the government’s vision to be a leading digital economy by 2030. It builds on almost $800 million invested in the 2020-21 Budget to enable businesses to take advantage of digital technologies to grow their businesses and create jobs. It is an opportunity to leverage AI as part of the Australian Government’s economic recovery plan. We must work together to ensure all Australians can benefit from advances in AI.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.