ePrivacy Exception Proposed

Late last month, a broad exception to the EU’s privacy regulations became effective.

My apologies. The first version of this post erroneously asserted the derogation to the ePrivacy Directive had been enacted. It has not, and this post has been re-titled and updated to reflect this fact.

As the European Union (EU) continues to work on enacting a modernized ePrivacy Directive (Directive 2002/58/EC) to complement the General Data Protection Regulation (GDPR), it proposed an exemption to manage a change in another EU law to sweep “number-independent interpersonal communications services” into the current regulatory structure of electronics communication. The policy justification for allowing a categorical exemption to the ePrivacy Directive is for combatting child sexual abuse online. This derogation of EU law is limited to at most five years and quite possibly less time if the EU can enact a successor to the ePrivacy Directive, an ePrivacy Regulation. However, it is unclear when this derogation will be agreed upon and enacted.

In September 2020, the European Commission (EC) issued “a Proposal for a Regulation on a temporary derogation from certain provisions of the ePrivacy Directive 2002/58/EC as regards the use of technologies by number-independent interpersonal communicationsservice providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.” The final regulation took effect on 21 December 2020. However, the EC has also issued a draft of compromise ePrivacy Regulation, the results of extensive communications. The GDPR was enacted with an update of the ePrivacy Directive in mind.

In early December, an EU Parliament committee approved the proposed derogation but the full Parliament has not yet acted upon the measure. The Parliament needs to reach agreement with the Presidency of the Council and the European Commission. In its press release, the Civil Liberties, Justice and Home Affairs explained:

The proposed regulation will provide for limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

Article 1 sets out the scope and aim of the temporary regulation:

This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.

The EC explained the legal and policy background for the exemption to the ePrivacy Directive:

  • On 21 December 2020, with the entry into application of the European Electronic Communications Code (EECC), the definition of electronic communications services will be replaced by a new definition, which includes number-independent interpersonal communications services. From that date on, these services will, therefore, be covered by the ePrivacy Directive, which relies on the definition of the EECC. This change concerns communications services like webmail messaging services and internet telephony.
  • Certain providers of number-independent interpersonal communications services are already using specific technologies to detect child sexual abuse on their services and report it to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, and/or to remove child sexual abuse material. These organisations refer to national hotlines for reporting child sexual abuse material, as well as organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the EU and in third countries.
  • Child sexual abuse is a particularly serious crime that has wide-ranging and serious life-long consequences for victims. In hurting children, these crimes also cause significant and long- term social harm. The fight against child sexual abuse is a priority for the EU. On 24 July 2020, the European Commission adopted an EU strategy for a more effective fight against child sexual abuse, which aims to provide an effective response, at EU level, to the crime of child sexual abuse. The Commission announced that it will propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and oblige them to report that material to public authorities by the second quarter of 2021. The announced legislation will be intended to replace this Regulation, by putting in place mandatory measures to detect and report child sexual abuse, in order to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse, while ensuring respect of the fundamental rights of the users, including in particular the right to freedom of expression and opinion, protection of personal data and privacy, and providing for mechanisms to ensure accountability and transparency.

The EC baldly asserts the problem of child online sexual abuse justifies a loophole to the broad prohibition on violating the privacy of EU persons. The EC did note that the fight against this sort of crime is a political priority for the EC, one that ostensibly puts the EU close to the views of the Five Eyes nations that have been pressuring technology companies to end the practice of making apps and hardware encrypted by default.

The EC explained:

The present proposal therefore presents a narrow and targeted legislative interim solution with the sole objective of creating a temporary and strictly limited derogation from the applicability of Articles 5(1) and 6 of the ePrivacy Directive, which protect the confidentiality of communications and traffic data. This proposal respects the fundamental rights, including the rights to privacy and protection of personal data, while enabling providers of number-independent interpersonal communications services to continue using specific technologies and continue their current activities to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services, pending the adoption of the announced long- term legislation. Voluntary efforts to detect solicitation of children for sexual purposes (“grooming”) also must be limited to the use of existing, state-of-the-art technology that corresponds to the safeguards set out. This Regulation should cease to apply in December 2025.

The EC added “[i]n case the announced long-term legislation is adopted and enters into force prior to this date, that legislation should repeal the present Regulation.”

In November, the European Data Protections Supervisor (EDPS) Wojciech Wiewiorówski published his opinion on the temporary, limited derogation from the EU’s regulation on electronics communication and privacy. Wiewiorówski cautioned that a short-term exception, however well-intended, would lead to future loopholes that would ultimately undermine the purpose of the legislation. Moreover, Wiewiorówski found that the derogation was not sufficiently specific guidance and safeguards and is not proportional. Wiewiorówski argued:

  • In particular, he notes that the measures envisaged by the Proposal would constitute an interference with the fundamental rights to respect for private life and data protection of all users of very popular electronic communications services, such as instant messaging platforms and applications. Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
  • The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field. The EDPS therefore considers it essential that the Proposal is not adopted, even in the form a temporary derogation, until all the necessary safeguards set out in this Opinion are integrated.
  • In particular, in the interest of legal certainty, the EDPS considers that it is necessary to clarify whether the Proposal itself is intended to provide a legal basis for the processing within the meaning of the GDPR, or not. If not, the EDPS recommends clarifying explicitly in the Proposal which legal basis under the GDPR would be applicable in this particular case.
  • In this regard, the EDPS stresses that guidance by data protection authorities cannot substitute compliance with the requirement of legality. It is insufficient to provide that the temporary derogation is “without prejudice” to the GDPR and to mandate prior consultation of data protection authorities. The co-legislature must take its responsibility and ensure that the proposed derogation complies with the requirements of Article 15(1), as interpreted by the CJEU.
  • In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.
  • Finally, the EDPS is of the view that the five-year period as proposed does not appear proportional given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards within the text of the legislation. He considers that the validity of any transitional measure should not exceed 2 years.

The Five Eyes nations (Australia, Canada, New Zealand, the United Kingdom, and the United States) issued a joint statement in which their ministers called for quick action.

In this statement, we highlight how from 21 December 2020, the ePrivacy Directive, applied without derogation, will make it easier for children to be sexually exploited and abused without detection – and how the ePrivacy Directive could make it impossible both for providers of internet communications services, and for law enforcement, to investigate and prevent such exploitation and abuse. It is accordingly essential that the European Union adopt urgently the derogation to the ePrivacy Directive as proposed by the European Commission in order for the essential work carried out by service providers to shield endangered children in Europe and around the world to continue.

Without decisive action, from 21 December 2020 internet-based messaging services and e-mail services captured by the European Electronic Communications Code’s (EECC) new, broader definition of ‘electronic communications services’ are covered by the ePrivacy Directive. The providers of electronic communications services must comply with the obligation to respect the confidentiality of communications and the conditions for processing communications data in accordance with the ePrivacy Directive. In the absence of any relevant national measures made under Article 15 of that Directive, this will have the effect of making it illegal for service providers operating within the EU to use their current tools to protect children, with the impact on victims felt worldwide.

As mentioned, this derogation comes at a time when the EC and the EU nations are trying to finalize and enact an ePrivacy Regulation. In the original 2017 proposal, the EC stated:

The ePrivacy Directive ensures the protection of fundamental rights and freedoms, in particular the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the Union.

The ePrivacy Regulation is intended to work in concert with the GDPR, and the draft 2020 regulation contains the following passages explaining the intended interplay of the two regulatory schemes:

  • Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons. Processing of electronic communications data by providers of electronic communications services and networks should only be permitted in accordance with this Regulation. This Regulation does not impose any obligations on the end-user End-users who are legal persons may have rights conferred by Regulation (EU) 2016/679 to the extent specifically required by this Regulation
  • While the principles and main provisions of Directive 2002/58/EC of the European Parliament and of the Council remain generally sound, that Directive has not fully kept pace with the evolution of technological and market reality, resulting in an inconsistent or insufficient effective protection of privacy and confidentiality in relation to electronic communications. Those developments include the entrance on the market of electronic communications services that from a consumer perspective are substitutable to traditional services, but do not have to comply with the same set of rules. Another development concerns new techniques that allow for tracking of online behaviour of end-users, which are not covered by Directive 2002/58/EC. Directive 2002/58/EC should therefore be repealed and replaced by this Regulation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Périgois on Unsplash

Five Eyes Again Lean On Tech About Encryption

In the latest demand, the usual suspects are joined by two new nations in urging tech to stop using default encryption and to essentially build backdoors.

The Five Eyes (FVEY) intelligence alliance plus two Asian nations have released an “International Statement: End-To-End Encryption and Public Safety,” which represents the latest FVEY salvo in their campaign against technology companies using default end-to-end encryption. Again, the FVEY nations are casting the issues presented by encryption through the prism of child sexual abuse, terrorism, and other horrible crimes in order to keep technology companies on their proverbial policy backfoot. For, after all, how can the reasonable tech CEO argue for encryption when it is being used to commit and cover up unspeakable crimes.

However, in a sign that technology companies may be facing a growing playing field, India and Japan joined the FVEY in this statement; whether this is a result of the recent Quadrilateral Security Dialogue is unclear, but it seems a fair assumption given that two of the FVEY nations, the United States and Australia make up the other two members of the Quad. And, of course, the United Kingdom, Canada, and New Zealand are the three other members of the FVEY.

In the body of the statement, FVEY, Japan, and India asserted:

  • We, the undersigned, support strong encryption, which plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cyber security.  It also serves a vital purpose in repressive states to protect journalists, human rights defenders and other vulnerable people, as stated in the 2017 resolution of the UN Human Rights Council.  Encryption is an existential anchor of trust in the digital world and we do not support counter-productive and dangerous approaches that would materially weaken or limit security systems. 
  • Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children. We urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content.  We call on technology companies to work with governments to take the following steps, focused on reasonable, technically feasible solutions:
    • Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable;
    • Enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and
    • Engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.

So, on the one hand, these nations recognize the indispensable role encryption plays in modern communications and in the fight against authoritarian regimes and “do not support counter-productive and dangerous approaches that would materially weaken or limit security systems.” But, on the other hand, “[p]articular implementations of encryption technology” is putting children at risk and letting terrorism thrive. Elsewhere in the statement we learn that the implementation in question is “[e]nd-to-end encryption that precludes lawful access to the content of communications in any circumstances.”

And, so these nations want companies like Facebook, Apple, Google, and others to take certain steps that would presumably maintain strong encryption but would allow access to certain communications for law enforcement purposes. These nations propose “[e]mbed[ding] the safety of the public in systems designs,” which is a nice phrase and wonderful rhetoric, but what does this mean practically? Companies should not use default encryption? Perhaps. But, let’s be honest about second order effects if American tech companies dispensed with default encryption. Sophisticated criminals and terrorists understand encryption and will still choose to encrypt their devices, apps, and communications, for in this scenario the devices and apps would no longer be encrypted as the default. Rather, people would have to go to the time and trouble of figuring out how to do this. . To be fair, neophyte and careless criminals and terrorists may not know to do so, and their communications would be fairly easy to acquire.

Another likely second order effect is that apps and software offering very hard to break encryption will no longer be made or legally offered in FVEY nations. Consequently, the enterprising individual interested in encryption that cannot be broken or tapped by governments will seek and likely find such technology through a variety of means produced in other countries. It is unlikely encryption will get put back in the bottle because FVEY and friends want it so.

Moreover, given the current technological landscape, the larger point here is that building backdoors into encryption or weakening encryption puts legitimate, desirable communications, activities, and transactions at greater risk of being intercepted. Why would this be so? Because it would take less effort and computing power to crack a weaker encryption key.

But, sure, a world in which my midnight snacking does not lead to weight gain would be amazing. And so it is with the FVEY’s call for strong encryption they could essentially defeat as needed. Eventually, the keys, technology, or means would be leaked or stolen as has happened time and time again. Most recently, there was a massive exfiltration of the Central Intelligence Agency’s (CIA) Vault 7 hacking tools and sources and methods. It would only be a matter of time before the tools to defeat encryption were stolen or compromised.

Perhaps there is a conceptual framework or technology that would achieve the FVEY’s goal, but, at present, it will entail tradeoffs that will make people less secure in their online communications. And, in the defense of the FVEY, they are proposing to “[e]ngage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.” Again, very nice phraseology that does not tell us much.

Of course, the FVEY nations are calling for access under proper authorization. However, in the U.S. that might not even entail an adversarial process in a court, for under the Foreign Intelligence Surveillance Act (FISA), there is no such process in the secret proceedings. Additionally, in the same vein, the phrase “subject to strong safeguards and oversight” is downright comical if the U.S. system is to be the template given the range of shortcomings and failures of national security agencies in meeting U.S. law relating to surveillance.

The FVEY, Japan, and India conclude with:

We are committed to working with industry to develop reasonable proposals that will allow technology companies and governments to protect the public and their privacy, defend cyber security and human rights and support technological innovation.  While this statement focuses on the challenges posed by end-to-end encryption, that commitment applies across the range of encrypted services available, including device encryption, custom encrypted applications and encryption across integrated platforms.  We reiterate that data protection, respect for privacy and the importance of encryption as technology changes and global Internet standards are developed remain at the forefront of each state’s legal framework.  However, we challenge the assertion that public safety cannot be protected without compromising privacy or cyber security.  We strongly believe that approaches protecting each of these important values are possible and strive to work with industry to collaborate on mutually agreeable solutions.

More having one’s cake and eating it, too. They think strong encryption is possible with the means of accessing encrypted communications related to crimes. This seems to be contrary to expert opinion on the matter.

As mentioned, this is not the FVEY’s first attempt to press technology companies. In October 2019, the U.S., the UK, and Australia sent a letter to Facebook CEO Mark Zuckerberg “to request that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” These governments claimed “[w]e support strong encryption…[and] respect promises made by technology companies to protect users’ data…[but] “[w]e must find a way to balance the need to secure data with public safety and the need for law enforcement to access the information they need to safeguard the public, investigate crimes, and prevent future criminal activity.” The officials asserted that “[c]ompanies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes.”

In summer 2019 the FVEY issued a communique in which it urged technology companies “to include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” Interestingly, at that time, these nations lauded Facebook for “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.” It begs the question of what, if anything, changed since this communique was issued and the recent letter to Zuckerberg. In any event, this communique followed the Five Eyes 2018 “Statement of Principles on Access to Evidence and Encryption,“ which articulated these nations’ commitment to working with technology companies to address encryption and the need for law enforcement agencies to meet their public safety and protection obligations.

In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

Moreover, one of the FVEY nations has enacted a law that could result in orders to technology companies to decrypt encrypted communications. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

This past summer, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

The European Union may have a different view, however. In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

Further Reading, Other Developments, and Coming Events ( 4 September)

Here is today’s Further Reading, Other Developments, and Coming Events.

Coming Events

  • The United States-China Economic and Security Review Commission will hold a hearing on 9 September on “U.S.-China Relations in 2020: Enduring Problems and Emerging Challenges” to “evaluate key developments in China’s economy, military capabilities, and foreign relations, during 2020.”
  • On 10 September, the General Services Administration (GSA) will have a webinar to discuss implementation of Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) that bars the federal government and its contractors from buying the equipment and services from Huawei, ZTE, and other companies from the People’s Republic of China.
  • The Federal Communications Commission (FCC) will hold a forum on 5G Open Radio Access Networks on 14 September. The FCC asserted
    • Chairman [Ajit] Pai will host experts at the forefront of the development and deployment of open, interoperable, standards-based, virtualized radio access networks to discuss this innovative new approach to 5G network architecture. Open Radio Access Networks offer an alternative to traditional cellular network architecture and could enable a diversity in suppliers, better network security, and lower costs.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.”
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled ““Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Election Assistance Commission (EAC) “released the Election Risk Profile Tool, a user-friendly assessment tool to equip election officials and federal agencies in prioritizing and managing cybersecurity risks to the Election Infrastructure Subsector.” The agencies stated “[t]he new tool is designed to help state and local election officials understand the range of risks they face and how to prioritize mitigation efforts…[and] also addresses areas of greatest risk, ensures technical cybersecurity assessments and services are meeting critical needs, and provides a sound analytic foundation for managing election security risk with partners at the federal, state and local level.”
    • CISA and the EAC explained “[t]he Election Risk Profile Tool:
      • Is a user-friendly assessment tool for state and local election officials to develop a high-level risk profile across a jurisdiction’s specific infrastructure components;
      • Provides election officials a method to gain insights into their cybersecurity risk and prioritize mitigations;
      • Accepts inputs of a jurisdiction’s specific election infrastructure configuration; and
      • Outputs a tailored risk profile for jurisdictions, which identifies specific areas of highest risk and recommends associated mitigation measures that the jurisdiction could implement to address the risk areas.
  • The cybersecurity agencies of the Five Eyes nations have released a Joint Cybersecurity Advisory: Technical Approaches to Uncovering and Remediating Malicious Activity that “highlights technical approaches to uncovering malicious activity and includes mitigation steps according to best practices.” The agencies asserted “[t]he purpose of this report is to enhance incident response among partners and network administrators along with serving as a playbook for incident investigation.”
    • The Australian Cyber Security Centre, Canada’s Communications Security Establishment, the United States’ Cybersecurity and Infrastructure Security Agency, the United Kingdom’s National Cyber Security Centre, and New Zealand’s National Cyber Security Centre and Computer Emergency Response Team summarized the key takeaways from the Joint Advisory:
      • When addressing potential incidents and applying best practice incident response procedures:
      • First, collect and remove for further analysis:
        • Relevant artifacts,
        • Logs, and
        • Data.
      • Next, implement mitigation steps that avoid tipping off the adversary that their presence in the network has been discovered.
      • Finally, consider soliciting incident response support from a third-party IT security organization to:
        • Provide subject matter expertise and technical support to the incident response,
        • Ensure that the actor is eradicated from the network, and
        • Avoid residual issues that could result in follow-up compromises once the incident is closed.
  • The United States’ (U.S.) Department of Justice (DOJ) and Federal Trade Commission (FTC) signed an Antitrust Cooperation Framework with their counterpart agencies from Australia, Canada, New Zealand, And United Kingdom. The Multilateral Mutual Assistance and Cooperation Framework for Competition Authorities (Framework) “aims to strengthen cooperation between the signatories, and provides the basis for a series of bilateral agreements among them focused on investigative assistance, including sharing confidential information and cross-border evidence gathering.” Given that a number of large technology companies are under investigation in the U.S., the European Union (EU) and elsewhere, signaling a shift in how technology multinationals are being viewed, this agreement may enable cross-border efforts to collectively address alleged abuses. However, the Framework “is not intended to be legally binding and does not give rise to legal rights or obligations under domestic or international law.” The Framework provides:
    • Recognising that the Participants can benefit by sharing their experience in developing, applying, and enforcing Competition Laws and competition policies, the Participants intend to cooperate and provide assistance, including by:
      • a) exchanging information on the development of competition issues, policies and laws;
      • b) exchanging experience on competition advocacy and outreach, including to consumers, industry, and government;
      • c) developing agency capacity and effectiveness by providing advice or training in areas of mutual interest, including through the exchange of officials and through experience-sharing events;
      • d) sharing best practices by exchanging information and experiences on matters of mutual interest, including enforcement methods and priorities; and
      • e) collaborating on projects of mutual interest, including via establishing working groups to consider specific issues.
  • Dynasplint Systems alerted the United States Department of Health and Human Services (HHS) that it suffered a breach affecting more than 100,000 people earlier this year. HHS’ Office of Civil Rights (OCR) is investigating possible violations of Health Insurance Portability and Accountability Act regulations regarding the safeguarding of patients’ health information. If Dynasplint failed to properly secure patient information or its systems, OCR could levy a multimillion dollar fine for the size breach. For example, in late July, OCR fined a company over $1 million for the theft of an unencrypted laptop that exposed the personal information of a little more than 20,000 people.
    • Dynasplint, a Maryland manufacturer of range of motion splints, explained:
      • On June 4, 2020, the investigation determined that certain information was accessed without authorization during the incident.
      • The information may have included names, addresses, dates of birth, Social Security numbers, and medical information.
      • Dynasplint Systems reported this matter to the FBI and will provide whatever cooperation is necessary to hold perpetrators accountable.
  • The California Legislature has sent two bills to Governor Gavin Newsom (D) that would change how technology is regulated in the state, including one that would alter the “California Consumer Privacy Act” (AB 375) (CCPA) if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. The two bills are:
    • AB 1138 would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.”
    •  AB 1281 would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
  • Senators Senator Shelley Moore Capito (R-WV), Amy Klobuchar (D-MN) and Jerry Moran (R-KS) have written “a letter to Federal Trade Commission (FTC) Chairman Joseph Simons urging the FTC to take action to address the troubling data collection and sharing practices of the mobile application (app) Premom” and “to request information on the steps that the FTC plans to take to address this issue.” They asserted:
    • A recent investigation from the International Digital Accountability Council (IDAC) indicated that Premom may have engaged in deceptive consumer data collection and processing, and that there may be material differences between Premom’s stated privacy policies and its actual data-sharing practices. Most troubling, the investigation found that Premom shared its users’ data without their consent.
    • Moore Capito, Klobuchar, and Moran stated “[i]n light of these concerning reports, and given the critical role that the FTC plays in enforcing federal laws that protect consumer privacy and data under Section 5 of the Federal Trade Commission Act and other sector specific laws, we respectfully ask that you respond to the following questions:
      • 1. Does the FTC treat persistent identifiers, such as the non-resettable device hardware identifiers discussed in the IDAC report, as personally identifiable information in relation to its general consumer data security and privacy enforcement authorities under Section 5 of the FTC Act?  
      • 2. Is the FTC currently investigating or does it plan to investigate Premom’s consumer data collection, transmission, and processing conduct described in the IDAC report to determine if the company has engaged in deceptive practices?
      • 3. Does the FTC plan to take any steps to educate users of the Premom app that the app may still be sharing their personal data without their permission if they have not updated the app? If not, does the FTC plan to require Premom to conduct such outreach?
      • 4. Please describe any unique or practically uncommon uses of encryption by the involved third-party companies receiving information from Premom that could be functionally interpreted to obfuscate oversight of the involved data transmissions.
      • 5. How can the FTC use its Section 5 authority to ensure that mobile apps are not deceiving consumers about their data collection and sharing practices and to preempt future potentially deceptive practices like those Premom may have engaged in?

Further Reading

  • Justice Dept. Plans to File Antitrust Charges Against Google in Coming Weeks” By Katie Benner and Cecilia Kang – The New York Times; “The Justice Department could file a lawsuit against Google this month, overriding skepticism from its own top lawyers” By Tonty Romm – The Washington Post; “There’s a partisan schism over the timing of a Google antitrust lawsuit” By Timothy B. Lee – Ars Technica. The New York Times explains in its deeply sourced article that United States Department of Justice (DOJ) attorneys want more time to build a better case against Google, but that Attorney General William Barr is pressing for the filing of a suit as early as the end of this month in order for the Trump Administration to show voters it is taking on big tech. Additionally, a case against a tech company would help shore up the President’s right flank as he and other prominent conservatives continue to insist in the absence of evidence that technology companies are biased against the right. The team of DOJ attorneys has shrunk from 40 to about 20 as numerous lawyers asked off the case once it was clear what the Attorney General wanted. These articles also throw light on to the split between Republican and Democratic state attorneys general in the case they have been working on with the former accusing the latter of stalling for time in the hopes a Biden DOJ will be harsher on the company and the latter accusing the former of trying to file a narrow case while Donald Trump is still President that would impair efforts to address the range of Google’s alleged antitrust abuses.
  • Facebook Moves to Limit Election Chaos in November” By Mike Isaac – The New York Times. The social network giant unveiled measures to fight misinformation the week before the United States election and afterwards should people try to make factually inaccurate claims about the results. Notably, political advertisements will be banned a week before the 3 November election, but this seems like pretty weak tea considering it will be business as usual until late October. Even though the company frames these moves as “additional steps we’re taking to help secure the integrity of the U.S. elections by encouraging voting, connecting people to authoritative information, and reducing the risks of post-election confusion,” the effect of misinformation, disinformation, and lies that proliferate on Facebook will have likely already taken root by late October. It is possible the company still wants the advertising revenue it would forgo if it immediately banned political advertising. Another proposed change is to provide accurate information about voting generally and COVID-19 and voting. In fact, the platform corrected a post of President Donald Trump’s that expressed doubts about mail-in voting.
  • Washington firm ran fake Facebook accounts in Venezuela, Bolivia and Mexico, report finds” By Craig Timberg and Elizabeth Dwoskin – The Washington Post. In tandem with taking down fake content posted by the Internet Research Agency, Facebook also removed accounts traced back to a Washington, D.C. public relations firm, CLS Strategies, that was running multiple accounts to support the government in Bolivia and the opposition party in Venezuela, both of which are right wing. Using information provided by Facebook, Stanford University’s Internet Observatory released a report stating that “Facebook removed a network of 55 Facebook accounts,4 2 Pages and 36 Instagram accounts attributed to the US-based strategic communications firm CLS Strategies for engaging in coordinated inauthentic behavior (CIB).” Stanford asserted these key takeaways:
    • 11 Facebook pages related to Bolivia mainly supported Bolivia’s Interim President Jeanine Áñez and disparaged Bolivia’s former president Evo Morales. All had similar creation dates and manager location settings.
    • Venezuela-focused assets supported and promoted Venezuelan opposition leaders but changed in tone in 2020, reflecting factional divides in the opposition and a turn away from Juan Guaidó.
    • In addition to fake accounts, removed Facebook accounts include six profiles that match the names and photos of CLS Strategies employees listed publicly on their website and appear to be their real accounts.
    • CLS Strategies has a disclosed contract with the Bolivian government to provide strategic communications counsel for Bolivia’s 2020 elections and to strengthen democracy and human rights in Bolivia.
    • Coordinated inauthentic behavior reports from Facebook and Twitter have increasingly included assets linked to marketing and PR firms originating and acting around the world. The firms’ actions violate the platforms’ terms by operating internationally and failing to identify their origins and motivations to users.
    • In its release on the issue, Facebook explained:
      • In August, we removed three networks of accounts, Pages and Groups. Two of them — from Russia and the US — targeted people outside of their country, and another from Pakistan focused on both domestic audiences in Pakistan and also in India. We have shared information about our findings with law enforcement, policymakers and industry partners.
  • Belarusian Officials Shut Down Internet With Technology Made by U.S. Firm” By Ryan Gallagher – Bloomberg. A United States firm, Sandvine, sold deep packet inspection technology to the government in Belarus through a Russian intermediary. The technology was ostensibly to be used by the government to fend off dangers to the nation’s networks but was instead deployed to shut down numerous social media and news sites on the internet the day of the election. However, Belarusian activists quickly determined how to use workarounds, launching the current unrest that threatens to topple the regime. The same company’s technology has been used elsewhere in the world to cut off access to the internet as detailed by the University of Toronto’s Citizen Lab in 2018.
  • Canada has effectively moved to block China’s Huawei from 5G, but can’t say so” – Reuters. In a move reminiscent of how the People’s Republic of China (PRC) tanked Qualcomm’s proposed purchase of NXP Semiconductors in 2018, Canada has effectively barred Huawei from its 5G networks by not deciding, which eventually sent a signal to its telecommunications companies to use Ericsson and Nokia instead. This way, there is no public announcement or policy statement the PRC can object to, and the country toes the line with its other Five Eyes partners that have banned Huawei in varying degrees. Additionally, given that two Canadian nationals are being held because Huawei Chief Financial Officer Meng Wanzhou is being detained in Canada awaiting extradition to the Unted States to face criminal charges, Ottawa needs to manage its relations with the PRC gingerly.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Simon Steinberger from Pixabay

Further Reading, Other Developments, and Coming Events (31 August)

Today’s Further Reading, Other Developments, and Coming Events.

Coming Events

  • On 10 September, the General Services Administration (GSA) will have a webinar to discuss implementation of Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) that bars the federal government and its contractors from buying the equipment and services from Huawei, ZTE, and other companies from the People’s Republic of China.
  • The Federal Communications Commission (FCC) will hold a forum on 5G Open Radio Access Networks on 14 September. The FCC asserted
    • Chairman [Ajit] Pai will host experts at the forefront of the development and deployment of open, interoperable, standards-based, virtualized radio access networks to discuss this innovative new approach to 5G network architecture. Open Radio Access Networks offer an alternative to traditional cellular network architecture and could enable a diversity in suppliers, better network security, and lower costs.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.”
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled ““Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • A group of Democratic Senators wrote the Federal Communications Commission (FCC) “to express our profound frustration that the [agency] has failed to take forceful action to keep households connected during the COVID-19 pandemic.” They asserted that “[a]s millions of American families face unprecedented financial pressures and educational challenges, we urge the FCC to reverse proposed changes to the Lifeline program, take immediate steps to open its assistance to more households, and ensure that its services meet the pressing needs of families during this crisis.”
    • They claimed
      • Since the first weeks of [FCC Chair Ajit Pai’s tenure], the FCC has sought to block new broadband providers’ participation in the Lifeline program, curtail benefits in tribal areas, exclude existing carriers, rollback reforms for registering new carriers, make it harder for new applicants  to subscribe, prevent carriers from offering free in-person distribution of phones, reduce incentives to enroll subscribers, and add more barriers for participating carriers and subscriber. These proposals have been so extreme that they would lead to cutting off carriers serving almost 70% of Lifeline subscribers.
    • They urged Pai “to immediately take the following steps:
      • 1.) Take emergency measures to provide additional financial support to Lifeline providers during the pandemic to temporarily support unlimited mobile data and voice minutes, and notify Congress if additional funding is needed for such changes.
      • 2.) Extend all current FCC waivers on Lifeline usage and subscriber documentation requirements for at least a full year, until August 2021or when we have recovered from the pandemic.
      • 3.) Close the currently outstanding Lifeline proposed rulemakings that would create new obstacles for eligible households and add unwarranted burden on carriers.
      • 4.)Pause the scheduled changes to Lifeline program’s minimum service standards until the Commission studies such impacts on the market in its upcoming 2021 State of Lifeline Marketplace Report, to avoid disruptions to customers’ services.
      • 5.) Restore the monthly subsidy to $9.25 for plans offering voice services for subscribers who value voice over data-heavy plans and pause the planned decrease in contributions for voice support.
      • 6.) Work with states to increase the automated verification of state databases with the National Verifier program by the end of this year.
  • New Zealand’s National Cyber Security Centre (NCSC) released a “General Security Advisory: ongoing campaign of Denial of Service (DoS) attacks affecting New Zealand entities” after four days of DoS attacks against New Zealand’s stock market coming from somewhere offshore. The NCSC recommended best practices the Australian Cyber Security Centre (ACSC) had published. The NCSC stated
    • [It] is aware of an ongoing campaign of DoS attacks affecting New Zealand entities.
    • The campaign has included the targeting of a number of global entities, predominantly in the financial sector. 
    • The NCSC strongly encourages all organisations in this sector to consider the risk to their organisation of DoS and ensure appropriate mitigations are in place.
  • Senator Mark Warner (D-VA) letters to DellAppleHPSamsungGoogleMicrosoftAcer America, and ASUS USA asking the “companies to do what they can to help bridge the “homework gap” – the lack of reliable computer or internet access that prevents school-aged children from being able to do school work from home.” Warner’s letter is in response to the nationwide shortage of lost laptops and tablets facing families as many children will be starting school online this fall. Warner stated:
    • There are a range of actions your company can take, including educational product discounts, the provision of complimentary or donated computers (including for home lending programs many educational institutions operate), and the provision of refurbished or returned products in good working condition for school districts and higher education institutions to distribute to educators and students. While I understand the strains placed on the global supply chain, your prioritization of these matters would greatly assist struggling families at this challenging time.
  • The United States Department of Defense (DOD) updated its list of ““Communist Chinese military companies” operating directly or indirectly in the United States in accordance with the statutory requirement of Section 1237 of the National Defense Authorization Act for Fiscal Year 1999, as amended.” The eleven companies from the People’s Republic of China (PRC) were added to the existing list sent “to Congress in June 2020,” some 20 years after Congress tasked the DOD with this responsibility. This action is most likely in response to a letter sent last year to fulfill this responsibility. Notably, any company on the list could be sanctioned by the President under the same authorities recently used against TikTok and WeChat.
    • In a September 2019 letter to Secretary of Defense Mark Esper, Senate Minority Leader Chuck Schumer (D-NY) and Senator Tom Cotton (R-AR) were joined by Representatives Ruben Gallego (D-AZ) and Mike Gallagher (R-WI) in asking whether the DOD has been updating a list of “those persons operating directly or indirectly in the United States or any of its territories and possessions that are Communist Chinese military companies” as directed by Section 1237 of the FY 1999 NDAA. They noted that China’s Communist Party has adopted a Military-Civilian Fusion strategy “to achieve its national objectives,” including the acquisition of U.S. technology through any means such as espionage, forced technology transfers, and the purchase of or investment in U.S. technology forms. Schumer, Cotton, Gallego, and Gallagher urged the Trump Administration “reexamine all statutory authorities at its disposal to confront the CCP’s strategy of Military-Civilian Fusion, including powers that have laid dormant for years.”
    • Unstated in this letter, however, is that the first part of Section 1237 grants the President authority to “exercise International Emergency Economic Powers Act (IEEPA) authorities (other than authorities relating to importation) without regard to section 202 of the IEEPA (50 U.S.C. 1701) in the case of any commercial activity in the United States by a person that is on the list.” Of IEEPA grants the President sweeping powers to prohibit transactions and block property and property interests for nations and other groups subject to an IEEPA national emergency declaration. Consequently, those companies identified by the DOD on a list per Section 1237 could be blocked and prohibited from doing business with U.S. entities and others and those that do business with such Chinese companies could be subject to enforcement actions by the U.S. government (e.g. the U.S.’s actions against ZTE for doing business with Iran in violation of an IEEPA national emergency).
    • The statute defines a “Communist Chinese military company” as “any person identified in the Defense Intelligence Agency publication numbered VP-1920-271-90, dated September 1990, or PC-1921-57-95, dated October 1995, and any update of those publications for the purposes of this section; and any other person that is owned or controlled by the People’s Liberation Army; and is engaged in providing commercial services, manufacturing, producing, or exporting.” Considering that the terms “owned” and “controlled” are not spelled out in this section, the executive branch may have very wide latitude in deeming a non-Chinese company as owned or controlled and therefore subject to the President’s use of IEEPA powers. Moreover, since the President already has the authority to declare an emergency and then use IEEPA powers, this language would seem to allow the President to bypass any such declaration and immediately use such powers, except those regarding importation, against any Chinese entities identified on this list by the Pentagon.
  • District of Columbia Attorney General Karl Racine (D) filed suit against Instacart alleging the company “violated the District’s Consumer Protection Procedures Act and tax law by: 
    • Charging District consumers millions of dollars in deceptive service fees: Prior to 2016, Instacart’s checkout screen contained an option to tip workers, set as a default 10 percent of the consumer’s subtotal for groceries that users could adjust. In 2016, Instacart swapped the tip option for a service fee, which was also set to a default 10 percent and could be adjusted, and displayed it where the tip option used to be. Consumers paid the service fee believing they were tipping workers. In reality, the service fee was a second charge—on top of a delivery fee—imposed by Instacart to cover delivery costs and operating expenses. Additionally, Instacart failed to clearly disclose that service fees were optional and that consumers could choose not to pay them.
    • Misleading consumers about how service fees contributed to worker pay: When Instacart announced the new service fees, it told consumers that “100% of the variable service amount is used to pay all shoppers more consistently for each and every delivery, not just the last shopper to touch the order.” Instacart also stated that the company collected a service fee because “multiple shoppers may have been involved in a single order” and the “service fee is used to pay this entire set of shoppers.” In fact, the shoppers who fulfilled a consumer’s order were paid the same whether or not a consumer paid the service fee.
    • Failing to pay at hundreds of thousands of dollars in District sales tax: Under District law, Instacart is responsible for collecting sales tax on the delivery services it provides. The entire time Instacart has operated in the District, it has failed to collect sales tax on the service fees and delivery fees it charged users.
  • Two large United States (U.S.) technology companies are facing class actions in the Netherlands and the United Kingdom (UK) that argue the companies’ use of third party cookies in order to sell real time bidding advertising violated the European Union’s General Data Protection Regulation (GDPR) by not obtaining the consent of people before their personal information is collected and processed. The suit against Oracle and Salesforce is being brought by The Privacy Collective, a European non-profit, that could result in damages of more than €10 billion.
  • As part of its lawsuit against Google “for deceptive and unfair practices used to obtain users’ location data, which Google then exploits for its lucrative advertising business,” the Office of the Attorney General of Arizona released emails obtained during the course of discovery that may demonstrate the company’s knowledge that its interface and operating system were trying to frustrate a user’s desire to truly turn off location data.
  • The eHealth Initiative & Foundation (eHI) and the Center for Democracy and Technology (CDT) released A Draft Consumer Privacy Framework for Health Data, “a collaborative effort addressing gaps in legal protections for consumer health data outside of the Health Insurance Portability and Accountability Act’s (HIPAA) coverage.” Feedback is welcome until 25 September.
    • The organizations asserted
      • The standards’ emphasis is on transparency, accountability, and the limitation on health data collection, disclosure, and use. Importantly, the standards:
        • (1) move beyond outdated notice and consent models,
        • (2) cover all health information, and
        • (3) cover all entities that use, disclose or collect consumer health information, regardless of the size or business model of the covered entity.
      • This proposal is not designed to be a replacement for necessary comprehensive data privacy legislation. Given that Congressional action to pass such a law is likely some time away, this effort is designed to build consensus on best practices and to do what we can now, in the interim, to shore up protections for non-HIPAA covered health data.

Further Reading

  • Big Oil Faded. Will Big Tech?” By Shira Ovide – The New York Times. This piece suggests that the so-called Big Tech companies may someday wane as many energy companies like Exxon are currently doing. The interesting point is made that a company or field’s preeminence can rapidly disappear and it can seem dominant until it is not. And this frequently happens for reasons that do not seem apparent or related. Ironically, Exxon essentially got pushed out of the Dow Jones Industrial Average because Apple had to split its stock because of its surging valuation. Another tech company, Salesforce, will replace Exxon.
  • Apple wants to stop advertisers from following you around the web. Facebook has other ideas.” By Peter Kafka – Recode. Apple will extend a feature from Safari to its next iOS for iPhones where users will soon be asked whether they want to allow apps to track them across the web and other apps in order to deliver them targeted, personalized advertising. To no great surprise, it is being assumed many users will say no, diminishing a prime mode by which companies reap data and show people advertisements that are intimately tied to what they read and watch online. Consequently, advertisers will be less willing to spend dollars on more general ads and income will be depressed for the two major players in this market: Facebook and Google. Facebook has already declared it will not use Apple’s device identifier unique to every iPhone or Apple Watch, meaning users downloading the Facebook app will not get the choice of whether to say no to the companies tracking them. It is not clear how well this workaround will mitigate the projected loss in ad revenue for Facebook, but it does represent the latest chapter in the fight between the two companies. Facebook has lined up with Epic Games, maker of Fortnite, in its suit against Apple regarding App Store policies. It is very likely Apple sees this change to iOS 14 as a means of burnishing its reputation as being more concerned about its users privacy than competitors in Silicon Valley, which it can afford to be considering it does not earn most of its revenue the same way Facebook does, and curry favor in Washington and Brussels where it is facing antitrust scrutiny.
  • Want a Free Amazon Halo Wearable? Just Hand Over Your Data to This Major Insurance Company” By Emily Mullin – OneZero. Amazon has teamed with insurer John Hancock to offer a wearable health and fitness tracker that will be used to collect personal data on wearers that is designed to nudge them into better behaviors and better health. This is not the first such pairing, and it raises a host of policy issues, for healthier people would be poised to reap benefits not available to less healthy people. Some insurers are offering modest amounts of cash or gift cards for exercising regularly or other benefits that would not go to less healthy people. These sorts of programs are similar to employee health and wellness programs that were enshrined in the “Patient Protection and Affordable Care Act” that studies have suggested do not work very well. Additionally, companies like Amazon and John Hancock will be collecting and processing all sorts of very sensitive personal information, making them likely targets of hacking operations. Also, there are privacy implications, for these wearable devices will likely allow companies to know the most intimate details of wearers’ lives.
  • TikTok Deal Is Complicated by New Rules From China Over Tech Exports” By Paul Mozur, Raymond Zhong and David McCabe – The New York Times; “TikTok Is Said to Wrestle With Two Competing Offers” By Mike Isaac – The New York Times; “China’s new tech export restrictions further cloud US TikTok sale and raise the risk of protectionism” By Coco Feng, Tracy Qu and Amanda Lee– South China Morning Post; “China puts drones and laser tech on restricted export list after US tightens rules” By Sidney Leng – South China Morning Post; “TikTok Chief Executive Kevin Mayer Resigns” By Mike Isaac – The New York Times.In a surprise announcement from two agencies late last week, the People’s Republic of China changed its export control rules for the first time since 2008 to likely have leverage over TikTok’s sale to a United States (U.S.) entity. Ostensibly, the changes are “to regulate technology exports, promote scientific and technological progress and economic and technological cooperation, and maintain national economic security,” but the inclusion of “personalised information recommendation service technology based on data analysis” and “artificial intelligence interactive interfaces” likely point to ByteDance’s app, TikTok. In fact a researcher with the PRC Ministry of Commerce was quoted as asserting “[t]he time to publish the new update of the export control list has been expedited due to the TikTok sale.” Moreover, the PRC’s timeline for deciding on whether an export license is needed is the same as the Trump Administration’s second executive order directing ByteDance to divest TikTok. Incidentally, these changes are probably in response to tighten of U.S. export controls against the PRC, which could set off retaliatory moves. In any event, Beijing will now have to approve any sale of TikTok operations in the U.S. Also, Walmart has apparently joined forces with Microsoft in preparing a bid on TikTok in competition with Oracle which threw its proverbal hat into the ring last week. And, new TikTok CEO Kevin Mayer stepped down in a surprise move citing ByteDance’s changed circumstances.
  • Trump aides interviewing replacement for embattled FTC chair” By Leah Nylen, Betsy Woodruff Swan, John Hendel and Daniel Lippman – Politico. The Trump Administration may be trying to force out Federal Trade Commission Chair Joe Simons or merely interviewing replacements if he steps down next year should President Donald Trump still be in the White House next year. Given the reports that Simons has resisted pressure from the White House to comply with the executive order on Section 230 by investigating social media platforms, Simons has likely not won any new fans at 1600 Pennsylvania Avenue. Having said that, removing an FTC Commissioner is much harder than other top positions in the U.S. government, and the FTC is designed to be insulated from political pressure. However, Commissioners are politicians, too, and carefully gauge the direction the wind is blowing. That being said, Simons has also sent out signals he will step down next year and return to private practice, so the interviewing of possible successors may be entirely normal in an Administration that usually does not operate normally.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gordon Johnson from Pixabay

Further Reading, Other Developments, and Coming Events (13 August)

Here are Further Reading, Other Developments, and Coming Events:

Coming Events

  • On 18 August, the National Institute of Standards and Technology (NIST) will host the “Bias in AI Workshop, a virtual event to develop a shared understanding of bias in AI, what it is, and how to measure it.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.

Other Developments

  • Senate Intelligence Committee Acting Chair Marco Rubio (R-FL) and Vice Chairman Mark Warner (D-VA) released a statement indicating the committee had voted to adopt the fifth and final volume of its investigation of the Russian Federation’s interference in the 2016 election. The committee had submitted the report to the Intelligence Community for vetting and have received the report with edits and redactions. The report could be released sometime over the next few weeks.  Rubio and Warner stated “the Senate Intelligence Committee voted to adopt the classified version of the final volume of the Committee’s bipartisan Russia investigation. In the coming days, the Committee will work to incorporate any additional views, as well as work with the Intelligence Community to formalize a properly redacted, declassified, publicly releasable version of the Volume 5 report.” The Senate Intelligence Committee’s has released four previous reports:
  • The National Institute of Standards and Technology (NIST) is accepting comments until 11 September on draft Special Publication 800-53B, “Control Baselines for Information Systems and Organizations,” a guidance document that will serve a key role in the United States government’s efforts to secure and protect the networks and systems it operates and those run by federal contractors. NIST explained:
    • This publication establishes security and privacy control baselines for federal information systems and organizations and provides tailoring guidance for those baselines. The use of the security control baselines is mandatory, in accordance with OMB Circular A-130 [OMB A-130] and the provisions of the Federal Information Security Modernization Act4 [FISMA], which requires the implementation of a set of minimum controls to protect federal information and  information systems. Whereas use of the privacy control baseline is not mandated by law or [OMB A-130], SP 800-53B, along with other supporting NIST publications, is designed to help organizations identify the security and privacy controls needed to manage risk and satisfy the security and privacy requirements in FISMA, the Privacy Act of 1974 [PRIVACT], selected OMB policies (e.g., [OMB A-130]), and designated Federal Information Processing Standards (FIPS), among others
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) released an “Election Vulnerability Reporting Guide
    to provide “election administrators with a step-by-step guide, list of resources, and a template for establishing a successful vulnerability disclosure program to address possible vulnerabilities in their state and local election systems…[and] [t]he six steps include:
    • Step 1: Identify Systems Where You Would Accept Security Testing, and those Off-Limits
    • Step 2: Draft an Easy-to-Read Vulnerability Disclosure Policy (See Appendix III)
    • Step 3: Establish a Way to Receive Reports/Conduct Follow-On Communication
    • Step 4: Assign Someone to Thank and Communicate with Researchers
    • Step 5: Assign Someone to Vet and Fix the Vulnerabilities
    • Step 6: Consider Sharing Information with Other Affected Parties
  • The United Kingdom’s Information Commissioner’s Office (ICO) has issued “Guidance on AI and data protection” that “clarifies how you can assess the risks to rights and freedoms that AI can pose from a data protection perspective; and the appropriate measures you can implement to mitigate them.” The ICO explained “[w]hile data protection and ‘AI ethics’ overlap, this guidance does not provide generic ethical or design principles for your use of AI.” The ICO stated “[i]t corresponds to data protection principles, and is structured as follows:
    • part one addresses accountability and governance in AI, including data protection impact assessments (DPIAs);
    • part two covers fair, lawful and transparent processing, including lawful bases, assessing and improving AI system performance, and mitigating potential discrimination;
    • part three addresses data minimisation and security; and
    • part four covers compliance with individual rights, including rights related to automated decision-making.
  •  20 state attorneys general wrote Facebook Chief Executive Officer Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg “to request  that  you  take  additional  steps  to prevent   Facebook   from   being used   to   spread   disinformation   and   hate   and   to   facilitate discrimination.” They also asked “that you take more steps to provide redress for users who fall victim to intimidation and harassment, including violence and digital abuse.” The attorneys general said that “[b]ased on our collective experience, we believe that Facebook should take additional actions including the following steps—many of which are highlighted in Facebook’s recent Civil Rights Audit—to strengthen its commitment to civil rights and fighting disinformation and discrimination:
    • Aggressively enforce Facebook policies against hate speech and organized hate organizations: Although Facebook has developed policies against hate speech and organizations that peddle it, we remain concerned that Facebook’s policies on Dangerous Individuals and Organizations, including but not limited to its policies on white nationalist and white supremacist content, are not enforced quickly and comprehensively enough. Content that violates Facebook’s own policies too often escapes removal just because it comes as coded language, rather than specific magic words. And even where Facebook takes steps to address a particular violation, it often fails to proactively address the follow-on actions by replacement or splinter groups that quickly emerge.
    • Allow public, third-party audits of hate content and enforcement: To gauge the ongoing progress of Facebook’s enforcement efforts, independent experts should be permitted access to the data necessary to conduct regular, transparent third-party audits of hate and hate-related misinformation on the platform, including any information made available to the Global Oversight Board. As part of this effort, Facebook should capture data on the prevalence of different forms of hate content on the platform, whether or not covered by Facebook’s own community standards, thus allowing the public to determine whether enforcement of anti-hate policies differs based on the type of hate content at issue.
    • Commit to an ongoing, independent analysis of Facebook’s content population scheme and the prompt development of best practices guidance: By funneling users toward particular types of content, Facebook’s content population scheme, including its algorithms, can push users into extremist online communities that feature divisive and inflammatory messages, often directed at particular groups. Although Facebook has conducted research and considered programs to reduce this risk, there is still no mandatory guidance for coders and other teams involved in content population. Facebook should commit to an ongoing, independent analysis of its content population scheme, including its algorithms, and also continuously implement mandatory protocols as best practices are identified to curb bias and prevent recommendations of hate content and groups.
    • Expand policies limiting inflammatory advertisements that vilify minority groups: Although Facebook currently prohibits ads that claim that certain people, because of their membership in a protected group, pose a threat to the physical safety of communities or the nation, its policies still allow attacks that characterize such groups as threats to national culture or values. The current prohibition should be expanded to include such ads.
  • New Zealand’s Ministry of Statistics “launched the Algorithm Charter for Aotearoa New Zealand” that “signals that [the nation’s agencies] are committed to being consistent, transparent and accountable in their use of algorithms.”
    • The Ministry explained “[t]he Algorithm Charter is part of a wider ecosystem and works together with existing tools, networks and research, including:
      • Principles for the Safe and Effective Use of Data and Analytics (Privacy Commissioner and Government Chief Data Steward, 2018)
      • Government Use of Artificial Intelligence in New Zealand (New Zealand Law Foundation and Otago University, 2019)
      • Trustworthy AI in Aotearoa – AI Principles (AI Forum New Zealand, 2020)
      • Open Government Partnership, an international agreement to increase transparency.
      • Data Protection and Use Policy (Social Wellbeing Agency, 2020)
      • Privacy, Human Rights and Ethics Framework (Ministry of Social Development).
  • The European Union (EU) imposed its first cyber sanctions under its Framework for a Joint EU Diplomatic Response to Malicious Cyber Activities (aka the cyber diplomacy toolbox) against six hackers and three entities from the Russian Federation, the People’s Republic of China (PRC) and the Democratic People’s Republic of Korea for attacks against the against the Organisation for the Prohibition of Chemical Weapons (OPCW) in the Netherlands, the malware attacks known as Petya and WannaCry, and Operation Cloud Hopper. The EU’s cyber sanctions follow sanctions the United States has placed on a number of people and entities from the same nations and also indictments the U.S. Department of Justice has announced over the years. The sanctions are part of the effort to levy costs on nations and actors that conduct cyber attacks. The EU explained:
    • The attempted cyber-attack was aimed at hacking into the Wi-Fi network of the OPCW, which, if successful, would have compromised the security of the network and the OPCW’s ongoing investigatory work. The Netherlands Defence Intelligence and Security Service (DISS) (Militaire Inlichtingen- en Veiligheidsdienst – MIVD) disrupted the attempted cyber-attack, thereby preventing serious damage to the OPCW.
    • “WannaCry” disrupted information systems around the world by targeting information systems with ransomware and blocking access to data. It affected information systems of companies in the Union, including information systems relating to services necessary for the maintenance of essential services and economic activities within Member States.
    • “NotPetya” or “EternalPetya” rendered data inaccessible in a number of companies in the Union, wider Europe and worldwide, by targeting computers with ransomware and blocking access to data, resulting amongst others in significant economic loss. The cyber-attack on a Ukrainian power grid resulted in parts of it being switched off during winter.
    • “Operation Cloud Hopper” has targeted information systems of multinational companies in six continents, including companies located in the Union, and gained unauthorised access to commercially sensitive data, resulting in significant economic loss.
  • The United States’ Federal Communications Commission (FCC) is asking for comments on the Department of Commerce’s the National Telecommunications and Information Administration’s (NTIA) petition asking the agency to start a rulemaking to clarify alleged ambiguities in 47 USC 230 regarding the limits of the liability shield for the content others post online versus the liability protection for “good faith” moderation by the platform itself. The NTIA was acting per direction in an executive order allegedly aiming to correct online censorship. Executive Order 13925, “Preventing Online Censorship” was issued in late May after Twitter factchecked two of President Donald Trump’s Tweets regarding false claims made about mail voting in California in response to the COVID-19 pandemic. Comments are due by 2 September.
  • The Australian Competition & Consumer Commission (ACCC) released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” The government in Canberra had asked the ACCC to draft this code earlier this year after talks broke down between the Australian Treasury
    • The ACCC explained
      • The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.
    • This is not the ACCC’s first interaction with the companies. Late last year, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.
    • A year ago, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “released core guidance documentation for the Trusted Internet Connections (TIC) program, developed to assist agencies in protecting modern information technology architectures and services.” CISA explained “In accordance with the Office of Management and Budget (OMB) Memorandum (M) 19-26: Update to the TIC Initiative, TIC 3.0 expands on the original initiative to drive security standards and leverage advances in technology to secure a wide spectrum of agency network architectures.” Specifically, CISA released three core guidance documents:
    • Program Guidebook (Volume 1) – Outlines the modernized TIC program and includes its historical context
    • Reference Architecture (Volume 2) – Defines the concepts of the program to guide and constrain the diverse implementations of the security capabilities
  • Senators Ron Wyden (D-OR), Bill Cassidy (R-LA) and ten other Members wrote the Federal Trade Commission (FTC) urging the agency “to investigate widespread privacy violations by companies in the advertising technology (adtech) industry that are selling private data about millions of Americans, collected without their knowledge or consent from their phones, computers, and smart TVs.” They asked the FTC “to use its authority to conduct broad industry probes under Section 6(b) of the FTC Act to determine whether adtech companies and their data broker partners have violated federal laws prohibiting unfair and deceptive business practices.” They argued “[t]he FTC should not proceed with its review of the Children’s Online Privacy Protection Act (COPPA) Rule before it has completed this investigation.”
  •  “100 U.S. women lawmakers and current and former legislators from around the world,” including Speaker of the House Nancy Pelosi (D-CA), sent a letter to Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg urging the company “to take decisive action to protect women from rampant and increasing online attacks on their platform that have caused many women to avoid or abandon careers in politics and public service.” They noted “[j]ust a few days ago, a manipulated and widely shared video that depicted Speaker Pelosi slurring her speech was once again circulating on major social media platforms, gaining countless views before TikTok, Twitter, and YouTube all removed the footage…[and] [t]he video remains on Facebook and is labeled “partly false,” continuing to gain millions of views.” The current and former legislators “called on Facebook to enforce existing rules, including:
    • Quick removal of posts that threaten candidates with physical violence, sexual violence or death, and that glorify, incite or praise violence against women; disable the relevant accounts, and refer offenders to law enforcement.
    • Eliminate malicious hate speech targeting women, including violent, objectifying or dehumanizing speech, statements of inferiority, and derogatory sexual terms;
    • Remove accounts that repeatedly violate terms of service by threatening, harassing or doxing or that use false identities to attack women leaders and candidates; and
    • Remove manipulated images or videos misrepresenting women public figures.
  • The United States’ Departments of Commerce and Homeland Security released an update “highlighting more than 50 activities led by industry and government that demonstrate progress in the drive to counter botnet threats.” in May 2018, the agencies submitted “A Report to the President on Enhancing the Resilience of the Internet and Communications Ecosystem Against Botnets and Other Automated, Distributed Threats” that identified a number of steps and prompted a follow on “A Road Map Toward Resilience Against Botnets” released in November 2018.
  • United States (U.S.) Secretary of Commerce Wilbur Ross and European Commissioner for Justice Didier Reynders released a joint statement explaining that “[t]he U.S. Department of Commerce and the European Commission have initiated discussions to evaluate the potential for an enhanced EU-U.S. Privacy Shield framework to comply with the July 16 judgment of the Court of Justice of the European Union in the Schrems II case.”
    • Maximillian Schrems filed a complaint against Facebook with Ireland’s Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under European Union law because of the mass U.S. surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-U.S. Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the U.S. passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”
    • However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The CJEU struck down the adequacy decision, throwing into doubt many entities’ transfers out of the EU into the U.S. but upheld SCCs in a way that suggested EU data protection authorities (DPA) may need to review all such agreements to ensure they comply with EU law.
  • The European Commission (EC) announced an “an in-depth investigation to assess the proposed acquisition of Fitbit by Google under the EU Merger Regulation.” The EC voiced its concern “that the proposed transaction would further entrench Google’s market position in the online advertising markets by increasing the already vast amount of data that Google could use for personalisation of the ads it serves and displays.” The EC detailed its “preliminary competition concerns:
    • Following its first phase investigation, the Commission has concerns about the impact of the transaction on the supply of online search and display advertising services (the sale of advertising space on, respectively, the result page of an internet search engine or other internet pages), as well as on the supply of ”ad tech” services (analytics and digital tools used to facilitate the programmatic sale and purchase of digital advertising). By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to Fitbit’s one.
    • The data collected via wrist-worn wearable devices appears, at this stage of the Commission’s review of the transaction, to be an important advantage in the online advertising markets. By increasing the data advantage of Google in the personalisation of the ads it serves via its search engine and displays on other internet pages, it would be more difficult for rivals to match Google’s online advertising services. Thus, the transaction would raise barriers to entry and expansion for Google’s competitors for these services, to the ultimate detriment of advertisers and publishers that would face higher prices and have less choice.
    • At this stage of the investigation, the Commission considers that Google:
      • is dominant in the supply of online search advertising services in the EEA countries (with the exception of Portugal for which market shares are not available);
      • holds a strong market position in the supply of online display advertising services at least in Austria, Belgium, Bulgaria, Croatia, Denmark, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Norway, Poland, Romania, Slovakia, Slovenia, Spain, Sweden and the United Kingdom, in particular in relation to off-social networks display ads;
      • holds a strong market position in the supply of ad tech services in the EEA.
    • The Commission will now carry out an in-depth investigation into the effects of the transaction to determine whether its initial competition concerns regarding the online advertising markets are confirmed.
    • In addition, the Commission will also further examine:
      • the effects of the combination of Fitbit’s and Google’s databases and capabilities in the digital healthcare sector, which is still at a nascent stage in Europe; and
      • whether Google would have the ability and incentive to degrade the interoperability of rivals’ wearables with Google’s Android operating system for smartphones once it owns Fitbit.
    • In February after the deal had been announced, the European Data Protection Board (EDPB) made clear it position that Google and Fitbit will need to scrupulously observe the General Data Protection Regulation’s privacy and data security requirements if the body is sign off on the proposed $2.2 billion acquisition. Moreover, at present Google has not informed European Union (EU) regulators of the proposed deal. The deal comes at a time when both EU and U.S. regulators are already investigating Google for alleged antitrust and anticompetitive practices, and the EDPB’s opinion could carry weight in this process.
  • The United States’ (U.S.) Department of Homeland Security released a Privacy Impact Assessment for the U.S. Border Patrol (USPB) Digital Forensics Programs that details how it may conduct searches of electronic devices at the U.S. border and ports of entry. DHS explained
    • As part of USBP’s law enforcement duties, USBP may search and extract information from electronic devices, including: laptop computers; thumb drives; compact disks; digital versatile disks (DVDs); mobile phones; subscriber identity module (SIM) cards; digital cameras; vehicles; and other devices capable of storing electronic information.
    • Last year, a U.S. District Court held that U.S. Customs and Border Protection (CPB) and U.S. Immigration and Customs Enforcement’s (ICE) current practices for searches of smartphones and computers at the U.S. border are unconstitutional and the agency must have reasonable suspicion before conducting such a search. However, the Court declined the plaintiffs’ request that the information taken off of their devices be expunged by the agencies. This ruling follows a Department of Homeland Security Office of the Inspector General (OIG) report that found CPB “did not always conduct searches of electronic devices at U.S. ports of entry according to its Standard Operating Procedures” and asserted that “[t]hese deficiencies in supervision, guidance, and equipment management, combined with a lack of performance measures, limit [CPB’s] ability to detect and deter illegal activities related to terrorism; national security; human, drug, and bulk cash smuggling; and child pornography.”
    • In terms of a legal backdrop, the United States Supreme Court has found that searches and seizures of electronic devices at borders and airports are subject to lesser legal standards than those conducted elsewhere in the U.S. under most circumstances. Generally, the government’s interest in securing the border against the flow of contraband and people not allowed to enter allow considerable leeway to the warrant requirements for many other types of searches. However, in recent years two federal appeals courts (the Fourth and Ninth Circuits) have held that searches of electronic devices require suspicion on the part of government agents while another appeals court (the Eleventh Circuit) held differently. Consequently, there is not a uniform legal standard for these searches.
  • The Inter-American Development Bank (IDB) and the Organization of Americans States (OAS) released their second assessment of cybersecurity across Latin America and the Caribbean that used the Cybersecurity Capacity Maturity Model for Nations (CMM) developed at University of Oxford’s Global Cyber Security Capacity Centre (GSCC). The IDB and OAS explained:
    • When the first edition of the report “Cybersecurity: Are We Ready in Latin America and the Caribbean?” was released in March 2016, the IDB and the OAS aimed to provide the countries of Latin America and the Caribbean (LAC) not only with a picture of the state of cybersecurity but also guidance about the next steps that should be pursued to strengthen national cybersecurity capacities. This was the first study of its kind, presenting the state of cybersecurity with a comprehensive vision and covering all LAC countries.
    • The great challenges of cybersecurity, like those of the internet itself, are of a global nature. Therefore, it is undeniable that the countries of LAC must continue to foster greater cooperation among themselves, while involving all relevant actors, as well as establishing a mechanism for monitoring, analysis, and impact assessment related to cybersecurity both nationally and regionally. More data in relation to cybersecurity would allow for the introduction of a culture of cyberrisk management that needs to be extended both in the public and private sectors. Countries must be prepared to adapt quickly to the dynamic environment around us and make decisions based on a constantly changing threat landscape. Our member states may manage these risks by understanding the impact on and the likelihood of cyberthreats to their citizens, organizations, and national critical infrastructure. Moving to the next level of maturity will require a comprehensive and sustainable cybersecurity policy, supported by the country’s political agenda, with allocation of  financial resources and qualified human capital to carry it out.
    • The COVID-19 pandemic will pass, but events that will require intensive use of digital technologies so that the world can carry on will continue happening. The challenge of protecting our digital space will, therefore, continue to grow. It is the hope of the IDB and the OAS that this edition of the report will help LAC countries to have a better understanding of their current state of cybersecurity capacity and be useful in the design of the policy initiatives that will lead them to increase their level of cyberresilience.
  • The European Data Protection Supervisor (EDPS) issued an opinion on “the European Commission’s action plan for a comprehensive Union policy on preventing money laundering and terrorism financing (C(2020)2800 final), published on 7 May 2020.” The EDPS asserted:
    • While  the  EDPS acknowledges the  importance  of  the  fight  against money  laundering  and terrorism financing as an objective of general interest, we call for the legislation to strike a balance between the interference with the fundamental rights of privacy and personal data protection and  the measures that  are  necessary  to  effectively  achieve  the  general  interest goals on anti-money  laundering  and  countering the  financing  of terrorism (AML/CFT) (the principle of proportionality).
    • The EDPS recommends that the Commission monitors the effective implementation of the existing  AML/CFT  framework while ensuring that the  GDPR  and  the  data  protection framework are respected and complied with. This is particularly relevant for the works on the interconnection of central bank account mechanisms and beneficial ownership registers that should be largely inspired by the principles of data minimisation, accuracy and privacy-by-design and by default.  

Further Reading

  • China already has your data. Trump’s TikTok and WeChat bans can’t stop that.” By Aynne Kokas – The Washington Post. This article persuasively makes the case that even if a ban on TikTok and WeChat were to work, and there are substantive questions as to how a ban would given how widely the former has been downloaded, the People’s Republic of China (PRC) is almost certainly acquiring massive reams of data on Americans through a variety of apps, platforms, and games. For example, Tencent, owner of WeChat, has a 40% stake in Epic Games that has Fortnite, a massively popular multiplayer game (if you have never heard of it, ask one of the children in your family). Moreover, a recent change to PRC law mandates that companies operating in the PRC must share their data bases for cybersecurity reviews, which may be an opportunity aside from hacking and exfiltrating United States entities, to access data. In summation, if the Trump Administration is serious about stopping the flow of data from the U.S. to the PRC, these executive orders will do very little.
  • Big Tech Makes Inroads With the Biden Campaign” by David McCabe and Kenneth P. Vogel – The New York Times. Most likely long before former Vice President Joe Biden clinched the Democratic nomination, advisers volunteered to help plot out his policy positions, a process that intensified this year. Of course, this includes technology policy, and many of those volunteering for the campaign’s Innovation Policy Committee have worked or are working for large technology companies directly or as consultants or lobbyists. This piece details some of these people and their relationships and how the Biden campaign is managing possible conflicts of interest. Naturally, those on the left wing of the Democratic Party calling for tighter antitrust, competition, and privacy regulation are concerned that Biden might be pulled away from these positions despite his public statements arguing that the United States government needs to get tougher with some practices.
  • A Bible Burning, a Russian News Agency and a Story Too Good to Check Out” By Matthew Rosenberg and Julian E. Barnes – The New York Times. The Russian Federation seems to be using a new tactic with some success for sowing discord in the United States that is the information equivalent of throwing fuel onto a fire. In this case, a fake story manufactured by a Russian outlet was seized on by some prominent Republicans, in part, because it fits their preferred world view of protestors. In this instance, a Russian outlet created a fake story amplifying an actual event that went viral. We will likely see more of this, and it is not confined to fake stories intended to appeal to the right. The same is happening with content meant for the left wing in the United States.
  • Facebook cracks down on political content disguised as local news” by Sara Fischer – Axios. As part of its continuing effort to crack down on violations of its policies, Facebook will no longer allow groups with a political viewpoint to masquerade as news. The company and outside experts have identified a range of instances where groups propagating a viewpoint, as opposed to reporting, have used a Facebook exemption by pretending to be local news outlets.
  • QAnon groups have millions of members on Facebook, documents show” By Ari Sen and Brandy Zadrozny – NBC News. It appears as if some Facebooks are leaking the results of an internal investigation that identified more than 1 million users who are part of QAnon groups. Most likely these employees want the company to take a stronger stance on the conspiracy group QAnon like the company has with COVID-19 lies and misinformation.
  • And, since Senator Kamala Harris (D-CA) was named former Vice President Joe Biden’s (D-DE) vice presidential pick, this article has become even more relevant than when I highlighted it in late July: “New Emails Reveal Warm Relationship Between Kamala Harris And Big Tech” – HuffPost. Obtained via an Freedom of Information request, new email from Senator Kamala Harris’ (D-CA) tenure as her state’s attorney general suggest she was willing to overlook the role Facebook, Google, and others played and still play in one of her signature issues: revenge porn. This article makes the case Harris came down hard on a scammer running a revenge porn site but did not press the tech giants with any vigor to take down such material from their platforms. Consequently, the case is made if Harris is former Vice President Joe Biden’s vice presidential candidate, this would signal a go easy approach on large companies even though many Democrats have been calling to break up these companies and vigorously enforce antitrust laws. Harris has largely not engaged on tech issues during her tenure in the Senate. To be fair, many of these companies are headquartered in California and pump billions of dollars into the state’s economy annually, putting Harris in a tricky position politically. Of course, such pieces should be taken with a grain of salt since it may have been suggested or planted by one of Harris’ rivals for the vice president nomination or someone looking to settle a score.
  • Unwanted Truths: Inside Trump’s Battles With U.S. Intelligence Agencies” by Robert Draper – The New York Times. A deeply sourced article on the outright antipathy between President Donald Trump and Intelligence Community officials, particularly over the issue of how deeply Russia interfered in the election in 2016. A number of former officials have been fired or forced out because they refused to knuckle under to the White House’s desire to soften or massage conclusions of Russia’s past and current actions to undermine the 2020 election in order to favor Trump.
  • Huawei says it’s running out of chips for its smartphones because of US sanctions” By Kim Lyons – The Verge and “Huawei: Smartphone chips running out under US sanctions” by Joe McDonald – The Associated Press. United States (U.S.) sanctions have started biting the Chinese technology company Huawei, which announced it will likely run out of processor chips for its smartphones. U.S. sanctions bar any company from selling high technology items like processors to Huawei, and this capability is not independently available in the People’s Republic of China (PRC) at present.
  • Targeting WeChat, Trump Takes Aim at China’s Bridge to the World” By Paul Mozur and Raymond Zhong – The New York Times. This piece explains WeChat, the app, the Trump Administration is trying to ban in the United States (U.S.) without any warning. It is like a combination of Facebook, WhatsApp, news app, and payment platform and is used by more than 1.2 billion people.
  • This Tool Could Protect Your Photos From Facial Recognition” By Kashmir Hill – The New York Times. Researchers at the University of Chicago have found a method of subtly altering photos of people that appears to foil most facial recognition technologies. However, a number of experts interviewed said it is too late to stop companies like AI Clearview.
  • I Tried to Live Without the Tech Giants. It Was Impossible.” By Kashmir Hill – The New York Times. This New York Times reporter tried living without the products of large technology companies, which involved some fairly obvious challenges and some that were not so obvious. Of course, it was hard for her to skip Facebook, Instagram, and the like, but cutting out Google and Amazon proved hardest and basically impossible because of the latter’s cloud presence and the former’s web presence. The fact that some of the companies cannot be avoided if one wants to be online likely lends weight to those making the case these companies are anti-competitive.
  • To Head Off Regulators, Google Makes Certain Words Taboo” by Adrianne Jeffries – The Markup. Apparently, in what is a standard practice at large companies, employees at Google were coached to avoid using certain terms or phrases that antitrust regulators would take notice of such as: “market,” “barriers to entry,” and “network effects.” The Markup obtained a 16 August 2019 document titled “Five Rules of Thumb For Written Communications” that starts by asserting “[w]ords matter…[e]specially in antitrust laws” and goes on to advise Google’s employees:
    • We’re out to help users, not hurt competitors.
    • Our users should always be free to switch, and we don’t lock anyone in.
    • We’ve got lots of competitors, so don’t assume we control or dominate any market.
    • Don’t try and define a market or estimate our market share.
    • Assume every document you generate, including email, will be seen by regulators.
  • Facebook Fired An Employee Who Collected Evidence Of Right-Wing Pages Getting Preferential Treatment” By Craig Silverman and Ryan Mac – BuzzFeed News. A Facebook engineer was fired after adducing proof in an internal communications system that the social media platform is more willing to change false and negative ratings to claims made by conservative outlets and personalities than any other viewpoint. If this is true, it would be opposite to the narrative spun by the Trump Administration and many Republicans in Congress. Moreover, Facebook’s incentives would seem to align with giving conservatives more preferential treatment because many of these websites advertise on Facebook, the company probably does not want to get crosswise with the Administration, sensational posts and content drive engagement which increases user numbers that allows for higher ad rates, and it wants to appear fair and impartial.
  • How Pro-Trump Forces Work the Refs in Silicon Valley” By Ben Smith – The New York Times. This piece traces the nearly four decade old effort of Republicans to sway mainstream media and now Silicon Valley to its viewpoint.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo credit: Gerd Altmann on Pixabay

EDPB Opines Encryption Ban Would Endanger A Nation’s Compliance with GDPR

As the US and others call on technology companies to develop the means to crack encrypted communications, an EU entity argues any nation with such a law would likely not meet the GDPR’s requirements.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB’s letter comes amidst another push by the Trump Administration, Republican allies in Congress, and other nations to have technology companies develop workarounds or backdoors to its end-to-end encrypted devices, apps, and systems. The proponents of this change claim online child sexual predators, terrorists, and other criminals are using products and services like WhatsApp, Telegram, and iPhones to defeat legitimate, targeted government surveillance and enforcement. They reason that unless technology companies abandon their unnecessarily absolutist position and work towards a technological solution, the number of bad actors communicating in ways that cannot be broken (aka “going dark”) will increase, allowing for greater crime and wrongdoing.

On the other side of the issue, technology companies, civil liberties and privacy experts, and computer scientists argue that any weakening of or backdoors to encryption will eventually be stolen and exposed, making it easier for criminals to hack, steal, and exfiltrate. They assert the internet and digital age are built on secure communications and threatening this central feature would wreak havoc beyond the crimes the US and other governments are seeking to prevent.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

The EDPB’s view is being articulated at a time when, as noted, a number of nations led by the United States (US) continue to press technology companies to allow them access to communications, apps, platforms, and devices that are encrypted. Last year, the US, United Kingdom, Australia, New Zealand, and Canada (the so-called Five Eyes nations) met and claimed in one of the communiques, the Five Eyes ministers asserted that

We are concerned where companies deliberately design their systems in a way that precludes any form of access to content, even in cases of the most serious crimes. This approach puts citizens and society at risk by severely eroding a company’s ability to identify and respond to the most harmful illegal content, such as child sexual exploitation and abuse, terrorist and extremist material and foreign adversaries’ attempts to undermine democratic values and institutions, as well as law enforcement agencies’ ability to investigate serious crime.

The five nations contended that “[t]ech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” The Five Eyes also claimed that “[t]hose companies should also embed the safety of their users in their system designs, enabling them to take action against illegal content…[and] [a]s part of this, companies and Governments must work together to ensure that the implications of changes to their services are well understood and that those changes do not compromise public safety.”

The Five Eyes applauded “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.”

The Five Eyes added

We share concerns raised internationally, inside and outside of government, about the impact these changes could have on protecting our most vulnerable citizens, including children, from harm. More broadly, we call for detailed engagement between governments, tech companies, and other stakeholders to examine how proposals of this type can be implemented without negatively impacting user safety, while protecting cyber security and user privacy, including the privacy of victims.

In October 2019, in an open letter to Facebook CEO Mark Zuckerberg, US Attorney General William P. Barr, United Kingdom Home Secretary Priti Patel, Australia’s Minister for Home Affairs Peter Dutton, and then acting US Homeland Security Secretary Kevin McAleenan asked “that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

However, one of the Five Eyes nations has already taken legislative action to force technology companies and individuals cooperate with law enforcement investigations in ways that could threaten encryption. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

In a related development, this week, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

INSLM stated “[t]he essential effects of TOLA are as follows:

a. Schedule 1 gives police and intelligence agencies new powers to agree or require significant industry assistance from communications providers.

b. Schedules 2, 3 and 4 update existing powers and, in some cases, extended them to new agencies. c. Schedule 5 gives the Australian Security Intelligence Organisation (ASIO) significant new powers to seek and receive both voluntary and compulsory assistance.

INSLM found

  • In relation to Schedule 1, for the reasons set out in greater detail in the report, Technical Assistance Notice (TANs) and Technical Capability Notice (TCNs) should be authorised by a body which is independent of the issuing agency or government. These are powers designed to compel a Designated Communications Provider (DCP) to reveal private information or data of its customers and therefore the usual practice of independent authorisation should apply.
  • I am satisfied that the computer access warrant and associated powers conferred by Schedule 2 are both necessary and proportionate, subject to some amendments.
  • I am generally satisfied that the powers conferred by Schedules 3 and 4 are both necessary and proportionate, but there are some matters that should be addressed and further monitored.
  • I have concluded that Schedule 5 should be amended to limit its breadth and clarify its scope.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

Further Reading and Other Developments (29 June)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The Senate Commerce, Science, and Transportation Committee held an oversight hearing on the Federal Communications Commission (FCC) with the FCC Chair and four Commissioners.
  • New Zealand’s Parliament passed the “Privacy Act 2020,” a major update of its 1993 statute that would, according to New Zealand’s Privacy Commissioner, do the following:
    • Mandatory notification of harmful privacy breaches. If organisations or businesses have a privacy breach that poses a risk of serious harm, they are required to notify the Privacy Commissioner and affected parties. This change brings New Zealand in line with international best practice.
    • Introduction of compliance orders. The Commissioner may issue compliance notices to require compliance with the Privacy Act. Failure to follow a compliance notice could result a fine of up to $10,000.
    • Binding access determinations. If an organisation or business refuses to make personal information available upon request, the Commissioner will have the power to demand release.
    • Controls on the disclosure of information overseas. Before disclosing New Zealanders’ personal information overseas, New Zealand organisations or businesses will need to ensure those overseas entities have similar levels of privacy protection to those in New Zealand.
    • New criminal offences. It will be an offence to mislead an organisation or business in a way that affects someone’s personal information or to destroy personal information if a request has been made for it.  The maximum fine for these offences is $10,000.
    • Explicit application to businesses whether or not they have a legal or physical presence in New Zealand. If an international digital platform is carrying on business in New Zealand, with the New Zealanders’ personal information, there will be no question that they will be obliged to comply with New Zealand law regardless of where they, or their servers are based.
  • The United States’ National Archives’ Information Security Oversight Office (ISOO) submitted its annual report to the White House and found:
    • Our Government’s ability to protect and share Classified National Security Information and Controlled Unclassified Information (CUI) continues to present serious challenges to our national security. While dozens of agencies now use various advanced technologies to accomplish their missions, a majority of them still rely on antiquated information security management practices. These practices have not kept pace with the volume of digital data that agencies create and these problems will worsen if we do not revamp our data collection methods for overseeing information security programs across the Government. We must collect and analyze data that more accurately reflects the true health of these programs in the digital age.
    • However, ISOO noted progress on efforts to better secure and protect CUI but added “[f]ull implementation will require additional resources, including dedicated funds and more full-time staff.”
    • Regarding classified information, ISOO found “Classified National Security Information policies and practices remain outdated and are unable to keep pace with the volume of digital data that agencies create.”
  • The Australian Strategic Policy Institute’s International Cyber Policy Centre released its most recent “Covid-19 Disinformation & Social Media Manipulation” report titled “ID2020, Bill Gates and the Mark of the Beast: how Covid-19catalyses existing online conspiracy movements:”
    • Against the backdrop of the global Covid-19 pandemic, billionaire philanthropist Bill Gates has become the subject of a diverse and rapidly expanding universe of conspiracy theories. As an example, a recent poll found that 44% of Republicans and 19% of Democrats in the US now believe that Gates is linked to a plot to use vaccinations as a pretext to implant microchips into people. And it’s not just America: 13% of Australians believe that Bill Gates played a role in the creation and spread of the coronavirus, and among young Australians it’s 20%. Protests around the world, from Germany to Melbourne, have included anti-Gates chants and slogans.
    • This report takes a close look at a particular variant of the Gates conspiracy theories, which is referred to here as the ID2020 conspiracy (named after the non-profit ID2020 Alliance, which the conspiracy theorists claim has a role in the narrative), as a case study for examining the dynamics of online conspiracy theories on Covid-19. Like many conspiracy theories, that narrative builds on legitimate concerns, in this case about privacy and surveillance in the context of digital identity systems, and distorts them in extreme and unfounded ways.
  • The Pandemic Response Accountability Committee (PRAC) released “TOP CHALLENGES FACING FEDERAL AGENCIES:  COVID-19 Emergency Relief and Response Efforts” for those agencies that received the bulk of funds under the “Coronavirus Aid, Relief, and Economic Security (CARES) Act” (P.L. 116-136). PRAC is housed within the Council of the Inspectors General on Integrity and Efficiency (CIGIE) is comprised of “21 Offices of Inspector General (OIG) overseeing agencies who received the bulk of the emergency funding.” PRAC stated
    • CIGIE previously has identified information technology (IT) security and management as a long-standing, serious, and ubiquitous challenge that impacts agencies across the government, highlighting agencies’ dependence on reliable and secure IT systems to perform their mission-critical functions.  Key areas of concern have included safeguarding federal systems against cyberattacks and insider threats, modernizing and managing federal IT systems, ensuring continuity of operations, and recruiting and retaining a highly skilled cybersecurity workforce.  
    • These concerns remain a significant challenge, but are impacted by (1) widespread reliance on maximum telework to continue agency operations during the pandemic, which has strained agency networks and shifted IT resources, and (2) additional opportunities and targets for cyberattacks created by remote access to networks and increases in online financial activity.
  • Following the completion of a European Union-People’s Republic of China summit, European Commission President Ursula von der Leyen pointed to a number of ongoing technology-related issues between the EU and the PRC, including:
    • [W]e continue to have an unbalanced trade and investment relationship. We have not made the progress we aimed for in last year’s Summit statement in addressing market access barriers. We need to follow up on these commitments urgently. And we also need to have more ambition on the Chinese side in order to conclude negotiations on an investment agreement. These two actions would address the asymmetry in our respective market access and would improve the level playing field between us. In order to conclude the investment agreement, we would need in particular substantial commitments from China on the behaviour of state-owned enterprises, transparency in subsidies, and transparency on the topic of forced technology transfers.
    • We have raised these issues at the same time with President Xi and Premier Li that we expect that China will show the necessary level of ambition to conclude these negotiations by the end of this year. I think it is important that we have now a political, high-level approach on these topics.
    • I have also made it clear that China needs to engage seriously on a reform of the World Trade Organization, in particular on the future negotiations on industrial subsidies. This is the relevant framework where we have to work together on the topic – and it is a difficult topic – but this is the framework, which we have to establish to have common binding rules we agree on.
    • And we must continue to work on tackling Chinese overcapacity, for example in the steel and metal sectors, and in high technology. Here for us it is important that China comes back to the international negotiation table, that we sit down there and find solutions.
    • We also pointed out the importance of the digital transformation and its highly assertive approach to the security, the resilience and the stability of digital networks, systems and value chains. We have seen cyberattacks on hospitals and dedicated computing centres. Likewise, we have seen a rise of online disinformation. We pointed out clearly that this cannot be tolerated.
  • United States Secretary of State Mike Pompeo issued a statement titled “The Tide Is Turning Toward Trusted 5G Vendors,” in which he claimed:
    • The tide is turning against Huawei as citizens around the world are waking up to the danger of the Chinese Communist Party’s surveillance state. Huawei’s deals with telecommunications operators around the world are evaporating, because countries are only allowing trusted vendors in their 5G networks. Examples include the Czech Republic, Poland, Sweden, Estonia, Romania, Denmark, and Latvia. Recently, Greece agreed to use Ericsson rather than Huawei to develop its 5G infrastructure.
  • Germany’s highest court, the Bundesgerichtshof (BGH), ruled against Facebook’s claim that the country’s antitrust regulator was wrong in its finding that it was abusing its dominant position in combining data on German nationals and residents across its platforms. Now the matter will go down to a lower German court that is expected to heed the higher court’s ruling and allow the Bundeskartellamt’s restrictions to limit Facebook’s activity.
  • France’s Conseil d’État upheld the Commission nationale de l’informatique et des libertés’ (CNIL) 2019 fine of €50 million of Google under the General Data Protection Regulation (GDPR) “for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.”
  • A Virginia court ruled against House Intelligence Committee Ranking Member Devin Nunes (R-CA) in his suit against Twitter and Liz Mair, a Republican consultant, and Twitter accounts @devincow and @DevinNunesMom regarding alleged defamation.
  • The California Secretary of State has listed the ballot initiative to add the “California Privacy Rights Act” to the state’s law, in large part, to amend the “California Consumer privacy Act” (CCPA) (AB 375) as having qualified for November’s ballot.

Further Reading

  • Wrongfully Accused by an Algorithm” – The New York Times. In what should have been predictable and foreseeable given the error rate of many facial recognition algorithms at identifying correctly people of color, an African American was wrongly identified by this technology, causing him to be released. Those in the field and experts stress positive identifications are supposed to only be one piece of evidence, but in this case, it was the only evidence police had. After a store loss specialists agreed a person in low grade photo was the likely shoplifter, police arrested the man. Eventually, the charges were dismissed, initially with prejudice leaving open the possibility of future prosecution but later the district attorney cleared all charges and expunged the arrest.
  • Pentagon Says it Needs ‘More Time’ Fixing JEDI Contract“ – Nextgov. The saga of the Department of Defense’s Joint Enterprise Defense Infrastructure cloud contract continues. Amazon and Microsoft will need to submit revised bids for the possibly $10 billion procurement as the Department of Defense (DOD) is trying to cure the problems turned up by a federal court in the suit brought by Amazon. These bids would be evaluated later this summer, according to a recent DOD court filing. The next award of this contract could trigger another bid protest just as the first award caused Amazon to challenge Microsoft’s victory.
  • EU pushing ahead with digital tax despite U.S. resistance, top official says” – Politico. In an Atlantic Council event, European Commission Executive Vice President Margrethe Vestager stated the European Union will move ahead with an EU-wide digital services tax despite the recent pullout of the United States from talks on such a tax. The Organization for Economic Co-operation and Development had convened multi-lateral talks to resolve differences on how a global digital services tax will ideally function with most of the nations involved arguing for a 2% tax to be assessed in the nation where the transaction occurs as opposed to where the company is headquartered. EU officials claim agreement was within reach when the US removed itself from the talks. An EU-wide tax is of a piece with a more aggressive stance taken by the EU towards US technology companies, a number of which are currently under investigation for antitrust and anti-competitive behaviors.
  • Verizon joins ad boycott of Facebook over hateful content” – Associated Press. The telecommunications company joined a number of other companies in pulling their advertising from Facebook organized by the ADL (the Anti-Defamation League), the NAACP, Sleeping Giants, Color Of Change, Free Press and Common Sense. The #StopHateforProfit “asks large Facebook advertisers to show they will not support a company that puts profit over safety,” and thus far, a number of companies are doing just that, including Eddie Bauer, Patagonia, North Face, Ben & Jerry’s, and others. In a statement, a Facebook spokesperson stated “[o]ur conversations with marketers and civil rights organizations are about how, together, we can be a force for good.” While Facebook has changed course due to this and other pressure regarding content posted or ads placed on its platform by most recently removing a Trump campaign ad with Nazi imagery, the company has not changed its position on allowing political ads with lies.
  • The UK’s contact tracing app fiasco is a master class in mismanagement” – MIT Technology Review. This after-action report on the United Kingdom’s National Health Service’s efforts to build its own COVID-19 contact tracing app is grim. The NHS is basically scrapping its work and opting for the Google/Apple API. However, the government in London is claiming “we will now be taking forward a solution that brings together the work on our app and the Google/Apple solution.” A far too ambitious plan married to organizational chaos led to the crash of the NHS effort.
  • Trump administration sees no loophole in new Huawei curb” – Reuters. Despite repeated arguments by trade experts the most recent United States Department of Commerce regulations on Huawei will not cut off access to high technology components, Secretary of Commerce Wilbur Ross claimed “[t]he Department of Commerce does not see any loopholes in this rule…[and] [w]e reaffirm that we will implement the rule aggressively and pursue any attempt to evade its intent.”
  • Defense Department produces list of Chinese military-linked companies” – Axios. Likely in response to a letter sent last year by Senate Minority Leader Chuck Schumer (D-NY) and Senator Tom Cotton (R-AR), the Department of Defense has finally fulfilled a requirement in the FY 1999 National Defense Authorization Act to update a list of “those persons operating directly or indirectly in the United States or any of its territories and possessions that are Communist Chinese military companies.” The DOD has complied and compiled a list of People’s Republic of China (PRC) entities linked to the PRC military. This provision in the FY 1999 NDAA also grants the President authority to “exercise International Emergency Economic Powers Act (IEEPA) authorities” against listed entities, which could include serious sanctions.
  • Andrew Yang is pushing Big Tech to pay users for data” – The Verge. Former candidate for the nomination of the Democratic Party for President Andrew Yang has stated the Data Dividend Project, “a movement dedicated to taking back control of our personal data: our data is our property, and if we allow companies to use it, we should get paid for it.” Additionally, “[i]ts primary objective is to establish and enforce data property rights under laws such as the California Consumer Privacy Act (CCPA), which went into effect on January 1, 2020.” California Governor Gavin Newsom proposed a similar program in very vague terms in a State of California speech but never followed up on it, and Senator John Kennedy (R-LA) has introduced the “Own Your Own Data Act” (S. 806) to provide people with rights to sell their personal data.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Retha Ferguson from Pexels