Five Eyes Again Lean On Tech About Encryption

In the latest demand, the usual suspects are joined by two new nations in urging tech to stop using default encryption and to essentially build backdoors.

The Five Eyes (FVEY) intelligence alliance plus two Asian nations have released an “International Statement: End-To-End Encryption and Public Safety,” which represents the latest FVEY salvo in their campaign against technology companies using default end-to-end encryption. Again, the FVEY nations are casting the issues presented by encryption through the prism of child sexual abuse, terrorism, and other horrible crimes in order to keep technology companies on their proverbial policy backfoot. For, after all, how can the reasonable tech CEO argue for encryption when it is being used to commit and cover up unspeakable crimes.

However, in a sign that technology companies may be facing a growing playing field, India and Japan joined the FVEY in this statement; whether this is a result of the recent Quadrilateral Security Dialogue is unclear, but it seems a fair assumption given that two of the FVEY nations, the United States and Australia make up the other two members of the Quad. And, of course, the United Kingdom, Canada, and New Zealand are the three other members of the FVEY.

In the body of the statement, FVEY, Japan, and India asserted:

  • We, the undersigned, support strong encryption, which plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cyber security.  It also serves a vital purpose in repressive states to protect journalists, human rights defenders and other vulnerable people, as stated in the 2017 resolution of the UN Human Rights Council.  Encryption is an existential anchor of trust in the digital world and we do not support counter-productive and dangerous approaches that would materially weaken or limit security systems. 
  • Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children. We urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content.  We call on technology companies to work with governments to take the following steps, focused on reasonable, technically feasible solutions:
    • Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable;
    • Enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and
    • Engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.

So, on the one hand, these nations recognize the indispensable role encryption plays in modern communications and in the fight against authoritarian regimes and “do not support counter-productive and dangerous approaches that would materially weaken or limit security systems.” But, on the other hand, “[p]articular implementations of encryption technology” is putting children at risk and letting terrorism thrive. Elsewhere in the statement we learn that the implementation in question is “[e]nd-to-end encryption that precludes lawful access to the content of communications in any circumstances.”

And, so these nations want companies like Facebook, Apple, Google, and others to take certain steps that would presumably maintain strong encryption but would allow access to certain communications for law enforcement purposes. These nations propose “[e]mbed[ding] the safety of the public in systems designs,” which is a nice phrase and wonderful rhetoric, but what does this mean practically? Companies should not use default encryption? Perhaps. But, let’s be honest about second order effects if American tech companies dispensed with default encryption. Sophisticated criminals and terrorists understand encryption and will still choose to encrypt their devices, apps, and communications, for in this scenario the devices and apps would no longer be encrypted as the default. Rather, people would have to go to the time and trouble of figuring out how to do this. . To be fair, neophyte and careless criminals and terrorists may not know to do so, and their communications would be fairly easy to acquire.

Another likely second order effect is that apps and software offering very hard to break encryption will no longer be made or legally offered in FVEY nations. Consequently, the enterprising individual interested in encryption that cannot be broken or tapped by governments will seek and likely find such technology through a variety of means produced in other countries. It is unlikely encryption will get put back in the bottle because FVEY and friends want it so.

Moreover, given the current technological landscape, the larger point here is that building backdoors into encryption or weakening encryption puts legitimate, desirable communications, activities, and transactions at greater risk of being intercepted. Why would this be so? Because it would take less effort and computing power to crack a weaker encryption key.

But, sure, a world in which my midnight snacking does not lead to weight gain would be amazing. And so it is with the FVEY’s call for strong encryption they could essentially defeat as needed. Eventually, the keys, technology, or means would be leaked or stolen as has happened time and time again. Most recently, there was a massive exfiltration of the Central Intelligence Agency’s (CIA) Vault 7 hacking tools and sources and methods. It would only be a matter of time before the tools to defeat encryption were stolen or compromised.

Perhaps there is a conceptual framework or technology that would achieve the FVEY’s goal, but, at present, it will entail tradeoffs that will make people less secure in their online communications. And, in the defense of the FVEY, they are proposing to “[e]ngage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.” Again, very nice phraseology that does not tell us much.

Of course, the FVEY nations are calling for access under proper authorization. However, in the U.S. that might not even entail an adversarial process in a court, for under the Foreign Intelligence Surveillance Act (FISA), there is no such process in the secret proceedings. Additionally, in the same vein, the phrase “subject to strong safeguards and oversight” is downright comical if the U.S. system is to be the template given the range of shortcomings and failures of national security agencies in meeting U.S. law relating to surveillance.

The FVEY, Japan, and India conclude with:

We are committed to working with industry to develop reasonable proposals that will allow technology companies and governments to protect the public and their privacy, defend cyber security and human rights and support technological innovation.  While this statement focuses on the challenges posed by end-to-end encryption, that commitment applies across the range of encrypted services available, including device encryption, custom encrypted applications and encryption across integrated platforms.  We reiterate that data protection, respect for privacy and the importance of encryption as technology changes and global Internet standards are developed remain at the forefront of each state’s legal framework.  However, we challenge the assertion that public safety cannot be protected without compromising privacy or cyber security.  We strongly believe that approaches protecting each of these important values are possible and strive to work with industry to collaborate on mutually agreeable solutions.

More having one’s cake and eating it, too. They think strong encryption is possible with the means of accessing encrypted communications related to crimes. This seems to be contrary to expert opinion on the matter.

As mentioned, this is not the FVEY’s first attempt to press technology companies. In October 2019, the U.S., the UK, and Australia sent a letter to Facebook CEO Mark Zuckerberg “to request that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” These governments claimed “[w]e support strong encryption…[and] respect promises made by technology companies to protect users’ data…[but] “[w]e must find a way to balance the need to secure data with public safety and the need for law enforcement to access the information they need to safeguard the public, investigate crimes, and prevent future criminal activity.” The officials asserted that “[c]ompanies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes.”

In summer 2019 the FVEY issued a communique in which it urged technology companies “to include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” Interestingly, at that time, these nations lauded Facebook for “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.” It begs the question of what, if anything, changed since this communique was issued and the recent letter to Zuckerberg. In any event, this communique followed the Five Eyes 2018 “Statement of Principles on Access to Evidence and Encryption,“ which articulated these nations’ commitment to working with technology companies to address encryption and the need for law enforcement agencies to meet their public safety and protection obligations.

In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

Moreover, one of the FVEY nations has enacted a law that could result in orders to technology companies to decrypt encrypted communications. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

This past summer, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

The European Union may have a different view, however. In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

© Michael Kans, Michael Kans Blog and, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s