ePrivacy Exception Proposed

Late last month, a broad exception to the EU’s privacy regulations became effective.

My apologies. The first version of this post erroneously asserted the derogation to the ePrivacy Directive had been enacted. It has not, and this post has been re-titled and updated to reflect this fact.

As the European Union (EU) continues to work on enacting a modernized ePrivacy Directive (Directive 2002/58/EC) to complement the General Data Protection Regulation (GDPR), it proposed an exemption to manage a change in another EU law to sweep “number-independent interpersonal communications services” into the current regulatory structure of electronics communication. The policy justification for allowing a categorical exemption to the ePrivacy Directive is for combatting child sexual abuse online. This derogation of EU law is limited to at most five years and quite possibly less time if the EU can enact a successor to the ePrivacy Directive, an ePrivacy Regulation. However, it is unclear when this derogation will be agreed upon and enacted.

In September 2020, the European Commission (EC) issued “a Proposal for a Regulation on a temporary derogation from certain provisions of the ePrivacy Directive 2002/58/EC as regards the use of technologies by number-independent interpersonal communicationsservice providers for the processing of personal and other data for the purpose of combatting child sexual abuse online.” The final regulation took effect on 21 December 2020. However, the EC has also issued a draft of compromise ePrivacy Regulation, the results of extensive communications. The GDPR was enacted with an update of the ePrivacy Directive in mind.

In early December, an EU Parliament committee approved the proposed derogation but the full Parliament has not yet acted upon the measure. The Parliament needs to reach agreement with the Presidency of the Council and the European Commission. In its press release, the Civil Liberties, Justice and Home Affairs explained:

The proposed regulation will provide for limited and temporary changes to the rules governing the privacy of electronic communications so that over the top (“OTT”) communication interpersonal services, such as web messaging, voice over Internet Protocol (VoIP), chat and web-based email services, can continue to detect, report and remove child sexual abuse online on a voluntary basis.

Article 1 sets out the scope and aim of the temporary regulation:

This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.

The EC explained the legal and policy background for the exemption to the ePrivacy Directive:

  • On 21 December 2020, with the entry into application of the European Electronic Communications Code (EECC), the definition of electronic communications services will be replaced by a new definition, which includes number-independent interpersonal communications services. From that date on, these services will, therefore, be covered by the ePrivacy Directive, which relies on the definition of the EECC. This change concerns communications services like webmail messaging services and internet telephony.
  • Certain providers of number-independent interpersonal communications services are already using specific technologies to detect child sexual abuse on their services and report it to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, and/or to remove child sexual abuse material. These organisations refer to national hotlines for reporting child sexual abuse material, as well as organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the EU and in third countries.
  • Child sexual abuse is a particularly serious crime that has wide-ranging and serious life-long consequences for victims. In hurting children, these crimes also cause significant and long- term social harm. The fight against child sexual abuse is a priority for the EU. On 24 July 2020, the European Commission adopted an EU strategy for a more effective fight against child sexual abuse, which aims to provide an effective response, at EU level, to the crime of child sexual abuse. The Commission announced that it will propose the necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and oblige them to report that material to public authorities by the second quarter of 2021. The announced legislation will be intended to replace this Regulation, by putting in place mandatory measures to detect and report child sexual abuse, in order to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse, while ensuring respect of the fundamental rights of the users, including in particular the right to freedom of expression and opinion, protection of personal data and privacy, and providing for mechanisms to ensure accountability and transparency.

The EC baldly asserts the problem of child online sexual abuse justifies a loophole to the broad prohibition on violating the privacy of EU persons. The EC did note that the fight against this sort of crime is a political priority for the EC, one that ostensibly puts the EU close to the views of the Five Eyes nations that have been pressuring technology companies to end the practice of making apps and hardware encrypted by default.

The EC explained:

The present proposal therefore presents a narrow and targeted legislative interim solution with the sole objective of creating a temporary and strictly limited derogation from the applicability of Articles 5(1) and 6 of the ePrivacy Directive, which protect the confidentiality of communications and traffic data. This proposal respects the fundamental rights, including the rights to privacy and protection of personal data, while enabling providers of number-independent interpersonal communications services to continue using specific technologies and continue their current activities to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services, pending the adoption of the announced long- term legislation. Voluntary efforts to detect solicitation of children for sexual purposes (“grooming”) also must be limited to the use of existing, state-of-the-art technology that corresponds to the safeguards set out. This Regulation should cease to apply in December 2025.

The EC added “[i]n case the announced long-term legislation is adopted and enters into force prior to this date, that legislation should repeal the present Regulation.”

In November, the European Data Protections Supervisor (EDPS) Wojciech Wiewiorówski published his opinion on the temporary, limited derogation from the EU’s regulation on electronics communication and privacy. Wiewiorówski cautioned that a short-term exception, however well-intended, would lead to future loopholes that would ultimately undermine the purpose of the legislation. Moreover, Wiewiorówski found that the derogation was not sufficiently specific guidance and safeguards and is not proportional. Wiewiorówski argued:

  • In particular, he notes that the measures envisaged by the Proposal would constitute an interference with the fundamental rights to respect for private life and data protection of all users of very popular electronic communications services, such as instant messaging platforms and applications. Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data.
  • The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field. The EDPS therefore considers it essential that the Proposal is not adopted, even in the form a temporary derogation, until all the necessary safeguards set out in this Opinion are integrated.
  • In particular, in the interest of legal certainty, the EDPS considers that it is necessary to clarify whether the Proposal itself is intended to provide a legal basis for the processing within the meaning of the GDPR, or not. If not, the EDPS recommends clarifying explicitly in the Proposal which legal basis under the GDPR would be applicable in this particular case.
  • In this regard, the EDPS stresses that guidance by data protection authorities cannot substitute compliance with the requirement of legality. It is insufficient to provide that the temporary derogation is “without prejudice” to the GDPR and to mandate prior consultation of data protection authorities. The co-legislature must take its responsibility and ensure that the proposed derogation complies with the requirements of Article 15(1), as interpreted by the CJEU.
  • In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.
  • Finally, the EDPS is of the view that the five-year period as proposed does not appear proportional given the absence of (a) a prior demonstration of the proportionality of the envisaged measure and (b) the inclusion of sufficient safeguards within the text of the legislation. He considers that the validity of any transitional measure should not exceed 2 years.

The Five Eyes nations (Australia, Canada, New Zealand, the United Kingdom, and the United States) issued a joint statement in which their ministers called for quick action.

In this statement, we highlight how from 21 December 2020, the ePrivacy Directive, applied without derogation, will make it easier for children to be sexually exploited and abused without detection – and how the ePrivacy Directive could make it impossible both for providers of internet communications services, and for law enforcement, to investigate and prevent such exploitation and abuse. It is accordingly essential that the European Union adopt urgently the derogation to the ePrivacy Directive as proposed by the European Commission in order for the essential work carried out by service providers to shield endangered children in Europe and around the world to continue.

Without decisive action, from 21 December 2020 internet-based messaging services and e-mail services captured by the European Electronic Communications Code’s (EECC) new, broader definition of ‘electronic communications services’ are covered by the ePrivacy Directive. The providers of electronic communications services must comply with the obligation to respect the confidentiality of communications and the conditions for processing communications data in accordance with the ePrivacy Directive. In the absence of any relevant national measures made under Article 15 of that Directive, this will have the effect of making it illegal for service providers operating within the EU to use their current tools to protect children, with the impact on victims felt worldwide.

As mentioned, this derogation comes at a time when the EC and the EU nations are trying to finalize and enact an ePrivacy Regulation. In the original 2017 proposal, the EC stated:

The ePrivacy Directive ensures the protection of fundamental rights and freedoms, in particular the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the Union.

The ePrivacy Regulation is intended to work in concert with the GDPR, and the draft 2020 regulation contains the following passages explaining the intended interplay of the two regulatory schemes:

  • Regulation (EU) 2016/679 regulates the protection of personal data. This Regulation protects in addition the respect for private life and communications. The provisions of this Regulation particularise and complement the general rules on the protection of personal data laid down in Regulation (EU) 2016/679. This Regulation therefore does not lower the level of protection enjoyed by natural persons under Regulation (EU) 2016/679. The provisions particularise Regulation (EU) 2016/679 as regards personal data by translating its principles into specific rules. If no specific rules are established in this Regulation, Regulation (EU) 2016/679 should apply to any processing of data that qualify as personal data. The provisions complement Regulation (EU) 2016/679 by setting forth rules regarding subject matters that are not within the scope of Regulation (EU) 2016/679, such as the protection of the rights of end-users who are legal persons. Processing of electronic communications data by providers of electronic communications services and networks should only be permitted in accordance with this Regulation. This Regulation does not impose any obligations on the end-user End-users who are legal persons may have rights conferred by Regulation (EU) 2016/679 to the extent specifically required by this Regulation
  • While the principles and main provisions of Directive 2002/58/EC of the European Parliament and of the Council remain generally sound, that Directive has not fully kept pace with the evolution of technological and market reality, resulting in an inconsistent or insufficient effective protection of privacy and confidentiality in relation to electronic communications. Those developments include the entrance on the market of electronic communications services that from a consumer perspective are substitutable to traditional services, but do not have to comply with the same set of rules. Another development concerns new techniques that allow for tracking of online behaviour of end-users, which are not covered by Directive 2002/58/EC. Directive 2002/58/EC should therefore be repealed and replaced by this Regulation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Guillaume Périgois on Unsplash

Canada Releases Privacy Bill

Canada’s newly released privacy bill shares commonalities with U.S. bills but features a stronger enforcement regime that could result in fines of up to 5% of annual worldwide revenue for the worst violations.

The government in Ottawa has introduced in Parliament the “Digital Charter Implementation Act, 2020” (Bill C-11) that would dramatically reform the nation’s privacy laws and significantly expand the power of the Office of Privacy Commissioner (OPC). The bill consists of two main parts, the Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act, and would partially repeal Canada’s federal privacy law: Personal Information Protection and Electronic Documents Act. Notably, the bill would allow the OPC to levy fines up to 5% of worldwide revenue or $25 million CAD (roughly $20 million USD), whichever is higher. Canadians would also get a private right of action under certain conditions.

Broadly, this bill shares many characteristics with a number of bills introduced in the United States Congress by Democratic Members. Consent would be needed in most cases where a Canadian’s personal information is collected, processed, used, shared, or disclosed although there are some notable exceptions. Canada’s federal privacy regulator would be able to seek and obtain stiff fines for non-compliance.

In the bill, its purpose is explained:

The purpose of this Act is to establish — in an era in which data is constantly flowing across borders and geographical boundaries and significant economic activity relies on the analysis, circulation and exchange of personal information — rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

The Department of Industry (aka Innovation, Science and Economic Development Canada) released this summary of the bill:

The Government of Canada has tabled the Digital Charter Implementation Act, 2020 to strengthen privacy protections for Canadians as they engage in commercial activities. The Act will create the Consumer Privacy Protection Act (CPPA), which will modernize Canada’s existing private sector privacy law, and will also create the new Personal information and Data Protection Tribunal Act, which will create the Personal Information and Data Tribunal, an entity that can impose administrative monetary penalties for privacy violations. Finally, the Act will repeal Part 2 of the existing Personal Information Protection and Electronic Documents Act (PIPEDA) and turn it into stand-alone legislation, the Electronic Documents Act. With each of these steps, the government is building a Canada where citizens have confidence that their data is safe and privacy is respected, while unlocking innovation that promotes a strong economy.

The Department added:

  • Changes enabled by CPPA will enhance individuals’ control over their personal information, such as by requesting its deletion, creating new data mobility rights that promote consumer choice and innovation, and by creating new transparency requirements over uses of personal information in areas such as artificial intelligence systems.
  • CPPA will also promote responsible innovation by reducing regulatory burden. A new exception to consent will address standard business practices; a new regime to clarify how organizations are to handle de-identified personal information, and another new exception to consent to allow organizations to disclose personal information for socially beneficial purposes, such as public health research, for example.
  • The new legislative changes will strengthen privacy enforcement and oversight in a manner similar to certain provinces and some of Canada’s foreign trading partners. It does so by: granting the Office of the Privacy Commissioner of Canada (OPC) order-making powers, which can compel organizations to comply with the law; force them to stop certain improper activities or uses of personal information; and order organizations to preserve information relevant to an OPC investigation. The new law will also enable administrative monetary penalties for serious contraventions of the law, subject to a maximum penalty of 3% of global revenues.
  • The introduction of the Personal Information and Data Tribunal Act will establish a new Data Tribunal, which will be responsible for determining whether to assign administrative monetary penalties that are recommended by the OPC following its investigations, determining the amount of any penalties and will also hear appeals of OPC orders and decisions. The Tribunal will provide for access to justice and contribute to the further development of privacy expertise by providing expeditious reviews of the OPC’s orders.
  • The Electronic Documents Act will take the electronic documents provisions of PIPEDA and enact them in standalone legislation. This change will simplify federal privacy laws and will better align the federal electronic documents regime to support service delivery initiatives by the Treasury Board Secretariat.

In a summary, the Department explained:

Under the CPPA, the Privacy Commissioner would have broad order-making powers, including the ability to force an organization to comply with its requirements under the CPPA and the ability to order a company to stop collecting data or using personal information. In addition, the Privacy Commissioner would also be able to recommend that the Personal Information and Data Protection Tribunal impose a fine. The legislation would provide for administrative monetary penalties of up to 3% of global revenue or $10 million [CAD] for non-compliant organizations. It also contains an expanded range of offences for certain serious contraventions of the law, subject to a maximum fine of 5% of global revenue or $25 million [CAD].

The CPPA broadly defines what constitutes “personal information” and what is therefore covered and protected by the bill. It would be “information about an identifiable individual,” a much wider scope than almost all the legislation in the United States, for example. Consequently, even information derived through processing that was not directly or indirectly collected from a person would seem to be covered by the bill. And, speaking of processing, the CPPA limits how personal information may collected and used, specifically “only for purposes that a reasonable person would consider appropriate in the circumstances.”

Moreover, entity can only collect personal information needed for purposes disclosed before collection or at the time of collection and only with the consent of the person. However, the CPPA would allow for “implied consent” if “the organization establishes that it is appropriate…taking into account the reasonable expectations of the individual and the sensitivity of the personal information that is to be collected, used or disclosed.” And, if the entity decides to collect and use personal information for any new purpose, it must obtain the consent of people in Canada before doing so. What’s more, organizations cannot condition the provision of products or services on people providing consent for collection of personal information beyond what is necessary. And, of course, consent gained under false, deceptive, or misleading pretenses is not valid and people may withdraw consent at any time.

In terms of the types of disclosures an organization must make in terms of purposes, the CPPA would require more than most proposed U.S. federal privacy laws. For example, an entity must tell people the specific personal information to be collected, processed, used, or disclosed, the reasonable consequences of any of the aforementioned, and the names of third parties or types of third partied with whom personal information would be shared.

The CPPA is very much like U.S. privacy bills in that there are numerous exceptions as to when consent is not needed for collecting, processing, and using personal information. Principally, this would be when a reasonable person would expect or understand this could happen or so long as the collection and processing activities are not to influence a person’s decisions or behavior. Activities that would fall in the former category are things such as collection, use, and processing needed to deliver a product or service, protecting the organization’s systems and information security, or the due diligence necessary to protect the organization from commercial risk. Moreover, if collection, use, and processing are in the public interest and consent cannot be readily obtained, then the organization may proceed. The same is true if there is an emergency situation that imperils the life or health of a person so long as disclosure to the person is made in writing expeditiously afterwards. However, neither consent nor knowledge are required for transfers of personal information to service providers, in employment settings, to prevent fraud, and for a number of other enumerated purposes.

There are wide exceptions to the consent requirement relating to collection and use of personal information in the event of investigations of breaches of agreements or contravention of federal or provincial law. Likewise, consent may not be needed if an organization is disclosing personal information to government institutions. Similarly, the collection and use of public information is authorized subject to regulations.

However, the CPPA makes clear that certain exceptions to the consent and knowledge requirements are simply not operative when the personal information in question is an “electronic address” or is stored on a computer system. In these cases, consent or knowledge would be needed before such collection of personal information is legal.

Organizations must dispose of personal information when it is no longer needed for the purpose it was originally collected except for personal information collected and used for decision making. In this latter case, information must be retained in case the person about whom the decision was made wants access. Organizations must dispose of personal information about a person upon his or her request unless doing so would result in the disposal of other people’s information or there is a Canadian law barring such disposal. If the organization refuses the request to dispose, it must inform the person in writing. If the organization grants the request, it must direct service providers to do the same and confirm destruction.

Organizations would have a duty to ensure personal information is accurate, and the applicability of this duty would turn on whether the information is being used to make decisions, is being shared with third parties, and if the information is being used on an ongoing basis.

The CPPA would impose security requirements for organizations collecting, using, and holding personal information. These data would need protection “through physical, organizational and technological security safeguards” appropriate to the sensitivity of the information. Specifically, these security safeguards “must protect personal information against, among other things, loss, theft and unauthorized access, disclosure, copying, use and modification.” Breaches must be reported as soon as feasible to the OPC and to affected people if there is a reasonable belief of “real risk of significant harm to an individual.” Significant harm is defined as “bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.” Real risk of significant harm is determined on the basis of

  • the sensitivity of the personal information involved in the breach;
  • the probability that the personal information has been, is being or will be misused; and
  • any other prescribed factor.

Organizations will also have a duty to explain their policies and practices under this act in plain language, including:

  • a description of the type of personal information under the organization’s control;
  • a general account of how the organization makes use of personal information, including how the organization applies the exceptions to the requirement to obtain consent under this Act;
  • a general account of the organization’s use of any automated decision system to make predictions, recommendations or decisions about individuals that could have significant impacts on them;
  • whether or not the organization carries out any international or interprovincial transfer or disclosure of personal information that may have reasonably foreseeable privacy implications;
  • how an individual may make a request for disposal under section 55 or access under section 63; and
  • the business contact information of the individual to whom complaints or requests for information may be made.

 Canadian nationals and residents would be able to access their personal information. Notably, “[o]n request by an individual, an organization must inform them of whether it has any personal information about them, how it uses the information and whether it has disclosed the information.” Access must also be granted to the requesting person. If the organization has disclosed a person’s information, when she makes a request to access, she must be told the names or types of third parties to whom her information was disclosed. Moreover, organizations using automated decision-making processes would have further responsibilities: “[i]f the organization has used an automated decision system to make a prediction, recommendation or decision about the individual, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision and of how the personal information that was used to make the prediction, recommendation or decision was obtained.” Additionally, if a person has been granted access to his personal information and it “is not accurate, up-to-date or complete,” then the organization must amend it and send the corrected information to third parties that has access to the information.

There are provisions required data portability (deemed data mobility by the CPPA). All organizations subject to the data mobility framework must transfer personal information upon request. People must be able to lodge complaints with organizations over compliance with the CPPA regarding their personal information. Organizations may not re-identify de-identified personal information.

Organizations would be able to draft and submit codes of conduct to the OPC for approval so long as they “provide[] for substantially the same or greater protection of personal information as some or all of the protection provided under this Act.” Likewise, an entity may apply to the OPC “for approval of a certification program that includes

(a) a code of practice that provides for substantially the same or greater protection of personal information as some or all of the protection provided under this Act;

(b) guidelines for interpreting and implementing the code of practice;

(c) a mechanism by which an entity that operates the program may certify that an organization is in compliance with the code of practice;

(d) a mechanism for the independent verification of an organization’s compliance with the code of practice;

(e) disciplinary measures for non-compliance with the code of practice by an organization, including the revocation of an organization’s certification; and

(f) anything else that is provided in the regulations.

However, complying with approved codes of conduct or a certification program does not mean an entity is complying with the CPPA.

The OPC would be granted a range of new powers to enforce the CPPA either through compliance orders (which resemble administrative actions taken by the United States Federal Trade Commission) that can be appealed to a new Personal Information and Data Protection Tribunal (Tribunal) and ultimately enforced in federal court if necessary. People in Canada would also get the right to sue in the event the OPC or the new Tribunal find an entity has contravened the CPPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by James Wheeler from Pixabay

Further Reading, Other Developments, and Coming Events (11 November)

Further Reading

  • ICE, IRS Explored Using Hacking Tools, New Documents Show” By Joseph Cox — Vice. Federal agencies other than the Federal Bureau of Investigation (FBI) and the Intelligence Community (IC) appear to be interesting in utilizing some of the capabilities offered by the private sector to access devices or networks in the name of investigating cases.
  • China’s tech industry relieved by Biden win – but not relaxed” By Josh Horwitz and Yingzhi Yang — Reuters. While a Biden Administration will almost certainly lower the temperature between Beijing and Washington, the People’s Republic of China is intent on addressing the pressure points used by the Trump Administration to inflict pain on its technology industry.
  • Trump Broke the Internet. Can Joe Biden Fix It?” By Gilad Edelman — WIRED. This piece provides a view of the waterfront in technology policy under a Biden Administration.
  • YouTube is awash with election misinformation — and it isn’t taking it down” By Rebecca Heilweil — Recode. For unexplained reasons, YouTube seems to have avoided the scrutiny facing Facebook and Twitter on their content moderation policies. Whether the lack of scrutiny is a reason is not clear, but the Google owned platform had much more election-related misinformation than the other social media platforms.
  • Frustrated by internet service providers, cities and schools push for more data” By Cyrus Farivar — NBC News. Internet service providers are not helping cities and states identify families eligible for low-cost internet to help children attend school virtually. They have claimed these data are proprietary, so jurisdictions have gotten creative about identifying such families.

Other Developments

  • The Consumer Product Safety Commission’s (CPSC) Office of the Inspector General (OIG) released its annual Federal Information Security Modernization Act (FISMA) audit and found “that although management continues to make progress in implementing the FISMA requirements much work remains to be done.” More particularly, it was “determined that the CPSC has not implemented an effective information security program and practices in accordance with FISMA requirements.” The OIG asserted:
    • The CPSC information security program was not effective because the CPSC has not developed a holistic formal approach to manage information security risks or to effectively utilize information security resources to address previously identified information security deficiencies. Although the CPSC has begun to develop an Enterprise Risk Management (ERM) program to guide risk management practices at the CPSC, explicit guidance and processes to address information security risks and integrate those risks into the broader agency-wide ERM program has not been developed.
    • In addition, the CPSC has not leveraged the relevant information security risk management guidance prescribed by NIST to develop an approach to manage information security risk.
    • Further, as asserted by CPSC personnel, the CPSC has limited resources to operate the information security program and to address the extensive FISMA requirements and related complex cybersecurity challenges.
    • Therefore, the CPSC has not dedicated the resources necessary to fully address these challenges and requirements. The CPSC began addressing previously identified information security deficiencies but was not able to address all deficiencies in FY 2020.
  • The United States (U.S.) Department of Justice (DOJ) announced the seizure of 27 websites allegedly used by Iran’s Islamic Revolutionary Guard Corps (IRGC) “to further a global covert influence campaign…in violation of U.S. sanctions targeting both the Government of Iran and the IRGC.” The DOJ contended:
    • Four of the domains purported to be genuine news outlets but were actually controlled by the IRGC and targeted audiences in the United States, to covertly influence United States policy and public opinion, in violation of the Foreign Agents Registration Act (FARA). The remainder targeted audiences in other parts of the world.  This seizure warrant follows an earlier seizure of 92 domains used by the IRGC for similar purposes.
  • The United Nations (UN) Special Rapporteur on the right to privacy Joseph Cannataci issued his annual report that “constitutes  a  preliminary  assessment  as  the  evidence  base required to reach definitive conclusions on whether privacy-intrusive, anti-COVID-19 measures are necessary and proportionate in a democratic society is not yet available.” Cannataci added “[a] more definitive report is planned for mid-2021, when 16 months of evidence will be available to allow a more accurate assessment.” He “addresse[d]  two  particular  aspects  of  the impact of COVID-19 on the right to privacy: data protection and surveillance.” The Special Rapporteur noted:
    • While the COVID-19 pandemic has generated much debate about the value of contact tracing and reliance upon technology that track citizens and those they encounter, the use of information and technology is not new in managing public health emergencies. What is concerning in some States are reports of how technology is being used and the degree of intrusion and control being exerted over citizens –possibly to little public health effect.
    • The Special Rapporteur concluded:
      • It is far too early to assess definitively whether some COVID-19-related measures might be unnecessary or disproportionate. The Special Rapporteur will continue to monitor the impact of surveillance in epidemiology on the right to privacy and report to the General Assembly in 2021. The main privacy risk lies in the use of non-consensual methods, such as those outlined in the section on hybrid systems of surveillance, which could result in function creep and be used for other purposes that may be privacy intrusive.
      • Intensive and omnipresent technological surveillance is not the panacea for pandemic situations such as COVID-19. This has been especially driven home by those countries in which the use of conventional contact-tracing methods, without recourse to smartphone applications, geolocation or other technologies, has proven to be most effective in countering the spread of COVID-19.
      • If a State decides that technological surveillance is necessary as a response to the global COVID-19 pandemic, it must make sure that, after proving both the necessity and proportionality of the specific measure, it has a law that explicitly provides for such surveillance measures (as in the example of Israel).
      • A State wishing to introduce a surveillance measure for COVID-19 purposes, should not be able to rely on a generic provision in law, such as one stating that the head of the public health authority may “order such other action be taken as he [or she] may consider appropriate”. That does not provide explicit and specific safeguards which are made mandatory both under the provisions of Convention 108 and Convention 108+, and based on the jurisprudence of the European Court of Human Rights. Indeed, if the safeguard is not spelled out in sufficient detail, it cannot be considered an adequate safeguard.
  • The University of Toronto’s Citizen Lab issued its submission to the Government of Canada’s “public consultation on the renewal of its Responsible Business Conduct (RBC) strategy, which is intended to provide guidance to the Government of Canada and Canadian companies active abroad with respect to their business activities.” Citizen Lab addressed “Canadian technology companies and the threat they pose to human rights abroad” and noted two of its reports on Canadian companies whose technologies were used to violate human rights:
    • In 2018, the Citizen Lab released a report documenting Netsweeper installations on public IP networks in ten countries that each presented widespread human rights concerns. This research revealed that Netsweeper technology was used to block: (1) political content sites, including websites linked to political groups, opposition groups, local and foreign news, and regional human rights issues in Bahrain, Kuwait, Yemen, and UAE; (2) LGBTQ content as a result of Netsweeper’s pre-defined ‘Alternative Lifestyles’ content category, as well as Google searches for keywords relating to LGBTQ content (e.g., the words “gay” or “lesbian”) in the UAE, Bahrain, and Yemen; (3) non-pornographic websites under the mis-categorization of sites like the World Health Organization and the Center for Health and Gender Equity as “pornography”; (4) access to news reporting on the Rohingya refugee crisis and violence against Muslims from multiple news outlets for users in India; (5) Blogspot-hosted websites in Kuwait by categorizing them as “viruses” as well as a range of political content from local and foreign news and a website that monitors human rights issues in the region; and (6) websites like Date.com, Gay.com (the Los Angeles LGBT Center), Feminist.org, and others through categorizing them as “web proxies.” 
    • In 2018, the Citizen Lab released a report documenting the use of Sandvine/Procera devices to redirect users in Turkey and Syria to spyware, as well as the use of such devices to hijack the Internet users’ connections in Egypt, redirecting them to revenue-generating content. These examples highlight some of the ways in which this technology can be used for malicious purposes. The report revealed how Citizen Lab researchers identified a series of devices on the networks of Türk Telekom—a large and previously state-owned ISP in Turkey—being used to redirect requests from users in Turkey and Syria who attempted to download certain common Windows applications like antivirus software and web browsers. Through the use of Sandvine/Procera technology, these users were instead redirected to versions of those applications that contained hidden malware. 
    • Citizen Lab made a number of recommendations:
      • Reform Canadian export law:  
        • Clarify that all Canadian exports are subject to the mandatory analysis set out in section 7.3(1) and section 7.4 of the Export and Import Permits Act (EIPA). 
        • Amend section 3(1) the EIPA such that the human rights risks of an exported good or technology provide an explicit basis for export control.
        • Amend the EIPA to include a ‘catch-all’ provision that subjects cyber-surveillance technology to export control, even if not listed on the Export Control List, when there is evidence that the end-use may be connected with internal repression and/or the commission of serious violations of international human rights or international humanitarian law. 
      • Implement mandatory human rights due diligence legislation:
        • Similar to the French duty of vigilance law, impose a human rights due diligence requirement on businesses such that they are required to perform human rights risk assessments, develop mitigation strategies, implement an alert system, and develop a monitoring and public reporting scheme. 
        • Ensure that the mandatory human rights due diligence legislation provides a statutory mechanism for liability where a company fails to conform with the requirements under the law. 
      • Expand and strengthen the Canadian Ombudsperson for Responsible Enterprise (CORE): 
        • Expand the CORE’s mandate to cover technology sector businesses operating abroad.
        • Expand the CORE’s investigatory mandate to include the power to compel companies and executives to produce testimony, documents, and other information for the purposes of joint and independent fact-finding.
        • Strengthen the CORE’s powers to hold companies to account for human rights violations abroad, including the power to impose fines and penalties and to impose mandatory orders.
        • Expand the CORE’s mandate to assist victims to obtain legal redress for human rights abuses. This could include the CORE helping enforce mandatory human rights due diligence requirements, imposing penalties and/or additional statutory mechanisms for redress when requirements are violated.
        • Increase the CORE’s budgetary allocations to ensure that it can carry out its mandate.
  • A week before the United States’ (U.S.) election, the White House’s Office of Science and Technology Policy (OSTP) issued a report titled “Advancing America’s Global Leadership in Science and Technology: Trump Administration Highlights from the Trump Administration’s First Term: 2017-2020,” that highlights the Administration’s purported achievements. OSTP claimed:
    • Over the past four years, President Trump and the entire Administration have taken decisive action to help the Federal Government do its part in advancing America’s global science and technology (S&T) preeminence. The policies enacted and investments made by the Administration have equipped researchers, health professionals, and many others with the tools to tackle today’s challenges, such as the COVID-19 pandemic, and have prepared the Nation for whatever the future holds.

Coming Events

  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Brett Sayles from Pexels

Further Reading, Other Developments, and Coming Events (14 October)

Further Reading

  •  “The Man Who Speaks Softly—and Commands a Big Cyber Army” By Garrett Graff — WIRED. A profile of General Paul Nakasone, the leader of both the United States’ National Security Agency (NSA) and Cyber Command, who has operated mostly in the background during the tumultuous Trump Administration. He has likely set the template for both organizations going forward for some time. A fascinating read chock with insider details.
  • Facebook Bans Anti-Vaccination Ads, Clamping Down Again” by Mike Isaac — The New York Times. In another sign of the social media platform responding to pressure in the United States and Europe, it was announced that anti-vaccination advertisements would no longer be accepted. This follows bans on Holocaust denial and QAnon material. Of course, this newest announcement is a classic Facebook half-step. Only paid advertisements will be banned, but users can continue to post about their opposition to vaccination.
  • To Mend a Broken Internet, Create Online Parks” By Eli Pariser — WIRED. An interesting argument that a public online space maintained by the government much like parks or public libraries may be just what democracies across the globe need to roll back the tide of extremism and division.
  • QAnon is tearing families apart” By Travis Andrews — The Washington Post. This is a terrifying tour through the fallout of the QAnon conspiracy that sucks some in so deeply they are marginally connected to reality in many ways.
  • AT&T has trouble figuring out where it offers government-funded Internet” By John Brodkin — Ars Technica.  So, yeah, about all that government cash given to big telecom companies that was supposed to bring more broadband coverage. Turns out, they definitely took the cash. The broadband service has been a much more elusive thing to verify. In one example, AT&T may or may not have provided service to 133,000 households in Mississippi after receiving funds from the Federal Communications Commission (FCC). Mississippi state authorities are arguing most of the service is non-existent. AT&T is basically saying it’s all a misunderstanding.

Other Developments

  • The California Attorney General’s Office (AG) has released yet another revision of the regulations necessary to implement the “California Consumer Privacy Act” (CCPA) (AB 375) and comments are due by 28 October. Of course, if Proposition 24 passes next month, the “California Privacy Rights Act” will largely replace the CCPA, requiring the drafting of even more regulations. Nonetheless, what everyone thought was the final set of CCPA regulations took effect on 14 August, but in the notice from the Office of Administrative Law was notice that the AG had withdrawn four portions of the proposed regulations. In the new draft regulations, the AG explained:
    • Proposed section 999.306, subd. (b)(3), provides examples of how businesses that collect personal information in the course of interacting with consumers offline can provide the notice of right to opt-out of the sale of personal information through an offline method.
    • Proposed section 999.315, subd. (h), provides guidance on how a business’s methods for submitting requests to opt-out should be easy and require minimal steps. It provides illustrative examples of methods designed with the purpose or substantial effect of subverting or impairing a consumer’s choice to opt-out.
    • Proposed section 999.326, subd. (a), clarifies the proof that a business may require an authorized agent to provide, as well as what the business may require a consumer to do to verify their request.
    • Proposed section 999.332, subd. (a), clarifies that businesses subject to either section 999.330, section 999.331, or both of these sections are required to include a description of the processes set forth in those sections in their privacy policies.
  • Facebook announced an update to its “hate speech policy to prohibit any content that denies or distorts the Holocaust.” Facebook claimed:
    • Following a year of consultation with external experts, we recently banned anti-Semitic stereotypes about the collective power of Jews that often depicts them running the world or its major institutions.  
    • Today’s announcement marks another step in our effort to fight hate on our services. Our decision is supported by the well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people. According to a recent survey of adults in the US aged 18-39, almost a quarter said they believed the Holocaust was a myth, that it had been exaggerated or they weren’t sure.
  • In a 2018 interview, Facebook CEO Mark Zuckerberg asserted:
    • I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong…
    • What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed.
    • He clarified in a follow up email:
      • I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.
      • Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed. These issues are very challenging but I believe that often the best way to fight offensive bad speech is with good speech.
  • The Government Accountability Office (GAO) issued an evaluation of the Trump Administration’s 5G Strategy and found more processes and actions are needed if this plan to vault the United States (U.S.) ahead of other nations will come to fruition. Specifically, “report examines the extent to which the Administration has developed a national strategy on 5G that address our six desirable characteristics of an effective national strategy.” The GAO identified the six desirable characteristics: (1) purpose, scope, and methodology; (2) problem definition and risk assessment; (3) goals, subordinate objectives, activities, and performance measures; (4) resources, investments, and risk management; (5) organizational roles, responsibilities, and coordination; and (6) integration and implementation. However, this assessment is necessarily limited, for National Security Council staff took the highly unusual approach of not engaging with the GAO, which may be another norm broken by the Trump Administration. The GAO stated “[t]he March 2020 5G national strategy partially addresses five of our desirable characteristics of an effective national strategy and does not address one, as summarized in table 1:
    • The GAO explained:
      • According to National Telecommunications and Information Administration (NTIA) and Office of Science and Technology Policy (OSTP) officials, the 5G national strategy was intentionally written to be at a high level and as a result, it may not include all elements of our six desirable characteristics of national strategies. These officials stated that the 5G implementation plan required by the Secure 5G and Beyond Act of 2020 is expected to include specific details, not covered in the 5G national strategy, on the U.S. government’s response to 5G risks and challenges. The implementation plan is expected to align and correspond to the lines of effort in the 5G national strategy. NTIA officials told us that the implementation plan to the 5G national strategy would be finalized by the end of October 2020. However, the officials we spoke to were unable to provide details on the final content of the implementation plan such as whether the plan would include all elements of our six desirable characteristics of national strategies given that it was not final. National strategies and their implementation plans should include all elements of the six desirable characteristics to enhance their usefulness as guidance and to ensure accountability and coordinate investments. Until the administration ensures that the implementation plan includes all elements of the six desirable characteristics, the guidance the plan provides decision makers in allocating resources to address 5G risks and challenges will likely be limited.
  • The Irish Council for Civil Liberties (ICCL) wrote the European Commission (EC) to make the case the United Kingdom (UK) is not deserving of an adequacy decision after Brexit because of institutional and cultural weaknesses at the Information Commissioner’s Office (ICO). The ICCL made the case that the ICO has been one of the most ineffectual enforcers of the General Data Protection Regulation (GDPR), especially with respect to what the ICCL called the largest data infringement under the GDPR and the largest data breach of all time: Real-Time Bidding. The ICCL took the ICO to task with having not followed through on fining companies for GDPR violations and having a tiny staff dedicated to data protection and technology issues. The ICCL invoked Article 45 of the GDPR to encourage the EC to deny the UK the adequacy decision it would need in order to transfer the personal data of EU residents to the UK.
  • In an unrelated development, the Information Commissioner’s Office (ICO) wrapped up its investigation into Facebook and Cambridge Analytica and detailed its additional findings in a letter to the Digital, Culture and Media and Sport Select Committee in the House of Commons. ICO head Elizabeth Denham asserted:
    • [w]e concluded that SCL Elections Ltd and Cambridge Analytica (SCL/CA) were purchasing significant volumes of commercially available personal data (at one estimate over 130 billion data points), in the main about millions of US voters, to combine it with the Facebook derived insight information they had obtained from an academic at Cambridge University, Dr Aleksandr Kogan, and elsewhere. In the main their models were also built from ‘off the shelf’ analytical tools and there was evidence that their own staff were concerned about some of the public statements the leadership of the company were making about their impact and influence.
    • From my review of the materials recovered by the investigation I have found no further evidence to change my earlier view that SCL/CA were not involved in the EU referendum campaign in the UK -beyond some initial enquiries made by SCL/CA in relation to UKIP data in the early stages of the referendum process. This strand of work does not appear to have then been taken forward by SCL/CA
    • I have concluded my wider investigations of several organisations on both the remain and the leave side of the UK’s referendum about membership of the EU. I identified no significant breaches of the privacy and electronic marketing regulations and data protection legislation that met the threshold for formal regulatory action. Where the organisation continued in operation, I have provided advice and guidance to support better future compliance with the rules.
    • During the investigation concerns about possible Russian interference in elections globally came to the fore. As I explained to the sub-committee in April 2019, I referred details of reported possible Russia-located activity to access data linked to the investigation to the National Crime Agency. These matters fall outside the remit of the ICO. We did not find any additional evidence of Russian involvement in our analysis of material contained in the SCL / CA servers we obtained.
  • The United States Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint cybersecurity advisory regarding “recently observed advanced persistent threat (APT) actors exploiting multiple legacy vulnerabilities in combination with a newer privilege escalation vulnerability.” CISA and the FBI revealed that that these tactics have penetrated systems related to elections but claimed there has been no degrading of the integrity of electoral systems.
  • The agencies stated:
    • The commonly used tactic, known as vulnerability chaining, exploits multiple vulnerabilities in the course of a single intrusion to compromise a network or application. 
    • This recent malicious activity has often, but not exclusively, been directed at federal and state, local, tribal, and territorial (SLTT) government networks. Although it does not appear these targets are being selected because of their proximity to elections information, there may be some risk to elections information housed on government networks.
    • CISA is aware of some instances where this activity resulted in unauthorized access to elections support systems; however, CISA has no evidence to date that integrity of elections data has been compromised.
  • Canada’s Privacy Commissioner Daniel Therrien released the “2019-2020 Annual Report to Parliament on the Privacy Act and Personal Information Protection and Electronic Documents Act” and asserted:
    • Technologies have been very useful in halting the spread of COVID-19 by allowing essential activities to continue safely. They can and do serve the public good.
    • At the same time, however, they raise new privacy risks. For example, telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. E-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.
    • As the pandemic speeds up digitization, basic privacy principles that would allow us to use public health measures without jeopardizing our rights are, in some cases, best practices rather than requirements under the existing legal framework.
    • We see, for instance, that the law has not properly contemplated privacy protection in the context of public-private partnerships, nor does it mandate app developers to consider Privacy by Design, or the principles of necessity and proportionality.
    • The law is simply not up to protecting our rights in a digital environment. Risks to privacy and other rights are heightened by the fact that the pandemic is fueling rapid societal and economic transformation in a context where our laws fail to provide Canadians with effective protection.
    • In our previous annual report, we shared our vision of how best to protect the privacy rights of Canadians and called on parliamentarians to adopt rights-based privacy laws.
    • We noted that privacy is a fundamental human right (the freedom to live and develop free from surveillance). It is also a precondition for exercising other human rights, such as equality rights in an age when machines and algorithms make decisions about us, and democratic rights when technologies can thwart democratic processes.
    • Regulating privacy is essential not only to support electronic commerce and digital services; it is a matter of justice.

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The House Intelligence Committee will conduct a virtual hearing titled “Misinformation, Conspiracy Theories, and ‘Infodemics’: Stopping the Spread Online.”
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • The Senate Commerce, Science, and Transportation Committee will reportedly hold a hearing on 29 October regarding 47 U.S.C. 230 with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Thanks for your Like • donations welcome from Pixabay

Five Eyes Again Lean On Tech About Encryption

In the latest demand, the usual suspects are joined by two new nations in urging tech to stop using default encryption and to essentially build backdoors.

The Five Eyes (FVEY) intelligence alliance plus two Asian nations have released an “International Statement: End-To-End Encryption and Public Safety,” which represents the latest FVEY salvo in their campaign against technology companies using default end-to-end encryption. Again, the FVEY nations are casting the issues presented by encryption through the prism of child sexual abuse, terrorism, and other horrible crimes in order to keep technology companies on their proverbial policy backfoot. For, after all, how can the reasonable tech CEO argue for encryption when it is being used to commit and cover up unspeakable crimes.

However, in a sign that technology companies may be facing a growing playing field, India and Japan joined the FVEY in this statement; whether this is a result of the recent Quadrilateral Security Dialogue is unclear, but it seems a fair assumption given that two of the FVEY nations, the United States and Australia make up the other two members of the Quad. And, of course, the United Kingdom, Canada, and New Zealand are the three other members of the FVEY.

In the body of the statement, FVEY, Japan, and India asserted:

  • We, the undersigned, support strong encryption, which plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cyber security.  It also serves a vital purpose in repressive states to protect journalists, human rights defenders and other vulnerable people, as stated in the 2017 resolution of the UN Human Rights Council.  Encryption is an existential anchor of trust in the digital world and we do not support counter-productive and dangerous approaches that would materially weaken or limit security systems. 
  • Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children. We urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content.  We call on technology companies to work with governments to take the following steps, focused on reasonable, technically feasible solutions:
    • Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable;
    • Enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and
    • Engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.

So, on the one hand, these nations recognize the indispensable role encryption plays in modern communications and in the fight against authoritarian regimes and “do not support counter-productive and dangerous approaches that would materially weaken or limit security systems.” But, on the other hand, “[p]articular implementations of encryption technology” is putting children at risk and letting terrorism thrive. Elsewhere in the statement we learn that the implementation in question is “[e]nd-to-end encryption that precludes lawful access to the content of communications in any circumstances.”

And, so these nations want companies like Facebook, Apple, Google, and others to take certain steps that would presumably maintain strong encryption but would allow access to certain communications for law enforcement purposes. These nations propose “[e]mbed[ding] the safety of the public in systems designs,” which is a nice phrase and wonderful rhetoric, but what does this mean practically? Companies should not use default encryption? Perhaps. But, let’s be honest about second order effects if American tech companies dispensed with default encryption. Sophisticated criminals and terrorists understand encryption and will still choose to encrypt their devices, apps, and communications, for in this scenario the devices and apps would no longer be encrypted as the default. Rather, people would have to go to the time and trouble of figuring out how to do this. . To be fair, neophyte and careless criminals and terrorists may not know to do so, and their communications would be fairly easy to acquire.

Another likely second order effect is that apps and software offering very hard to break encryption will no longer be made or legally offered in FVEY nations. Consequently, the enterprising individual interested in encryption that cannot be broken or tapped by governments will seek and likely find such technology through a variety of means produced in other countries. It is unlikely encryption will get put back in the bottle because FVEY and friends want it so.

Moreover, given the current technological landscape, the larger point here is that building backdoors into encryption or weakening encryption puts legitimate, desirable communications, activities, and transactions at greater risk of being intercepted. Why would this be so? Because it would take less effort and computing power to crack a weaker encryption key.

But, sure, a world in which my midnight snacking does not lead to weight gain would be amazing. And so it is with the FVEY’s call for strong encryption they could essentially defeat as needed. Eventually, the keys, technology, or means would be leaked or stolen as has happened time and time again. Most recently, there was a massive exfiltration of the Central Intelligence Agency’s (CIA) Vault 7 hacking tools and sources and methods. It would only be a matter of time before the tools to defeat encryption were stolen or compromised.

Perhaps there is a conceptual framework or technology that would achieve the FVEY’s goal, but, at present, it will entail tradeoffs that will make people less secure in their online communications. And, in the defense of the FVEY, they are proposing to “[e]ngage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.” Again, very nice phraseology that does not tell us much.

Of course, the FVEY nations are calling for access under proper authorization. However, in the U.S. that might not even entail an adversarial process in a court, for under the Foreign Intelligence Surveillance Act (FISA), there is no such process in the secret proceedings. Additionally, in the same vein, the phrase “subject to strong safeguards and oversight” is downright comical if the U.S. system is to be the template given the range of shortcomings and failures of national security agencies in meeting U.S. law relating to surveillance.

The FVEY, Japan, and India conclude with:

We are committed to working with industry to develop reasonable proposals that will allow technology companies and governments to protect the public and their privacy, defend cyber security and human rights and support technological innovation.  While this statement focuses on the challenges posed by end-to-end encryption, that commitment applies across the range of encrypted services available, including device encryption, custom encrypted applications and encryption across integrated platforms.  We reiterate that data protection, respect for privacy and the importance of encryption as technology changes and global Internet standards are developed remain at the forefront of each state’s legal framework.  However, we challenge the assertion that public safety cannot be protected without compromising privacy or cyber security.  We strongly believe that approaches protecting each of these important values are possible and strive to work with industry to collaborate on mutually agreeable solutions.

More having one’s cake and eating it, too. They think strong encryption is possible with the means of accessing encrypted communications related to crimes. This seems to be contrary to expert opinion on the matter.

As mentioned, this is not the FVEY’s first attempt to press technology companies. In October 2019, the U.S., the UK, and Australia sent a letter to Facebook CEO Mark Zuckerberg “to request that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” These governments claimed “[w]e support strong encryption…[and] respect promises made by technology companies to protect users’ data…[but] “[w]e must find a way to balance the need to secure data with public safety and the need for law enforcement to access the information they need to safeguard the public, investigate crimes, and prevent future criminal activity.” The officials asserted that “[c]ompanies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes.”

In summer 2019 the FVEY issued a communique in which it urged technology companies “to include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” Interestingly, at that time, these nations lauded Facebook for “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.” It begs the question of what, if anything, changed since this communique was issued and the recent letter to Zuckerberg. In any event, this communique followed the Five Eyes 2018 “Statement of Principles on Access to Evidence and Encryption,“ which articulated these nations’ commitment to working with technology companies to address encryption and the need for law enforcement agencies to meet their public safety and protection obligations.

In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

Moreover, one of the FVEY nations has enacted a law that could result in orders to technology companies to decrypt encrypted communications. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

This past summer, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

The European Union may have a different view, however. In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

Further Reading, Other Developments, and Coming Events ( 4 September)

Here is today’s Further Reading, Other Developments, and Coming Events.

Coming Events

  • The United States-China Economic and Security Review Commission will hold a hearing on 9 September on “U.S.-China Relations in 2020: Enduring Problems and Emerging Challenges” to “evaluate key developments in China’s economy, military capabilities, and foreign relations, during 2020.”
  • On 10 September, the General Services Administration (GSA) will have a webinar to discuss implementation of Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) that bars the federal government and its contractors from buying the equipment and services from Huawei, ZTE, and other companies from the People’s Republic of China.
  • The Federal Communications Commission (FCC) will hold a forum on 5G Open Radio Access Networks on 14 September. The FCC asserted
    • Chairman [Ajit] Pai will host experts at the forefront of the development and deployment of open, interoperable, standards-based, virtualized radio access networks to discuss this innovative new approach to 5G network architecture. Open Radio Access Networks offer an alternative to traditional cellular network architecture and could enable a diversity in suppliers, better network security, and lower costs.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.”
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled ““Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Election Assistance Commission (EAC) “released the Election Risk Profile Tool, a user-friendly assessment tool to equip election officials and federal agencies in prioritizing and managing cybersecurity risks to the Election Infrastructure Subsector.” The agencies stated “[t]he new tool is designed to help state and local election officials understand the range of risks they face and how to prioritize mitigation efforts…[and] also addresses areas of greatest risk, ensures technical cybersecurity assessments and services are meeting critical needs, and provides a sound analytic foundation for managing election security risk with partners at the federal, state and local level.”
    • CISA and the EAC explained “[t]he Election Risk Profile Tool:
      • Is a user-friendly assessment tool for state and local election officials to develop a high-level risk profile across a jurisdiction’s specific infrastructure components;
      • Provides election officials a method to gain insights into their cybersecurity risk and prioritize mitigations;
      • Accepts inputs of a jurisdiction’s specific election infrastructure configuration; and
      • Outputs a tailored risk profile for jurisdictions, which identifies specific areas of highest risk and recommends associated mitigation measures that the jurisdiction could implement to address the risk areas.
  • The cybersecurity agencies of the Five Eyes nations have released a Joint Cybersecurity Advisory: Technical Approaches to Uncovering and Remediating Malicious Activity that “highlights technical approaches to uncovering malicious activity and includes mitigation steps according to best practices.” The agencies asserted “[t]he purpose of this report is to enhance incident response among partners and network administrators along with serving as a playbook for incident investigation.”
    • The Australian Cyber Security Centre, Canada’s Communications Security Establishment, the United States’ Cybersecurity and Infrastructure Security Agency, the United Kingdom’s National Cyber Security Centre, and New Zealand’s National Cyber Security Centre and Computer Emergency Response Team summarized the key takeaways from the Joint Advisory:
      • When addressing potential incidents and applying best practice incident response procedures:
      • First, collect and remove for further analysis:
        • Relevant artifacts,
        • Logs, and
        • Data.
      • Next, implement mitigation steps that avoid tipping off the adversary that their presence in the network has been discovered.
      • Finally, consider soliciting incident response support from a third-party IT security organization to:
        • Provide subject matter expertise and technical support to the incident response,
        • Ensure that the actor is eradicated from the network, and
        • Avoid residual issues that could result in follow-up compromises once the incident is closed.
  • The United States’ (U.S.) Department of Justice (DOJ) and Federal Trade Commission (FTC) signed an Antitrust Cooperation Framework with their counterpart agencies from Australia, Canada, New Zealand, And United Kingdom. The Multilateral Mutual Assistance and Cooperation Framework for Competition Authorities (Framework) “aims to strengthen cooperation between the signatories, and provides the basis for a series of bilateral agreements among them focused on investigative assistance, including sharing confidential information and cross-border evidence gathering.” Given that a number of large technology companies are under investigation in the U.S., the European Union (EU) and elsewhere, signaling a shift in how technology multinationals are being viewed, this agreement may enable cross-border efforts to collectively address alleged abuses. However, the Framework “is not intended to be legally binding and does not give rise to legal rights or obligations under domestic or international law.” The Framework provides:
    • Recognising that the Participants can benefit by sharing their experience in developing, applying, and enforcing Competition Laws and competition policies, the Participants intend to cooperate and provide assistance, including by:
      • a) exchanging information on the development of competition issues, policies and laws;
      • b) exchanging experience on competition advocacy and outreach, including to consumers, industry, and government;
      • c) developing agency capacity and effectiveness by providing advice or training in areas of mutual interest, including through the exchange of officials and through experience-sharing events;
      • d) sharing best practices by exchanging information and experiences on matters of mutual interest, including enforcement methods and priorities; and
      • e) collaborating on projects of mutual interest, including via establishing working groups to consider specific issues.
  • Dynasplint Systems alerted the United States Department of Health and Human Services (HHS) that it suffered a breach affecting more than 100,000 people earlier this year. HHS’ Office of Civil Rights (OCR) is investigating possible violations of Health Insurance Portability and Accountability Act regulations regarding the safeguarding of patients’ health information. If Dynasplint failed to properly secure patient information or its systems, OCR could levy a multimillion dollar fine for the size breach. For example, in late July, OCR fined a company over $1 million for the theft of an unencrypted laptop that exposed the personal information of a little more than 20,000 people.
    • Dynasplint, a Maryland manufacturer of range of motion splints, explained:
      • On June 4, 2020, the investigation determined that certain information was accessed without authorization during the incident.
      • The information may have included names, addresses, dates of birth, Social Security numbers, and medical information.
      • Dynasplint Systems reported this matter to the FBI and will provide whatever cooperation is necessary to hold perpetrators accountable.
  • The California Legislature has sent two bills to Governor Gavin Newsom (D) that would change how technology is regulated in the state, including one that would alter the “California Consumer Privacy Act” (AB 375) (CCPA) if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. The two bills are:
    • AB 1138 would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.”
    •  AB 1281 would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
  • Senators Senator Shelley Moore Capito (R-WV), Amy Klobuchar (D-MN) and Jerry Moran (R-KS) have written “a letter to Federal Trade Commission (FTC) Chairman Joseph Simons urging the FTC to take action to address the troubling data collection and sharing practices of the mobile application (app) Premom” and “to request information on the steps that the FTC plans to take to address this issue.” They asserted:
    • A recent investigation from the International Digital Accountability Council (IDAC) indicated that Premom may have engaged in deceptive consumer data collection and processing, and that there may be material differences between Premom’s stated privacy policies and its actual data-sharing practices. Most troubling, the investigation found that Premom shared its users’ data without their consent.
    • Moore Capito, Klobuchar, and Moran stated “[i]n light of these concerning reports, and given the critical role that the FTC plays in enforcing federal laws that protect consumer privacy and data under Section 5 of the Federal Trade Commission Act and other sector specific laws, we respectfully ask that you respond to the following questions:
      • 1. Does the FTC treat persistent identifiers, such as the non-resettable device hardware identifiers discussed in the IDAC report, as personally identifiable information in relation to its general consumer data security and privacy enforcement authorities under Section 5 of the FTC Act?  
      • 2. Is the FTC currently investigating or does it plan to investigate Premom’s consumer data collection, transmission, and processing conduct described in the IDAC report to determine if the company has engaged in deceptive practices?
      • 3. Does the FTC plan to take any steps to educate users of the Premom app that the app may still be sharing their personal data without their permission if they have not updated the app? If not, does the FTC plan to require Premom to conduct such outreach?
      • 4. Please describe any unique or practically uncommon uses of encryption by the involved third-party companies receiving information from Premom that could be functionally interpreted to obfuscate oversight of the involved data transmissions.
      • 5. How can the FTC use its Section 5 authority to ensure that mobile apps are not deceiving consumers about their data collection and sharing practices and to preempt future potentially deceptive practices like those Premom may have engaged in?

Further Reading

  • Justice Dept. Plans to File Antitrust Charges Against Google in Coming Weeks” By Katie Benner and Cecilia Kang – The New York Times; “The Justice Department could file a lawsuit against Google this month, overriding skepticism from its own top lawyers” By Tonty Romm – The Washington Post; “There’s a partisan schism over the timing of a Google antitrust lawsuit” By Timothy B. Lee – Ars Technica. The New York Times explains in its deeply sourced article that United States Department of Justice (DOJ) attorneys want more time to build a better case against Google, but that Attorney General William Barr is pressing for the filing of a suit as early as the end of this month in order for the Trump Administration to show voters it is taking on big tech. Additionally, a case against a tech company would help shore up the President’s right flank as he and other prominent conservatives continue to insist in the absence of evidence that technology companies are biased against the right. The team of DOJ attorneys has shrunk from 40 to about 20 as numerous lawyers asked off the case once it was clear what the Attorney General wanted. These articles also throw light on to the split between Republican and Democratic state attorneys general in the case they have been working on with the former accusing the latter of stalling for time in the hopes a Biden DOJ will be harsher on the company and the latter accusing the former of trying to file a narrow case while Donald Trump is still President that would impair efforts to address the range of Google’s alleged antitrust abuses.
  • Facebook Moves to Limit Election Chaos in November” By Mike Isaac – The New York Times. The social network giant unveiled measures to fight misinformation the week before the United States election and afterwards should people try to make factually inaccurate claims about the results. Notably, political advertisements will be banned a week before the 3 November election, but this seems like pretty weak tea considering it will be business as usual until late October. Even though the company frames these moves as “additional steps we’re taking to help secure the integrity of the U.S. elections by encouraging voting, connecting people to authoritative information, and reducing the risks of post-election confusion,” the effect of misinformation, disinformation, and lies that proliferate on Facebook will have likely already taken root by late October. It is possible the company still wants the advertising revenue it would forgo if it immediately banned political advertising. Another proposed change is to provide accurate information about voting generally and COVID-19 and voting. In fact, the platform corrected a post of President Donald Trump’s that expressed doubts about mail-in voting.
  • Washington firm ran fake Facebook accounts in Venezuela, Bolivia and Mexico, report finds” By Craig Timberg and Elizabeth Dwoskin – The Washington Post. In tandem with taking down fake content posted by the Internet Research Agency, Facebook also removed accounts traced back to a Washington, D.C. public relations firm, CLS Strategies, that was running multiple accounts to support the government in Bolivia and the opposition party in Venezuela, both of which are right wing. Using information provided by Facebook, Stanford University’s Internet Observatory released a report stating that “Facebook removed a network of 55 Facebook accounts,4 2 Pages and 36 Instagram accounts attributed to the US-based strategic communications firm CLS Strategies for engaging in coordinated inauthentic behavior (CIB).” Stanford asserted these key takeaways:
    • 11 Facebook pages related to Bolivia mainly supported Bolivia’s Interim President Jeanine Áñez and disparaged Bolivia’s former president Evo Morales. All had similar creation dates and manager location settings.
    • Venezuela-focused assets supported and promoted Venezuelan opposition leaders but changed in tone in 2020, reflecting factional divides in the opposition and a turn away from Juan Guaidó.
    • In addition to fake accounts, removed Facebook accounts include six profiles that match the names and photos of CLS Strategies employees listed publicly on their website and appear to be their real accounts.
    • CLS Strategies has a disclosed contract with the Bolivian government to provide strategic communications counsel for Bolivia’s 2020 elections and to strengthen democracy and human rights in Bolivia.
    • Coordinated inauthentic behavior reports from Facebook and Twitter have increasingly included assets linked to marketing and PR firms originating and acting around the world. The firms’ actions violate the platforms’ terms by operating internationally and failing to identify their origins and motivations to users.
    • In its release on the issue, Facebook explained:
      • In August, we removed three networks of accounts, Pages and Groups. Two of them — from Russia and the US — targeted people outside of their country, and another from Pakistan focused on both domestic audiences in Pakistan and also in India. We have shared information about our findings with law enforcement, policymakers and industry partners.
  • Belarusian Officials Shut Down Internet With Technology Made by U.S. Firm” By Ryan Gallagher – Bloomberg. A United States firm, Sandvine, sold deep packet inspection technology to the government in Belarus through a Russian intermediary. The technology was ostensibly to be used by the government to fend off dangers to the nation’s networks but was instead deployed to shut down numerous social media and news sites on the internet the day of the election. However, Belarusian activists quickly determined how to use workarounds, launching the current unrest that threatens to topple the regime. The same company’s technology has been used elsewhere in the world to cut off access to the internet as detailed by the University of Toronto’s Citizen Lab in 2018.
  • Canada has effectively moved to block China’s Huawei from 5G, but can’t say so” – Reuters. In a move reminiscent of how the People’s Republic of China (PRC) tanked Qualcomm’s proposed purchase of NXP Semiconductors in 2018, Canada has effectively barred Huawei from its 5G networks by not deciding, which eventually sent a signal to its telecommunications companies to use Ericsson and Nokia instead. This way, there is no public announcement or policy statement the PRC can object to, and the country toes the line with its other Five Eyes partners that have banned Huawei in varying degrees. Additionally, given that two Canadian nationals are being held because Huawei Chief Financial Officer Meng Wanzhou is being detained in Canada awaiting extradition to the Unted States to face criminal charges, Ottawa needs to manage its relations with the PRC gingerly.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Simon Steinberger from Pixabay

Further Reading and Other Developments (17 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Speaking of which, the Technology Policy Update is being published daily during the week, and here are the Other Developments and Further Reading from this week.

Other Developments

  • Acting Senate Intelligence Committee Chair Marco Rubio (R-FL), Senate Foreign Relations Committee Chair Jim Risch (R-ID), and Senators Chris Coons (D-DE) and John Cornyn (R-TX) wrote Secretary of Commerce Wilbur Ross and Secretary of Defense Mike Esper “to ask that the Administration take immediate measures to bring the most advanced digital semiconductor manufacturing capabilities to the United States…[which] are critical to our American economic and national security and while our nation leads in the design of semiconductors, we rely on international manufacturing for advanced semiconductor fabrication.” This letter follows the Trump Administration’s May announcement that the Taiwan Semiconductor Manufacturing Corporation (TSMC) agreed to build a $12 billion plant in Arizona. It also bears note that one of the amendments pending to the “National Defense Authorization Act for Fiscal Year 2021“ (S.4049) would establish a grants program to stimulate semiconductor manufacturing in the US.
  • Senators Mark R. Warner (D-VA), Mazie K. Hirono (D-HI) and Bob Menendez (D-NJ) sent a letter to Facebook “regarding its failure to prevent the propagation of white supremacist groups online and its role in providing such groups with the organizational infrastructure and reach needed to expand.” They also “criticized Facebook for being unable or unwilling to enforce its own Community Standards and purge white supremacist and other violent extremist content from the site” and posed “a series of questions regarding Facebook’s policies and procedures against hate speech, violence, white supremacy and the amplification of extremist content.”
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published the Pipeline Cyber Risk Mitigation Infographic that was “[d]eveloped in coordination with the Transportation Security Administration (TSA)…[that] outlines activities that pipeline owners/operators can undertake to improve their ability to prepare for, respond to, and mitigate against malicious cyber threats.”
  • Representative Kendra Horn (D-OK) and 10 other Democrats introduced legislation “requiring the U.S. government to identify, analyze, and combat efforts by the Chinese government to exploit the COVID-19 pandemic” that was endorsed by “[t]he broader Blue Dog Coalition” according to their press release. The “Preventing China from Exploiting COVID-19 Act” (H.R.7484) “requires the Director of National Intelligence—in coordination with the Secretaries of Defense, State, and Homeland Security—to prepare an assessment of the different ways in which the Chinese government has exploited or could exploit the pandemic, which originated in China, in order to advance China’s interests and to undermine the interests of the United States, its allies, and the rules-based international order.” Horn and her cosponsors stated “[t]he assessment must be provided to Congress within 90 days and posted in unclassified form on the DNI’s website.”
  • The Supreme Court of Canada upheld the “Genetic Non-Discrimination Act” and denied a challenge to the legality of the statute brought by the government of Quebec, the Attorney General of Canada, and others. The court found:
    • The pith and substance of the challenged provisions is to protect individuals’ control over their detailed personal information disclosed by genetic tests, in the broad areas of contracting and the provision of goods and services, in order to address Canadians’ fears that their genetic test results will be used against them and to prevent discrimination based on that information. This matter is properly classified within Parliament’s power over criminal law. The provisions are supported by a criminal law purpose because they respond to a threat of harm to several overlapping public interests traditionally protected by the criminal law — autonomy, privacy, equality and public health.
  • The U.S.-China Economic and Security Review Commission published a report “analyzing the evolution of U.S. multinational enterprises (MNE) operations in China from 2000 to 2017.” The Commission found MNE’s operations in the People’s Republic of China “may indirectly erode the  United  States’  domestic industrial competitiveness  and  technological  leadership relative  to  China” and “as U.S. MNE activity in China increasingly focuses on the production of high-end technologies, the risk  that  U.S.  firms  are  unwittingly enabling China to  achieve  its industrial  policy and  military  development objectives rises.”
  • The Federal Communications Commission (FCC) and Huawei filed their final briefs in their lawsuit before the United States Court of Appeals for the Fifth Circuit arising from the FCC’s designation of Huawei as a “covered company” for purposes of a rule that denies Universal Service Funds (USF) “to purchase or obtain any equipment or services produced or provided by a covered company posing a national security threat to the integrity of communications networks or the communications supply chain.” Huawei claimed in its brief that “[t]he rulemaking and “initial designation” rest on the FCC’s national security judgments..[b]ut such judgments fall far afield of the FCC’s statutory  authority  and  competence.” Huawei also argued “[t]he USF rule, moreover, contravenes the Administrative Procedure Act (APA) and the Due Process Clause.” The FCC responded in its filing that “Huawei challenges the FCC’s decision to exclude carriers whose networks are vulnerable to foreign interference, contending that the FCC has neither statutory nor constitutional authority to make policy judgments involving “national security”…[but] [t]hese arguments are premature, as Huawei has not yet been injured by the Order.” The FCC added “Huawei’s claim that the Communications Act textually commits all policy determinations with national security implications to the President is demonstrably false.”
  • European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski released his Strategy for 2020-2024, “which will focus on Digital Solidarity.” Wiewiórowski explained that “three core pillars of the EDPS strategy outline the guiding actions and objectives for the organisation to the end of 2024:
    • Foresight: The EDPS will continue to monitor legal, social and technological advances around the world and engage with experts, specialists and data protection authorities to inform its work.
    • Action: To strengthen the EDPS’ supervision, enforcement and advisory roles the EDPS will promote coherence in the activities of enforcement bodies in the EU and develop tools to assist the EU institutions, bodies and agencies to maintain the highest standards in data protection.
    • Solidarity: While promoting digital justice and privacy for all, the EDPS will also enforce responsible and sustainable data processing, to positively impact individuals and maximise societal benefits in a just and fair way.
  • Facebook released a Civil Rights Audit, an “investigation into Facebook’s policies and practices began in 2018 at the behest and encouragement of the civil rights community and some members of Congress.” Those charged with conducting the audit explained that they “vigorously advocated for more and would have liked to see the company go further to address civil rights concerns in a host of areas that are described in detail in the report” including but not limited to
    • A stronger interpretation of its voter suppression policies — an interpretation that makes those policies effective against voter suppression and prohibits content like the Trump voting posts — and more robust and more consistent enforcement of those policies leading up to the US 2020 election.
    • More visible and consistent prioritization of civil rights in company decision-making overall.
    • More resources invested to study and address organized hate against Muslims, Jews and other targeted groups on the platform.
    • A commitment to go beyond banning explicit references to white separatism and white nationalism to also prohibit express praise, support and representation of white separatism and white nationalism even where the terms themselves are not used.
    • More concrete action and specific commitments to take steps to address concerns about algorithmic bias or discrimination.
    • They added that “[t]his report outlines a number of positive and consequential steps that the company has taken, but at this point in history, the Auditors are concerned that those gains could be obscured by the vexing and heartbreaking decisions Facebook has made that represent significant setbacks for civil rights.”
  • The National Security Commission on Artificial Intelligence (NSCAI) released a white paper titled “The Role of AI Technology in Pandemic Response and Preparedness” that “outlines a series of investments and initiatives that the United States must undertake to realize the full potential of AI to secure our nation against pandemics.” NSCAI noted its previous two white papers:
  • Secretary of Defense Mark Esper announced that Chief Technology Officer Michael J.K. Kratsios has “been designated to serve as Acting Under Secretary of Defense for Research and Engineering” even though he does not have a degree in science. The last Under Secretary held a PhD. However, Kratsios worked for venture capitalist Peter Thiel who backed President Donald Trump when he ran for office in 2016.
  • The United States’ Department of Transportation’s Federal Railroad Administration (FRA) issued research “to develop a cyber security risk analysis methodology for communications-based connected railroad technologies…[and] [t]he use-case-specific implementation of the methodology can identify potential cyber attack threats, system vulnerabilities, and consequences of the attack– with risk assessment and identification of promising risk mitigation strategies.”
  • In a blog post, a National Institute of Standards and Technology (NIST) economist asserted cybercrime may be having a much larger impact on the United States’ economy than previously thought:
    • In a recent NIST report, I looked at losses in the U.S. manufacturing industry due to cybercrime by examining an underutilized dataset from the Bureau of Justice Statistics, which is the most statistically reliable data that I can find. I also extended this work to look at the losses in all U.S. industries. The data is from a 2005 survey of 36,000 businesses with 8,079 responses, which is also by far the largest sample that I could identify for examining aggregated U.S. cybercrime losses. Using this data, combined with methods for examining uncertainty in data, I extrapolated upper and lower bounds, putting 2016 U.S. manufacturing losses to be between 0.4% and 1.7% of manufacturing value-added or between $8.3 billion and $36.3 billion. The losses for all industries are between 0.9% and 4.1% of total U.S. gross domestic product (GDP), or between $167.9 billion and $770.0 billion. The lower bound is 40% higher than the widely cited, but largely unconfirmed, estimates from McAfee.
  • The Government Accountability Office (GAO) advised the Federal Communications Commission (FCC) that it needs a comprehensive strategy for implementing 5G across the United States. The GAO concluded
    • FCC has taken a number of actions regarding 5G deployment, but it has not clearly developed specific and measurable performance goals and related measures–with the involvement of relevant stakeholders, including National Telecommunications and Information Administration (NTIA)–to manage the spectrum demands associated with 5G deployment. This makes FCC unable to demonstrate whether the progress being made in freeing up spectrum is achieving any specific goals, particularly as it relates to congested mid-band spectrum. Additionally, without having established specific and measurable performance goals with related strategies and measures for mitigating 5G’s potential effects on the digital divide, FCC will not be able to assess the extent to which its actions are addressing the digital divide or what actions would best help all Americans obtain access to wireless networks.
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “Time Guidance for Network Operators, Chief Information Officers, and Chief Information Security Officers” “to inform public and private sector organizations, educational institutions, and government agencies on time resilience and security practices in enterprise networks and systems…[and] to address gaps in available time testing practices, increasing awareness of time-related system issues and the linkage between time and cybersecurity.”
  • Fifteen Democratic Senators sent a letter to the Department of Defense, Office of the Director of National Intelligence (ODNI), Department of Homeland Security (DHS), Federal Bureau of Investigations (FBI), and U.S. Cyber Command, urging them “to take additional measures to fight influence campaigns aimed at disenfranchising voters, especially voters of color, ahead of the 2020 election.” They called on these agencies to take “additional measures:”
    • The American people and political candidates are promptly informed about the targeting of our political processes by foreign malign actors, and that the public is provided regular periodic updates about such efforts leading up to the general election.
    • Members of Congress and congressional staff are appropriately and adequately briefed on continued findings and analysis involving election related foreign disinformation campaigns and the work of each agency and department to combat these campaigns.
    • Findings and analysis involving election related foreign disinformation campaigns are shared with civil society organizations and independent researchers to the maximum extent which is appropriate and permissible.
    • Secretary Esper and Director Ratcliffe implement a social media information sharing and analysis center (ISAC) to detect and counter information warfare campaigns across social media platforms as authorized by section 5323 of the Fiscal Year 2020 National Defense Authorization Act.
    • Director Ratcliffe implement the Foreign Malign Influence Response Center to coordinate a whole of government approach to combatting foreign malign influence campaigns as authorized by section 5322 of the Fiscal Year 2020 National Defense Authorization Act.
  • The Information Technology and Innovation Foundation (ITIF) unveiled an issue brief “Why New Calls to Subvert Commercial Encryption Are Unjustified” arguing “that government efforts to subvert encryption would negatively impact individuals and businesses.” ITIF offered these “key takeaways:”
    • Encryption gives individuals and organizations the means to protect the confidentiality of their data, but it has interfered with law enforcement’s ability to prevent and investigate crimes and foreign threats.
    • Technological advances have long frustrated some in the law enforcement community, giving rise to multiple efforts to subvert commercial use of encryption, from the Clipper Chip in the 1990s to the San Bernardino case two decades later.
    • Having failed in these prior attempts to circumvent encryption, some law enforcement officials are now calling on Congress to invoke a “nuclear option”: legislation banning “warrant-proof” encryption.
    • This represents an extreme and unjustified measure that would do little to take encryption out of the hands of bad actors, but it would make commercial products less secure for ordinary consumers and businesses and damage U.S. competitiveness.
  • The White House released an executive order in which President Donald Trump determined “that the Special Administrative Region of Hong Kong (Hong Kong) is no longer sufficiently autonomous to justify differential treatment in relation to the People’s Republic of China (PRC or China) under the particular United States laws and provisions thereof set out in this order.” Trump further determined “the situation with respect to Hong Kong, including recent actions taken by the PRC to fundamentally undermine Hong Kong’s autonomy, constitutes an unusual and extraordinary threat, which has its source in substantial part outside the United States, to the national security, foreign policy, and economy of the United States…[and] I hereby declare a national emergency with respect to that threat.” The executive order would continue the Administration’s process of changing policy to ensure Hong Kong is treated the same as the PRC.
  • President Donald Trump also signed a bill passed in response to the People’s Republic of China (PRC) passing legislation the United States and other claim will strip Hong Kong of the protections the PRC agreed to maintain for 50 years after the United Kingdom (UK) handed over the city. The “Hong Kong Autonomy Act” “requires the imposition of sanctions on Chinese individuals and banks who are included in an annual State Department list found to be subverting Hong Kong’s autonomy” according to the bill’s sponsor Representative Brad Sherman (D-CA).
  • Representative Stephen Lynch, who chairs House Oversight and Reform Committee’s National Security Subcommittee, sent letters to Apple and Google “after the Office of the Director of National Intelligence (ODNI) and the Federal Bureau of Investigation (FBI) confirmed that mobile applications developed, operated, or owned by foreign entities, including China and Russia, could potentially pose a national security risk to American citizens and the United States” according to his press release. He noted in letters sent by the technology companies to the Subcommittee that:
    • Apple confirmed that it does not require developers to submit “information on where user data (if any such data is collected by the developer’s app) will be housed” and that it “does not decide what user data a third-party app can access, the user does.”
    • Google stated that it does “not require developers to provide the countries in which their mobile applications will house user data” and acknowledged that “some developers, especially those with a global user base, may store data in multiple countries.”
    • Lynch is seeking “commitments from Apple and Google to require information from application developers about where user data is stored, and to make users aware of that information prior to downloading the application on their mobile devices.”
  • Minnesota Attorney General Keith Ellison announced a settlement with Frontier Communications that “concludes the three major investigations and lawsuits that the Attorney General’s office launched into Minnesota’s major telecoms providers for deceptive, misleading, and fraudulent practices.” The Office of the Attorney General (OAG) stated
    • Based on its investigation, the Attorney General’s Office alleged that Frontier used a variety of deceptive and misleading practices to overcharge its customers, such as: billing customers more than they were quoted by Frontier’s agents; failing to disclose fees and surcharges in its sales presentations and advertising materials; and billing customers for services that were not delivered.
    • The OAG “also alleged that Frontier sold Minnesotans expensive internet services with so-called “maximum speed” ratings that were not attainable, and that Frontier improperly advertised its service as “reliable,” when in fact it did not provide enough bandwidth for customers to consistently receive their expected service.”
  • The European Data Protection Board (EDPB) issued guidelines “on the criteria of the Right to be Forgotten in the search engines cases under the GDPR” that “focuses solely on processing by search engine providers and delisting requests  submitted by data subjects” even Article 17 of the General Data Protection Regulation applies to all data controllers. The EDPB explained “This paper is divided into two topics:
    • The first topic concerns the grounds a data subject can rely on for a delisting request sent to a search engine provider pursuant to Article 17.1 GDPR.
    • The second topic concerns the exceptions to the Right to request delisting according to Article 17.3 GDPR.
  • The Australian Competition & Consumer Commission (ACCC) “is seeking views on draft Rules and accompanying draft Privacy Impact Assessment that authorise third parties who are accredited at the ‘unrestricted’ level to collect Consumer Data Right (CDR) data on behalf of another accredited person.” The ACCC explained “[t]his will allow accredited persons to utilise other accredited parties to collect CDR data and provide other services that facilitate the provision of goods and services to consumers.” In a March explanatory statement, the ACCC stated “[t]he CDR is an economy-wide reform that will apply sector-by-sector, starting with the banking sector…[and] [t]he objective of the CDR is to provide individual and business consumers (consumers) with the ability to efficiently and conveniently access specified data held about them by businesses (data holders), and to authorise the secure disclosure of that data to third parties (accredited data recipients) or to themselves.” The ACCC noted “[t]he CDR is regulated by both the ACCC and the Office of the Australian Information Commissioner (OAIC) as it concerns both competition and consumer matters as well as the privacy and confidentiality of consumer data.” Input is due by 20 July.
  • Office of the Inspector General (OIG) for the Department of the Interior (Interior) found that even though the agency spends $1.4 billion annually on cybersecurity “[g]uarding against increasing cybersecurity threats” remains one of Interior’s top challenges. The OIG asserted Interior “continues to struggle to implement an enterprise information technology (IT) security program that balances compliance, cost, and risk while enabling bureaus to meet their diverse missions.”
  • In a summary of its larger investigation into “Security over Information Technology Peripheral Devices at Select Office of Science Locations,” the Department of Energy’s Office of the Inspector General (OIG) that “identified weaknesses related to access controls and configuration settings” for peripheral devices (e.g. thumb drives, printers, scanners and other connected devices)  “similar in type to those identified in prior evaluations of the Department’s unclassified cybersecurity program.”
  • The House Homeland Security Committee’s Cybersecurity, Infrastructure Protection, and Innovation Subcommittee Ranking Member John Katko (R-NY) “a comprehensive national cybersecurity improvement package” according to his press release, consisting of these bills:
    • The “Cybersecurity and Infrastructure Security Agency Director and Assistant Directors Act:”  This bipartisan measure takes steps to improve guidance and long-term strategic planning by stabilizing the CISA Director and Assistant Directors positions. Specifically, the bill:
      • Creates a 5-year term for the CISA Director, with a limit of 2 terms. The term of office for the current Director begins on date the Director began to serve.
      • Elevates the Director to the equivalent of a Deputy Secretary and Military Service Secretaries.
      • Depoliticizes the Assistant Director positions, appointed by the Secretary of the Department of Homeland Security (DHS), categorizing them as career public servants. 
    • The “Strengthening the Cybersecurity and Infrastructure Security Agency Act of 2020:” This measure mandates a comprehensive review of CISA in an effort to strengthen its operations, improve coordination, and increase oversight of the agency. Specifically, the bill:
      • Requires CISA to review how additional appropriations could be used to support programs for national risk management, federal information systems management, and public-private cybersecurity and integration. It also requires a review of workforce structure and current facilities and projected needs. 
      • Mandates that CISA provides a report to the House and Senate Homeland Committees within 1-year of enactment. CISA must also provide a report and recommendations to GSA on facility needs. 
      • Requires GSA to provide a review to the Administration and House and Senate Committees on CISA facilities needs within 30-days of Congressional report. 
    • The “CISA Public-Private Talent Exchange Act:” This bill requires CISA to create a public-private workforce program to facilitate the exchange of ideas, strategies, and concepts between federal and private sector cybersecurity professionals. Specifically, the bill:
      • Establishes a public-private cyber exchange program allowing government and industry professionals to work in one another’s field.
      • Expands existing private outreach and partnership efforts. 
  • The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) is ordering United States federal civilian agencies “to apply the July 2020 Security Update for Windows Servers running DNS (CVE-2020-1350), or the temporary registry-based workaround if patching is not possible within 24 hours.” CISA stated “[t]he software update addresses a significant vulnerability where a remote attacker could exploit it to take control of an affected system and run arbitrary code in the context of the Local System Account.” CISA Director Christopher Krebs explained “due to the wide prevalence of Windows Server in civilian Executive Branch agencies, I’ve determined that immediate action is necessary, and federal departments and agencies need to take this remote code execution vulnerability in Windows Server’s Domain Name System (DNS) particularly seriously.”
  • The United States (US) Department of State has imposed “visa restrictions on certain employees of Chinese technology companies that provide material support to regimes engaging in human rights abuses globally” that is aimed at Huawei. In its statement, the Department stated “Companies impacted by today’s action include Huawei, an arm of the Chinese Communist Party’s (CCP) surveillance state that censors political dissidents and enables mass internment camps in Xinjiang and the indentured servitude of its population shipped all over China.” The Department claimed “[c]ertain Huawei employees provide material support to the CCP regime that commits human rights abuses.”
  • Earlier in the month, the US Departments of State, Treasury, Commerce, and of Homeland Security issued an “advisory to highlight the harsh repression in Xinjiang.” The agencies explained
    • Businesses, individuals, and other persons, including but not limited to academic institutions, research service providers, and investors (hereafter “businesses and individuals”), that choose to operate in Xinjiang or engage with entities that use labor from Xinjiang elsewhere in China should be aware of reputational, economic, and, in certain instances, legal, risks associated with certain types of involvement with entities that engage in human rights abuses, which could include Withhold Release Orders (WROs), civil or criminal investigations, and export controls.
  • The United Kingdom’s National Cyber Security Centre (NCSC), Canada’s Communications  Security Establishment (CSE), United States’ National Security Agency (NSA) and the United States’ Department of Homeland Security’s Cybersecurity and Infrastructure Security  Agency (CISA) issued a joint advisory on a Russian hacking organization’s efforts have “targeted various organisations involved in COVID-19 vaccine development in Canada, the United States and the United Kingdom, highly likely with the intention of stealing information and intellectual property relating to the development and testing of COVID-19 vaccines.” The agencies named APT29 (also known as ‘the Dukes’ or ‘Cozy Bear’), “a cyber espionage group, almost certainly part of the Russian intelligence services,” as the culprit behind “custom malware known as ‘WellMess’ and ‘WellMail.’”
    • This alert follows May advisories issued by Australia, the US, and the UK on hacking threats related to the pandemic. Australia’s Department of Foreign Affairs and Trade (DFAT) and the Australian Cyber Security Centre (ACSC) issued “Advisory 2020-009: Advanced Persistent Threat (APT) actors targeting Australian health sector organisations and COVID-19 essential services” that asserted “APT groups may be seeking information and intellectual property relating to vaccine development, treatments, research and responses to the outbreak as this information is now of higher value and priority globally.” CISA and NCSC issued a joint advisory for the healthcare sector, especially companies and entities engaged in fighting COVID-19. The agencies stated that they have evidence that Advanced Persistent Threat (APT) groups “are exploiting the COVID-19 pandemic as part of their cyber operations.” In an unclassified public service announcement, the Federal Bureau of Investigation (FBI) and CISA named the People’s Republic of China as a nation waging a cyber campaign against U.S. COVID-19 researchers. The agencies stated they “are issuing this announcement to raise awareness of the threat to COVID-19-related research.”
  • The National Initiative for Cybersecurity Education (NICE) has released a draft National Institute of Standards and Technology (NIST) Special Publication (SP) for comment due by 28 August. Draft NIST Special Publication (SP) 800-181 Revision 1, Workforce Framework for Cybersecurity (NICE Framework) that features several updates, including:
    • an updated title to be more inclusive of the variety of workers who perform cybersecurity work,
    • definition and normalization of key terms,
    • principles that facilitate agility, flexibility, interoperability, and modularity,
    • introduction of competencies,
  • Representatives Glenn Thompson (R-PA), Collin Peterson (D-MN), and James Comer (R-KY) sent a letter to Federal Communications Commission (FCC) “questioning the Commission’s April 20, 2020 Order granting Ligado’s application to deploy a terrestrial nationwide network to provide 5G services.”
  • The European Commission (EC) is asking for feedback on part of its recently released data strategy by 31 July. The EC stated it is aiming “to create a single market for data, where data from public bodies, business and citizens can be used safely and fairly for the common good…[and] [t]his initiative will draw up rules for common European data spaces (covering areas like the environment, energy and agriculture) to:
    • make better use of publicly held data for research for the common good
    • support voluntary data sharing by individuals
    • set up structures to enable key organisations to share data.
  • The United Kingdom’s Parliament is asking for feedback on its legislative proposal to regulate Internet of Things (IoT) devices. The Department for Digital, Culture, Media & Sport explained “the obligations within the government’s proposed legislative framework would fall mainly on the manufacturer if they are based in the UK, or if not based in the UK, on their UK representative.” The Department is also “developing an enforcement approach with relevant stakeholders to identify an appropriate enforcement body to be granted day to day responsibility and operational control of monitoring compliance with the legislation.” The Department also touted the publishing of the European Telecommunications Standards Institute’s (ETSI) “security baseline for Internet-connected consumer devices and provides a basis for future Internet of Things product certification schemes.”
  • Facebook issued a white paper, titled “CHARTING A WAY FORWARD: Communicating Towards People-Centered and Accountable Design About Privacy,” in which the company states its desire to be involved in shaping a United States privacy law (See below for an article on this). Facebook concluded:
    • Facebook recognizes the responsibility we have to make sure that people are informed about the data that we collect, use, and share.
    • That’s why we support globally consistent comprehensive privacy laws and regulations that, among other things, establish people’s basic rights to be informed about how their information is collected, used, and shared, and impose obligations for organizations to do the same, including the obligation to build internal processes that maintain accountability.
    • As improvements to technology challenge historic approaches to effective communications with people about privacy, companies and regulators need to keep up with changing times.
    • To serve the needs of a global community, on both the platforms that exist now and those that are yet to be developed, we want to work with regulators, companies, and other interested third parties to develop new ways of informing people about their data, empowering them to make meaningful choices, and holding ourselves accountable.
    • While we don’t have all the answers, there are many opportunities for businesses and regulators to embrace modern design methods, new opportunities for better collaboration, and innovative ways to hold organizations accountable.
  • Four Democratic Senators sent Facebook a letter “about reports that Facebook has created fact-checking exemptions for people and organizations who spread disinformation about the climate crisis on its social media platform” following a New York Times article this week on the social media’s practices regarding climate disinformation. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars. Senators Elizabeth Warren (D-WA), Tom Carper (D-DE), Sheldon Whitehouse (D-R.I.) and Brian Schatz (D-HI) argued “[i]f Facebook is truly “committed to fighting the spread of false news on Facebook and Instagram,” the company must immediately acknowledge in its fact-checking process that the climate crisis is not a matter of opinion and act to close loopholes that allow climate disinformation to spread on its platform.” They posed a series of questions to Facebook CEO Mark Zuckerberg on these practices, requesting answers by 31 July.
  • A Canadian court has found that the Canadian Security Intelligence Service (CSIS) “admittedly collected information in a manner that is contrary to this foundational commitment and then relied on that information in applying for warrants under the Canadian Security Intelligence Service Act, RSC 1985, c C-23 [CSIS Act]” according to a court summary of its redacted decision. The court further stated “[t]he Service and the Attorney General also admittedly failed to disclose to the Court the Service’s reliance on information that was likely collected unlawfully when seeking warrants, thereby breaching the duty of candour owed to the Court.” The court added “[t]his is not the first time this Court has been faced with a breach of candour involving the Service…[and] [t]he events underpinning this most recent breach were unfolding as recommendations were being implemented by the Service and the Attorney General to address previously identified candour concerns.” CSIS was found to have illegally collected and used metadata in a 2016 case ion its conduct between 2006-2016. In response to the most recent ruling, CSIS is vowing to implement a range of reforms. The National Security and Intelligence Review Agency (NSIRA) is pledging the same.
  • The United Kingdom’s National Police Chiefs’ Council (NPCC) announced the withdrawal of “[t]he ‘Digital device extraction – information for complainants and witnesses’ form and ‘Digital Processing Notice’ (‘the relevant forms’) circulated to forces in February 2019 [that] are not sufficient for their intended purpose.” In mid-June, the UK’s data protection authority, the Information Commissioner’s Office (ICO) unveiled its “finding that police data extraction practices vary across the country, with excessive amounts of personal data often being extracted, stored, and made available to others, without an appropriate basis in existing data protection law.” This withdrawal was also due, in part, to a late June Court of Appeal decision.  
  • A range of public interest and advocacy organizations sent a letter to Speaker of the House Nancy Pelosi (D-CA) and House Minority Leader Kevin McCarthy (R-CA) noting “there are intense efforts underway to do exactly that, via current language in the House and Senate versions of the FY2021 National Defense Authorization Act (NDAA) that ultimately seek to reverse the FCC’s recent bipartisan and unanimous approval of Ligado Networks’ regulatory plans.” They urged them “not endorse efforts by the Department of Defense and its allies to veto commercial spectrum authorizations…[and][t]he FCC has proven itself to be the expert agency on resolving spectrum disputes based on science and engineering and should be allowed to do the job Congress authorized it to do.” In late April, the FCC’s “decision authorize[d] Ligado to deploy a low-power terrestrial nationwide network in the 1526-1536 MHz, 1627.5-1637.5 MHz, and 1646.5-1656.5 MHz bands that will primarily support Internet of Things (IoT) services.” The agency argued the order “provides regulatory certainty to Ligado, ensures adjacent band operations, including Global Positioning System (GPS), are sufficiently protected from harmful interference, and promotes more efficient and effective use of [the U.S.’s] spectrum resources by making available additional spectrum for advanced wireless services, including 5G.”
  • The European Data Protection Supervisor (EDPS) rendered his opinion on the European Commission’s White Paper on Artificial Intelligence: a European approach to excellence and trust and recommended the following for the European Union’s (EU) regulation of artificial intelligence (AI):
    • applies both to EU Member States and to EU institutions, offices, bodies and agencies;
    • is designed to protect from any negative impact, not only on individuals, but also on communities and society as a whole;
    • proposes a more robust and nuanced risk classification scheme, ensuring any significant potential harm posed by AI applications is matched by appropriate mitigating measures;
    • includes an impact assessment clearly defining the regulatory gaps that it intends to fill.
    • avoids overlap of different supervisory authorities and includes a cooperation mechanism.
    • Regarding remote biometric identification, the EDPS supports the idea of a moratorium on the deployment, in the EU, of automated recognition in public spaces of human features, not only of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals, so that an informed and democratic debate can take place and until the moment when the EU and Member States have all the appropriate safeguards, including a comprehensive legal framework in place to guarantee the proportionality of the respective technologies and systems for the specific use case.
  • The Bundesamt für Verfassungsschutz (BfV), Germany’s domestic security agency, released a summary of its annual report in which it claimed:
    • The Russian Federation, the People’s Republic of China, the Islamic Republic of Iran and the Republic of Turkey remain the main countries engaged in espionage activities and trying to exert influence on Germany.
    • The ongoing digital transformation and the increasingly networked nature of our society increases the potential for cyber attacks, worsening the threat of cyber espionage and cyber sabotage.
    • The intelligence services of the Russian Federation and the People’s Republic of China in particular carry out cyber espionage activities against German agencies. One of their tasks is to boost their own economies with the help of information gathered by the intelligence services. This type of information-gathering campaign severely threatens the success and development opportunities of German companies.
    • To counteract this threat, Germany has a comprehensive cyber security architecture in place, which is operated by a number of different authorities. The BfV plays a major role in investigating and defending against cyber threats by detecting attacks, attributing them to specific attackers, and using the knowledge gained from this to draw up prevention strategies. The National Cyber Response Centre, in which the BfV plays a key role, was set up to consolidate the co-operation between the competent agencies. The National Cyber Response Centre aims to optimise the exchange of information between state agencies and to improve the co-ordination of protective and defensive measures against potential IT incidents.

Further Reading

  • Trump confirms cyberattack on Russian trolls to deter them during 2018 midterms” – The Washington Post. In an interview with former George W. Bush speechwriter Marc Thiessen, President Donald Trump confirmed he ordered a widely reported retaliatory attack on the Russian Federation’s Internet Research Agency as a means of preventing interference during the 2018 mid-term election. Trump claimed this attack he ordered was the first action the United States took against Russian hacking even though his predecessor warned Russian President Vladimir Putin to stop such activities and imposed sanctions at the end of 2016. The timing of Trump’s revelation is interesting given the ongoing furor over reports of Russian bounties paid to Taliban fighters for killing Americans the Trump Administration may have known of but did little or nothing to stop.
  • Germany proposes first-ever use of EU cyber sanctions over Russia hacking” – Deutsche Welle. Germany is looking to use the European Union’s (EU) cyber sanctions powers against Russia for its alleged 2015 16 GB exfiltration of data from the Bundestag’s systems, including from Chancellor Angela Merkel’s office. Germany has been alleging that Fancy Bear (aka APT28) and Russia’s military secret service GRU carried out the attack. Germany has circulated its case for sanctions to other EU nations and EU leadership. In 2017, the European Council declared “[t]he EU diplomatic response to malicious cyber activities will make full use of measures within the Common Foreign and Security Policy, including, if necessary, restrictive measures…[and] [a] joint EU response to malicious cyber activities would be proportionate to the scope, scale, duration, intensity, complexity, sophistication and impact of the cyber activity.”
  • Wyden Plans Law to Stop Cops From Buying Data That Would Need a Warrant” – VICE. Following on a number of reports that federal, state, and local law enforcement agencies are essentially sidestepping the Fourth Amendment through buying location and other data from people’s smartphones, Senator Ron Wyden (D-OR) is going to draft legislation that would seemingly close what he, and other civil libertarians, are calling a loophole to the warrant requirement.
  • Amazon Backtracks From Demand That Employees Delete TikTok” – The New York Times. Amazon first instructed its employees to remove ByteDance’s app, TikTok, on 11 July from company devices and then reversed course the same day, claiming the email had been erroneously sent out. The strange episode capped another tumultuous week for ByteDance as the Trump Administration is intensifying pressure in a number of ways on the company which officials claim is subject to the laws of the People’s Republic of China and hence must share information with the government in Beijing. ByteDance counters the app marketed in the United States is through a subsidiary not subject to PRC law. ByteDance also said it would no longer offer the app in Hong Kong after the PRC change in law has extended the PRC’s reach into the former British colony. TikTok was also recently banned in India as part of a larger struggle between India and he PRC. Additionally, the Democratic National Committee warned staff about using the app this week, too.
  • Is it time to delete TikTok? A guide to the rumors and the real privacy risks.” – The Washington Post. A columnist and security specialist found ByteDance’s app vacuums up information from users, but so does Facebook and other similar apps. They scrutinized TikTok’s privacy policy and where the data went, and they could not say with certainty that it goes to and stays on servers in the US and Singapore. 
  • California investigating Google for potential antitrust violations” – Politico. California Attorney General Xavier Becerra is going to conduct his own investigation of Google aside and apart from the investigation of the company’s advertising practices being conducted by virtually every other state in the United States. It was unclear why Becerra opted against joining the larger probe launched in September 2019. Of course, the Trump Administration’s Department of Justice is also investigating Google and could file suit as early as this month.
  • How May Google Fight an Antitrust Case? Look at This Little-Noticed Paper” – The New York Times. In a filing with the Australian Competition and Consumer Commission (ACCC), Google claimed it does not control the online advertising market and it is borne out by a number of indicia that argue against a monopolistic situation. The company is likely to make the same case to the United States’ government in its antitrust inquiry. However, similar arguments did not gain tractions before the European Commission, which levied a €1.49 billion for “breaching EU antitrust rules” in March 2019.
  •  “Who Gets the Banhammer Now?” – The New York Times. This article examines possible motives for the recent wave of action by social media platforms to police a fraction of the extreme and hateful speech activists and others have been asking them to take down for years. This piece makes the argument that social media platforms are businesses and operate as such and expecting them to behave as de facto public squares dedicated to civil political and societal discourse is more or less how we ended up where we are.
  • TikTok goes tit-for-tat in appeal to MPs: ‘stop political football’ – The Australian. ByteDance is lobbying hard in Canberra to talk Ministers of Parliament out of possibly banning TikTok like the United States has said it is considering. While ByteDance claims the data collected on users in Australia is sent to the US or Singapore, some experts are arguing just to maintain and improve the app would necessarily result in some non-People’s Republic of China (PRC) user data making its way back to the PRC. As Australia’s relationship with the PRC has grown more fraught with allegations PRC hackers infiltrated Parliament and the Prime Minister all but saying PRC hackers were targeting hospitals and medical facilities, the government in Canberra could follow India’s lead and ban the app.
  • Calls for inquiry over claims Catalan lawmaker’s phone was targeted” – The Guardian. British and Spanish newspapers are reporting that an official in Catalonia who favors separating the region from Spain may have had his smartphone compromised with industrial grade spyware typically used only by law enforcement and counterterrorism agencies. The President of the Parliament of Catalonia Roger Torrent claims his phone was hacked for domestic political purposes, which other Catalan leaders argued, too. A spokesperson for the Spanish government said “[t]he government has no evidence that the speaker of the Catalan parliament has been the victim of a hack or theft involving his mobile.” However, the University of Toronto’s CitizenLab, the entity that researched and claimed that Israeli firm NSO Group’s spyware was deployed via WhatsApp to spy on a range of journalists, officials, and dissidents, often by their own governments, confirmed that Torrent’s phone was compromised.
  • While America Looks Away, Autocrats Crack Down on Digital News Sites” – The New York Times. The Trump Administration’s combative relationship with the media in the United States may be encouraging other nations to crack down on digital media outlets trying to hold those governments to account.
  •  “How Facebook Handles Climate Disinformation” – The New York Times. Even though the social media giant has moved aggressively to take down false and inaccurate COVID-19 posts, climate disinformation lives on the social media platform largely unmolested for a couple of reasons. First, Facebook marks these sorts of posts as opinion and take the approach that opinions should be judged under an absolutist free speech regime. Moreover, Facebook asserts posts of this sort do not pose any imminent harm and therefore do not need to be taken down. Despite having teams of fact checkers to vet posts of demonstrably untrue information, Facebook chooses not to, most likely because material that elicits strong reactions from users drive engagement that, in turn, drives advertising dollars.
  • Here’s how President Trump could go after TikTok” – The Washington Post. This piece lays out two means the Trump Administration could employ to press ByteDance in the immediate future: use of the May 2019 Executive Order “Securing the Information and Communications Technology and Services Supply Chain” or the Committee on Foreign Investment in the United States process examining ByteDance of the app Music.ly that became TikTok. Left unmentioned in this article is the possibility of the Federal Trade Commission (FTC) examining its 2019 settlement with ByteDance to settle violations of the “Children’s Online Privacy Protection Act” (COPPA).
  • You’re Doomscrolling Again. Here’s How to Snap Out of It.” – The New York Times. If you find yourself endlessly looking through social media feeds, this piece explains why and how you might stop doing so.
  • UK selling spyware and wiretaps to 17 repressive regimes including Saudi Arabia and China” – The Independent. There are allegations that the British government has ignored its own regulations on selling equipment and systems that can be used for surveillance and spying to other governments with spotty human rights records. Specifically, the United Kingdom (UK) has sold £75m to countries non-governmental organizations (NGO) are rated as “not free.” The claims include nations such as the People’s Republic of China (PRC), the Kingdom of Saudi Arabia, Bahrain, and others. Not surprisingly, NGOs and the minority Labour party are calling for an investigation and changes.
  • Google sued for allegedly tracking users in apps even after opting out” – c/net. Boies Schiller Flexner filed suit in what will undoubtedly seek to become a class action suit over Google’s alleged continuing to track users even when they turned off tracking features. This follows a suit filed by the same firm against Google in June, claiming its browser Chrome still tracks people when they switch to incognito mode.
  • Secret Trump order gives CIA more powers to launch cyberattacks” – Yahoo! News. It turns out that in addition to signing National Security Presidential Memorandum (NSPM) 13 that revamped and eased offensive cyber operations for the Department of Defense, President Donald Trump signed a presidential finding that has allowed the Central Intelligence Agency (CIA) to launch its own offensive cyber attacks, mainly at Russia and Iran, according to unnamed former United States (US) officials according to this blockbuster story. Now, the decision to commence with an attack is not vetted by the National Security Council; rather, the CIA makes the decision. Consequently, there have been a number of attacks on US adversaries that until now have not been associated with the US. And, the CIA is apparently not informing the National Security Agency or Cyber Command of its operations, raising the risk of US cyber forces working at cross purposes or against one another in cyberspace. Moreover, a recently released report blamed the lax security environment at the CIA for a massive exfiltration of hacking tools released by Wikileaks. 
  • Facebook’s plan for privacy laws? ‘Co-creating’ them with Congress” – Protocol. In concert with the release of a new white paper, Facebook Deputy Chief Privacy Officer Rob Sherman sat for an interview in which he pledged the company’s willingness to work with Congress to co-develop a national privacy law. However, he would not comment on any of the many privacy bills released thus far or the policy contours of a bill Facebook would favor except for advocating for an enhanced notice and consent regime under which people would be better informed about how their data is being used. Sherman also shrugged off suggestions Facebook may not be welcome given its record of privacy violations. Finally, it bears mention that similar efforts by other companies at the state level have not succeeded as of yet. For example, Microsoft’s efforts in Washington state have not borne fruit in the passage of a privacy law.
  • Deepfake used to attack activist couple shows new disinformation frontier” – Reuters. We are at the beginning of a new age of disinformation in which fake photographs and video will be used to wage campaigns against nations, causes, and people. An activist and his wife were accused of being terrorist sympathizers by a university student who apparently was an elaborate ruse for someone or some group looking to defame the couple. Small errors gave away the ruse this time, but advances in technology are likely to make detection all the harder.
  • Biden, billionaires and corporate accounts targeted in Twitter hack” – The Washington Post. Policymakers and security experts were alarmed when the accounts of major figures like Bill Gates and Barack Obama were hacked yesterday by some group seeking to sell bitcoin. They argue Twitter was lucky this time and a more ideologically motivated enemy may seek to cause havoc, say on the United States’ coming election. A number of experts are claiming the penetration of the platform must have been of internal controls for so many high profile accounts to be taken over at the same time.
  • TikTok Enlists Army of Lobbyists as Suspicions Over China Ties Grow” – The New York Times. ByteDance’s payments for lobbying services in Washington doubled between the last quarter of 2019 and thirst quarter of 2020, as the company has retained more than 35 lobbyists to push back against the Trump Administration’s rhetoric and policy changes. The company is fighting against a floated proposal to ban the TikTok app on national security grounds, which would cut the company off from another of its top markets after India banned it and scores of other apps from the People’s Republic of China. Even if the Administration does not bar use of the app in the United States, the company is facing legislation that would ban its use on federal networks and devices that will be acted upon next week by a Senate committee. Moreover, ByteDance’s acquisition of the app that became TikTok is facing a retrospective review of an inter-agency committee for national security considerations that could result in an unwinding of the deal. Moreover, the Federal Trade Commission (FTC) has been urged to review ByteDance’s compliance with a 2019 settlement that the company violated regulations protecting the privacy of children that could result in multi-billion dollar liability if wrongdoing is found.
  • Why Google and Facebook Are Racing to Invest in India” – Foreign Policy. With New Delhi banning 59 apps and platforms from the People’s Republic of China (PRC), two American firms have invested in an Indian giant with an eye toward the nearly 500 million Indians not yet online. Reliance Industries’ Jio Platforms have sold stakes to Google and Facebook worth $4.5 billion and $5.7 billion that gives them prized positions as the company looks to expand into 5G and other online ventures. This will undoubtedly give a leg up to the United States’ online giants in vying with competitors to the world’s second most populous nation.
  • “Outright Lies”: Voting Misinformation Flourishes on Facebook” – ProPublica. In this piece published with First Draft, “a global nonprofit that researches misinformation,” an analysis of the most popular claims made about mail voting show that many of them are inaccurate or false, thus violating the platforms terms of services yet Facebook has done nothing to remove them or mark them as inaccurate until this article was being written.
  • Inside America’s Secretive $2 Billion Research Hub” – Forbes. Using contract information obtained through Freedom of Information requests and interviews, light is shined on the little known non-profit MITRE Corporation that has been helping the United States government address numerous technological problems since the late 1950’s. The article uncovers some of its latest, federally funded projects that are raising eyebrows among privacy advocates: technology to life people’s fingerprints from social media pictures, technology to scan and copy Internet of Things (IoT) devices from a distance, a scanner to read a person’s DNA, and others.
  • The FBI Is Secretly Using A $2 Billion Travel Company As A Global Surveillance Tool” – Forbes. In his second blockbuster article in a week, Forbes reporter Thomas Brewster exposes how the United States (US) government is using questionable court orders to gather travel information from the three companies that essentially provide airlines, hotels, and other travel entities with back-end functions with respect to reservations and bookings. The three companies, one of whom, Sabre is a US multinational, have masses of information on you if you have ever traveled, and US law enforcement agencies, namely the Federal Bureau of Investigation, is using a 1789 statute to obtain orders all three companies have to obey for information in tracking suspects. Allegedly, this capability has only been used to track terror suspects but will now reportedly be used for COVID-19 tracking.
  • With Trump CIA directive, the cyber offense pendulum swings too far” – Yahoo! News. Former United States (US) National Coordinator for Security, Infrastructure Protection, and Counter-terrorism Richard Clarke argues against the Central Intelligence Agency (CIA) having carte blanche in conducting cyber operations without the review or input of other federal agencies. He suggests that the CIA in particular, and agencies in general, tend to push their authority to the extreme, which in this case could lead to incidents and lasting precedents in cyberspace that may haunt the US. Clarke also intimated that it may have been the CIA and not Israel that launched cyber attacks on infrastructure facilities in Tehran this month and last.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

EDPB Opines Encryption Ban Would Endanger A Nation’s Compliance with GDPR

As the US and others call on technology companies to develop the means to crack encrypted communications, an EU entity argues any nation with such a law would likely not meet the GDPR’s requirements.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In a response to a Minister of the European Parliament’s letter, the European Data Protection Board (EDPB) articulated its view that any nation that implements an “encryption ban” would endanger its compliance with the General Data Protection Regulation (GDPR) and possibly result in companies domiciled in those countries not being able to transfer and process the personal data of EU citizens. However, as always, it bears note the EDPB’s view may not carry the day with the European Commission, Parliament, and courts.

The EDPB’s letter comes amidst another push by the Trump Administration, Republican allies in Congress, and other nations to have technology companies develop workarounds or backdoors to its end-to-end encrypted devices, apps, and systems. The proponents of this change claim online child sexual predators, terrorists, and other criminals are using products and services like WhatsApp, Telegram, and iPhones to defeat legitimate, targeted government surveillance and enforcement. They reason that unless technology companies abandon their unnecessarily absolutist position and work towards a technological solution, the number of bad actors communicating in ways that cannot be broken (aka “going dark”) will increase, allowing for greater crime and wrongdoing.

On the other side of the issue, technology companies, civil liberties and privacy experts, and computer scientists argue that any weakening of or backdoors to encryption will eventually be stolen and exposed, making it easier for criminals to hack, steal, and exfiltrate. They assert the internet and digital age are built on secure communications and threatening this central feature would wreak havoc beyond the crimes the US and other governments are seeking to prevent.

The EDPB stated

Any ban on encryption or provisions weakening encryption would undermine the GDPR obligations on the  concerned  controllers  and  processors  for  an  effective  implementation  of  both  data  protection principles and the appropriate technical and organisational measures. Similar considerations apply to transfers to controllers or processors in any third countries adopting such bans or provisions. Security measures are therefore specifically mentioned among the elements   the   European Commission must take into account when assessing the adequacy of the level of protection in a third country. In the absence of such a decision, transfers are subject to appropriate safeguards or maybe based on derogations; in any case the security of the personal data has to be ensured at all times.

The EDPB opined “that any encryption ban would seriously undermine compliance with the GDPR.” The EDPB continued, “[m]ore specifically, whatever the instrument used,  it would represent a major  obstacle in recognising a level of protection essentially equivalent to that ensured by the applicable  data protection law in the EU, and would seriously question the ability of the concerned controllers and processors to comply with the security obligation of the regulation.”

The EDPB’s view is being articulated at a time when, as noted, a number of nations led by the United States (US) continue to press technology companies to allow them access to communications, apps, platforms, and devices that are encrypted. Last year, the US, United Kingdom, Australia, New Zealand, and Canada (the so-called Five Eyes nations) met and claimed in one of the communiques, the Five Eyes ministers asserted that

We are concerned where companies deliberately design their systems in a way that precludes any form of access to content, even in cases of the most serious crimes. This approach puts citizens and society at risk by severely eroding a company’s ability to identify and respond to the most harmful illegal content, such as child sexual exploitation and abuse, terrorist and extremist material and foreign adversaries’ attempts to undermine democratic values and institutions, as well as law enforcement agencies’ ability to investigate serious crime.

The five nations contended that “[t]ech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can obtain access to data in a readable and usable format.” The Five Eyes also claimed that “[t]hose companies should also embed the safety of their users in their system designs, enabling them to take action against illegal content…[and] [a]s part of this, companies and Governments must work together to ensure that the implications of changes to their services are well understood and that those changes do not compromise public safety.”

The Five Eyes applauded “approaches like Mark Zuckerberg’s public commitment to consulting Governments on Facebook’s recent proposals to apply end-to-end encryption to its messaging services…[and] [t]hese engagements must be substantive and genuinely influence design decisions.”

The Five Eyes added

We share concerns raised internationally, inside and outside of government, about the impact these changes could have on protecting our most vulnerable citizens, including children, from harm. More broadly, we call for detailed engagement between governments, tech companies, and other stakeholders to examine how proposals of this type can be implemented without negatively impacting user safety, while protecting cyber security and user privacy, including the privacy of victims.

In October 2019, in an open letter to Facebook CEO Mark Zuckerberg, US Attorney General William P. Barr, United Kingdom Home Secretary Priti Patel, Australia’s Minister for Home Affairs Peter Dutton, and then acting US Homeland Security Secretary Kevin McAleenan asked “that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.” In Facebook’s December 2019 response, Facebook Vice President and WhatsApp Head Will Cathcart and Facebook Vice President and Messenger Head Stan Chudnovsky stated “[c]ybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere…[and] [t]he ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.”

However, one of the Five Eyes nations has already taken legislative action to force technology companies and individuals cooperate with law enforcement investigations in ways that could threaten encryption. In December 2018, Australia enacted the “Telecommunications and Other Legislation (Assistance and Access) Act 2018” (TOLA). As the Office of Australia’s Information Commissioner (OAIC) wrote of TOLA, “[t]he powers permitted under the Act have the potential to significantly weaken important privacy rights and protections under the Privacy Act…[and] [t]he encryption technology that can obscure criminal communications and pose a threat to national security is the same technology used by ordinary citizens to exercise their legitimate rights to privacy.”

In a related development, this week, Australia’s Independent National Security Legislation Monitor (INSLM) issued its report on “Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018” (TOLA). The Parliamentary  Joint  Committee on Intelligence and Security had requested that the INSLM review the statute, and so INSLM engaged in a lengthy review, including input from the public. As explained in the report’s preface, the “INSLM independently reviews the operation, effectiveness and implications of national  security  and  counter-terrorism  laws;  and  considers  whether  the  laws  contain  appropriate  protections  for  individual  rights,  remain  proportionate  to  terrorism or national security threats, and remain necessary.”

INSLM claimed

In this report I reject the notion that there is a binary choice that must be made between the effectiveness of agencies’ surveillance powers in the digital age on the one hand and the security of the internet on the other. Rather, I conclude that what is necessary is a law which allows agencies to meet technological challenges, such as those caused by encryption, but in a proportionate way and with proper rights protection. Essentially this can be done by updating traditional safeguards to meet those same technological challenges – notably, those who are trusted to authorise intrusive search and surveillance powers must be able to understand the technological context in which those powers operate, and their consequences. If, but only if, the key recommendations I set out in this report in this regard are adopted, TOLA will be such a law.

INSLM stated “[t]he essential effects of TOLA are as follows:

a. Schedule 1 gives police and intelligence agencies new powers to agree or require significant industry assistance from communications providers.

b. Schedules 2, 3 and 4 update existing powers and, in some cases, extended them to new agencies. c. Schedule 5 gives the Australian Security Intelligence Organisation (ASIO) significant new powers to seek and receive both voluntary and compulsory assistance.

INSLM found

  • In relation to Schedule 1, for the reasons set out in greater detail in the report, Technical Assistance Notice (TANs) and Technical Capability Notice (TCNs) should be authorised by a body which is independent of the issuing agency or government. These are powers designed to compel a Designated Communications Provider (DCP) to reveal private information or data of its customers and therefore the usual practice of independent authorisation should apply.
  • I am satisfied that the computer access warrant and associated powers conferred by Schedule 2 are both necessary and proportionate, subject to some amendments.
  • I am generally satisfied that the powers conferred by Schedules 3 and 4 are both necessary and proportionate, but there are some matters that should be addressed and further monitored.
  • I have concluded that Schedule 5 should be amended to limit its breadth and clarify its scope.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay