EU Regulators Settle Dispute Over Proper Punishment of Twitter For Breach

The EDPB uses its GDPR powers to manage a dispute between DPAs.

The European Data Protection Board (EDPB) concluded its first use of powers granted under the General Data Protection Regulation (GDPR) (Regulation (EU) 2016/679 of the European Parliament and of the Council) to resolve a dispute among EU regulators on how to apply the GDPR in punishing a violator. In this case, the EDPB had to referee how Twitter should be punished for a data breach arising from a bug affecting users of an Android OS. Ireland’s Data Protection Commission (DPC) and unnamed concerned supervisory agencies (CSA) disagreed about how Twitter should be fined for the GDPR breach, and so an unused article of the GDPR was triggered that put the EDPB in charge of resolving the dispute. The EDPB considered the objections raised by other EU agencies and found that the DPC needed to recalculate its fine that was set to be a maximum of $300,000 of a possible $69.2 million. Thereafter, the DPC revised and decided that “an administrative fine of €450,000 on Twitter” is “an effective, proportionate and dissuasive measure.”

The DPC issued a revised decision that incorporates the EDPB’s decision on the case that arose from a glitch that changed a person’s protected tweets to unprotected. Twitter users may protect their tweets, meaning only certain people, usually just followers, can see this content. However, a bug with the Android OS resulted in a person’s desire to protect their tweets being thwarted the DPC explained:

The bug that resulted in this data breach meant that, if a user operating an Android device changed the  email  address  associated  with  that  Twitter  account,  their  tweets  became  unprotected  and consequently were accessible to the wider public without the user’s knowledge.

The DPC said this breach occurred between September 2017 and January 2019, affecting 88,726 EU and European Economic Area (EEA) users, and on 8 January 2019, Twitter alerted the DPC, triggering an investigation. Twitter revealed:

On 26 December 2018, we received a bug report through our bug bounty program that if a Twitter user with a protected account, using Twitter for Android, changed their email address the bug would result in their account being unprotected.

Article 33(1) of the GDPR requires breaches to be reported to a DPA within 72 hours in most cases:

In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.

However, Twitter conceded by way of reason as to why it had not reported the breach within the 72 hour window:

The severity of the issue – and that it was reportable – was not appreciated until 3 January 2018 at which point Twitter’s incident response process was put into action.

Additionally, Article 33(5) would become relevant during the DPC investigation:

The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.

Consequently, Twitter had a responsibility as the controller to document all the relevant facts about the data breach and then to report the breach within 72 hours of becoming aware of the breach subject to a range of exceptions.

Shortly thereafter, the DPC named itself the lead supervisory agency (LSA), investigated and reached its proposed decision in late April and submitted it to the European Commission (EC). And, this is where the need for the EDPB to step in began.

Irish Data Protection Commissioner Helen Dixon explained the scope of the subsequent investigation:

  1. Whether Twitter International Company (TIC) complied with its obligations, in accordance with Article 33(1) GDPR, to notify the Commission of the Breach without undue delay and, where feasible, not later than 72 hours after having become aware of it; and
  2. Whether TIC complied with its obligation under Article 33(5) to document the Breach.

Dixon found that TIC did not comply with Article 33(1) and found unpersuasive the main claim of TIC that because Twitter, International, its processor under EU law, did not alert TIC in a timely fashion, it need not meet the 72 hour window. Moreover, Dixon found TIC did not meet its Article 33(5) obligations such that its compliance with Article 33 could be determined. However, the size of the fine became the issue necessitating the EDPB step in because the Austrian Supervisory Authority (Österreichische Datenschutzbehörde), the German Supervisory Authority (Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit) and the Italian Supervisory Authority (Garante per la protezione dei dati personali) made “relevant and reasoned” objections.

Per the GDPR, the EDPB intervened. Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

In its mid-November statement, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 2% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

According to the DPC, the EDPB ultimately decided that

…the [DPC] is required to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC, and to amend its Draft Decision by increasing the level of the fine in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.

Dixon went back and reasoned through the breach and compliance. She stressed that the GDPR infringements were largely aside and apart from the substance of the breach, which is why the administrative fine was low. Nonetheless, Dixon reexamined the evidence in light of the EDPB’s decision and concluded in relevant part:

  • I therefore consider that the nature of the obligations arising under Article 33(1) and Article 33(5) are such that, compliance is central to the overall functioning of the supervision and enforcement regime performed by supervisory authorities in relation to both the specific issue of personal data breaches but also the identification and assessment of wider issues of non-compliance by controllers. As such, non-compliance with these obligations has serious consequences in that it risks undermining the effective exercise by supervisory authorities of their functions under the GDPR. With regard to the nature of the specific infringements in these circumstances, it is clear, having regard to the foregoing, that in the circumstances of this case, the delayed notification under Article 33(1) inevitably delayed the Commission’s assessment of the Breach. With regard to Article 33(5), the deficiencies in the “documenting” of the Breach by TIC impacted on the Commission’s overall efficient assessment of the Breach, necessitating the raising of multiple queries concerning the facts and sequencing surrounding the notification of the Breach.
  • Accordingly, having regard to the potential for damage to data subjects caused by the delayed notification to the Commission (which I have set out above in the context of Article 83(2)(a)), the corollary of this is that any category of personal data could have been affected by the delayed notification. Whilst, as stated above, there was no direct evidence of damage, at the same time, it cannot be definitively said that there was no damage to data subjects or no affected categories of personal data.

Dixon also recalculated the fine that she noted was bound on the upper limit at €10 million or 2% of annual worldwide revenue after once again turning aside TIC’s argument that it independent of Twitter for purposes of determining a fine. Dixon determined the appropriate administrative fine would be about $500,000 and Twitter’s worldwide revenue was $3.46 billion in 2019 (meaning a maximum penalty of $69.2 million.) Dixon explained:

Having regard to all of the foregoing, and, in particular, having had due regard to all of the factors which I am required to consider under Articles 83(2)(a) to (k), as applicable, and in the interests of effectiveness, proportionality and deterrence, and in light of the re-assessment of the elements I have implemented and documented above in accordance with the EDPB Decision, I have decided to impose an administrative fine of $500,000, which equates (in my estimation for this purpose) to €450,000. In deciding to impose a fine in this amount, I have had regard to the previous range of the fine, set out in the Draft Decision (of $150,000 – $300,000), and to the binding direction in the EDPB Decision, at paragraph 207 thereof, that the level of the fine should be increased “..in order to ensure it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality established by Article 83(1) GDPR and taking into account the criteria of Article 83(2) GDPR.”

In its Article 65 decision, the EDPB judged the various objections to the DPC’s proposed decision against Article 4(24) of the GDPR:

‘relevant and reasoned objection’ means an objection to a draft decision as to whether there is an infringement of this Regulation, or whether envisaged action in relation to the controller or processor complies with this Regulation, which clearly demonstrates the significance of the risks posed by the draft decision as regards the fundamental rights and freedoms of data subjects and, where applicable, the free flow of personal data within the Union;

The EDPB ultimately decided “the fine proposed in the Draft Decision is too low and therefore does not fulfil its purpose as a corrective measure, in particular it does not meet the requirements of Article 83(1) GDPR of being effective, dissuasive and proportionate.” The EDPB directed the DPC “to re-assess the elements it relies upon to calculate the amount of the fixed fine to be imposed on TIC so as to ensure it is appropriate to the facts of the case.” However, the EDPB turned aside a number of other objections raised by EU DPAs as failing to meet the standard of review in Article 4(24):

  • the competence of the LSA;
  • the qualification of the roles of TIC and Twitter, Inc., respectively;
  • the infringements of the GDPR identified by the LSA;
  • the existence of possible additional (or alternative) infringements of the GDPR;
  • the lack of a reprimand;

However, the EDPB stressed:

Regarding the objections deemed not to meet the requirements stipulated by Art 4(24) GDPR, the EDPB does not take any position on the merit of any substantial issues raised by these objections. The EDPB reiterates that its current decision is without any prejudice to any assessments the EDPB may be called upon to make in other cases, including with the same parties, taking into account the contents of the relevant draft decision and the objections raised by the CSAs.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by papagnoc from Pixabay

EDPB Concludes First Use of Powers To Resolve Differences Between DPAs in Twitter Enforcement Action

The EDPB announces but does not release its release on the dispute between SAs in the EU over the appropriate punishment for Twitter’s data breaches.

The European Data Protection Board (EDPB) has used its powers under the General Data Protection Regulation (GDPR) for the first time to resolve a dispute between data protection authorities (DPA) in the European Union (EU) over an enforcement action. Unidentified DPAs had objected to the proposed action Ireland’s Data Protection Commission (DPC) had circulated, obligating the EDPB to utilize its Article 65 powers to craft a resolution to the disputed part of the action. The enforcement concerned 2018 and 2019 Twitter data breaches. Now, the DPC has a month to craft a decision on the basis of the EDPB decision unless the DPC challenges the decision in the Court of Justice for the European Union (CJEU).

The DPC submitted its draft decision to other DPAs on the Twitter breach in May in accordance with Article 60 of the GDPR. The DPC stated “[t]he draft decision focusses on whether Twitter International Company has complied with Articles 33(1) and 33(5) of the GDPR” (i.e. the provision pertaining to data breach and proper notification protocol. The DPC further explained

  • This draft decision is one of a number of significant developments in DPC inquiries into “big tech” companies this week. Deputy Commissioner Graham Doyle has confirmed that: “In addition to submitting this draft decision to other EU supervisory authorities, we have this week sent a preliminary draft decision to WhatsApp Ireland Limited for their final submissions which will be taken in to account by the DPC before preparing a draft decision in that matter also for Article 60 purposes.  The inquiry into WhatsApp Ireland examines its compliance with Articles 12 to 14 of the GDPR in terms of transparency including in relation to transparency around what information is shared with Facebook.“
  • The DPC has also completed the investigation phase of a complaint-based inquiry which focuses on Facebook Ireland’s obligations to establish a lawful basis for personal data processing. This inquiry is now in the decision-making phase at the DPC.

In its statement this week, the EDPB said it “adopted by a 2/3 majority of its members its first dispute resolution decision on the basis of Art. 65 GDPR.” The EDPB stated

The Irish supervisory authority (SA) issued the draft decision following an own-volition inquiry and investigations into Twitter International Company, after the company notified the Irish SA of a personal data breach on 8 January 2019. In May 2020, the Irish SA shared its draft decision with the concerned supervisory authorities (CSAs) in accordance with Art. 60 (3) GDPR. The CSAs then had four weeks to submit their relevant and reasoned objections (RROs.) Among others, the CSAs issued RROs on the infringements of the GDPR identified by the lead supervisory authority (LSA), the role of Twitter International Company as the (sole) data controller, and the quantification of the proposed fine. 

It appears from the EDPB’s statement that other DPAs/SAs had objected to the size of the fine (which can be as high as 4% of annual revenue), how Twitter violated the GDPR, and Twitter’s culpability based on whether it was the only controller of the personal data or other controllers may have also been held responsible.

The EDPB asserted:

The Irish SA shall adopt its final decision on the basis of the EDPB decision, which will be addressed to the controller, without undue delay and at the latest one month after the EDPB has notified its decision. The LSA and CSAs shall notify the EDPB of the date the final decision was notified to the controller. Following this notification, the EDPB will publish its decision on its website.

The EDPB also published FAQs on the Article 65 procedure.

More recently, the EDPB issued a draft of its construction of a key authority in the GDPR designed to guide and coordinate investigations that cross borders in the European Union (EU). An LSA is supposed to consider “relevant and reasoned objections” to draft decisions submitted by CSAs. If an LSA rejects such feedback, then the GDPR action gets kicked over to the EDPB. However, since this has only happened once, the EDPB thought it appropriate to define the term so all the EU DPA would understand what objections are relevant and reasoned.

The EDPB explained that the guidance “aims at establishing a common understanding of the notion of the terms “relevant and reasoned”, including what should be considered when assessing whether an objection “clearly demonstrates the significance of the risks posed by the draft decision.” The EDPB stated “[t]he unfamiliarity surrounding “what constitutes relevant and reasoned objection” has the potential to create misunderstandings and inconsistent applications by the supervisory authorities, the EU legislator (sic) suggested that the EDPB should issue guidelines on this concept (end of Recital 124 GDPR).”

Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by ElisaRiva from Pixabay

Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay

Further Reading, Other Developments, and Coming Events (27 October)

Further Reading

  •  “The Police Can Probably Break Into Your Phone” By Jack Nicas — The New York Times. So, about “Going Dark.” Turns out nations and law enforcement officials have either oversold the barrier that default end-to-end encryption on phones creates or did not understand the access that police were already getting to many encrypted phones. This piece is based in large part on the Upturn report showing that United States (U.S.) law enforcement agencies have multiple means of hacking into encrypted or protected smartphones. The point is made that the issue is really that encryption makes it harder to get into phones and is quite pricey. If an iPhone or Android user stores data in the cloud, then getting access is not a problem. But having it encrypted on a phone requires serious technological means to access. But, this article points to another facet of the Upturn report: police have very little in the way of policy or guidance on how to handle data in ways that respect privacy and possibly even the laws of their jurisdictions.
  • Pornhub Doesn’t Care” By Samantha Cole and Emanuel Maiberg — Vice. One of the world’s biggest pornography sites seems to have a poor track record at taking down non-consensual pornography. A number of women were duped into filming pornography they were told would not be distributed online or only in certain jurisdictions. The proprietor lied and now many of them are faced with having these clips turn up again and again on Pornhub and other sites even if they use digital fingerprinting of such videos. These technological screening methods can be easily defeated. Worse still, Pornhub, and its parent company, Mindgeek, did not start responding to requests from these women to have their videos taken down until they began litigating against the man who had masterminded the filming of the non-consensual videos.
  • ‘Machines set loose to slaughter’: the dangerous rise of military AI” By Frank Pasquale — The Guardian. This long read lays out some of the possibilities that may come to pass if artificial intelligence is used to create autonomous weapons or robots. Most of the outcomes sound like science fiction, but then who could have foreseen a fleet of drones in the Middle East operated by the United States.
  • How The Epoch Times Created a Giant Influence Machine” By Kevin Roose — The New York Times. An interesting tale of how a fringe publication may be on its way to being one of the biggest purveyors of right wing material online.
  • Schools Clamored for Seesaw’s App. That Was Good News, and Bad News.” By Stephanie Clifford — The New York Times. The pandemic has led to the rise of another educational app.

Other Developments

  • The United Kingdom’s (UK) Parliamentary Business, Energy and Industrial Strategy (BEIS) Committee wrote a number of companies, including technology firms, “to seek answers in relation to the Committee’s inquiry exploring the extent to which businesses in the UK are exploiting the forced labour of Uyghur in the Xinjiang region of China” according to the committee’s press release. The committee wrote to Amazon and TikTok because as the chair of the committee, Minister of Parliament Nusrat Ghani asserted:
    • The Australian Strategic Policy Institute’s (ASPI) ‘Uyghur’s for Sale’ report names 82 foreign and Chinese companies directly or indirectly benefiting from the exploitation of Uyghur workers in Xinjiang. The companies listed in the Australian Strategic Policy Institute’s report span industries including the fashion, retail and information technology sectors. On the BEIS Committee, we are determined to ask prominent businesses operating in Britain in these sectors what they are doing to ensure their profits are not on the back of forced labour in China. These businesses are trusted by many British consumers and I hope they will repay this faith by coming forward to answer these questions and also take up the opportunity to give evidence to the Business Committee in public.
    • In its March report, the ASPI argued:
      • The Chinese government has facilitated the mass transfer of Uyghur and other ethnic minority citizens from the far west region of Xinjiang to factories across the country. Under conditions that strongly suggest forced labour, Uyghurs are working in factories that are in the supply chains of at least 82 well-known global brands in the technology, clothing and automotive sectors, including Apple, BMW, Gap, Huawei, Nike, Samsung, Sony and Volkswagen.
      • This report estimates that more than 80,000 Uyghurs were transferred out of Xinjiang to work in factories across China between 2017 and 2019, and some of them were sent directly from detention camps. The estimated figure is conservative and the actual figure is likely to be far higher. In factories far away from home, they typically live in segregated dormitories, undergo organised Mandarin and ideological training outside working hours, are subject to constant surveillance, and are forbidden from participating in religious observances. Numerous sources, including government documents, show that transferred workers are assigned minders and have limited freedom of movement.
      • China has attracted international condemnation for its network of extrajudicial ‘re-education camps’ in Xinjiang. This report exposes a new phase in China’s social re-engineering campaign targeting minority citizens, revealing new evidence that some factories across China are using forced Uyghur labour under a state-sponsored labour transfer scheme that is tainting the global supply chain.
  • A group of nations worked together to find and apprehend individuals accused of laundering ill-gotten funds for cyber criminals. The United States (U.S.) indicted the accused. Europol explained:
    • An unprecedented international law enforcement operation involving 16 countries has resulted in the arrest of 20 individuals suspected of belonging to the QQAAZZ criminal network which attempted to launder tens of millions of euros on behalf of the world’s foremost cybercriminals. 
    • Some 40 house searches were carried out in Latvia, Bulgaria, the United Kingdom, Spain and Italy, with criminal proceedings initiated against those arrested by the United States, Portugal, the United Kingdom and Spain. The largest number of searches in the case were carried out in Latvia in operations led by the Latvian State Police (Latvijas Valsts Policija). Bitcoin mining equipment was also seized in Bulgaria.
    • This international sweep follows a complex investigation led by the Portuguese Judicial Police (Polícia Judiciária) together with the United States Attorney Office for the Western District of Pennsylvania and the FBI’s Pittsburgh Field Office, alongside the Spanish National Police (Policia Nacional) and the regional Catalan police (Mossos D’esquadra) and law enforcement authorities from the United Kingdom, Latvia, Bulgaria, Georgia, Italy, Germany, Switzerland, Poland, Czech Republic, Australia, Sweden, Austria and Belgium with coordination efforts led by Europol. 
    • The U.S. Department of Justice (DOJ) claimed:
      • Comprised of several layers of members from Latvia, Georgia, Bulgaria, Romania, and Belgium, among other countries, the QQAAZZ network opened and maintained hundreds of corporate and personal bank accounts at financial institutions throughout the world to receive money from cybercriminals who stole it from bank accounts of victims.  The funds were then transferred to other QQAAZZ-controlled bank accounts and sometimes converted to cryptocurrency using “tumbling” services designed to hide the original source of the funds.  After taking a fee of up to 40 to 50 percent, QQAAZZ returned the balance of the stolen funds to their cybercriminal clientele.  
      • The QQAAZZ members secured these bank accounts by using both legitimate and fraudulent Polish and Bulgarian identification documents to create and register dozens of shell companies which conducted no legitimate business activity. Using these registration documents, the QQAAZZ members then opened corporate bank accounts in the names of the shell companies at numerous financial institutions around the world, thereby generating hundreds of QQAAZZ-controlled bank accounts available to receive stolen funds from cyber thieves.
      • QQAAZZ advertised its services as a “global, complicit bank drops service” on Russian-speaking online cybercriminal forums where cybercriminals gather to offer or seek specialized skills or services needed to engage in a variety of cybercriminal activities. The criminal gangs behind some of the world’s most harmful malware families (e.g.: Dridex, Trickbot, GozNym, etc.) are among those cybercriminal groups that benefited from the services provided by QQAAZZ. 
  • Representatives Anna Eshoo (D-CA) and Bobby L. Rush (D-IL), and Senator Ron Wyden (D-OR) wrote the Privacy and Civil Liberties Oversight Board (PCLOB) asking that the privacy watchdog “investigate the federal government’s surveillance of recent protests, the legal authorities for that surveillance, the government’s adherence to required procedures in using surveillance equipment, and the chilling effect that federal government surveillance has had on protesters.”
    • They argued:
      • Many agencies have or may have surveilled protesters, according to press reports and agency documents.
        • The Customs and Border Protection (CBP) deployed various aircraft –including AS350 helicopters, a Cessna single-engine airplane, and Predator drones –that logged 270 hours of aerial surveillance footage over 15 cities, including Minneapolis, New York City, Buffalo, Philadelphia, Detroit, and Washington, D.C.
        • The FBI flew Cessna 560 aircraft over protests in Washington, D.C., in June, and reporting shows that the FBI has previously equipped such aircraft with ‘dirt boxes,’ equipment that can collect cell phone location data, along with sophisticated cameras for long-range, persistent video surveillance.
        • In addition to specific allegations of protester surveillance, the Drug Enforcement Agency (DEA) was granted broad authority to “conduct covert surveillance ”over protesters responding to the murder of Mr. Floyd.
    • Eshoo, Rush, and Wyden claimed:
      • Recent surveillance of protests involves serious threats to liberty and requires a thorough investigation. We ask that PCLOB thoroughly investigate, including by holding public hearings, the following issues and issue a public report about its findings:
        • (1) Whether and to what extent federal government agencies surveilled protests by collecting or processing personal information of protesters.
        • (2) What legal authorities agencies are using as the basis for surveillance, an unclassified enumeration of claimed statutory or other authorities, and whether agencies followed required procedures for using surveillance equipment, acquiring and processing personal data, receiving appropriate approvals, and providing needed transparency.
        • (3) To what extent the threat of surveillance has a chilling effect on protests.
  • Ireland’s Data Protection Commission (DPC) has opened two inquiries into Facebook and Instagram for potential violations under the General Data Protection Regulation (GDPR) and Ireland’s Data Protection Act 2018. This is not the only regulatory action the DPC has against Facebook, which is headquartered in Dublin. The DPC is reportedly trying to stop Facebook from transferring personal data out of the European Union (EU) and into the United States (U.S.) using standard contractual clauses (SCC) in light of the EU-U.S. Privacy Shield being struck down. The DPC stated “Instagram is a social media platform which is used widely by children in Ireland and across Europe…[and] [t]he DPC has been actively monitoring complaints received from individuals in this area and has identified potential concerns in relation to the processing of children’s personal data on Instagram which require further examination.
    • The DPC explained the two inquiries:
      • This Inquiry will assess Facebook’s reliance on certain legal bases for its processing of children’s personal data on the Instagram platform. The DPC will set out to establish whether Facebook has a legal basis for the ongoing processing of children’s personal data and if it employs adequate protections and or restrictions on the Instagram platform for such children. This Inquiry will also consider whether Facebook meets its obligations as a data controller with regard to transparency requirements in its provision of Instagram to children.
      • This Inquiry will focus on Instagram profile and account settings and the appropriateness of these settings for children. Amongst other matters, this Inquiry will explore Facebook’s adherence with the requirements in the GDPR in respect to Data Protection by Design and Default and specifically in relation to Facebook’s responsibility to protect the data protection rights of children as vulnerable persons.
  • The United States’ National Institute of Standards and Technology (NIST) issued a draft version of the Cybersecurity Profile for the Responsible Use of Positioning, Navigation and Timing (PNT) Services (NISTIR 8323). Comments are due by 23 November.
    • NIST explained:
      • NIST has developed this PNT cybersecurity profile to help organizations identify systems, networks, and assets dependent on PNT services; identify appropriate PNT services; detect the disruption and manipulation of PNT services; and manage the associated risks to the systems, networks, and assets dependent on PNT services. This profile will help organizations make deliberate, risk-informed decisions on their use of PNT services.
    • In its June request for information (RFI), NIST explained “Executive Order 13905, Strengthening National Resilience Through Responsible Use of Positioning, Navigation, and Timing Services, was issued on February 12, 2020 and seeks to protect the national and economic security of the United States from disruptions to PNT services that are vital to the functioning of technology and infrastructure, including the electrical power grid, communications infrastructure and mobile devices, all modes of transportation, precision agriculture, weather forecasting, and emergency response.” The EO directed NIST “to develop and make available, to at least the appropriate agencies and private sector users, PNT profiles.”

Coming Events

  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

“How Encryption Works” by Afsal CMK is licensed under CC BY 4.0

Further Reading, Other Developments, and Coming Events (22 October)

Further Reading

  •  “A deepfake porn Telegram bot is being used to abuse thousands of women” By Matt Burgess — WIRED UK. A bot set loose on Telegram can take pictures of women and, apparently teens, too, and “takes off” their clothing, rendering a naked image of females who never took naked pictures. This seems to be the next iteration in deepfake porn, a problem that will surely get worse until governments legislate against it and technology companies have incentives to locate and take down such material.
  • The Facebook-Twitter-Trump Wars Are Actually About Something Else” By Charlie Warzel — The New York Times. This piece makes the case that there are no easy fixes for American democracy or for misinformation on social media platforms.
  • Facebook says it rejected 2.2m ads for breaking political campaigning rules” — Agence France-Presse. Facebook’s Vice President of Global Affairs and Communications Nick Clegg said the social media giant is employing artificial intelligence and humans to find and remove political advertisements that violate policy in order to avoid a repeat of 2016 where untrue information and misinformation played roles in both Brexit and the election of Donald Trump as President of the United States.
  • Huawei Fallout—Game-Changing New China Threat Strikes At Apple And Samsung” By Zak Doffman — Forbes. Smartphone manufacturers from the People’s Republic of China (PRC) appear ready to step into the projected void caused by the United States (U.S.) strangling off Huawei’s access to chips. Xiaomi and Oppo have already seen sales surge worldwide and are poised to pick up where Huawei is being forced to leave off, perhaps demonstrating the limits of U.S. power to blunt the rise of PRC technology companies.
  • As Local News Dies, a Pay-for-Play Network Rises in Its Place” By Davey Alba and Jack Nicas — The New York Times. With a decline and demise of many local media outlets in the United States, new groups are stepping into the void, and some are politically minded but not transparent about biases. The organization uncovered in this article is nakedly Republican and is running and planting articles at both legitimate and artificial news sites for pay. Sometimes conservative donors pay, sometimes campaigns do. Democrats are engaged in the same activity but apparently to a lesser extent. These sorts of activities will only erode further faith in the U.S. media.
  • Forget Antitrust Laws. To Limit Tech, Some Say a New Regulator Is Needed.” By Steve Lohr — The New York Times. This piece argues that anti-trust enforcement actions are plodding, tending to take years to finish. Consequently, this body of law is inadequate to the task of addressing the market dominance of big technology companies. Instead, a new regulatory body is needed along the lines of those regulating the financial services industries that is more nimble than anti-trust. Given the problems in that industry with respect to regulation, this may not be the best model.
  • “‘Do Not Track’ Is Back, and This Time It Might Work” By Gilad Edelman — WIRED. Looking to utilize the requirement in the “California Consumer Privacy Act” (CCPA) (AB 375) that requires regulated entities to respect and effectuate the use of a one-time opt-out mechanism, a group of entities have come together to build and roll out the Global Privacy Control. In theory, users could download this technical specification to their phones and computers, install it, use it once, and then all websites would be on notice regarding that person’s privacy preferences. Such a means would go to the problem turned up by Consumer Reports recent report on the difficulty of trying to opt out of having one’s personal information sold.
  • EU countries sound alarm about growing anti-5G movement” By Laurens Cerulus — Politico. 15 European Union (EU) nations wrote the European Commission (EC) warning that the nascent anti-5G movement borne of conspiracy thinking and misinformation threatens the Eu’s position vis-à-vis the United States (U.S.) and the People’s Republic of China (PRC). There have been more than 200 documented arson attacks in the EU with the most having occurred in the United Kingdom, France, and the Netherlands. These nations called for a more muscular, more forceful debunking of the lies and misinformation being spread about 5G.
  • Security firms call Microsoft’s effort to disrupt botnet to protect against election interference ineffective” By Jay Greene — The Washington Post. Microsoft seemingly acted alongside the United States (U.S.) Cyber Command to take down and impair the operation of Trickbot, but now cybersecurity experts are questioning how effective Microsoft’s efforts really were. Researchers have shown the Russian operated Trickbot has already stood up operations and has dispersed across servers around the world, showing how difficult it is to address some cyber threats.
  • Governments around the globe find ways to abuse Facebook” By Sara Fischer and Ashley Gold — Axios. This piece puts a different spin on the challenges Facebook faces in countries around the world, especially those that ruthlessly use the platform to spread lies and misinformation than the recent BuzzFeed News article. The new article paints Facebook as the well-meaning company being taken advantage of while the other one portrays a company callous to content moderation except in nations where it causes them political problems such as the United States, the European Union, and other western democracies.

Other Developments

  • The United States (U.S.) Department of Justice’s (DOJ) Cyber-Digital Task Force (Task Force) issued “Cryptocurrency: An Enforcement Framework,” that “provides a comprehensive overview of the emerging threats and enforcement challenges associated with the increasing prevalence and use of cryptocurrency; details the important relationships that the Department of Justice has built with regulatory and enforcement partners both within the United States government and around the world; and outlines the Department’s response strategies.” The Task Force noted “[t]his document does not contain any new binding legal requirements not otherwise already imposed by statute or regulation.” The Task Force summarized the report:
    • [I]n Part I, the Framework provides a detailed threat overview, cataloging the three categories into which most illicit uses of cryptocurrency typically fall: (1) financial transactions associated with the commission of crimes; (2) money laundering and the shielding of legitimate activity from tax, reporting, or other legal requirements; and (3) crimes, such as theft, directly implicating the cryptocurrency marketplace itself. 
    • Part II explores the various legal and regulatory tools at the government’s disposal to confront the threats posed by cryptocurrency’s illicit uses, and highlights the strong and growing partnership between the Department of Justice and the Securities and Exchange Commission, the Commodity Futures Commission, and agencies within the Department of the Treasury, among others, to enforce federal law in the cryptocurrency space.
    • Finally, the Enforcement Framework concludes in Part III with a discussion of the ongoing challenges the government faces in cryptocurrency enforcement—particularly with respect to business models (employed by certain cryptocurrency exchanges, platforms, kiosks, and casinos), and to activity (like “mixing” and “tumbling,” “chain hopping,” and certain instances of jurisdictional arbitrage) that may facilitate criminal activity.    
  • The White House’s Office of Science and Technology Policy (OSTP) has launched a new website for the United States’ (U.S.) quantum initiative and released a report titled “Quantum Frontiers: Report On Community Input To The Nation’s Strategy For Quantum Information Science.” The Quantum Initiative flows from the “National Quantum Initiative Act” (P.L. 115-368) “to  provide  for  a  coordinated  Federal  program  to  accelerate  quantum  research  and  development  for  the  economic and national security of the United States.” The OSTP explained that the report “outlines eight frontiers that contain core problems with fundamental questions confronting quantum information science (QIS) today:
    • Expanding Opportunities for Quantum Technologies to Benefit Society
    • Building the Discipline of Quantum Engineering
    • Targeting Materials Science for Quantum Technologies
    • Exploring Quantum Mechanics through Quantum Simulations
    • Harnessing Quantum Information Technology for Precision Measurements
    • Generating and Distributing Quantum Entanglement for New Applications
    • Characterizing and Mitigating Quantum Errors
    • Understanding the Universe through Quantum Information
    • OSTP asserted “[t]hese frontier areas, identified by the QIS research community, are priorities for the government, private sector, and academia to explore in order to drive breakthrough R&D.”
  • The New York Department of Financial Services (NYDFS) published its report on the July 2020 Twitter hack during which a team of hacker took over a number of high-profile accounts (e.g. Barack Obama, Kim Kardashian West, Jeff Bezos, and Elon Musk) in order to perpetrate a cryptocurrency scam. The NYDFS has jurisdiction over cryptocurrencies and companies dealing in this item in New York. The NYDFS found that the hackers used the most basic means to acquire permission to take over accounts. The NYDFS explained:
    • Given that Twitter is a publicly traded, $37 billion technology company, it was surprising how easily the Hackers were able to penetrate Twitter’s network and gain access to internal tools allowing them to take over any Twitter user’s account. Indeed, the Hackers used basic techniques more akin to those of a traditional scam artist: phone calls where they pretended to be from Twitter’s Information Technology department. The extraordinary access the Hackers obtained with this simple technique underscores Twitter’s cybersecurity vulnerability and the potential for devastating consequences. Notably, the Twitter Hack did not involve any of the high-tech or sophisticated techniques often used in cyberattacks–no malware, no exploits, and no backdoors.
    • The implications of the Twitter Hack extend far beyond this garden-variety fraud. There are well-documented examples of social media being used to manipulate markets and interfere with elections, often with the simple use of a single compromised account or a group of fake accounts.In the hands of a dangerous adversary, the same access obtained by the Hackers–the ability to take control of any Twitter users’ account–could cause even greater harm.
    • The Twitter Hack demonstrates the need for strong cybersecurity to curb the potential weaponization of major social media companies. But our public institutions have not caught up to the new challenges posed by social media. While policymakers focus on antitrust and content moderation problems with large social media companies, their cybersecurity is also critical. In other industries that are deemed critical infrastructure, such as telecommunications, utilities, and finance, we have established regulators and regulations to ensure that the public interest is protected. With respect to cybersecurity, that is what is needed for large, systemically important social media companies.
    • The NYDFS recommended the cybersecurity measures cryptocurrency companies in New York should implement to avoid similar hacks, including its own cybersecurity regulations that bind its regulated entities in New York. The NYDFS also called for a national regulator to address the lack of a dedicated regulator of Twitter and other massive social media platforms. The NYDFS asserted:
      • Social media companies currently have no dedicated regulator. They are subject to the same general oversight applicable to other companies. For instance, the SEC’s regulations for all public companies apply to public social media companies, and antitrust and related laws and regulations enforced by the Department of Justice and the FTC apply to social media companies as they do to all companies. Social media companies are also subject to generally applicable laws, such as the California Consumer Privacy Act and the New York SHIELD Act. The European Union’s General Data Protection Regulation, which regulates the storage and use of personal data, also applies to social media entities doing business in Europe.
      • But there are no regulators that have the authority to uniformly regulate social media platforms that operate over the internet, and to address the cybersecurity concerns identified in this Report. That regulatory vacuum must be filled.
      • A useful starting point is to create a “systemically important” designation for large social media companies, like the designation for critically important bank and non-bank financial institutions. In the wake of the 2007-08 financial crisis, Congress established a new regulatory framework for financial institutions that posed a systemic threat to the financial system of the United States. An institution could be designated as a Systemically Important Financial Institution (“SIFI”) “where the failure of or a disruption to the functioning of a financial market utility or the conduct of a payment, clearing, or settlement activity could create, or increase, the risk of significant liquidity or credit problems spreading among financial institutions or markets and thereby threaten the stability of the financial system of the United States.”
      • The risks posed by social media to our consumers, economy, and democracy are no less grave than the risks posed by large financial institutions. The scale and reach of these companies, combined with the ability of adversarial actors who can manipulate these systems, require a similarly bold and assertive regulatory approach.
      • The designation of an institution as a SIFI is made by the Financial Stability Oversight Council (“FSOC”), which Congress established to “identify risks to the financial stability of the United States” and to provide enhanced supervision of SIFIs.[67] The FSOC also “monitors regulatory gaps and overlaps to identify emerging sources of systemic risk.” In determining whether a financial institution is systemically important, the FSOC considers numerous factors including: the effect that a failure or disruption to an institution would have on financial markets and the broader financial system; the nature of the institution’s transactions and relationships; the nature, concentration, interconnectedness, and mix of the institution’s activities; and the degree to which the institution is regulated.
      • An analogue to the FSOC should be established to identify systemically important social media companies. This new Oversight Council should evaluate the reach and impact of social media companies, as well as the society-wide consequences of a social media platform’s misuse, to determine which companies they should designate as systemically important. Once designated, those companies should be subject to enhanced regulation, such as through the provision of “stress tests” to evaluate the social media companies’ susceptibility to key threats, including cyberattacks and election interference.
      • Finally, the success of such oversight will depend on the establishment of an expert agency to oversee designated social media companies. Systemically important financial companies designated by the FSOC are overseen by the Federal Reserve Board, which has a long-established and deep expertise in banking and financial market stability. A regulator for systemically important social media would likewise need deep expertise in areas such as technology, cybersecurity, and disinformation. This expert regulator could take various forms; it could be a completely new agency or could reside within an established agency or at an existing regulator.
  • The Government Accountability Office (GAO) evaluated how well the Trump Administration has been implementing the “Open, Public, Electronic and Necessary Government Data Act of 2018” (OPEN Government Data Act) (P.L. 115-435). As the GAO explained, this statute “requires federal agencies to publish their information as open data using standardized, nonproprietary formats, making data available to the public open by default, unless otherwise exempt…[and] codifies and expands on existing federal open data policy including the Office of Management and Budget’s (OMB) memorandum M-13-13 (M-13-13), Open Data Policy—Managing Information as an Asset.”
    • The GAO stated
      • To continue moving forward with open government data, the issuance of OMB implementation guidance should help agencies develop comprehensive inventories of their data assets, prioritize data assets for publication, and decide which data assets should or should not be made available to the public.
      • Implementation of this statutory requirement is critical to agencies’ full implementation and compliance with the act. In the absence of this guidance, agencies, particularly agencies that have not previously been subject to open data policies, could fall behind in meeting their statutory timeline for implementing comprehensive data inventories.
      • It is also important for OMB to meet its statutory responsibility to biennially report on agencies’ performance and compliance with the OPEN Government Data Act and to coordinate with General Services Administration (GSA) to improve the quality and availability of agency performance data that could inform this reporting. Access to this information could inform Congress and the public on agencies’ progress in opening their data and complying with statutory requirements. This information could also help agencies assess their progress and improve compliance with the act.
    • The GAO made three recommendations:
      • The Director of OMB should comply with its statutory requirement to issue implementation guidance to agencies to develop and maintain comprehensive data inventories. (Recommendation 1)
      • The Director of OMB should comply with the statutory requirement to electronically publish a report on agencies’ performance and compliance with the OPEN Government Data Act. (Recommendation 2)
      • The Director of OMB, in collaboration with the Administrator of GSA, should establish policy to ensure the routine identification and correction of errors in electronically published performance information. (Recommendation 3)
  • The United States’ (U.S.) National Security Agency (NSA) issued a cybersecurity advisory titled “Chinese State-Sponsored Actors Exploit Publicly Known Vulnerabilities,” that “provides Common Vulnerabilities and Exposures (CVEs) known to be recently leveraged, or scanned-for, by Chinese state-sponsored cyber actors to enable successful hacking operations against a multitude of victim networks.” The NSA recommended a number of mitigations generally for U.S. entities, including:
    • Keep systems and products updated and patched as soon as possible after patches are released.
    • Expect that data stolen or modified (including credentials, accounts, and software) before the device was patched will not be alleviated by patching, making password changes and reviews of accounts a good practice.
    • Disable external management capabilities and set up an out-of-band management network.
    • Block obsolete or unused protocols at the network edge and disable them in device configurations.
    • Isolate Internet-facing services in a network Demilitarized Zone (DMZ) to reduce the exposure of the internal network.
    • Enable robust logging of Internet-facing services and monitor the logs for signs of compromise.
    • The NSA then proceeded to recommend specific fixes.
    • The NSA provided this policy backdrop:
      • One of the greatest threats to U.S. National Security Systems (NSS), the U.S. Defense Industrial Base (DIB), and Department of Defense (DOD) information networks is Chinese state-sponsored malicious cyber activity. These networks often undergo a full array of tactics and techniques used by Chinese state-sponsored cyber actors to exploit computer networks of interest that hold sensitive intellectual property, economic, political, and military information. Since these techniques include exploitation of publicly known vulnerabilities, it is critical that network defenders prioritize patching and mitigation efforts.
      • The same process for planning the exploitation of a computer network by any sophisticated cyber actor is used by Chinese state-sponsored hackers. They often first identify a target, gather technical information on the target, identify any vulnerabilities associated with the target, develop or re-use an exploit for those vulnerabilities, and then launch their exploitation operation.
  • Belgium’s data protection authority (DPA) (Autorité de protection des données in French or Gegevensbeschermingsautoriteit in Dutch) (APD-GBA) has reportedly found that the Transparency & Consent Framework (TCF) developed by the Interactive Advertising Bureau (IAB) violates the General Data Protection Regulation (GDPR). The Real-Time Bidding (RTB) system used for online behavioral advertising allegedly transmits the personal information of European Union residents without their consent even before a popup appears on their screen asking for consent. The APD-GBA is the lead DPA in the EU in investigating the RTB and will likely now circulate their findings and recommendations to other EU DPAs before any enforcement will commence.
  • None Of Your Business (noyb) announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish Data Protection Commission (DPC) today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.) Last month, after the DPC directed Facebook to stop transferring the personal data of EU citizens to the U.S., the company filed suit in the Irish High Court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • noyb further asserted:
      • Instead of making a decision in the pending procedure, the DPC has started a second, new investigation into the same subject matter (“Parallel Procedure”), as widely reported (see original reporting by the WSJ). No logical reasons for the Parallel Procedure was given, but the DPC has maintained that Mr Schrems will not be heard in this second case, as he is not a party in this Parallel Procedure. This Paralell procedure was criticised by Facebook publicly (link) and instantly blocked by a Judicial Review by Facebook (see report by Reuters).
      • Today’s Judicial Review by noyb is in many ways the counterpart to Facebook’s Judicial Review: While Facebook wants to block the second procedure by the DPC, noyb wants to move the original complaints procedure towards a decision.
      • Earlier this summer, the CJEU struck down the adequacy decision for the agreement between the EU and (U.S. that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data to the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • Most likely with the aim of helping British chances for an adequacy decision from the European Union (EU), the United Kingdom’s Information Commissioner’s Office (ICO) published guidance that “discusses the right of access [under the General Data Protection Regulation (GDPR)] in detail.” The ICO explained “is aimed at data protection officers (DPOs) and those with specific data protection responsibilities in larger organisations…[but] does not specifically cover the right of access under Parts 3 and 4 of the Data Protection Act 2018.”
    • The ICO explained
      • The right of access, commonly referred to as subject access, gives individuals the right to obtain a copy of their personal data from you, as well as other supplementary information.
  • The report the House Education and Labor Ranking Member requested from the Government Accountability Office (GAO) on the data security and data privacy practices of public schools. Representative Virginia Foxx (R-NC) asked the GAO “to review the security of K-12 students’ data. This report examines (1) what is known about recently reported K-12 cybersecurity incidents that compromised student data, and (2) the characteristics of school districts that experienced these incidents.” Strangely, the report did have GAO’s customary conclusions or recommendations. Nonetheless, the GAO found:
    • Ninety-nine student data breaches reported from July 1, 2016 through May 5, 2020 compromised the data of students in 287 school districts across the country, according to our analysis of K-12 Cybersecurity Resource Center (CRC) data (see fig. 3). Some breaches involved a single school district, while others involved multiple districts. For example, an attack on a vendor system in the 2019-2020 school year affected 135 districts. While information about the number of students affected was not available for every reported breach, examples show that some breaches affected thousands of students, for instance, when a cybercriminal accessed 14,000 current and former students’ personally identifiable information (PII) in one district.
    • The 99 reported student data breaches likely understate the number of breaches that occurred, for different reasons. Reported incidents sometimes do not include sufficient information to discern whether data were breached. We identified 15 additional incidents in our analysis of CRC data in which student data might have been compromised, but the available information was not definitive. In addition, breaches can go undetected for some time. In one example, the personal information of hundreds of thousands of current and former students in one district was publicly posted for 2 years before the breach was discovered.
    • The CRC identified 28 incidents involving videoconferences from April 1, 2020 through May 5, 2020, some of which disrupted learning and exposed students to harm. In one incident, 50 elementary school students were exposed to pornography during a virtual class. In another incident in a different district, high school students were targeted with hate speech during a class, resulting in the cancellation that day of all classes using the videoconferencing software. These incidents also raise concerns about the potential for violating students’ privacy. For example, one district is reported to have instructed teachers to record their class sessions. Teachers said that students’ full names were visible to anyone viewing the recording.
    • The GAO found gaps in the protection and enforcement of student privacy by the United States government:
      • [The Department of] Education is responsible for enforcing Family Educational Rights and Privacy Act (FERPA), which addresses the privacy of PII in student education records and applies to all schools that receive funds under an applicable program administered by Education. If parents or eligible students believe that their rights under FERPA have been violated, they may file a formal complaint with Education. In response, Education is required to take appropriate actions to enforce and deal with violations of FERPA. However, because the department’s authority under FERPA is directly related to the privacy of education records, Education’s security role is limited to incidents involving potential violations under FERPA. Further, FERPA amendments have not directly addressed educational technology use.
      • The “Children’s Online Privacy Protection Act” (COPPA) requires the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s privacy. The COPPA Rule, which took effect in 2000 and was later amended in 2013, requires operators of covered websites or online services that collect personal information from children under age 13 to provide notice and obtain parental consent, among other things. COPPA generally applies to the vendors who provide educational technology, rather than to schools directly. However, according to FTC guidance, schools can consent on behalf of parents to the collection of students’ personal information if such information is used for a school-authorized educational purpose and for no other commercial purpose.
  • Upturn, an advocacy organization that “advances equity and justice in the design, governance, and use of technology,” has released a report showing that United States (U.S.) law enforcement agencies have multiple means of hacking into encrypted or protected smartphones. There have long been the means and vendors available in the U.S. and abroad for breaking into phones despite the claims of a number of nations like the Five Eyes (U.S., the United Kingdom, Australia, Canada, and New Zealand) that default end-to-end encryption was a growing problem that allowed those preying on children and engaged in terrorism to go undetected. In terms of possible bias, Upturn is “is supported by the Ford Foundation, the Open Society Foundations, the John D. and Catherine T. MacArthur Foundation, Luminate, the Patrick J. McGovern Foundation, and Democracy Fund.”
    • Upturn stated:
      • Every day, law enforcement agencies across the country search thousands of cellphones, typically incident to arrest. To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”
      • This report documents the widespread adoption of MDFTs by law enforcement in the United States. Based on 110 public records requests to state and local law enforcement agencies across the country, our research documents more than 2,000 agencies that have purchased these tools, in all 50 states and the District of Columbia. We found that state and local law enforcement agencies have performed hundreds of thousands of cellphone extractions since 2015, often without a warrant. To our knowledge, this is the first time that such records have been widely disclosed.
    • Upturn argued:
      • Law enforcement use these tools to investigate not only cases involving major harm, but also for graffiti, shoplifting, marijuana possession, prostitution, vandalism, car crashes, parole violations, petty theft, public intoxication, and the full gamut of drug-related offenses. Given how routine these searches are today, together with racist policing policies and practices, it’s more than likely that these technologies disparately affect and are used against communities of color.
      • We believe that MDFTs are simply too powerful in the hands of law enforcement and should not be used. But recognizing that MDFTs are already in widespread use across the country, we offer a set of preliminary recommendations that we believe can, in the short-term, help reduce the use of MDFTs. These include:
        • banning the use of consent searches of mobile devices,
        • abolishing the plain view exception for digital searches,
        • requiring easy-to-understand audit logs,
        • enacting robust data deletion and sealing requirements, and
        • requiring clear public logging of law enforcement use.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Mehmet Turgut Kirkgoz from Pixabay

EDPB Releases Draft Guidelines On “Relevant And Reasoned Objection”

The EDPB seeks to define when a DPA would have criticism of a draft GDPR decision another DPA would need to heed.

The European Data Protection Board (EDPB) has issued a draft of its construction of a key authority in the General Data Protection Regulation (GDPR) designed to guide and coordinate investigations that cross borders in the European Union (EU). A lead supervisory authority (LSA) is supposed to consider “relevant and reasoned objections” to draft decisions submitted by concerned supervisory authorities (CSA). If an LSA rejects such feedback, then the GDPR action gets kicked over to the EDPB. However, since this has only happened once, the EDPB thought it appropriate to define the term so all the EU data protection authorities (DPA) would understand what objections are relevant and reasoned.

The EDPB explained that the guidance “aims at establishing a common understanding of the notion of the terms “relevant and reasoned”, including what should be considered when assessing whether an objection “clearly demonstrates the significance of the risks posed by the draft decision.” The EDPB stated “he unfamiliarity surrounding “what constitutes relevant and reasoned objection” has the potential to create misunderstandings and inconsistent applications by the supervisory authorities, the EU legislator (sic) suggested that the EDPB should issue guidelines on this concept (end of Recital 124 GDPR).”

Article 60 of the GDPR provides if a CSA “expresses a relevant and reasoned objection to the draft decision [of the LSA], the lead supervisory authority shall, if it does not follow the relevant and reasoned objection or is of the opinion that the objection is not relevant or reasoned, submit the matter to the consistency mechanism.” Article 65 also provides that where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned,” then the EDPB must step in and work towards a final binding decision. This process was installed so that the enforcement of the EDPB would be uniform throughout the EU and to forestall the possibility that one DPA or a small group of DPAs would construe the data protection regime in ways contrary to its intention. As it is, there have already been allegations that some DPAs have been ineffective or lenient towards alleged offenders.

The EDPB stated in relevant part:

  • Article 4(24) GDPR defines “relevant and reasoned objection” as an objection to a draft decision as to whether there is an infringement of this Regulation, or whether envisaged action in relation to the controller or processor complies with this Regulation, which clearly demonstrates the significance of the risks posed by the draft decision as regards the fundamental rights and freedoms of data subjects and, where applicable, the free flow of personal data within the Union”.
  • This concept serves as a threshold in situations where CSAs aim to object to a (revised) draft decision to be adopted by the LSA under Article 60 GDPR.
  • In order for the objection to be considered as “relevant”, there must be a direct connection between the objection and the draft decision at issue. More specifically, the objection needs to concern either whether there is an infringement of the GDPR or whether the envisaged action in relation to the controller or processor complies with the GDPR.
  • In order for the objection to be “reasoned”, it needs to include clarifications and arguments as to why an amendment of the decision is proposed (i.e. the alleged legal / factual mistakes of the draft decision). It also needs to demonstrate how the change would lead to a different conclusion as to whether there is an infringement of the GDPR or whether the envisaged action in relation to the controller or processor complies with the GDPR.

Of course, this guidance is released at a time when the EDPB is using these powers for the first time. Unnamed DPAs have objected to proposed ruling by Ireland’s Data Protection Commission (DPC), the lead DPA investigating 2018 and 2019 Twitter data breaches. Consequently, the disagreement has been handed off to the EDPB, and depending on how resolution of this matter happens, the body could decide Twitter’s punishment, including a possible fine of up to 4% of its worldwide revenue. What’s more, the DPC is the lead agency investigating Facebook’s WhatsApp and Instagram, among other large technology companies, and may have to relinquish those decisions as well if other DPAs disagree with the DPC’s proposed punishment for any wrongdoing.

The DPC submitted its draft decision to other DPAs on the Twitter breach in May. The DPC stated “[t]he draft decision focusses on whether Twitter International Company has complied with Articles 33(1) and 33(5) of the GDPR” (i.e. the provision pertaining to data breach and proper notification protocol. The DPC further explained

  • This draft decision is one of a number of significant developments in DPC inquiries into “big tech” companies this week. Deputy Commissioner Graham Doyle has confirmed that: “In addition to submitting this draft decision to other EU supervisory authorities, we have this week sent a preliminary draft decision to WhatsApp Ireland Limited for their final submissions which will be taken in to account by the DPC before preparing a draft decision in that matter also for Article 60 purposes.  The inquiry into WhatsApp Ireland examines its compliance with Articles 12 to 14 of the GDPR in terms of transparency including in relation to transparency around what information is shared with Facebook.“
  • The DPC has also completed the investigation phase of a complaint-based inquiry which focuses on Facebook Ireland’s obligations to establish a lawful basis for personal data processing. This inquiry is now in the decision-making phase at the DPC.

Under Article 65 now that the draft decision on Twitter has been handed over to the EDPB, it has a month to get two-thirds of its members to agree to a binding decision it may draft. If this is not achieved, then the Board has another two weeks to get a simple majority, and if this does not occur, then EDPB Chair Andrea Jelinek alone may decide. Consequently, it is possible the EDPB redrafts the DPC decision and tries to get buy in from the DPAs that make up the Board to support a stronger punishment of Twitter.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Developments, and Coming Events (22 September)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing on 23 September titled “Examining Threats to American Intellectual Property: Cyber-attacks and Counterfeits During the COVID-19 Pandemic” with these witnesses:
    • Adam Hickey, Deputy Assistant Attorney General National Security Division, Department of Justice
    • Clyde Wallace, Deputy Assistant Director Cyber Division, Federal Bureau of Investigation
    • Steve Francis, Assistant Director, HSI Global Trade Investigations Division Director, National Intellectual Property Rights Center, U.S. Immigration and Customs Enforcement, Department of Homeland Security
    • Bryan S. Ware, Assistant Director for Cybersecurity Cyber Security and Infrastructure Security Agency, Department of Homeland Security
  • On 23 September, the Commerce, Science, and Transportation Committee will hold a hearing titled “Revisiting the Need for Federal Data Privacy Legislation,” with these witnesses:
    • The Honorable Julie Brill, Former Commissioner, Federal Trade Commission
    • The Honorable William Kovacic, Former Chairman and Commissioner, Federal Trade Commission
    • The Honorable Jon Leibowitz, Former Chairman and Commissioner, Federal Trade Commission
    • The Honorable Maureen Ohlhausen, Former Commissioner and Acting Chairman, Federal Trade Commission
    • Mr. Xavier Becerra, Attorney General, State of California
  • The House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee will hold a virtual hearing “Mainstreaming Extremism: Social Media’s Role in Radicalizing America” on 23 September with these witnesses:
    • Marc Ginsburg, President, Coalition for a Safer Web
    • Tim Kendall, Chief Executive Officer, Moment
    • Taylor Dumpson, Hate Crime Survivor and Cyber-Harassment Target
    • John Donahue, Fellow, Rutgers University Miler Center for Community Protection and Resiliency, Former Chief of Strategic Initiatives, New York City Police Department
  • On 23 September, the Senate Homeland Security and Governmental Affairs will hold a hearing to consider the nomination of Chad Wolf to be the Secretary of Homeland Security.
  • The Senate Armed Services Committee will hold a closed briefing on 24 September “on Department of Defense Cyber Operations in Support of Efforts to Protect the Integrity of U.S. National Elections from Malign Actors” with:
    • Kenneth P. Rapuano, Assistant Secretary of Defense for Homeland Defense and Global Security
    • General Paul M. Nakasone, Commander, U.S. Cyber Command and Director, National Security Agency/Chief, Central Security Service
  • On 24 September, the Homeland Security and Governmental Affairs will hold a hearing on “Threats to the Homeland” with:
    • Christopher A. Wray, Director, Federal Bureau of Investigation
    • Christopher Miller, Director, National Counterterrorism Center
    • Kenneth Cuccinelli, Senior Official Performing the Duties of the Deputy Secretary of Homeland Security
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled “Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September and has made available its agenda with these items:
    • Facilitating Shared Use in the 3.1-3.55 GHz Band. The Commission will consider a Report and Order that would remove the existing non-federal allocations from the 3.3-3.55 GHz band as an important step toward making 100 megahertz of spectrum in the 3.45-3.55 GHz band available for commercial use, including 5G, throughout the contiguous United States. The Commission will also consider a Further Notice of Proposed Rulemaking that would propose to add a co-primary, non-federal fixed and mobile (except aeronautical mobile) allocation to the 3.45-3.55 GHz band as well as service, technical, and competitive bidding rules for flexible-use licenses in the band. (WT Docket No. 19-348)
    • Expanding Access to and Investment in the 4.9 GHz Band. The Commission will consider a Sixth Report and Order that would expand access to and investment in the 4.9 GHz (4940-4990 MHz) band by providing states the opportunity to lease this spectrum to commercial entities, electric utilities, and others for both public safety and non-public safety purposes. The Commission also will consider a Seventh Further Notice of Proposed Rulemaking that would propose a new set of licensing rules and seek comment on ways to further facilitate access to and investment in the band. (WP Docket No. 07-100)
    • Improving Transparency and Timeliness of Foreign Ownership Review Process. The Commission will consider a Report and Order that would improve the timeliness and transparency of the process by which it seeks the views of Executive Branch agencies on any national security, law enforcement, foreign policy, and trade policy concerns related to certain applications filed with the Commission. (IB Docket No. 16-155)
    • Promoting Caller ID Authentication to Combat Spoofed Robocalls. The Commission will consider a Report and Order that would continue its work to implement the TRACED Act and promote the deployment of caller ID authentication technology to combat spoofed robocalls. (WC Docket No. 17-97)
    • Combating 911 Fee Diversion. The Commission will consider a Notice of Inquiry that would seek comment on ways to dissuade states and territories from diverting fees collected for 911 to other purposes. (PS Docket Nos. 20-291, 09-14)
    • Modernizing Cable Service Change Notifications. The Commission will consider a Report and Order that would modernize requirements for notices cable operators must provide subscribers and local franchising authorities. (MB Docket Nos. 19-347, 17-105)
    • Eliminating Records Requirements for Cable Operator Interests in Video Programming. The Commission will consider a Report and Order that would eliminate the requirement that cable operators maintain records in their online public inspection files regarding the nature and extent of their attributable interests in video programming services. (MB Docket No. 20-35, 17-105)
    • Reforming IP Captioned Telephone Service Rates and Service Standards. The Commission will consider a Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking that would set compensation rates for Internet Protocol Captioned Telephone Service (IP CTS), deny reconsideration of previously set IP CTS compensation rates, and propose service quality and performance measurement standards for captioned telephone services. (CG Docket Nos. 13-24, 03-123)
    • Enforcement Item. The Commission will consider an enforcement action.

Other Developments

  • The United States (U.S.) Department of Justice (DOJ) has indicted two Iranian nationals for allegedly hacking into systems in the U.S., Europe, and the Middle East dating back to 2013 to engage in espionage and sometimes theft.
    • The DOJ claimed in its press release:
      • According to a 10-count indictment returned on Sept. 15, 2020, Hooman Heidarian, a/k/a “neo,” 30, and Mehdi Farhadi, a/k/a “Mehdi Mahdavi” and “Mohammad Mehdi Farhadi Ramin,” 34, both of Hamedan, Iran, stole hundreds of terabytes of data, which typically included confidential communications pertaining to national security, foreign policy intelligence, non-military nuclear information, aerospace data, human rights activist information, victim financial information and personally identifiable information, and intellectual property, including unpublished scientific research.  In some instances, the defendants’ hacks were politically motivated or at the behest of Iran, including instances where they obtained information regarding dissidents, human rights activists, and opposition leaders.  In other instances, the defendants sold the hacked data and information on the black market for private financial gain.
      • The victims included several American and foreign universities, a Washington, D.C.-based think tank, a defense contractor, an aerospace company, a foreign policy organization, non-governmental organizations (NGOs), non-profits, and foreign government and other entities the defendants identified as rivals or adversaries to Iran.  In addition to the theft of highly protected and sensitive data, the defendants also vandalized websites, often under the pseudonym “Sejeal” and posted messages that appeared to signal the demise of Iran’s internal opposition, foreign adversaries, and countries identified as rivals to Iran, including Israel and Saudi Arabia.
  • Two United States (U.S.) agencies took coordinated action against an alleged cyber threat group and a front company for a “a years-long malware campaign that targeted Iranian dissidents, journalists, and international companies in the travel sector.” The U.S. Department of the Treasury’s Office of Foreign Assets Control (OFAC) “imposed sanctions on Iranian cyber threat group Advanced Persistent Threat 39 (APT39), 45 associated individuals, and one front company…Rana Intelligence Computing Company (Rana)” per the agency’s press release. Treasury further claimed:
    • Rana advances Iranian national security objectives and the strategic goals of Iran’s Ministry of Intelligence and Security (MOIS) by conducting computer intrusions and malware campaigns against perceived adversaries, including foreign governments and other individuals the MOIS considers a threat. APT39 is being designated pursuant to E.O. 13553 for being owned or controlled by the MOIS, which was previously designated on February 16, 2012 pursuant to Executive Orders 13224, 13553, and 13572, which target terrorists and those responsible for human rights abuses in Iran and Syria, respectively.
    • The Federal Bureau of Investigation (FBI) provided “information on numerous malware variants and indicators of compromise (IOCs) associated with Rana to assist organizations and individuals in determining whether they may have been targeted.”
  • The United States (U.S.) Department of Justice (DOJ) also released grand jury indictments against five nationals of the People’s Republic of China and two Malaysians for extensive hacking and exfiltration of commercial and business information with an eye towards profiting from these crimes. The DOJ asserted in its press release:
    • In August 2019 and August 2020, a federal grand jury in Washington, D.C., returned two separate indictments (available here and here) charging five computer hackers, all of whom were residents and nationals of the People’s Republic of China (PRC), with computer intrusions affecting over 100 victim companies in the United States and abroad, including software development companies, computer hardware manufacturers, telecommunications providers, social media companies, video game companies, non-profit organizations, universities, think tanks, and foreign governments, as well as pro-democracy politicians and activists in Hong Kong.
    •  The intrusions, which security researchers have tracked using the threat labels “APT41,” “Barium,” “Winnti,” “Wicked Panda,” and “Wicked Spider,” facilitated the theft of source code, software code signing certificates, customer account data, and valuable business information.  These intrusions also facilitated the defendants’ other criminal schemes, including ransomware and “crypto-jacking” schemes, the latter of which refers to the group’s unauthorized use of victim computers to “mine” cryptocurrency. 
    • Also in August 2020, the same federal grand jury returned a third indictment charging two Malaysian businessmen who conspired with two of the Chinese hackers to profit from computer intrusions targeting the video game industry in the United States and abroad.  Shortly thereafter, the U.S. District Court for the District of Columbia issued arrest warrants for the two businessmen.  On Sept. 14, 2020, pursuant to a provisional arrest request from the United States with a view to their extradition, Malaysian authorities arrested them in Sitiawan.
  • On 21 September, the House of Representatives took and passed the following bills, according to summaries provided by the House Majority Whip’s office:
    • The “Effective Assistance in the Digital Era” (H.R. 5546) (Rep. Jeffries – Judiciary) This bill requires the Federal Bureau of Prisons to establish a system to exempt from monitoring any privileged electronic communications between incarcerated individuals and their attorneys or legal representatives.
    • The “Defending the Integrity of Voting Systems Act (S. 1321) This bill broadens the definition of “protected computer” for purposes of computer fraud and abuse offenses under current law to include a computer that is part of a voting system.
    • The “Promoting Secure 5G Act of 2020” (H.R. 5698) This bill will establish as a U.S. policy within the IFIs to only finance 5G projects and other wireless technologies that include adequate security measures in furtherance of national security aims to protect wireless networks from bad actors and foreign governments.
    • The “MEDIA Diversity Act of 2020” (H.R. 5567) This bill Requires the FCC to consider market entry barriers for socially disadvantaged individuals in the communications marketplace.
    • The “Don’t Break Up the T-Band Act of 2020” as amended (H.R. 451) This bill repeals the requirement on the FCC to reallocate and auction the T-Band.  H.R. 451 also requires the FCC to adopt rules limiting the use of 9-1-1 fees by States or other taxing jurisdictions to (1) the support and implementation of 9-1-1 services and (2) operational expenses of public safety answering points.
    • It bears note that S. 1321 has passed the Senate, and so it is off to the White House for the only election security bill that has made it through both house of Congress.

Further Reading

  • Justice Department expected to brief state attorneys general this week on imminent Google antitrust lawsuit” By Tony Romm — The Washington Post; “Justice Dept. Case Against Google Is Said to Focus on Search Dominance” By Cecilia Kang, Katie Benner, Steve Lohr and Daisuke Wakabayashi — The New York Times; “Justice Department, states to meet in possible prelude to Google antitrust suit” By Leah Nylen — Politico. Tomorrow, the United States Department of Justice (DOJ) will outline its proposed antitrust case against Google with state attorneys general, almost all of whom are investigating Google on the same grounds. Reportedly, the DOJ case is focused on the company’s dominance of online searches, notably its arrangement to make Google the default search engine on iPhones and Androids, and not on its advertising practices. If the DOJ goes this road, then it will be similar to the European Union’s (EU) 2018 case against Google for the same, which resulted in EU residents being offered a choice on search engines on Android devices and a €4.34 billion fine. This development comes after articles earlier this month that Attorney General William Barr has been pushing the DOJ attorneys and investigators against the wishes of many to wrap up the investigation in time for a pre-election filing that would allow President Donald Trump to claim he is being tough on big technology companies. However, if this comes to pass, Democratic attorneys general may decline to join the suit and may bring their own action also alleging violations in the online advertising realm that Google dominates. In this vein, Texas Attorney General Ken Paxton has been leading the state effort to investigate Google’s advertising business, which critics argue is anti-competitive. Also, according to DOJ attorneys who oppose what they see as Barr rushing the suit, this could lead to a weaker case Google may be able to defeat in court. Of course, this news comes shortly after word leaked from the Federal Trade Commission (FTC) that its case against Facebook could be filed regarding its purchase of rivals WhatsApp and Instagram.
  • Why Japan wants to join the Five Eyes intelligence network” By Alan Weedon — ABC News. This piece makes the case as to why the United States, United Kingdom, Canada, Australia, and New Zealand may admit a new member to the Five Eyes soon: Japan. The case for the first Asian country is that it is a stable, western democracy, a key ally in the Pacific, and a bulwark against the influence of the People’s Republic of China (PRC). It is really this latter point that could carry the day, for the Five Eyes may need Japan’s expertise with the PRC and its technology to counter the former’s growing ambitions.
  • The next Supreme Court justice could play a major role in cybersecurity and privacy decisions” By Joseph Marks — The Washington Post. There are a range of cybersecurity and technology cases that the Supreme Court will decide in the near future, and so whether President Donald Trump gets to appoint Justice Ruth Bader Ginsburg’s successor will be very consequential for policy in these areas. For example, the court could rule on the Computer Fraud and Abuse Act for the first time regarding whether researchers are violating the law by probing for weak spots in systems. There are also Fourth Amendment and Fifth Amendment cases pending with technology implications as the former pertains to searches of devices by border guards and the latter to self-incrimination visa vis suspects being required to unlock devices.
  • Facebook Says it Will Stop Operating in Europe If Regulators Don’t Back Down” By David Gilbert —VICE. In a filing in its case against Ireland’s Data Protection Commission (DPC), Facebook made veiled threats that if the company is forced to stop transferring personal data to the United States, it may stop operating in the European Union altogether. Recently, the DPC informed Facebook that because Privacy Shield was struck down, it would need to stop transfers even though the company has been using standard contractual clauses, another method permitted in some case under the General Data Protection Regulation. Despite Facebook’s representation, it seems a bit much that the company would leave the EU to any competitors looking to its fill its shoes.
  • As U.S. Increases Pressure, Iran Adheres to Toned-Down Approach” By Julian E. Barnes, David E. Sanger, Ronen Bergman and Lara Jakes — The New York Times. The Islamic Republic of Iran is showing remarkable restraint in its interactions with the United States in the face of continued, punitive actions against Tehran. And this is true also of its cyber operations. The country has made the calculus that any response could be used by President Donald Trump to great effect in closing the gap against front runner former Vice President Joe Biden. The same has been true of its cyber operations against Israel, which has reportedly conducted extensive operations inside Iran with considerable damage.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (14 September)

Coming Events

  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The House Homeland Security Committee will hold a hearing titled “Worldwide Threats to the Homeland” on 17 September with the following witnesses:
    • Chad Wolf, Department of Homeland Security
    • Christopher Wray, Director, Federal Bureau of Investigation
    • Christopher Miller, Director, National Counterterrorism Center (NCTC)
  • On 17 September, the House Energy and Commerce Committee’s Communications & technology Subcommittee will hold a hearing titled “Trump FCC: Four Years of Lost Opportunities.”
  • The House Armed Services Committee’s Intelligence and Emerging Threats and Capabilities Subcommittee will hold a hearing’ titled “Interim Review of the National Security Commission on Artificial Intelligence Effort and Recommendations” with these witnesses:
    • Dr. Eric Schmidt , Chairman, National Security Commission on Artificial Intelligence 
    • HON Robert Work, Vice Chairman, National Security Commission on Artificial Intelligence, HON Mignon Clyburn, Commissioner, National Security Commission on Artificial Intelligence 
    • Dr. José-Marie Griffiths, Commissioner, National Security Commission on Artificial Intelligence
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” The agency has released its agenda and explained:
    • The workshop will also feature four panel discussions that will focus on: case studies on data portability rights in the European Union, India, and California; case studies on financial and health portability regimes; reconciling the benefits and risks of data portability; and the material challenges and solutions to realizing data portability’s potential.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled “Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September and has made available its agenda with these items:
    • Facilitating Shared Use in the 3.1-3.55 GHz Band. The Commission will consider a Report and Order that would remove the existing non-federal allocations from the 3.3-3.55 GHz band as an important step toward making 100 megahertz of spectrum in the 3.45-3.55 GHz band available for commercial use, including 5G, throughout the contiguous United States. The Commission will also consider a Further Notice of Proposed Rulemaking that would propose to add a co-primary, non-federal fixed and mobile (except aeronautical mobile) allocation to the 3.45-3.55 GHz band as well as service, technical, and competitive bidding rules for flexible-use licenses in the band. (WT Docket No. 19-348)
    • Expanding Access to and Investment in the 4.9 GHz Band. The Commission will consider a Sixth Report and Order that would expand access to and investment in the 4.9 GHz (4940-4990 MHz) band by providing states the opportunity to lease this spectrum to commercial entities, electric utilities, and others for both public safety and non-public safety purposes. The Commission also will consider a Seventh Further Notice of Proposed Rulemaking that would propose a new set of licensing rules and seek comment on ways to further facilitate access to and investment in the band. (WP Docket No. 07-100)
    • Improving Transparency and Timeliness of Foreign Ownership Review Process. The Commission will consider a Report and Order that would improve the timeliness and transparency of the process by which it seeks the views of Executive Branch agencies on any national security, law enforcement, foreign policy, and trade policy concerns related to certain applications filed with the Commission. (IB Docket No. 16-155)
    • Promoting Caller ID Authentication to Combat Spoofed Robocalls. The Commission will consider a Report and Order that would continue its work to implement the TRACED Act and promote the deployment of caller ID authentication technology to combat spoofed robocalls. (WC Docket No. 17-97)
    • Combating 911 Fee Diversion. The Commission will consider a Notice of Inquiry that would seek comment on ways to dissuade states and territories from diverting fees collected for 911 to other purposes. (PS Docket Nos. 20-291, 09-14)
    • Modernizing Cable Service Change Notifications. The Commission will consider a Report and Order that would modernize requirements for notices cable operators must provide subscribers and local franchising authorities. (MB Docket Nos. 19-347, 17-105)
    • Eliminating Records Requirements for Cable Operator Interests in Video Programming. The Commission will consider a Report and Order that would eliminate the requirement that cable operators maintain records in their online public inspection files regarding the nature and extent of their attributable interests in video programming services. (MB Docket No. 20-35, 17-105)
    • Reforming IP Captioned Telephone Service Rates and Service Standards. The Commission will consider a Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking that would set compensation rates for Internet Protocol Captioned Telephone Service (IP CTS), deny reconsideration of previously set IP CTS compensation rates, and propose service quality and performance measurement standards for captioned telephone services. (CG Docket Nos. 13-24, 03-123)
    • Enforcement Item. The Commission will consider an enforcement action.

Other Developments

  • After Ireland’s Data Protection Commission (DPC) directed Facebook to stop transferring the personal data of European Union citizens to the United States (U.S.), the company filed suit in Ireland’s court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge. Earlier this summer, the Court of Justice for the European Union (CJEU) struck down the adequacy decision for the agreement between the European Union (EU) and United States (U.S.) that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data o the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • In a related development, the European Data Protection Board (EDPB) has established “a taskforce to look into complaints filed in the aftermath of the CJEU Schrems II judgement.” The EDPB noted the 101 identical complaints “lodged with EEA Data Protection Authorities against several controllers in the European Economic Area (EEA) member states regarding their use of Google/Facebook services which involve the transfer of personal data.” The Board added “[s]pecifically the complainants, represented by the NGO NOYB, claim that Google/Facebook transfer personal data to the U.S. relying on the EU-U.S. Privacy Shield or Standard Contractual Clauses and that according to the recent CJEU judgment in case C-311/18 the controller is unable to ensure an adequate protection of the complainants’ personal data.” The EDPB claimed “[t]he taskforce will analyse the matter and ensure a close cooperation among the members of the Board…[and] [t]his taskforce will prepare recommendations to assist controllers and processors with their duty to identify and implement appropriate supplementary measures to ensure adequate protection when transferring data to third countries.” EDPB Chair Andrea Jelinek cautioned “the implications of the judgment are wide-ranging, and the contexts of data transfers to third countries very diverse…[and] [t]herefore, there cannot be a one-size-fits-all, quick fix solution.” She added “[e]ach organisation will need to evaluate its own data processing operations and transfers and take appropriate measures.”
  • An Australian court ruled against Facebook in its efforts to dismiss a suit brought against the company for its role in retaining and providing personal data to Cambridge Analytica. A Federal Court of Australia dismissed Facebook’s filings to reverse a previous ruling that allowed the Office of the Australian Information Commissioner (OAIC) to sue Facebook’s United States and Irish entities.
    • In March, the OAIC filed suit in federal court in Australia, alleging the two companies transgressed the privacy rights of 311,127 Australians under Australia’s Privacy Act. The two companies could face liability as high as $1.7 million ASD per violation.
    • In its November 2018 report to Parliament titled “Investigation into the use of data analytics in political campaigns”, the ICO explained
      • One key strand of our investigation involved allegations that an app, ultimately referred to as ‘thisisyourdigitallife’, was developed by Dr Aleksandr Kogan and his company Global Science Research (GSR) in order to harvest the data of up to 87 million global Facebook users, including one million in the UK. Some of this data was then used by Cambridge Analytica, to target voters during the 2016 US Presidential campaign process.
    • In its July 2018 report titled “Democracy disrupted? Personal information and political influence,” the ICO explained
      • The online targeted advertising model used by Facebook is very complex, and we believe a high level of transparency in relation to political advertising is vital. This is a classic big-data scenario: understanding what data is going into the system; how users’ actions on Facebook are determining what interest groups they are placed in; and then the rules that are fed into any dynamic algorithms that enable organisations to target individuals with specific adverts and messaging.
      • Our investigation found significant fair-processing concerns both in terms of the information available to users about the sources of the data that are being used to determine what adverts they see and the nature of the profiling taking place. There were further concerns about the availability and transparency of the controls offered to users over what ads and messages they receive. The controls were difficult to find and were not intuitive to the user if they wanted to control the political advertising they received. Whilst users were informed that their data would be used for commercial advertising, it was not clear that political advertising would take place on the platform.
      • The ICO also found that despite a significant amount of privacy information and controls being made available, overall they did not effectively inform the users about the likely uses of their personal information. In particular, more explicit information should have been made available at the first layer of the privacy policy. The user tools available to block or remove ads were also complex and not clearly available to users from the core pages they would be accessing. The controls were also limited in relation to political advertising.
  • The Australian Competition & Consumer Commission (ACCC) announced it “will be examining the experiences of Australian consumers, developers, suppliers and others in a new report scrutinising mobile app stores” according to the agency’s press release. The ACCC’s inquiry comes at the same time regulators in the United States and the European Union are investigating the companies for their app store practices, which could lead to enforcement actions. The ACCC is also looking to institute a code that would require Google and Facebook to pay Australian media outlets for content used on their platforms. The ACCC stated that “[i]ssues to be examined include the use and sharing of data by apps, the extent of competition between Google and Apple’s app stores, and whether more pricing transparency is needed in Australia’s mobile apps market.” The ACCC added:
    • Consumers are invited to share their experiences with buying and using apps through a short survey. The ACCC has also released an issues paper seeking views and feedback from app developers and suppliers.
    • In the issues paper, the ACCC explained “[p]otential outcomes” could be:
      • findings regarding structural, competitive or behavioural issues affecting the supply of apps
      • increased information about competition, pricing and other practices in the supply of apps and on app marketplaces
      • ACCC action to address any conduct that raises concerns under the Competition and Consumer Act 2010, and
      • recommendations to the Government for legislative reform to address systemic issues.
  • The Government Accountability Office (GAO) found an agency has implemented spotty, incomplete privacy measures in using facial recognition technology (FRT) at ports of entry.
    • The House Homeland Security and Senate Homeland Security and Governmental Affairs asked the GAO
      • to review United States (U.S.) Customs and Border Protection (CBP) and Transportation Security Administration’s (TSA) facial recognition technology capabilities for traveler identity verification. This report addresses (1) the status of CBP’s testing and deployment of facial recognition technology at ports of entry, (2) the extent to which CBP’s use of facial recognition technology has incorporated privacy principles consistent with applicable laws and policies, (3) the extent to which CBP has assessed the accuracy and performance of its facial recognition capabilities at ports of entry, and (4) the status of TSA’s testing of facial recognition capabilities and the extent to which TSA’s facial recognition pilot tests incorporated privacy principles.
    • The GAO noted:
      • Most recently, in 2017, we reported that CBP had made progress in testing biometric exit capabilities, including facial recognition technology, but challenges continued to affect CBP’s efforts to develop and implement a biometric exit system, such as differences in the logistics and infrastructure among ports of entry. As we previously reported, CBP had tested various biometric technologies in different locations to determine which type of technology could be deployed on a large scale without disrupting legitimate travel and trade, while still meeting its mandate to implement a biometric entry-exit system. Based on the results of its testing, CBP concluded that facial recognition technology was the most operationally feasible and traveler-friendly option for a comprehensive biometric solution. Since then, CBP has prioritized testing and deploying facial recognition technology at airports (referred to as air exit), with seaports and land ports of entry to follow. These tests and deployments are part of CBP’s Biometric Entry-Exit Program.
      • As part of TSA’s mission to protect the nation’s transportation systems and to ensure freedom of movement for people and commerce, TSA has been exploring facial recognition technology for identity verification at airport checkpoints. Since 2017, TSA has conducted a series of pilot tests—some in partnership with CBP—to assess the feasibility of using facial recognition technology to automate traveler identity verification at airport security checkpoints. In April 2018, TSA signed a policy memorandum with CBP on the development and implementation of facial recognition capabilities at airports.
    • The GAO made recommendations to CBP:
      • The Commissioner of CBP should ensure that the Biometric Entry-Exit Program’s privacy notices contain complete and current information, including all of the locations where facial recognition is used and how travelers can request to opt out as appropriate. (Recommendation 1)
      • The Commissioner of CBP should ensure that the Biometric Entry-Exit Program’s privacy signage is consistently available at all locations where CBP is using facial recognition. (Recommendation 2)
      • The Commissioner of CBP should direct the Biometric Entry-Exit Program to develop and implement a plan to conduct privacy audits of its commercial partners’, contractors’, and vendors’ use of personally identifiable information. (Recommendation 3)
      • The Commissioner of CBP should develop and implement a plan to ensure that the biometric air exit capability meets its established photo capture requirement. (Recommendation 4)
      • The Commissioner of CBP should develop a process by which Biometric Entry-Exit program officials are alerted when the performance of air exit facial recognition falls below established thresholds. (Recommendation 5)
  • The United States (U.S.) Agency for Global Media (USAGM) is being sued by an entity it funds and oversees because
    • Previously, the United States Court of Appeals for the District of Columbia enjoined USAGM from “taking any action to remove or replace any officers or directors of the OTF,” pending the outcome of the suit which is being expedited.
    • Additionally, USAGM CEO and Chair of the Board Michael Pack is being accused in two different letters of seeking to compromise the integrity and independence of two organizations he oversees. There have been media accounts of the Trump Administration’s remaking of USAGM in ways critics contend are threatening the mission and effectiveness of the Open Technology Fund (OTF), a U.S. government non-profit designed to help dissidents and endangered populations throughout the world. The head of the OTF has been removed, evoking the ire of Members of Congress, and other changes have been implemented that are counter to the organization’s mission. Likewise, there are allegations that politically-motivated policy changes seek to remake the Voice of America (VOA) into a less independent entity.
      • In a letter to Pack, OTF argued that a number of recent actions Pack has undertaken have violated “firewall protections” in the organization’s grant agreement. They further argue that Pack is conflicted and should turn over the investigation to the United States (U.S.) Department of State’s Office of the Inspector General (OIG). OTF alleged the following:
        • 1. Attempts to compromise and undermine OTF’s independence: USAGM has repeatedly attempted to undermine OTF’s independence over the past several months.
        • 2. Attempts to compromise and undermine integrity: USAGM has also attempted to undermine the integrity of OTF by publicly making numerous false and misleading claims about OTF to the internet freedom community, the general public, and even to Congress.
        • 3. Attempts to compromise and undermine security: USAGM has attempted to undermine the security of OTF, our staff, and our project partners -many of whom operate in highly sensitive environments -by
          • 1) attempting to gain unauthorized and unsupervised access to our office space and
          • 2) by requesting vast amounts of sensitive information and documentation with no apparent grant-related purpose, and no regard for the security of that information and documentation
        • 4. Attempts to compromise and undermine privacy: Closely related to USAGM’s attempts to undermine OTF’s security, USAGM has also attempted to undermine the privacy of OTF’s staff and partners by requesting that OTF provide Personally Identifiable Information(PII) without a clearly articulated grant-related purpose, and with no guarantee that the PII will be handled in a secure manner.
        • 5. Attempts to compromise and undermine effectiveness: USAGM’s actions have undermined the effectiveness of OTF by:
          • 1) freezing and subsequently withholding $19,181,791 in congressionally appropriated funding from OTF, forcing OTF to issue stop-work orders to 49 of our 60 internet freedom projects;
          • 2) providing unjustified, duplicative, overbroad, and unduly burdensome requests for information and documentation, without any clear grant-related purpose, and with clearly unreasonable deadlines;
          • 3) attempting to divert and redirect funding obligated by USAGM to OTF in an effort to duplicate OTF’s work; and
          • 4) threatening to terminate OTF’s Grant Agreement.
    • OTF asserted
      • These actions individually serve to seriously undermine OTF’s organizational and programmatic effectiveness. In their combined aggregate they threaten to dismantle OTF’s basic ability to effectively carry out its congressionally mandated mission to the detriment of USAGM and the cause of internet freedom globally
    • A group of VOA journalists wrote the entity’s acting director, asserting that Pack’s actions risk crippling programs and projects for some countries that are considered national security priorities.” They added:
      • He has ordered the firing of contract journalists, with no valid reason, by cancelling their visas, forcing them back to home countries where the lives of some of them may be in jeopardy. Now the purge appears to be expanding to include U.S. permanent residents and even U.S. citizens, with Mr. Pack recklessly expressing that being a journalist is “a great cover for a spy.
  • The Cyberspace Solarium Commission (CSC) issued its latest white paper to address a continuing problem for the United States’ government: how to attract or train a sufficient cyber workforce when private sector salaries are generally better. In “Growing A Stronger Federal Cyber Workforce,” the CSC claimed “Currently more than one in three public-sector cyber jobs sits open…[and] [f]illing these roles has been a persistent and intractable problem over the past decade, in large part due to a lack of coordination and leadership.” The CSC averred “[i]n the context of this pervasive challenge, the fundamental purpose of this paper is to outline the elements required for a coherent strategy that enables substantive and coordinated investment in cyber workforce development and calls for a sustained investment in that strategy.” The CSC then proceeds to lay out “five elements to guide development of a federal cyber workforce strategy:
    • Organize: Federal departments and agencies must have flexible tools for organizing and managing their workforce that can adapt to each organization’s individual mission while also providing coherence across the entirety of the federal government. To appropriately organize the federal cyber workforce, the CSC recommends properly identifying and utilizing cyber-specific occupational classifications to allow more tailored workforce policies, building a federal cyber service to provide clear and agile hiring authorities and other personnel management tools, and establishing coordination structures to provide clear leadership for federal workforce development e orts.
    • Recruit: Federal leaders must focus on the programs that make public service an attractive prospect to talented individuals. In many ways, the federal government’s greatest tool for recruitment is the mission and unique learning opportunities inherent in federal work. To capitalize on these advantages, the government should invest in existing programs such as CyberCorps: Scholarship for Service and the Centers of Academic Excellence, while also working to mitigate recruitment barriers that stem from the personnel security clearance process.
    • Develop: e federal government, like all cyber employers, cannot expect every new employee to have hands-on experience, a four-year degree, and a list of industry certifications. Rather, the federal government will be stronger if it draws from a broad array of educational backgrounds and creates opportunities for employees to gain knowledge and experience as they work. is e ort will call for many innovative approaches, among which the Commission particularly recommends apprenticeship programs and upskilling opportunities to support cyber employee development.
    • Retain: Federal leaders should take a nuanced view of retention, recognizing that enabling talent to move flexibly between the public and private sectors enables a stronger cyber workforce overall. However, federal employers can take steps to encourage their employees to increase the time they spend in public service. Improving pay flexibility is a major consideration, but continuing the development of career pathways and providing interesting career development opportunities like rotational and exchange programs also can be critical. Of particular note, federal employers can increase retention of underrepresented groups through the removal of inequities and barriers to advancement in the workplace.
    • Stimulate growth: e federal government cannot simply recruit a larger share of the existing national talent pool. Rather, leaders must take steps to grow the talent pool itself in order to increase the numbers of those available for federal jobs. To promote growth of the talent pool nationwide, the federal government must first coordinate government efforts working toward this goal. Executive branch and congressional leaders should also invest in measures to promote diversity across the national workforce and incentivize research to provide a greater empirical understanding of cyber workforce dynamics. Finally, federal leaders must work to increase the military cyber workforce, which has a significant impact on the national cyber workforce because it serves as both a source and an employer of cyber talent.

Further Reading

  • Oracle reportedly wins deal for TikTok’s US operations as ‘trusted tech partner’” By Tom Warren and Nick Statt – The Verge. ByteDance chose Oracle over Microsoft but not for buying its operations in the United States (U.S.), Australia, Canada, and New Zealand. Now, Oracle is proposing to be TikTok’s trusted technology partner, which seems to be hosting TikTok’s operations in the U.S. and managing its data as a means of allaying the concerns of the U.S. government about access by the People’s Republic of China (PRC).
  • Why Do Voting Machines Break on Election Day?” By Adrianne Jeffries – The Markup. This piece seeks to debunk the hype by explaining that most voting issues are minor and easily fixed, which may well be a welcome message in the United States (U.S.) given the lies and fretting about the security and accuracy of the coming election. Nonetheless, the mechanical and systemic problems encountered by some Americans do speak to the need to update voting laws and standards. Among other problems are the high barriers to entry for firms making and selling voting machines.
  • Twitter steps up its fight against election misinformation” By Elizabeth Dwoskin – The Washington Post. Twitter and Google announced policy changes like Facebook did last week to help tamp down untrue claims and lies about voting and elections in the United States. Twitter will take a number of different approaches to handling lies and untrue assertions. If past is prologue, President Donald Trump may soon look to test the limits of this policy as he did shortly after Facebook announced its policy changes. Google will adjust searches on election day to place respected, fact oriented organizations at the top of search results.
  • China’s ‘hybrid war’: Beijing’s mass surveillance of Australia and the world for secrets and scandal” By Andrew Probyn and Matthew Doran – ABC News; “Zhenhua Data leak: personal details of millions around world gathered by China tech company” By Daniel Hurst in Canberra, Lily Kuo in Beijing and Charlotte Graham-McLay in Wellington – The Guardian. A massive database leaked to to an American shows the breadth and range of information collected by a company in the People’s Republic of China (PRC) alleged to be working with the country’s military and security services. Zhenhua Data is denying any wrongdoing or anything untoward, but the database contains information on 2.4 million people, most of whom live in western nations in positions of influence and power such as British and Australian prime Ministers Boris Johnson and Scott Morrison. Academics claim this sort of compilation of information from public and private sources is unprecedented and would allow the PRC to run a range of influence operations.
  • Europe Feels Squeeze as Tech Competition Heats Up Between U.S. and China” By Steven Erlanger and Adam Satariano – The New York Times. Structural challenges in the European Union (EU) and a lack of large technology companies have left the EU is a delicate position. It seeks to be the world’s de facto regulator but is having trouble keeping with the United States and the People’s Republic of China, the two dominant nations in technology.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by PixelAnarchy from Pixabay

EDPB Steps Into Twitter Investigation

The body that consists of and oversees the EU’s DPAs will use its power under the GDPR to resolve  a dispute between agencies over the punishment for Twitter’s data breaches.

The European Data Protection Board (EDPB) will soon have the opportunity to use a key power for the first time since its inception in order to resolve a dispute among data protection authorities (DPA) in the European Union (EU). Unnamed DPAs have objected to proposed ruling by Ireland’s Data Protection Commission (DPC), the lead DPA investigating 2018 and 2019 Twitter data breaches. Consequently, per the General Data Protection Regulation (GDPR), the disagreement has been handed off to the EDPB, and depending on how resolution of this matter happens, the body could decide Twitter’s punishment, including a possible fine of up to 4% of its worldwide revenue. What’s more, the DPC is the lead agency investigating Facebook’s WhatsApp and Instagram, among other large technology companies, and may have to relinquish those decisions as well if other DPAs disagree with the DPC’s proposed punishment for any wrongdoing.

The DPC submitted its draft decision to other DPAs on the Twitter breach in May in accordance with Article 60 of the GDPR. The DPC stated “[t]he draft decision focusses on whether Twitter International Company has complied with Articles 33(1) and 33(5) of the GDPR” (i.e. the provision pertaining to data breach and proper notification protocol. The DPC further explained

  • This draft decision is one of a number of significant developments in DPC inquiries into “big tech” companies this week. Deputy Commissioner Graham Doyle has confirmed that: “In addition to submitting this draft decision to other EU supervisory authorities, we have this week sent a preliminary draft decision to WhatsApp Ireland Limited for their final submissions which will be taken in to account by the DPC before preparing a draft decision in that matter also for Article 60 purposes.  The inquiry into WhatsApp Ireland examines its compliance with Articles 12 to 14 of the GDPR in terms of transparency including in relation to transparency around what information is shared with Facebook.“
  • The DPC has also completed the investigation phase of a complaint-based inquiry which focuses on Facebook Ireland’s obligations to establish a lawful basis for personal data processing. This inquiry is now in the decision-making phase at the DPC.

Article 65 of the GDPR provides that the EDPB will make a binding decision on an investigation where “a supervisory authority concerned has raised a relevant and reasoned objection to a draft decision of the lead authority or the lead authority has rejected such an objection as being not relevant or reasoned.” In this case, at least one DPA has raised an objection to the DPC’s draft decision, thus triggering Article 65. Then the EDPB has a month to get two-thirds of its members to agree to a binding decision it may draft. If this is not achieved, then the Board has another two weeks to get a simple majority, and if this does not occur, then EDPB Chair  Andrea Jelinek alone may decide. Consequently, it is possible the EDPB redrafts the DPC decision and tries to get buy in from the DPAs that make up the Board to support a stronger punishment of Twitter.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Roland Mey from Pixabay

Europe’s Highest Court Strikes Down Privacy Shield

The agreement that has been allowing US companies to transfer the personal data of EU residents to the US was found to be invalid under EU law. The EU’s highest court seem to indicate standard contractual clauses, a frequently used means to transfer data, may be acceptable.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

In the second major ruling from the European Union (EU) this week, earlier today, its highest court invalidated the agreement that has allowed multinational corporations and others to transfer the personal data of EU citizens to the United States (US) for commercial purposes since 2016. The court did not, however, find illegal standard contractual clauses, the means by which many such transfers are occurring. This is the second case an Austrian privacy activist has brought, alleging that Facebook was transferring his personal data into the US in violation of European law because US law, especially surveillance programs, resulted in less protection and fewer rights. The first case resulted in the previous transfer agreement being found illegal, and now this case has resulted in much the same outcome. The import of this ruling is not immediately clear.

Maximillian Schrems filed a complaint against Facebook with the Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under EU law because of the mass US surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-US Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the US passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”

However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The European Data Protection Board (EDPB) explained in a recent decision on Denmark’s SCC that

  • According to Article 28(3) General Data Protection Regulation (GDPR), the processing by a data processor shall be governed by a contract or other legal act under Union or Member State law that is binding on the processor with regard to the controller, setting out a set of specific aspects to regulate the contractual relationship between the parties. These include the subject-matter and duration of the processing, its nature and purpose, the type of personal data and categories of data subjects, among others.
  • Under Article 28(6) GDPR, without prejudice to an individual contract between the data controller and the data processor, the contract or the other legal act referred in paragraphs (3) and (4) of Article 28 GDPR may be based, wholly or in part on SCCs.

In a summary of its decision, the CJEU explained

The GDPR provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

The CJEU found

  • Regarding the level of protection required in respect of such a transfer, the Court holds that the requirements laid down for such purposes by the GDPR concerning appropriate safeguards, enforceable rights and effective legal remedies must be interpreted as meaning that data subjects whose personal data are transferred to a third country pursuant to standard data protection clauses must be afforded a level of protection essentially equivalent to that guaranteed within the EU by the GDPR, read in the light of the Charter. In those circumstances, the Court specifies that the assessment of that level of protection must take into consideration both the contractual clauses agreed between the data exporter established in the EU and the recipient of the transfer established in the third country concerned and, as regards any access by the public authorities of that third country to the data transferred, the relevant aspects of the legal system of that third country.
  • Regarding the supervisory authorities’ obligations in connection with such a transfer, the Court holds that, unless there is a valid Commission adequacy decision, those competent supervisory authorities are required to suspend or prohibit a transfer of personal data to a third country where they take the view, in the light of all the circumstances of that transfer, that the standard data protection clauses are not or cannot be complied with in that country and that the protection of the data transferred that is required by EU law cannot be ensured by other means, where the data exporter established in the EU has not itself suspended or put an end to such a transfer.

The CJEU stated “the limitations on the protection of personal data arising from the domestic law of the US on the access and use by US public authorities of such data transferred from the EU to that third country, which the Commission assessed in [its 2016 adequacy decision], are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under EU law, by the principle of proportionality, in so far as the surveillance programmes based on those provisions are not limited to what is strictly necessary.”

The CJEU found the process put in place by the US government to handle complaints inadequate. The 2016 Privacy Shield resulted in the creation of an Ombudsman post that EU citizens could submit their complaints. This position is currently held by Under Secretary of State for Economic Growth, Energy, and the Environment Keith Krach.

The CJEU stated “the Ombudsperson mechanism referred to in that decision does  not  provide  data  subjects with any  cause  of  action  before  a  body  which  offers guarantees substantially equivalent to those required by EU law, such as to ensure both the independence  of  the Ombudsperson  provided  for  by  that  mechanism  and the  existence  of rules  empowering  the  Ombudsperson  to  adopt  decisions  that  are  binding  on  the US intelligence services.”

The decision on SCCs is more ambiguous as it is not entirely clear the circumstances under which they can be used. In its decision, the CJEU made clear that SCCs are not necessarily legal under EU law:

although there are situations in which, depending on the law and practices in force in the third country concerned, the recipient of such a transfer is in a position to guarantee the necessary protection of the data solely on the basis of standard data protection clauses, there are others in which the content of those standard clauses might not constitute a sufficient means of ensuring, in practice, the effective protection of personal data transferred to the third country concerned. That is the case, in particular, where the law of that third country allows its public authorities to interfere with the rights of the data subjects to which that data relates.

Reaction from the parties was mixed, particularly on what the CJEU’s ruling means for SCCs even though there was agreement that the Privacy Shield will soon no longer govern data transfers from the EU to the US.

The DPC issued a statement in which it asserted

Today’s judgment provides just that, firmly endorsing the substance of the concerns expressed by the DPC (and by the Irish High Court) to the effect that EU citizens do not enjoy the level of protection demanded by EU law when their data is transferred to the United States. In that regard, while the judgment most obviously captures Facebook’s transfers of data relating to Mr Schrems, it is of course the case that its scope extends far beyond that, addressing the position of EU citizens generally.

The DPC added

So, while in terms of the points of principle in play, the Court has endorsed the DPC’s position, it has also ruled that the SCCs transfer mechanism used to transfer data to countries worldwide is, in principle, valid, although it is clear that, in practice, the application of the SCCs transfer mechanism to transfers of personal data to the United States is now questionable. This is an issue that will require further and careful examination, not least because assessments will need to be made on a case by case basis.

At a press conference, EC Vice-President Věra Jourová claimed the “CJEU declared the Privacy Shield decision invalid, but also confirmed that the standard contractual clauses remain a valid tool for the transfer of personal data to processors established in third countries.” She asserted “[t]his means that the transatlantic data flows can continue, based on the broad toolbox for international transfers provided by the GDPR, for instance binding corporate rules or SCCs.” Jourová contended with regard to next steps, “[w]e are not starting from scratch…[and] [o]n the contrary, the Commission has already been working intensively to ensure that this toolbox is fit for purpose, including the modernisation of the Standard Contractual Clauses.” Jourová stated “we will be working closely with our American counterparts, based on today’s ruling.”

European Commissioner for Justice Didier Reynders stated

  • First, I welcome the fact that the Court confirmed the validity of our Decision on SCCs.
    • We have been working already for some time on modernising these clauses and ensuring that our toolbox for international data transfers is fit for purpose.
    • Standard Contractual Clauses are in fact the most used tool for international transfers of personal data and we wanted to ensure they can be used by businesses and fully in line with EU law.
    • We are now advanced with this work and we will of course take into account the requirements of judgement.
    • We will work with the European Data Protection Board, as well as the 27 EU Member States. It will be very important to start the process to have a formal approval to modernise the Standard Contractual Clauses as soon as possible. We have been in an ongoing process about such a modernisation for some time, but with an attention to the different elements of the decision of the Court today.
  • My second point: The Court has invalidated the Privacy Shield. We have to study the judgement in detail and carefully assess the consequences of this invalidation.

Reynders stated that “[i]n the meantime, transatlantic data flows between companies can continue using other mechanisms for international transfers of personal data available under the GDPR.”

In a statement, US Secretary of Commerce Wilbur Ross

While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts.

Ross continued

We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.

The Department of Commerce stated it “will continue to administer the Privacy Shield program, including processing submissions for self-certification and re-certification to the Privacy Shield Frameworks and maintaining the Privacy Shield List.” The agency added “[t]oday’s decision does not relieve participating organizations of their Privacy Shield obligations.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by harakir from Pixabay