Further Reading, Other Development, and Coming Events (20 and 21 January 2021)

Further Reading

  • Amazon’s Ring Neighbors app exposed users’ precise locations and home addresses” By Zack Whittaker — Tech Crunch. Again Amazon’s home security platform suffers problems by way of users data being exposed or less than protected.
  • Harassment of Chinese dissidents was warning signal on disinformation” By Shawna Chen and Bethany Allen-Ebrahimian — Axios. In an example of how malicious online activities can spill into the real world as a number of Chinese dissidents were set upon by protestors.
  • How Social Media’s Obsession with Scale Supercharged Disinformation” By Joan Donovan — Harvard Business Review. Companies like Facebook and Twitter emphasized scale over safety in trying to grow as quickly as possible. This lead to a proliferation of fake accounts and proved welcome ground for the seeds of misinformation.
  • The Moderation War Is Coming to Spotify, Substack, and Clubhouse” By Alex Kantrowitz — OneZero. The same issues with objectionable and abusive content plaguing Twitter, Facebook, YouTube and others will almost certainly become an issue for the newer platforms, and in fact already are.
  • Mexican president mounts campaign against social media bans” By Mark Stevenson — The Associated Press. The leftist President of Mexico President Andrés Manuel López Obrador is vowing to lead international efforts to stop social media companies from censoring what he considers free speech. Whether this materializes into something substantial is not clear.
  • As Trump Clashes With Big Tech, China’s Censored Internet Takes His Side” By Li Yuan — The New York Times. The government in Beijing is framing the ban of former President Donald Trump after the attempted insurrection by social media platforms as proof there is no untrammeled freedom of speech. This position helps bolster the oppressive policing of online content the People’s Republic of China (PRC) wages against its citizens. And quite separately many Chinese people (or what appear to be actual people) are questioning what is often deemed the censoring of Trump in the United States (U.S.), a nation ostensibly committed to free speech. There is also widespread misunderstanding about the First Amendment rights of social media platforms not to host content with which they disagree and the power of platforms to make such determinations without fear that the U.S. government will punish them as is often the case in the PRC.
  • Trump admin slams China’s Huawei, halting shipments from Intel, others – sources” By Karen Freifeld and Alexandra Alper — Reuters. On its way out of the proverbial door, the Trump Administration delivered parting shots to Huawei and the People’s Republic of China by revoking one license and denying others to sell the PRC tech giant semiconductors. Whether the Biden Administration will reverse or stand by these actions remains to be seen. The companies, including Intel, could appeal. Additionally, there are an estimated $400 million worth of applications for similar licenses pending at the Department of Commerce that are now the domain of the new regime in Washington. It is too early to discern how the Biden Administration will maintain or modify Trump Administration policy towards the PRC.
  • Behind a Secret Deal Between Google and Facebook” By Daisuke Wakabayashi and Tiffany Hsu — The New York Times. The newspaper got its hands on an unredacted copy of the antitrust suit Texas Attorney General Ken Paxton and other attorneys general filed against Google, and it has details on the deal Facebook and Google allegedly struck to divide the online advertising world. Not only did Facebook ditch an effort launched by publishers to defeat Google’s overwhelming advantages in online advertising bidding, it joined Google’s rival effort with a guarantee that it would win a specified number of bids and more time to bid on ads. Google and Facebook naturally deny any wrongdoing.
  • Biden and Trump Voters Were Exposed to Radically Different Coverage of the Capitol Riot on Facebook” By Colin Lecher and Jon Keegan — The Markup. Using a tool on browsers the organization pays Facebook users to have, the Markup can track the type of material they see in their feed. Facebook’s algorithm fed people material about the 6 January 2021 attempted insurrection based on their political views. Many have pointed out that this very dynamic creates filter bubbles that poison democracy and public discourse.
  • Banning Trump won’t fix social media: 10 ideas to rebuild our broken internet – by experts” By Julia Carrie Wong — The Guardian. There are some fascinating proposals in this piece that could help address the problems of social media.
  • Misinformation dropped dramatically the week after Twitter banned Trump and some allies” By Elizabeth Dwoskin and Craig Timberg — The Washington Post. Research showed that lies, misinformation, and disinformation about election fraud dropped by three-quarters after former President Donald Trump was banned from Twitter and other platforms. Other research showed that a small group of conservatives were responsible for up to 20% of misinformation on this and other conspiracies.
  • This Was WhatsApp’s Plan All Along” By Shoshana Wodinsky — Gizmodo. This piece does a great job of breaking down into plain English the proposed changes to terms of service on WhatsApp that so enraged users that competitors Signal and Telegram have seen record-breaking downloads. Basically, it is all about reaping advertising dollars for Facebook through businesses and third-party partners using user data from business-related communications. Incidentally, WhatsApp has delayed changes until March because of the pushback.
  • Brussels eclipsed as EU countries roll out their own tech rules” By By Laura Kayali and Mark Scott — Politico EU. The European Union (EU) had a hard-enough task in trying to reach final language on a Digital Services Act and Digital Markets Act without nations like France, Germany, Poland, and others picking and choosing text from draft bills and enacting them into law. Brussels is not happy with this trend.

Other Developments

  • Federal Trade Commission (FTC) Chair Joseph J. Simons announced his resignation from the FTC effective on 29 January 2021 in keeping with tradition and past practice. This resignation clears the way for President Joe Biden to name the chair of the FTC, and along with FTC Commissioner Rohit Chopra’s nomination to head the Consumer Financial Protection Bureau (CFPB), the incoming President will get to nominate two Democratic FTC Commissioners, tipping the political balance of the FTC and likely ushering in a period of more regulation of the technology sector.
    • Simons also announced the resignation of senior staff: General Counsel Alden F. Abbott; Bureau of Competition Director Ian Conner; Bureau of Competition Deputy Directors Gail Levine and Daniel Francis; Bureau of Consumer Protection Director Andrew Smith; Bureau of Economics Director Andrew Sweeting; Office of Public Affairs Director Cathy MacFarlane; and Office of Policy Planning Director Bilal Sayyed.
  • In a speech last week before he sworn in, President Joe Biden announced his $1.9 trillion American Rescue Plan, and according to a summary, Biden will ask Congress to provide $10 billion for a handful of government facing programs to improve technology. Notably, Biden “is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks.” Biden is proposing to dramatically increase funding for a fund that would allow agencies to borrow and then pay back funds to update their technology. Moreover, Biden is looking to push more money to a program to aid officials at agencies who oversee technology development and procurement.
    • Biden stated “[t]o remediate the SolarWinds breach and boost U.S. defenses, including of the COVID-19 vaccine process, President-elect Biden is calling on Congress to:
      • Expand and improve the Technology Modernization Fund. ​A $9 billion investment will help the U.S. launch major new IT and cybersecurity shared services at the Cyber Security and Information Security Agency (CISA) and the General Services Administration and complete modernization projects at federal agencies. ​In addition, the president-elect is calling on Congress to change the fund’s reimbursement structure in order to fund more innovative and impactful projects.
      • Surge cybersecurity technology and engineering expert hiring​. Providing the Information Technology Oversight and Reform fund with $200 million will allow for the rapid hiring of hundreds of experts to support the federal Chief Information Security Officer and U.S. Digital Service.
      • Build shared, secure services to drive transformational projects. ​Investing$300 million in no-year funding for Technology Transformation Services in the General Services Administration will drive secure IT projects forward without the need of reimbursement from agencies.
      • Improving security monitoring and incident response activities. ​An additional $690M for CISA will bolster cybersecurity across federal civilian networks, and support the piloting of new shared security and cloud computing services.
  • The United States (U.S.) Department of Commerce issued an interim final rule pursuant to an executive order (EO) issued by former President Donald Trump to secure the United States (U.S.) information and communications supply chain. This rule will undoubtedly be reviewed by the Biden Administration and may be withdrawn or modified depending on the fate on the EO on which the rule relies.
    • In the interim final rule, Commerce explained:
      • These regulations create the processes and procedures that the Secretary of Commerce will use to identify, assess, and address certain transactions, including classes of transactions, between U.S. persons and foreign persons that involve information and communications technology or services designed, developed, manufactured, or supplied, by persons owned by, controlled by, or subject to the jurisdiction or direction of a foreign adversary; and pose an undue or unacceptable risk. While this interim final rule will become effective on March 22, 2021, the Department of Commerce continues to welcome public input and is thus seeking additional public comment. Once any additional comments have been evaluated, the Department is committed to issuing a final rule.
      • On November 27, 2019, the Department of Commerce (Department) published a proposed rule to implement the terms of the Executive Order. (84 FR 65316). The proposed rule set forth processes for (1) how the Secretary would evaluate and assess transactions involving ICTS to determine whether they pose an undue risk of sabotage to or subversion of the ICTS supply chain, or an unacceptable risk to the national security of the United States or the security and safety of U.S. persons; (2) how the Secretary would notify parties to transactions under review of the Secretary’s decision regarding the ICTS Transaction, including whether the Secretary would prohibit or mitigate the transaction; and (3) how parties to transactions reviewed by the Secretary could comment on the Secretary’s preliminary decisions. The proposed rule also provided that the Secretary could act without complying with the proposed procedures where required by national security. Finally, the Secretary would establish penalties for violations of mitigation agreements, the regulations, or the Executive Order.
      • In addition to seeking general public comment, the Department requested comments from the public on five specific questions: (1) Whether the Secretary should consider categorical exclusions or whether there are classes of persons whose use of ICTS cannot violate the Executive Order; (2) whether there are categories of uses or of risks that are always capable of being reliably and adequately mitigated; (3) how the Secretary should monitor and enforce any mitigation agreements applied to a transaction; (4) how the terms, “transaction,” “dealing in,” and “use of” should be clarified in the rule; and (5) whether the Department should add record-keeping requirements for information related to transactions.
      • The list of “foreign adversaries” consists of the following foreign governments and non-government persons: The People’s Republic of China, including the Hong Kong Special Administrative Region (China); the Republic of Cuba (Cuba); the Islamic Republic of Iran (Iran); the Democratic People’s Republic of Korea (North Korea); the Russian Federation (Russia); and Venezuelan politician Nicolás Maduro (Maduro Regime).
  • The Federal Trade Commission (FTC) adjusted its penalty amounts for inflation, including a boost to the per violation penalty virtually all the privacy bills introduced in the last Congress would allow the agency to wield against first-time violators. The penalty for certain unfair and deceptive acts or practices was increased from $43,280 to $43,792.
  • The United States (U.S.) Department of State stood up its new Bureau of Cyberspace Security and Emerging Technologies (CSET) as it has long planned. At the beginning of the Trump Administration, the Department of State dismantled the Cyber Coordinator Office and gave its cybersecurity portfolio to the Bureau of Economic Affairs, which displeased Congressional stakeholders. In 2019, the department notified Congress of its plan to establish CSET. The department asserted:
    • The need to reorganize and resource America’s cyberspace and emerging technology security diplomacy through the creation of CSET is critical, as the challenges to U.S. national security presented by China, Russia, Iran, North Korea, and other cyber and emerging technology competitors and adversaries have only increased since the Department notified Congress in June 2019 of its intent to create CSET.
    • The CSET bureau will lead U.S. government diplomatic efforts on a wide range of international cyberspace security and emerging technology policy issues that affect U.S. foreign policy and national security, including securing cyberspace and critical technologies, reducing the likelihood of cyber conflict, and prevailing in strategic cyber competition.  The Secretary’s decision to establish CSET will permit the Department to posture itself appropriately and engage as effectively as possible with partners and allies on these pressing national security concerns.
    • The Congressional Members of the Cyberspace Solarium Commission made clear their disapproval of the decision. Senators Angus King (I-ME) and Ben Sasse, (R-NE) and Representatives Mike Gallagher (R-WI) and Jim Langevin (D-RI) said:
      • In our report, we emphasize the need for a greater emphasis on international cyber policy at State. However, unlike the bipartisan Cyber Diplomacy Act, the State Department’s proposed Bureau will reinforce existing silos and […] hinder the development of a holistic strategy to promote cyberspace stability on the international stage. We urge President-elect Biden to pause this reorganization when he takes office in two weeks and work with Congress to enact meaningful reform to protect our country in cyberspace.
  • The Australian Cyber Security Centre (ACSC) the Risk Identification Guidance “developed to assist organisations in identifying risks associated with their use of suppliers, manufacturers, distributors and retailers (i.e. businesses that constitute their cyber supply chain)” and the Risk Management Guidance because “[c]yber supply chain risk management can be achieved by identifying the cyber supply chain, understanding cyber supply chain risk, setting cyber security expectations, auditing for compliance, and monitoring and improving cyber supply chain security practices.”
  • The United Kingdom’s Surveillance Camera Commissioner (SCC), issued “best practice guidance, ‘Facing the Camera’, to all police forces in England and Wales” The SCC explained that “The provisions of this document only apply to the use of facial recognition technology and the inherent processing of images by the police where such use is integral to a surveillance camera system being operated in ‘live time’ or ‘near real time’ operational scenarios.” Last summer, a British appeals court overturned a decision that found that a police force’s use of facial recognition technology in a pilot program that utilized live footage to be legal. The appeals court found the use of this technology by the South Wales Police Force a violation of “the right to respect for private life under Article 8 of the European  Convention  on  Human  Rights,  data  protection  legislation,  and  the  Public  Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.” The SCC stated:
    • The SCC considers surveillance to be an intrusive investigatory power where it is conducted by the police which impacts upon those fundamental rights and freedoms of people, as set out by the European Convention of Human Rights (ECHR) and the Human Rights Act 1998. In the context of surveillance camera systems which make use of facial recognition technology, the extent of state intrusion in such matters is significantly increased by the capabilities of algorithms which are in essence, integral to the surveillance conduct seeking to harvest information, private information, metadata, data, personal data, intelligence and evidence. Each of the aforementioned are bound by laws and rules which ought to be separately and jointly considered and applied in a manner which is demonstrably lawful and ethical and engenders public trust and confidence.
    • Whenever the police seek to use technology in pursuit of a legitimate aim, the key question arises as to whether the degree of intrusion which is caused to the fundamental freedoms of citizens by the police surveillance conduct using surveillance algorithms (biometric or otherwise) is necessary in a democratic society when considered alongside the legality and proportionality of their endeavours and intent. The type of equipment/technology/modality which they choose to use to that end (e.g. LFR, ANPR, thermal imaging, gait analysis, movement sensors etc), the manner in which such technological means are deployed, (such as using static cameras at various locations, used with body worn cameras or other mobile means), and whether such technology is used overtly alongside or networked with other surveillance technologies, are all factors which may significantly influence the depth of intrusion caused by police conduct upon citizen’s rights.
  • The Senate confirmed the nomination of Avril Haines to be the new Director of National Intelligence by an 89-10 vote after Senator Tom Cotton (R-AK) removed his hold on her nomination. However, Josh Hawley (R-MO) placed a hold on the nomination of Alejandro Mayorkas to be the next Secretary of Homeland Security and explained his action this way:
    • On Day 1 of his administration, President-elect Biden has said he plans to unveil an amnesty plan for 11 million immigrants in this nation illegally. This comes at a time when millions of American citizens remain out of work and a new migrant caravan has been attempting to reach the United States. Mr. Mayorkas has not adequately explained how he will enforce federal law and secure the southern border given President-elect Biden’s promise to roll back major enforcement and security measures. Just today, he declined to say he would enforce the laws Congress has already passed to secure the border wall system. Given this, I cannot consent to skip the standard vetting process and fast-track this nomination when so many questions remain unanswered.
  • Former Trump White House Cyber Coordinator Rob Joyce will replace the National Security Agency’s (NSA) Director of Cybersecurity Anne Neuberger who has been named the Biden White House’s Deputy National Security Advisor for Cyber and Emerging Technology. Anne Neuberger’s portfolio at the NSA included “lead[ing] NSA’s cybersecurity mission, including emerging technology areas like quantum-resistant cryptography.” Joyce was purged when former National Security Advisor John Bolton restructured the NSC in 2018, forcing out Joyce and his boss, former Homeland Security Advisor Tom Bossert. Presumably Joyce would have the same responsibilities. At the National Security Council, Neuberger would will work to coordinate cybersecurity and emerging technology policy across agencies and funnel policy options up to the full NSC and ultimately the President. This work would include Joyce.
  • The Supreme Court of the United States (SCOTUS) heard oral arguments on whether the Federal Trade Commission (FTC) Act gives the agency the power to seek monetary damages and restitution alongside permanent injunctions under Section 13(b). In AMG Capital Management, LLC v. FTC, the parties opposing the FTC argue the plain language of the statute does not allow for the seeking of restitution and monetary damages under this specific section of the FTC Act while the agency argues long accepted past practice and Congressional intent do, in fact, allow this relief to be sought when the FTC is seeking to punish violators of Section 5. The FTC is working a separate track to get a fix from Congress which could rewrite the FTC Act to make clear this sort of relief is legal. However, some stakeholders in the debate over privacy legislation may be using the case as leverage.
    • In October 2020, the FTC wrote the House and Senate committees with jurisdiction over the agency, asking for language to resolve the litigation over the power to seek and obtain restitution for victims of those who have violated Section 5 of the FTC Act and disgorgement of ill-gotten gains. The FTC is also asking that Congress clarify that the agency may act against violators even if their conduct has stopped as it has for more than four decades. Two federal appeals courts have ruled in ways that have limited the FTC’s long used powers, and now the Supreme Court of the United States is set to rule on these issues sometime next year. The FTC is claiming, however, that defendants are playing for time in the hopes that the FTC’s authority to seek and receive monetary penalties will ultimately be limited by the United States (U.S.) highest court. Judging by language tucked into a privacy bill introduced by the former chair of one of the committees, Congress may be willing to act soon.
    • The FTC asked the House Energy and Commerce and Senate Commerce, Science, and Transportation Committees “to take quick action to amend Section 13(b) [of the FTC Act i.e. 15 U.S.C. § 53(b)] to make clear that the Commission can bring actions in federal court under Section 13(b) even if conduct is no longer ongoing or impending when the suit is filed and can obtain monetary relief, including restitution and disgorgement, if successful.” The agency asserted “[w]ithout congressional action, the Commission’s ability to use Section 13(b) to provide refunds to consumer victims and to enjoin illegal activity is severely threatened.” All five FTC Commissioners signed the letter.
    • The FTC explained that adverse rulings by two federal appeals courts are constraining the agency from seeking relief for victims and punishment for violators of the FTC Act in federal courts below those two specific courts, but elsewhere defendants are either asking courts for a similar ruling or using delaying tactics in the hopes the Supreme Court upholds the two federal appeals courts:
      • …[C]ourts of appeals in the Third and Seventh Circuits have recently ruled that the agency cannot obtain any monetary relief under Section 13(b). Although review in the Supreme Court is pending, these lower court decisions are already inhibiting our ability to obtain monetary relief under 13(b). Not only do these decisions already prevent us from obtaining redress for consumers in the circuits where they issued, prospective defendants are routinely invoking them in refusing to settle cases with agreed-upon redress payments.
      • Moreover, defendants in our law enforcement actions pending in other circuits are seeking to expand the rulings to those circuits and taking steps to delay litigation in anticipation of a potential Supreme Court ruling that would allow them to escape liability for any monetary relief caused by their unlawful conduct. This is a significant impediment to the agency’s effectiveness, its ability to provide redress to consumer victims, and its ability to prevent entities who violate the law from profiting from their wrongdoing.
  • The United Kingdom’s Information Commissioner’s Office (ICO) issued guidance for British entities that may be affected by the massive SolarWinds hack that has compromised many key systems in the United States. The ICO advised:
    • Organisations should immediately check whether they are using a version of the software that has been compromised. These are versions 2019.4 HF 5, 2020.2 with no hotfix installed, and 2020.2 HF 1.
    • Organisations must also determine if the personal data they hold has been affected by the cyber-attack. If a reportable personal data breach is found, UK data controllers are required to inform the ICO within 72 hours of discovering the breach. Reports can be submitted online or organisations can call the ICO’s personal data breach helpline for advice on 0303 123 1113, option 2.
    • Organisations subject to the NIS Regulation will also need to determine if this incident has led to a “substantial impact on the provision’ of its digital services and report to the ICO.
  • Europol announced the takedown of “the world’s largest illegal marketplace on the dark web” in an operation coordinated by the following nations: “Germany, Australia, Denmark, Moldova, Ukraine, the United Kingdom (the National Crime Agency), and the USA (DEA, FBI, and IRS).” Europol added:
    • The Central Criminal Investigation Department in the German city of Oldenburg arrested an Australian citizen who is the alleged operator of DarkMarket near the German-Danish border over the weekend. The investigation, which was led by the cybercrime unit of the Koblenz Public Prosecutor’s Office, allowed officers to locate and close the marketplace, switch off the servers and seize the criminal infrastructure – more than 20 servers in Moldova and Ukraine supported by the German Federal Criminal Police office (BKA). The stored data will give investigators new leads to further investigate moderators, sellers, and buyers. 
  • The Enforcement Bureau (Bureau) of the Federal Communications Commission (FCC) issued an enforcement advisory intended to remind people that use of amateur and personal radios to commit crimes is itself a criminal offense that could warrant prosecution. The notice was issued because the FCC is claiming it is aware of discussion by some of how these means of communications may be superior to social media, which has been cracking down on extremist material since the attempted insurrection at the United States Capitol on 6 January. The Bureau stated:
    • The Bureau has become aware of discussions on social media platforms suggesting that certain radio services regulated by the Commission may be an alternative to social media platforms for groups to communicate and coordinate future activities.  The Bureau recognizes that these services can be used for a wide range of permitted purposes, including speech that is protected under the First Amendment of the U.S. Constitution.  Amateur and Personal Radio Services, however, may not be used to commit or facilitate crimes. 
    • Specifically, the Bureau reminds amateur licensees that they are prohibited from transmitting “communications intended to facilitate a criminal act” or “messages encoded for the purpose of obscuring their meaning.” Likewise, individuals operating radios in the Personal Radio Services, a category that includes Citizens Band radios, Family Radio Service walkie-talkies, and General Mobile Radio Service, are prohibited from using those radios “in connection with any activity which is against Federal, State or local law.” Individuals using radios in the Amateur or Personal Radio Services in this manner may be subject to severe penalties, including significant fines, seizure of the offending equipment, and, in some cases, criminal prosecution.
  • The European Data Protection Board (EDPB) issued its “Strategy for 2021-2023” in order “[t]o be effective in confronting the main challenges ahead.” The EDPB cautioned:
    • This Strategy does not provide an exhaustive overview of the work of the EDPB in the years to come. Rather it sets out the four main pillars of our strategic objectives, as well as set of key actions to help achieve those objectives. The EDPB will implement this Strategy within its Work Program, and will report on the progress achieved in relation to each Pillar as part of its annual reports.
    • The EDPB listed and explained the four pillars of its strategy:
      • PILLAR 1: ADVANCING HARMONISATION AND FACILITATING COMPLIANCE. The EDPB will continue to strive for a maximum degree of consistency in the application of data protection rules and limit fragmentation among Member States. In addition to providing practical, easily understandable and accessible guidance, the EDPB will develop and promote tools that help to implement data protection into practice, taking into account practical experiences of different stakeholders on the ground.
      • PILLAR 2: SUPPORTING EFFECTIVE ENFORCEMENT AND EFFICIENT COOPERATION BETWEEN NATIONAL SUPERVISORY AUTHORITIES. The EDPB is fully committed to support cooperation between all national supervisory authorities that work together to enforce European data protection law. We will streamline internal processes, combine expertise and promote enhanced coordination. We intend not only to ensure a more efficient functioning of the cooperation and consistency mechanisms, but also to strive for the development of a genuine EU-wide enforcement culture among supervisory authorities.
      • PILLAR 3: A FUNDAMENTAL RIGHTS APPROACH TO NEW TECHNOLOGIES. The protection of personal data helps to ensure that technology, new business models and society develop in accordance with our values, such as human dignity, autonomy and liberty. The EDPB will continuously monitor new and emerging technologies and their potential impact on the fundamental rights and daily lives of individuals. Data protection should work for all people, particularly in the face of processing activities presenting the greatest risks to individuals’ rights and freedoms (e.g. to prevent discrimination). We will help to shape Europe’s digital future in line with our common values and rules. We will continue to work with other regulators and policymakers to promote regulatory coherence and enhanced protection for individuals.
      • PILLAR 4: THE GLOBAL DIMENSION. The EDPB is determined to set and promote high EU and global standards for international data transfers to third countries in the private and the public sector, including in the law enforcement sector. We will reinforce our engagement with the international community to promote EU data protection as a global model and to ensure effective protection of personal data beyond EU borders.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) revealed that all but one of the videoconferencing platforms it and other data protection authorities’ (DPA) July 2020 letter urging them to “adopt principles to guide them in addressing some key privacy risks.” The ICO explained:
    • Microsoft, Cisco, Zoom and Google replied to the open letter. The joint signatories thank these companies for engaging on this important matter and for acknowledging and responding to the concerns raised. In their responses the companies highlighted various privacy and security best practices, measures, and tools that they advise are implemented or built-in to their video teleconferencing services.
    • The information provided by these companies is encouraging. It is a constructive foundation for further discussion on elements of the responses that the joint signatories feel would benefit from more clarity and additional supporting information.
    • The ICO stated:
      • The joint signatories have not received a response to the open letter from Houseparty. They strongly encourage Houseparty to engage with them and respond to the open letter to address the concerns raised.
  • The European Union Agency for Cybersecurity (ENISA) “launched a public consultation, which runs until 7 February 2021, on its draft of the candidate European Union Cybersecurity Certification Scheme on Cloud Services (EUCS)…[that] aims to further improve the Union’s internal market conditions for cloud services by enhancing and streamlining the services’ cybersecurity guarantees.” ENISA stated:
    • There are challenges to the certification of cloud services, such as a diverse set of market players, complex systems and a constantly evolving landscape of cloud services, as well as the existence of different schemes in Member States. The draft EUCS candidate scheme tackles these challenges by calling for cybersecurity best practices across three levels of assurance and by allowing for a transition from current national schemes in the EU. The draft EUCS candidate scheme is a horizontal and technological scheme that intends to provide cybersecurity assurance throughout the cloud supply chain, and form a sound basis for sectoral schemes.
    • More specifically, the draft EUCS candidate scheme:
      • Is a voluntary scheme;
      • The scheme’s certificates will be applicable across the EU Member States;
      • Is applicable for all kinds of cloud services – from infrastructure to applications;
      • Boosts trust in cloud services by defining a reference set of security requirements;
      • Covers three assurance levels: ‘Basic’, ‘Substantial’ and ‘High’;
      • Proposes a new approach inspired by existing national schemes and international standards;
      • Defines a transition path from national schemes in the EU;
      • Grants a three-year certification that can be renewed;
      • Includes transparency requirements such as the location of data processing and storage.

Coming Events

  • The Commerce, Science, and Transportation Committee will hold a hearing on the nomination of Gina Raimondo to be the Secretary of Commerce on 26 January.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Peggy und Marco Lachmann-Anke from Pixabay

Further Reading, Other Developments, and Coming Events (19 January 2021)

Further Reading

  • Hong Kong telecoms provider blocks website for first time, citing security law” — Reuters; “A Hong Kong Website Gets Blocked, Raising Censorship Fears” By Paul Mozur and Aaron Krolik — The New York Times. The Hong Kong Broadband Network (HKBN) blocked access to a website about the 2019 protests against the People’s Republic of China (PRC) (called HKChronicles) under a recently enacted security law critics had warned would lead to exactly this sort of outcome. Allegedly, the Hong Kong police had invoked the National Security Law for the first time, and other telecommunications companies have followed suit.
  • Biden to counter China tech by urging investment in US: adviser” By Yifan Yu — Nikkei Asia. President-elect Joe Biden’s head of the National Economic Council said at a public event that the Biden Administration would focus less on tariffs and other similar instruments to counter the People’s Republic of China (PRC). Instead, the incoming President would try to foster investment in United States companies and technologies to fend off the PRC’s growing strength in a number of crucial fields. Also, a Biden Administration would work more with traditional U.S. allies to contest policies from Beijing.
  • Revealed: walkie-talkie app Zello hosted far-right groups who stormed Capitol” By Micah Loewinger and Hampton Stall — The Guardian. Some of the rioters and insurrectionists whop attacked the United States Capitol on 6 January were using another, lesser known communications app, Zello, to coordinate their actions. The app has since taken down a number of right-wing and extremist groups that have flourished for months if not years on the platform. It remains to be seen how smaller platforms will be scrutinized under a Biden Presidency. Zello has reportedly been aware that these groups have been using their platform and opted not to police their conduct.
  • They Used to Post Selfies. Now They’re Trying to Reverse the Election.” By Stuart A. Thompson and Charlie Warzel — The New York Times. The three people who amassed considerable extremist followings seem each to be part believer and part opportunist. A fascinating series of profiles about the three.
  • Telegram tries, and fails, to remove extremist content” By Mark Scott — Politico. Platforms other than Facebook and Twiiter are struggling to moderate right wing and extremist content that violates their policies and terms of service.

Other Developments

  • The Biden-Harris transition team announced that a statutorily established science advisor will now be a member of the Cabinet and named its nominee for this and other positions. The Office of Science and Technology Policy (OSTP) was created by executive order in the Ford Administration and then codified by Congress. However, the OSTP Director has not been a member of the Cabinet alongside the Senate-confirmed Secretaries and others. President-elect Joe Biden has decided to elevate the OSTP Director to the Cabinet, likely in order to signal the importance of science and technology in his Administration. The current OSTP has exercised unusual influence in the Trump Administration under the helm of OSTP Associate Director Michael Kratsios and shaped policy in a number of realms like artificial intelligence, national security, and others.
    • In the press release, the transition team explained:
      • Dr. Eric Lander will be nominated as Director of the OSTP and serve as the Presidential Science Advisor. The president-elect is elevating the role of science within the White House, including by designating the Presidential Science Advisor as a member of the Cabinet for the first time in history. One of the country’s leading scientists, Dr. Lander was a principal leader of the Human Genome Project and has been a pioneer in the field of genomic medicine. He is the founding director of the Broad Institute of MIT and Harvard, one of the nation’s leading research institutes. During the Obama-Biden administration, he served as external Co-Chair of the President’s Council of Advisors on Science and Technology. Dr. Lander will be the first life scientist to serve as Presidential Science Advisor.
      • Dr. Alondra Nelson will serve as OSTP Deputy Director for Science and Society. A distinguished scholar of science, technology, social inequality, and race, Dr. Nelson is president of the Social Science Research Council, an independent, nonprofit organization linking social science research to practice and policy. She is also a professor at the Institute for Advanced Study, one of the nation’s most distinguished research institutes, located in Princeton, NJ.
      • Dr. Frances H. Arnold and Dr. Maria Zuber will serve as the external Co-Chairs of the President’s Council of Advisors on Science and Technology (PCAST). An expert in protein engineering, Dr. Arnold is the first American woman to win the Nobel Prize in Chemistry. Dr. Zuber, an expert in geophysics and planetary science, is the first woman to lead a NASA spacecraft mission and has chaired the National Science Board. They are the first women to serve as co-chairs of PCAST.
      • Dr. Francis Collins will continue serving in his role as Director of the National Institutes of Health.
      • Kei Koizumi will serve as OSTP Chief of Staff and is one of the nation’s leading experts on the federal science budget.
      • Narda Jones, who will serve as OSTP Legislative Affairs Director, was Senior Technology Policy Advisor and Counsel for the Democratic staff of the U.S. Senate Committee on Commerce, Science and Transportation.
  • The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) issued a report on supply chain security by a public-private sector advisory body, which represents one of the lines of effort of the U.S. government to better secure technology and electronics that emanate from the People’s Republic of China (PRC). CISA’s National Risk Management Center co-chairs the Information and Communications Technology (ICT) Supply Chain Risk Management (SCRM) Task Force along with the Information Technology Sector Coordinating Council and the Communications Sector Coordinating Council. The ICT SCRM published its Year 2 Report that “builds upon” its Interim Report and asserted:
    • Over the past year, the Task Force has expanded upon its first-year progress to advance meaningful partnership around supply chain risk management. Specifically, the Task Force:
      • Developed reference material to support overcoming legal obstacles to information sharing
      • Updated the Threat Evaluation Report, which evaluates threats to suppliers, with additional scenarios and mitigation measures for the corresponding threat scenarios
      • Produced a report and case studies providing in -depth descriptions of control categories and information regarding when and how to use a Qualified List to manage supply chain risks
      • Developed a template for SCRM compliance assessments and internal evaluations of alignment to industry standards
      • Analyzed the current and potential impacts from the COVID-19 pandemic, and developed a system map to visualize ICT supply chain routes and identify chokepoints
      • Surveyed supply chain related programs and initiatives that provide opportunities for potential TaskForce engagement
    • Congress established an entity to address and help police supply chain risk at the end of 2018 in the “Strengthening and Enhancing Cyber-capabilities by Utilizing Risk Exposure Technology Act” (SECURE Act) (P.L. 115-390). The Federal Acquisition Security Council (FASC) has a number of responsibilities, including:
      • developing an information sharing process for agencies to circulate decisions throughout the federal government made to exclude entities determined to be IT supply chain risks
      • establishing a process by which entities determined to be IT supply chain risks may be excluded from procurement government-wide (exclusion orders) or suspect IT must be removed from government systems (removal orders)
      • creating an exception process under which IT from an entity subject to a removal or exclusion order may be used if warranted by national interest or national security
      • issuing recommendations for agencies on excluding entities and IT from the IT supply chain and “consent for a contractor to subcontract” and mitigation steps entities would need to take in order for the Council to rescind a removal or exclusion order
      • In September 2020, the FASC released an interim regulation that took effect upon being published that “implement[s] the requirements of the laws that govern the operation of the FASC, the sharing of supply chain risk information, and the exercise of its authorities to recommend issuance of removal and exclusion orders to address supply chain security risks…”
  • The Australian government has released its bill to remake how platforms like Facebook, Google, and others may use the content of new media, including provision for payment. The “Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020” “establishes a mandatory code of conduct to help support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The agency charged with developing legislation, the Australian Competition and Consumer Commission (ACCC), has tussled with Google in particular over what this law would look like with the technology giant threatening to withdraw from Australia altogether. The ACCC had determined in its July 2019 Digital Platform Inquiry:
    • that there is a bargaining power imbalance between digital platforms and news media businesses so that news media businesses are not able to negotiate for a share of the revenue generated by the digital platforms and to which the news content created by the news media businesses contributes. Government intervention is necessary because of the public benefit provided by the production and dissemination of news, and the importance of a strong independent media in a well-functioning democracy.
    • In an Explanatory Memorandum, it is explained:
      • The Bill establishes a mandatory code of conduct to address bargaining power imbalances between digital platform services and Australian news businesses…by setting out six main elements:
        • bargaining–which require the responsible digital platform corporations and registered news business corporations that have indicated an intention to bargain, to do so in good faith;
        • compulsory arbitration–where parties cannot come to a negotiated agreement about remuneration relating to the making available of covered news content on designated digital platform services, an arbitral panel will select between two final offers made by the bargaining parties;
        • general requirements –which, among other things, require responsible digital platform corporations to provide registered news business corporations with advance notification of planned changes to an algorithm or internal practice that will have a significant effect on covered news content;
        • non-differentiation requirements –responsible digital platform corporations must not differentiate between the news businesses participating in the Code, or between participants and non-participants, because of matters that arise in relation to their participation or non-participation in the Code;
        • contracting out–the Bill recognises that a digital platform corporation may reach a commercial bargain with a news business outside the Code about remuneration or other matters. It provides that parties who notify the ACCC of such agreements would not need to comply with the general requirements, bargaining and compulsory arbitration rules (as set out in the agreement); and
        • standard offers –digital platform corporations may make standard offers to news businesses, which are intended to reduce the time and cost associated with negotiations, particularly for smaller news businesses. If the parties notify the ACCC of an agreed standard offer, those parties do not need to comply with bargaining and compulsory arbitration (as set out in the agreement);
  • The Federal Trade Commission (FTC) has reached a settlement with an mobile advertising company over “allegations that it failed to provide in-game rewards users were promised for completing advertising offers.” The FTC unanimously agreed to the proposed settlement with Tapjoy, Inc. that bars the company “from misleading users about the rewards they can earn and must monitor its third-party advertiser partners to ensure they do what is necessary to enable Tapjoy to deliver promised rewards to consumers.” The FTC drafted a 20 year settlement that will obligate Tapjoy, Inc. to refrain from certain practices that violate the FTC Act; in this case that includes not making false claims about the rewards people can get if they take or do not take some action in an online game. Tapjoy, Inc. will also need to submit compliance reports, keep records, and make materials available to the FTC upon demand. Any failure to meet the terms of the settlement could prompt the FTC to seek redress in federal court, including more than $43,000 per violation.
    • In the complaint, the FTC outlined Tapjoy, Inc.’s illegal conduct:
      • Tapjoy operates an advertising platform within mobile gaming applications (“apps”). On the platform, Tapjoy promotes offers of in-app rewards (e.g., virtual currency) to consumers who complete an action, such as taking a survey or otherwise engaging with third-party advertising. Often, these consumers must divulge personal information or spend money. In many instances, Tapjoy never issues the promised reward to consumers who complete an action as instructed, or only issues the currency after a substantial delay. Consumers who attempt to contact Tapjoy to complain about missing rewards find it difficult to do so, and many consumers who complete an action as instructed and are able to submit a complaint nevertheless do not receive the promised reward.  Tapjoy has received hundreds of thousands of complaints concerning its failure to issue promised rewards to consumers. Tapjoy nevertheless has withheld rewards from consumers who have completed all required actions.
    • In its press release, the FTC highlighted the salient terms of the settlement:
      • As part of the proposed settlement, Tapjoy is prohibited from misrepresenting the rewards it offers consumers and the terms under which they are offered. In addition, the company must clearly and conspicuously display the terms under which consumers can receive such rewards and must specify that the third-party advertisers it works with determine if a reward should be issued. Tapjoy also will be required to monitor its advertisers to ensure they are following through on promised rewards, investigate complaints from consumers who say they did not receive their rewards, and discipline advertisers who deceive consumers.
    • FTC Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued a joint statement, and in their summary section, they asserted:
      • The explosive growth of mobile gaming has led to mounting concerns about harmful practices, including unlawful surveillance, dark patterns, and facilitation of fraud.
      • Tapjoy’s failure to properly police its mobile gaming advertising platform cheated developers and gamers out of promised compensation and rewards.
      • The Commission must closely scrutinize today’s gaming gatekeepers, including app stores and advertising middlemen, to prevent harm to developers and gamers.
    • On the last point, Chopra and Kelly Slaughter argued:
      • We should all be concerned that gatekeepers can harm developers and squelch innovation. The clearest example is rent extraction: Apple and Google charge mobile app developers on their platforms up to 30 percent of sales, and even bar developers from trying to avoid this tax through offering alternative payment systems. While larger gaming companies are pursuing legal action against these practices, developers and small businesses risk severe retaliation for speaking up, including outright suspension from app stores – an effective death sentence.
      • This market structure also has cascading effects on gamers and consumers. Under heavy taxation by Apple and Google, developers have been forced to adopt alternative monetization models that rely on surveillance, manipulation, and other harmful practices.
  • The United Kingdom’s (UK) High Court ruled against the use of general warrants for online surveillance by the Uk’s security agencies (MI5, MI6, and the Government Communication Headquarters (GCHQ)). Privacy International (PI), a British advocacy organization, had brought the suit after Edward Snowden revealed the scope of the United States National Security Agency’s (NSA) surveillance activities, including bulk collection of information, a significant portion of which required hacking. PI sued in a special tribunal formed to resolve claims against British security agencies where the government asserted general warrants would suffice for purposes of mass hacking. PI disagreed and argued this was counter to 250 years of established law in the UK that warrants must be based on reasonable suspicion, specific in what is being sought, and proportionate. The High Court agreed with PI.
    • In its statement after the ruling, PI asserted:
      • Because general warrants are by definition not targeted (and could therefore apply to hundreds, thousands or even millions of people) they violate individuals’ right not to not have their property searched without lawful authority, and are therefore illegal.
      • The adaptation of these 250-year-old principles to modern government hacking and property interference is of great significance. The Court signals that fundamental constitutional principles still need to be applied in the context of surveillance and that the government cannot circumvent traditional protections afforded by the common law.
  • In Indiana, the attorney general is calling on the governor to “to adopt a safe harbor rule I proposed that would incentivize companies to take strong data protection measures, which will reduce the scale and frequency of cyberattacks in Indiana.” Attorney General Curtis Hill urged Governor Eric J. Holcomb to allow a change in the state’s data security regulations to be made effective.
    • The proposed rule provides:
      • Procedures adopted under IC 24-4.9-3-3.5(c) are presumed reasonable if the procedures comply with this section, including one (1) of the following applicable standards:
        • (1) A covered entity implements and maintains a cybersecurity program that complies with the National Institute of Standards and Technology (NIST) cybersecurity framework and follows the most recent version of one (1) of the following standards:
          • (A) NIST Special Publication 800-171.
          • (B) NIST SP 800-53.
          • (C) The Federal Risk and Authorization Management Program (FedRAMP) security assessment framework.
          • (D) International Organization for Standardization/International Electrotechnical Commission 27000 family – information security management systems.
        • (2) A covered entity is regulated by the federal or state government and complies with one (1) of the following standards as it applies to the covered entity:
          • (A) The federal USA Patriot Act (P.L. 107-56).
          • (B) Executive Order 13224.
          • (C) The federal Driver’s Privacy Protection Act (18 U.S.C. 2721 et seq.).
          • (D) The federal Fair Credit Reporting Act (15 U.S.C. 1681 et seq.).
          • (E) The federal Health Insurance Portability and Accountability Act (HIPAA) (P.L. 104-191).
        • (3) A covered entity complies with the current version of the payment card industry data security standard in place at the time of the breach of security of data, as published by the Payment Card Industry Security Standard Council.
      • The regulations further provide that if a data base owner can show “its data security plan was reasonably designed, implemented, and executed to prevent the breach of security of data” then it “will not be subject to a civil action from the office of the attorney general arising from the breach of security of data.”
  • The Tech Transparency Project (TTP) is claiming that Apple “has removed apps in China at the government’s request” the majority of which “involve activities like illegal gambling and porn.” However, TTP is asserting that its analysis “suggests Apple is proactively blocking scores of other apps that are politically sensitive for Beijing.”

Coming Events

  • On 19 January, the Senate Intelligence Committee will hold a hearing on the nomination of Avril Haines to be the Director of National Intelligence.
  • The Senate Homeland Security and Governmental Affairs Committee will hold a hearing on the nomination of Alejandro N. Mayorkas to be Secretary of Homeland Security on 19 January.
  • On 19 January, the Senate Armed Services Committee will hold a hearing on former General Lloyd Austin III to be Secretary of Defense.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

UK and EU Defer Decision On Data Flows

Whether there will be an adequacy decision allowing the free flow of personal data under the GDPR from the EU to the recently departed UK has been punted. And, its recent status as a member of the EU notwithstanding, the UK might not get an adequacy decision.

In reaching agreement on many aspects of the United Kingdom’s (UK) exit from the European Union (EU), negotiators did not reach agreement on whether the EU would permit the personal data of EU persons to continue flowing to the UK under the easiest means possible. Instead, the EU and UK agreed to let the status quo continue until an adequacy decision is made or six months lapse. The value of data flowing between the UK and EU was valued at more than £100 billion in 2017 according to British estimates, with the majority of this trade being from the UK to the EU.

Under the General Data Protection Regulation (GDPR), the personal data of EU people can be transferred to other nations for most purposes once the European Commission (EC) has found the other nation has adequate protection equal to those granted in the EU. Of course, this has been an ongoing issue with data flows to the United States (U.S.) as two agreements (Safe Harbor and Privacy Shield) and their EC adequacy decisions were ruled illegal, in large part, because, according to the EU’s highest court, U.S. law does not provide EU persons with the same rights they have in the EU. Most recently, this occurred in 2020 when the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the EU-United States Privacy Shield (aka Schrems II). It bears note that transfers of personal data may occur through other means under the GDPR that may prove more resource intensive: standard data protection clauses (SCC), binding corporate rules (BCR), and others.

Nevertheless, an adequacy decision is seen as the most desirable means of transfer and the question of whether the UK’s laws are sufficient has lingered over the Brexit discussions, with some claiming that the nation’s membership in the Five Eyes surveillance alliance with the U.S. and others possibly disqualifying the UK. Given the range of thorny issues the UK and EU punted (e.g. how to handle the border between Northern Ireland and Ireland), it is not surprising that the GDPR and data flows was also punted.

The UK-EU Trade and Cooperation Agreement (TCA) explained the terms of the data flow agreement and, as noted, in the short term, the status quo will continue with data flows to the UK being treated as if it were still part of the EU. This state will persist until the EC reaches an adequacy decision or for four months with another two months of the status quo being possible in the absence of an adequacy decision so long as neither the UK nor EU object. Moreover, these provisions are operative only so long as the UK has its GDPR compliant data protection law (i.e. UK Data Protection Act 2018) in place and does exercise specified “designated powers.” The UK has also deemed EU and European Economic Area (EEA) and European Free Trade Association (EFTA) nations to be adequate for purposes of data transfers from the UK on a transitional basis.

Specifically, the TCA provides

For the duration of the specified period, transmission of personal data from the Union to the United Kingdom shall not be considered as transfer to a third country under Union law, provided that the data protection legislation of the United Kingdom on 31 December 2020, as it is saved and incorporated into United Kingdom law by the European Union (Withdrawal) Act 2018 and as modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (“the applicable data protection regime”), applies and provided that the United Kingdom does not exercise the designated powers without the agreement of the Union within the Partnership Council.

The UK also agreed to notify the EU if it “enters into a new instrument which can be relied on to transfer personal data to a third country under Article 46(2)(a) of the UK GDPR or section 75(1)(a) of the UK Data Protection Act 2018 during the specified period.” However, if the EU were to object, it appears from the terms of the TCA, all the EU could do is force the UK “to discuss the relevant object.” And yet, should the UK sign a treaty allowing personal data to flow to a nation the EU deems inadequate, this could obviously adversely affect the UK’s prospects of getting an adequacy decision.

Not surprisingly, the agreement also pertains to the continued flow of personal data as part of criminal investigations and law enforcement matters but not national security matters. Moreover, these matters fall outside the scope of the GDPR and would not be affected in many ways by an adequacy decision or a lack of one. In a British government summary, it is stated that the TCA

provide[s] for law enforcement and judicial cooperation between the UK, the EU and its Member States in relation to the prevention, investigation, detection and prosecution of criminal offences and the prevention of and fight against money laundering and financing of terrorism.

The text of the TCA makes clear national security matters visa vis data flows and information sharing are not covered:

This Part only applies to law enforcement and judicial cooperation in criminal matters taking place exclusively between the United Kingdom, on the one side, and the Union and the Member States, on the other side. It does not apply to situations arising between the Member States, or between Member States and Union institutions, bodies, offices and agencies, nor does it apply to the activities of authorities with responsibilities for safeguarding national security when acting in that field.

The TCA also affirms:

  • The cooperation provided for in this Part is based on the Parties’ long-standing commitment to ensuring a high level of protection of personal data.
  • To reflect that high level of protection, the Parties shall ensure that personal data processed under this Part is subject to effective safeguards in the Parties’ respective data protection regimes…

The United Kingdom’s data protection authority (DPA), the Information Commissioner’s Office (ICO), issued an explanation of how British law enforcement entities should act in light of the TCA. The ICO explained to British entities on law enforcement-related data transfers to the UK:

  • We are now a ‘third country’ for EU data protection purposes. If you receive personal data from a law enforcement partner in the EU, this means the sender will need to comply with the transfer provisions under their national data protection law (which are likely to be similar to those in Part 3 of the DPA 2018).
  • This means the EU sender needs to make sure other appropriate safeguards are in place – probably through a contract or other binding legal instrument, or by making their own assessment of appropriate safeguards. The sender can take into account the protection provided by the DPA 2018 itself when making this assessment.
  • If you receive personal data from other types of organisations in the EU or EEA who are subject to the GDPR, the sender will need to comply with the transfer provisions of the UK GDPR. You may want to consider putting standard contractual clauses (SCCs) in place to ensure adequate safeguards in these cases. We have produced an interactive tool to help you use the SCCs.

The ICO explained for transfers from the UK to the EU (but not the EEA):

  • There is a transitional adequacy decision in place to cover transfers to EU member states and Gibraltar. This will not extend to EEA countries outside the EU, where you should continue to consider other safeguards.
  • This means you can continue to send personal data from the UK to your law enforcement partners in the EU, as long as you can show the transfer is necessary for law enforcement purposes. You can also transfer personal data to non-law enforcement bodies in the EU if you can meet some additional conditions, but you will need to notify the ICO.

Turning back to an adequacy decision and commercial transfers of personal data from the EU to the UK, in what may well be a preview of a world in which there is no adequacy decision between the UK and EU, the European Data Protection Board (EDPB) issued an “information note” in mid-December that spells out how the GDPR would be applied:

  • In the absence of an adequacy decision applicable to the UK as per Article 45 GDPR, such transfers will require appropriate safeguards(e.g., standard data protection clauses, binding corporate rules, codes of conduct…), as well as enforceable data subject rights and effective legal remedies for data subjects, in accordance with Article 46 GDPR.
  • Subject to specific conditions, it may still be possible to transfer personal data to the UK based on a derogation listed in Article 49 GDPR. However, Article 49 GDPR has an exceptional nature and the derogations it contains must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive.
  • Moreover, where personal data are transferred to the UK on the basis of Article 46 GDPR safeguards, supplementary measures might be necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, in accordance with the Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data.

Regarding commercial data transfers, the ICO issued a statement urging British entities to start setting up “alternative transfer mechanisms” to ensure data continues to flow from the EU to UK:

  • The Government has announced that the Treaty agreed with the EU will allow personal data to flow freely from the EU (and EEA) to the UK, until adequacy decisions have been adopted, for no more than six months.
  • This will enable businesses and public bodies across all sectors to continue to freely receive data from the EU (and EEA), including law enforcement agencies.
  • As a sensible precaution, before and during this period, the ICO recommends that businesses work with EU and EEA organisations who transfer personal data to them, to put in place alternative transfer mechanisms, to safeguard against any interruption to the free flow of EU to UK personal data.

However, even with these more restrictive means of transferring personal data to the UK exist, there will likely be legal challenges. It bears note that in light of Schrems II, EU DPAs are likely to apply a much higher level of scrutiny to SCCs, and challenges to the legality of using SCCs to transfer personal data to the U.S. have already been commenced. It also seems certain the legality of using SCCs to transfer data to the UK would be challenged, as well.

However, returning to the preliminary issue of whether the EC will give the UK an adequacy decision, there may a number of obstacles to a finding that the UK’s data protection and surveillance laws are indeed adequate under EU law[1]. Firstly, the UK’s surveillance practices in light of a recent set of CJEU rulings may prove difficult for the EC to stomach. In 2020, the CJEU handed down a pair of rulings (here and here) on the extent to which European Union (EU) nations may engage in bulk, indiscriminate collection of two types of data related to electronic communications. The CJEU found that while EU member nations may conduct these activities to combat crime or national security threats during periods limited by necessity and subject to oversight, nations may not generally require the providers of electronic communications to store and provide indiscriminate location data and traffic data in response to an actual national security danger or a prospective one. The CJEU combined three cases into two rulings that came from the UK, France, and Belgium to elucidate the reach of the Privacy and Electronic Communications Directive in relation to foundational EU laws.

The UK is, of course, one of the U.S.’s staunchest allies and partners when it comes to government surveillance of electronic communications. On this point, the CJEU summarized the beginning of the case out of the UK:

  • At the beginning of 2015, the existence of practices for the acquisition and use of bulk communications data by the various security and intelligence agencies of the United Kingdom, namely GCHQ, MI5 and MI6, was made public, including in a report by the Intelligence and Security Committee of Parliament (United Kingdom). On 5 June 2015, Privacy International, a non-governmental organisation, brought an action before the Investigatory Powers Tribunal (United Kingdom) against the Secretary of State for Foreign and Commonwealth Affairs, the Secretary of State for the Home Department and those security and intelligence agencies, challenging the lawfulness of those practices.

Secondly, the government of Prime Minister Boris Johnson may aspire to change data laws in ways the EU does not. In media accounts, unnamed EC officials were critical of the UK’s 2020 “National Data Strategy,” particularly references to “legal barriers (real and perceived)” to accessing data that “must be addressed.”

Thirdly, it may become a matter of politics. The EU has incentives to make the UK’s exit from the EU difficult to dissuade other nations from following the same path. Moreover, having previously been the second largest economy in the EU as measured by GDP, the UK may prove a formidable economy competitor, lending more weight to the view that the EU may not want to help the UK’s  businesses compete with the EU’s.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by succo from Pixabay


[1] European Union Parliament, “The EU-UK relationship beyond Brexit: options for Police Cooperation and Judicial Cooperation in Criminal Matters,” Page 8: Although the UK legal framework is currently broadly in line with the EU legal framework and the UK is a signatory to the European Convention on Human Rights (ECHR), there are substantial questions over whether the Data Protection Act fully incorporates the data protection elements required by the Charter of Fundamental Rights, concerning the use of the national security exemption from the GDPR used by the UK, the retention of data and bulk powers granted to its security services, and over its onward transfer of this data to third country security partners such as the ‘Five Eyes’ partners (Britain, the USA, Australia, New Zealand and Canada).

Further Reading, Other Developments, and Coming Events (14 December)

Further Reading

  • Russian Hackers Broke Into Federal Agencies, U.S. Officials Suspect” By David Sanger — The New York Times.; “Russian government hackers are behind a broad espionage campaign that has compromised U.S. agencies, including Treasury and Commerce” By Ellen Nakashima and Craig Timberg — The Washington Post; “Suspected Russian hackers spied on U.S. Treasury emails – sources” By Chris Bing — Reuters. Apparently, Sluzhba vneshney razvedki Rossiyskoy Federatsii (SVR), the Russian Federation’s Foreign Intelligence Service, has exploited a vulnerability in SolarWinds’ update system used by many United States (U.S.) government systems, Fortune 500 companies, and the U.S.’ top ten largest telecommunications companies. Reportedly, APT29 (aka Cozy Bear) has had free reign in the email systems of the Departments of the Treasury and Commerce among other possible victims. The hackers may have also accessed a range of other entities around the world using the same SolarWind system. Moreover, these penetrations may be related to the recently announced theft of hacking tools a private firm, FireEye, used to test clients’ systems.
  • Hackers steal Pfizer/BioNTech COVID-19 vaccine data in Europe, companies say” By Jack Stubbs — Reuters. The European Union’s (EU) agency that oversees and approve medications has been hacked, and documents related to one of the new COVID-19 vaccines may have been stolen. The European Medicines Agency (EMA) was apparently penetrated, and materials related to Pfizer and BioNTech’s vaccine were exfiltrated. The scope of the theft is not yet known, but this is the latest in many attempts to hack into the entities conducting research on the virus and potential vaccines.
  • The AI Girlfriend Seducing China’s Lonely Men” By Zhang Wanqing — Sixth Tone. A chat bot powered by artificial intelligence that some men in the People’s Republic of China (PRC) are using extensively raises all sorts of ethical and privacy issues. Lonely people have turned to this AI technology and have confided their deepest feelings, which are stored by the company. It seems like a matter of time until these data are mined for commercial value or hacked. Also, the chatbot has run afoul of PRC’s censorship policies. Finally, is this a preview of the world to come, much like the 2013 film, Her, in which humans have relationships with AI beings?
  • YouTube will now remove videos disputing Joe Biden’s election victory” By Makena Kelly — The Verge. The Google subsidiary announced that because the safe harbor deadline has been reached and a sufficient number of states have certified President-elect Joe Biden, the platform will begin taking down misleading election videos. This change in policy may have come about, in part, because of pressure from Democrats in Congress about what they see as Google’s lackluster efforts to find and remove lies, misinformation, and disinformation about the 2020 election.
  • Lots of people are gunning for Google. Meet the man who might have the best shot.” By Emily Birnbaum — Protocol. Colorado Attorney General Phil Weiser may be uniquely qualified to lead state attorneys general on a second antitrust and anti-competition action against Google given his background as a law professor steeped in antitrust and his background in the Department of Justice and White House during the Obama Administration.

Other Developments

  • Cybersecurity firm, FireEye, revealed it was “attacked by a highly sophisticated threat actor, one whose discipline, operational security, and techniques lead us to believe it was a state-sponsored attack” according to CEO Kevin Mandia. This hacking may be related to vast penetration of United States (U.S.) government systems revealed over the weekend. Mandia stated FireEye has “found that the attacker targeted and accessed certain Red Team assessment tools that we use to test our customers’ security…[that] mimic the behavior of many cyber threat actors and enable FireEye to provide essential diagnostic security services to our customers.” Mandia claimed none of these tools were zero-day exploits. FireEye is “proactively releasing methods and means to detect the use of our stolen Red Team tools…[and] out of an abundance of caution, we have developed more than 300 countermeasures for our customers, and the community at large, to use in order to minimize the potential impact of the theft of these tools.
    • Mandia added:
      • Consistent with a nation-state cyber-espionage effort, the attacker primarily sought information related to certain government customers. While the attacker was able to access some of our internal systems, at this point in our investigation, we have seen no evidence that the attacker exfiltrated data from our primary systems that store customer information from our incident response or consulting engagements, or the metadata collected by our products in our dynamic threat intelligence systems. If we discover that customer information was taken, we will contact them directly.
      • Based on my 25 years in cyber security and responding to incidents, I’ve concluded we are witnessing an attack by a nation with top-tier offensive capabilities. This attack is different from the tens of thousands of incidents we have responded to throughout the years. The attackers tailored their world-class capabilities specifically to target and attack FireEye. They are highly trained in operational security and executed with discipline and focus. They operated clandestinely, using methods that counter security tools and forensic examination. They used a novel combination of techniques not witnessed by us or our partners in the past.
      • We are actively investigating in coordination with the Federal Bureau of Investigation and other key partners, including Microsoft. Their initial analysis supports our conclusion that this was the work of a highly sophisticated state-sponsored attacker utilizing novel techniques.    
  • The United States’ (U.S.) Department of Justice filed suit against Facebook for “tactics that discriminated against U.S. workers and routinely preferred temporary visa holders (including H-1B visa holders) for jobs in connection with the permanent labor certification (PERM) process.” The DOJ is asking for injunction to stop Facebook from engaging in the alleged conduct, civil penalties, and damages for workers harmed by this conduct.
    • The DOJ contended:
      • The department’s lawsuit alleges that beginning no later than Jan. 1, 2018 and lasting until at least Sept. 18, 2019, Facebook employed tactics that discriminated against U.S. workers and routinely preferred temporary visa holders (including H-1B visa holders) for jobs in connection with the PERM process. Rather than conducting a genuine search for qualified and available U.S. workers for permanent positions sought by these temporary visa holders, Facebook reserved the positions for temporary visa holders because of their immigration status, according to the complaint. The complaint also alleges that Facebook sought to channel jobs to temporary visa holders at the expense of U.S. workers by failing to advertise those vacancies on its careers website, requiring applicants to apply by physical mail only, and refusing to consider any U.S. workers who applied for those positions. In contrast, Facebook’s usual hiring process relies on recruitment methods designed to encourage applications by advertising positions on its careers website, accepting electronic applications, and not pre-selecting candidates to be hired based on a candidate’s immigration status, according to the lawsuit.
      • In its investigation, the department determined that Facebook’s ineffective recruitment methods dissuaded U.S. workers from applying to its PERM positions. The department concluded that, during the relevant period, Facebook received zero or one U.S. worker applicants for 99.7 percent of its PERM positions, while comparable positions at Facebook that were advertised on its careers website during a similar time period typically attracted 100 or more applicants each. These U.S. workers were denied an opportunity to be considered for the jobs Facebook sought to channel to temporary visa holders, according to the lawsuit. 
      • Not only do Facebook’s alleged practices discriminate against U.S. workers, they have adverse consequences on temporary visa holders by creating an employment relationship that is not on equal terms. An employer that engages in the practices alleged in the lawsuit against Facebook can expect more temporary visa holders to apply for positions and increased retention post-hire. Such temporary visa holders often have limited job mobility and thus are likely to remain with their company until they can adjust status, which for some can be decades.
      • The United States’ complaint seeks civil penalties, back pay on behalf of U.S. workers denied employment at Facebook due to the alleged discrimination in favor of temporary visa holders, and other relief to ensure Facebook stops the alleged violations in the future. According to the lawsuit, and based on the department’s nearly two-year investigation, Facebook’s discrimination against U.S. workers was intentional, widespread, and in violation of a provision of the Immigration and Nationality Act (INA), 8 U.S.C. § 1324b(a)(1), that the Department of Justice’s Civil Rights Division enforces. 
  • A trio of consumer authority regulators took the lead in coming into agreement with Apple to add “a new section to each app’s product page in its App Store, containing key information about the data the app collects and an accessible summary of the most important information from the privacy policy.” The United Kingdom’s UK’s Competition and Markets Authority (CMA), the Netherlands Authority for Consumers and Markets and the Norwegian Consumer Authority led the effort that “ongoing work from the International Consumer Protection and Enforcement Network (ICPEN), involving 27 of its consumer authority members across the world.” The three agencies explained:
    • Consumer protection authorities, including the CMA, became concerned that people were not being given clear information on how their personal data would be used before choosing an app, including on whether the app developer would share their personal data with a third party. Without this information, consumers are unable to compare and choose apps based on how they use personal data.
  • Australia’s Council of Financial Regulators (CFR) has released a Cyber Operational Resilience Intelligence-led Exercises (CORIE) framework “to test and demonstrate the cyber maturity and resilience of institutions within the Australian financial services industry.”

Coming Events

  • On 15 December, the Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing titled “The Role of Private Agreements and Existing Technology in Curbing Online Piracy” with these witnesses:
    • Panel I
      • Ms. Ruth Vitale, Chief Executive Officer, CreativeFuture
      • Mr. Probir Mehta, Head of Global Intellectual Property and Trade Policy, Facebook, Inc.
      • Mr. Mitch Glazier, Chairman and CEO, Recording Industry Association of America
      • Mr. Joshua Lamel, Executive Director, Re:Create
    • Panel II
      • Ms. Katherine Oyama, Global Director of Business Public Policy, YouTube
      • Mr. Keith Kupferschmid, Chief Executive Officer, Copyright Alliance
      • Mr. Noah Becker, President and Co-Founder, AdRev
      • Mr. Dean S. Marks, Executive Director and Legal Counsel, Coalition for Online Accountability
  • The Senate Armed Services Committee’s Cybersecurity Subcommittee will hold a closed briefing on Department of Defense Cyber Operations on 15 December with these witnesses:
    • Mr. Thomas C. Wingfield, Deputy Assistant Secretary of Defense for Cyber Policy, Office of the Under Secretary of Defense for Policy
    • Mr. Jeffrey R. Jones, Vice Director, Command, Control, Communications and Computers/Cyber, Joint Staff, J-6
    • Ms. Katherine E. Arrington, Chief Information Security Officer for the Assistant Secretary of Defense for Acquisition, Office of the Under Secretary of Defense for Acquisition and Sustainment
    • Rear Admiral Jeffrey Czerewko, United States Navy, Deputy Director, Global Operations, J39, J3, Joint Staff
  • The Senate Banking, Housing, and Urban Affairs Committee’s Economic Policy Subcommittee will conduct a hearing titled “US-China: Winning the Economic Competition, Part II” on 16 December with these witnesses:
    • The Honorable Will Hurd, Member, United States House of Representatives;
    • Derek Scissors, Resident Scholar, American Enterprise Institute;
    • Melanie M. Hart, Ph.D., Senior Fellow and Director for China Policy, Center for American Progress; and
    • Roy Houseman, Legislative Director, United Steelworkers (USW).
  • On 17 December the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency’s (CISA) Information and Communications Technology (ICT) Supply Chain Risk Management (SCRM) Task Force will convene for a virtual event, “Partnership in Action: Driving Supply Chain Security.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by stein egil liland from Pexels

Task Force Calls For Enhanced Digital Regulation in UK

The UK may soon reform its competition and consumer laws visa vis digital markets.

A United Kingdom (UK) entity has recommended that Prime Minister Boris Johnson and his Conservative government remake digital regulation in the UK, especially with respect to competition policy. A task force has returned an extensive set of recommendations requiring legislation and increased coordination and a new focus for existing regulators. The timeline for such action is not clear, and Downing Street would have to agree before anything happens. However, the UK’s new regulatory scheme and the European Union’s ongoing efforts to revamp its regulatory approach to large technology firms will both likely affect United States (U.S.) multinationals such as Facebook and Google. It may also serve as a template for the U.S. to remake its regulation of digital competition.

The United Kingdom’s Competition & Markets Authority (CMA) led an effort consisting of the Office of Communications (Ofcom) and the Information Commissioner’s Office (ICO) in the form of the Digital Markets Taskforce. The Task Force follows the 2019 “Unlocking digital competition, Report of the Digital Competition Expert Panel”, an effort led by Obama Administration Council of Economic Advisers Chair Jason Furman and the more recent July 2020 “Online platforms and digital advertising market study.” In 2019, the Task Force issued its “Digital Markets Strategy” that “sets out five strategic aims, and seven priority focus areas.”

The Task Force acknowledged its efforts in the UK were not unique. It referenced similar inquiries and plans to reform other nations’ regulation of digital markets in the U.S., the EU, Germany, Japan, and Australia.

The Task Force summarized its findings:

The accumulation and strengthening of market power by a small number of digital firms has the potential to cause significant harm to consumers and businesses that rely on them, to innovative competitors and to the economy and society more widely:

  • A poor deal for consumers and businesses who rely on them. These firms can exploit their powerful positions. For consumers this can mean they get a worse deal than they would in a more competitive market, for example having less protection or control of their data. For businesses this can mean they are, for example, charged higher listing fees or higher prices for advertising online. These higher prices for businesses can then feed through into higher prices for consumers for a wide range of products and services across the economy.
  • Innovative competitors face an unfair disadvantage. A powerful digital firm can extend its strong position in one market into other markets, ultimately giving itself an unfair advantage over its rivals. This means innovative competitors, even if they have a good idea, are likely to find it much harder to compete and grow their businesses. This can result in long-term harmful effects on innovation and the dynamism of UK markets.
  • A less vibrant digital economy. If powerful digital firms act to unfairly disadvantage their innovative competitors, these innovative firms will find it harder to enter and expand in new markets, meaning the ‘unicorns’ of tomorrow that will support jobs and the future digital economy will not emerge.

The Task Force calls for the establishment of a new Digital Markets Unit (DMU) that would be particularly focused on policing potential harm before it occurs. Thus, the Task Force is calling for a regulator that is proactive and nimble enough to address risks to competition and consumers any harm happens. The DMU would oversee a new “Strategic Market Status” regime, and the Task Force is recommending that the government and Parliament revisit and refresh consumer and competition laws. The Task Force stated that the “government should put in place a regulatory framework for the most powerful digital firms, alongside strengthening existing competition and consumer laws…[and] [i]n considering the design of this regulatory framework we have sought to strike the right balance between the following key principles:

  • Evidence driven and effective – regulation must be effective, and that means ensuring it is evidence based, but also that it can react swiftly enough to prevent and address harms. The activities undertaken by the most powerful digital firms are diverse and a ‘one size fits all’ approach could have damaging results.
  • Proportionate and targeted – regulation must be proportionate and targeted at addressing a particular problem, minimising the risk of any possible unintended consequences.
  • Open, transparent and accountable – across all its work the DMU should operate in an open and transparent manner. In reaching decisions it should consult a wide range of parties. It should clearly articulate why it has reached decisions and be held accountable for them.
  • Proactive and forward-looking – the DMU should be focused on preventing harm from occurring, rather than enforcing ex post. It should seek to understand how digital markets might evolve, the risks this poses to competition and innovation, and act proactively to assess and manage those risks.
  • Coherent – the DMU should seek to promote coherence with other regulatory regimes both domestically and internationally, in particular by working through the Digital Regulation Cooperation Forum which is already working to deliver a step change in coordination and cooperation between regulators in digital markets.

The Task Force provided more detail on the new SMS scheme:

The entry point to the SMS regime is an assessment of whether a firm has ‘strategic market status’. This should be an evidence-based economic assessment as to whether a firm has substantial, entrenched market power in at least one digital activity, providing the firm with a strategic position (meaning the effects of its market power are likely to be particularly widespread and/or significant). It is focused on assessing the very factors which may give rise to harm, and which motivate the need for regulatory intervention.

Those firms that are designated with SMS should be subject to the following three pillars of the regime:

  • An enforceable code of conduct that sets out clearly how an SMS firm is expected to behave in relation to the activity motivating its SMS designation. The aim of the code is to manage the effects of market power, for example by preventing practices which exploit consumers and businesses or exclude innovative competitors.
  • Pro-competitive interventions like personal data mobility, interoperability and data access which can be used to address the factors which are the source of an SMS firm’s market power in a particular activity. These interventions seek to drive longer-term dynamic changes in these activities, opening up opportunities for greater competition and innovation.
  • SMS merger rules to ensure closer scrutiny of transactions involving SMS firms, given the particular risks and potential consumer harm arising from these transactions.

The SMS regime should be an ex ante regime, focused on proactively preventing harm. Fostering a compliance culture within SMS firms will be crucial to its overall success. However, a key part of fostering compliance is credible deterrence and the DMU will need to be able to take tough action where harm does occur, requiring firms to change their behaviour, and with the ability to impose substantial penalties. The ability to take tough action sits alongside enabling resolution through a participative approach, whereby the DMU seeks to engage constructively with all affected parties to achieve fast and effective results.

The Task Force sketched its ideal timeline during which Parliament would enact its recommendations, which would be next year at the earliest:

We believe the case for an ex ante regime in digital markets has been made. We therefore welcome the government’s response to the CMA’s online platforms and digital advertising market study, and its commitment to establishing a DMU from April 2021 within the CMA. We also welcome government’s commitment to consult on proposals for a new pro-competition regime in early 2021 and to legislate to put the DMU on a statutory footing when parliamentary time allows. We urge government to move quickly in taking this legislation forward. As government rightly acknowledges, similar action is being pursued across the globe and there is a clear opportunity for the UK to lead the way in championing a modern pro-competition, pro-innovation regime.

The Task Force summarized its recommendations to the government:

A Digital Markets Unit

Recommendation 1: The government should set up a DMU which should seek to further the interests of consumers and citizens in digital markets, by promoting competition and innovation.

  • Recommendation 1a: The DMU should be a centre of expertise and knowledge in relation to competition in digital markets.
  • Recommendation 1b: The DMU should be proactive, seeking to foster compliance with regulatory requirements and taking swift action to prevent harm from occurring.

A pro-competition regime for the most powerful digital firms

Recommendation 2: The government should establish a pro-competition framework, to be overseen by the DMU, to pursue measures in relation to SMS firms which further the interests of consumers and citizens, by promoting competition and innovation.

Recommendation 3: The government should provide the DMU with the power to designate a firm with SMS.

  • Recommendation 3a: SMS should require a finding that the firm has substantial, entrenched market power in at least one digital activity, providing the firm with a strategic position.
  • Recommendation 3b: The DMU should set out in formal guidance its prioritisation rules for designation assessments. These should include the firm’s revenue (globally and within the UK), the activity undertaken by the firm and a consideration of whether a sector regulator is better placed to address the issues of concern.
  • Recommendation 3c: The designation process should be open and transparent with a consultation on the provisional decision and the assessment completed within a statutory deadline.
  • Recommendation 3d: A firm’s SMS designation should be set for a fixed period before being reviewed.
  • Recommendation 3e: When a firm meets the SMS test, the associated remedies should apply only to a subset of the firm’s activities, whilst the status should apply to the firm as a whole.

Recommendation 4: The government should establish the SMS regime such that when the SMS test is met, the DMU can establish an enforceable code of conduct for the firm in relation to its designated activities to prevent it from taking advantage of its power and position.

  • Recommendation 4a: A code should comprise high-level objectives supported by principles and guidance.
  • Recommendation 4b: The objectives of the code should be set out in legislation, with the remainder of the content of each code to be determined by the DMU, tailored to the activity, conduct and harms it is intended to address.
  • Recommendation 4c: The DMU should ensure the code addresses the concerns about the effect of the power and position of SMS firms when dealing with publishers, as identified by the Cairncross Review.
  • Recommendation 4d: The code of conduct should always apply to the activity or activities which are the focus of the SMS designation.
  • Recommendation 4e: The DMU should consult on and establish a code as part of the designation assessment. The DMU should be able to vary the code outside the designation review cycle.

Recommendation 5: SMS firms should have a legal obligation to ensure their conduct is compliant with the requirements of the code at all times and put in place measures to foster compliance.

Recommendation 6: The government should establish the SMS regime such that the DMU can impose pro-competitive interventions on an SMS firm to drive dynamic change as well as to address harms related to the designated activities.

  • Recommendation 6a: With the exception of ownership separation, the DMU should not be limited in the types of remedies it is able to apply.
  • Recommendation 6b: The DMU should be able to implement PCIs anywhere within an SMS firm in order to address a concern related to its substantial entrenched market power and strategic position in a designated activity.
  • Recommendation 6c: In implementing a PCI the DMU should demonstrate that it is an effective and proportionate remedy to an adverse effect on competition or consumers. A PCI investigation should be completed within a fixed statutory deadline.
  • Recommendation 6d: PCIs should be implemented for a limited duration and should be regularly reviewed.

Recommendation 7: The government should establish the SMS regime such that the DMU can undertake monitoring in relation to the conduct of SMS firms and has a range of tools available to resolve concerns.

  • Recommendation 7a: Where appropriate, the DMU should seek to resolve concerns using a participative approach, engaging with parties to deliver fast and effective resolution.
  • Recommendation 7b: The DMU should be able to open formal investigations into breaches of the code and where a breach is found, require an SMS firm to change its behaviour. These investigations should be completed within a fixed statutory deadline.
  • Recommendation 7c: The DMU should be able to impose substantial penalties for breaches of the code and for breaches of code and PCI orders.
  • Recommendation 7d: The DMU should be able to take action quickly on an interim basis where it suspects the code has been breached.
  • Recommendation 7e: The DMU should be able to undertake scoping assessments where it is concerned there is an adverse effect on competition or consumers in relation to a designated activity. The outcome of such assessments could include a code breach investigation, a pro-competitive intervention investigation, or variation to a code principle or guidance.

Recommendation 8: The government should establish the SMS regime such that the DMU can draw information from a wide range of sources, including by using formal information gathering powers, to gather the evidence it needs to inform its work.

Recommendation 9: The government should ensure the DMU’s decisions are made in an open and transparent manner and that it is held accountable for them.

  • Recommendation 9a: The DMU’s decisions should allow for appropriate internal scrutiny.
  • Recommendation 9b: The DMU should consult on its decisions.
  • Recommendation 9c: The DMU’s decisions should be timely, with statutory deadlines used to set expectations and deliver speedy outcomes.
  • Recommendation 9d: The DMU’s decisions should be judicially reviewable on ordinary judicial review principles and the appeals process should deliver robust outcomes at pace.

Recommendation 10: The government should establish the SMS regime such that SMS firms are subject to additional merger control requirements.

Recommendation 11: The government should establish the SMS merger control regime such that SMS firms are required to report all transactions to the CMA. In addition, transactions that meet clear-cut thresholds should be subject to mandatory notification, with completion prohibited prior to clearance. Competition concerns should be assessed using the existing substantive test but a lower and more cautious standard of proof.

A modern competition and consumer regime for digital markets

Recommendation 12: The government should provide the DMU with a duty to monitor digital markets to enable it to build a detailed understanding of how digital businesses operate, and to provide the basis for swifter action to drive competition and innovation and prevent harm.

Recommendation 13: The government should strengthen competition and consumer protection laws and processes to ensure they are better adapted for the digital age.

  • Recommendation 13a: The government should pursue significant reforms to the markets regime to ensure it can be most effectively utilised to promote competition and innovation across digital markets, for example by pursuing measures like data mobility and interoperability.
  • Recommendation 13b: The government should strengthen powers to tackle unlawful or illegal activity or content on digital platforms which could result in economic detriment to consumers and businesses.
  • Recommendation 13c: The government should take action to strengthen powers to enable effective consumer choice in digital markets, including by addressing instances where choice architecture leads to consumer harm.
  • Recommendation 13d: The government should provide for stronger enforcement of the Platform to Business Regulation.

A coherent regulatory landscape

Recommendation 14: The government should ensure the DMU is able to work closely with other regulators with responsibility for digital markets, in particular Ofcom, the ICO and the FCA.

  • Recommendation 14a: The DMU should be able to share information with other regulators and seek reciprocal arrangements.
  • Recommendation 14b: The government should consider, in consultation with Ofcom and the FCA, empowering these agencies with joint powers with the DMU in relation to the SMS regime, with the DMU being the primary authority.

Recommendation 15: The government should enable the DMU to work closely with regulators in other jurisdictions to promote a coherent regulatory landscape.

  • Recommendation 15a: The DMU should be able to share information with regulators in other jurisdictions and should seek reciprocal arrangements.
  • Recommendation 15b: The DMU should explore establishing a network of international competition and consumer agencies to facilitate better monitoring and action in relation to the conduct of SMS firms.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Free-Photos from Pixabay

Johnson’s Government Unveils Telecoms Bill

The UK seeks to remake its telecommunications sector, especially its security and supply chain risk aspects.

The government of Prime Minister Boris Johnson has released its telecommunications legislation that would delineate the United Kingdom’s (UK) approach to managing “high risk” companies such as Huawei. The Telecommunications (Security) Bill would reform how the UK regulates the security practices of telecommunications providers like Vodafone and also address risks to the nation’s telecommunications system.

The genesis of the legislation was the 2018 UK Telecoms Supply Chain Review, an inquiry launched “to address three key questions:

  1. How should we incentivise telecoms providers to improve security standards and practices in 5G and full fibre networks?
  2. How should we address the security challenges posed by vendors?
  3. How can we create sustainable diversity in the telecoms supply chain?”

A year later, the UK government “identified three areas of concern:

  1. Existing industry practices may have achieved good commercial outcomes but did not incentivise effective cyber security risk management.
  2. Policy and regulation in enforcing telecoms cyber security needed to be significantly strengthened to address these concerns.
  3. The lack of diversity across the telecoms supply chain creates the possibility of national dependence on single suppliers, which poses a range of risks to the security and resilience of UK telecoms networks.”

The Department asserted:

The Review recommended the establishment of a new security framework for the UK’s public telecoms providers, with its foundations set by new telecoms security requirements overseen by Ofcom and the government. It also recommended new national security powers for the government to control the presence of high risk vendors in UK networks.

Working in the background during this initiative was the pressure brought by the United States (U.S.) and the People’s Republic of China (PRC) over Huawei and 5G and the pending exit from the European Union (EU). The Trump Administration was making claims about the security of Huawei’s 5G technology and equipment, arguing it would serve to allow the PRC’s security services to spy in any nation that installed the PRC technology giant’s systems. At first, the UK tried to manage the risks its security services turned up in reviewing Huawei’s technology and sought a middle path where Huawei would have a significant role in 5G in the UK as it did for previous iterations of the nation’s wireless network.

However, this approach proved politically unfeasible when Conservative backbenchers indicated to Downing Street that they would amend a telecoms bill to ban Huawei. At this point, the Prime Minister changed tack and announced ban that would take effect by 2027 of any new Huawei technology in the UK’s 5G networks. Johnson’s government nearly lost a vote in March on a different telecoms bill, sending his leadership team a signal they appear to have received. The reason provided for the UK’s change was U.S. sanctions on Huawei that cut off its access to semiconductors that allegedly now made it impossible to use the company for the 5G rollout. In a fact sheet, it was claimed:

  • on 14 July 2020 the Secretary of State for Digital, Culture, Media, and Sport (DCMS) announced in the House of Commons that UK telecoms providers should cease to procure any new 5G equipment from Huawei after 31 December 2020 and remove all Huawei equipment from 5G networks by the end of 2027.
  • The government advised full fibre telecoms providers to transition away from purchasing Huawei full fibre equipment affected by the US sanctions. For full fibre networks, we have held a technical consultation with industry on the transition away from Huawei equipment, in order to better understand supply chain alternatives. The conclusions of the consultation will be announced in due course.

The Department for Digital, Culture, Media, and Sport (Department) explained in one fact sheet, “[t]he Telecommunications (Security) Bill is in two parts:

  1. Clauses 1 to 14 introduce a stronger telecoms security framework. The Bill amends the Communications Act 2003 by placing strengthened telecoms security duties on public telecoms providers. To support these duties, the Bill will enable more specific security requirements to be set out in secondary legislation, underpinned by codes of practice providing guidance on the security measures to be taken to meet those requirements. The Bill gives the telecoms regulator, the Office of Communications (Ofcom), powers to monitor and enforce industry compliance with the duties and specific security requirements. It places new obligations on public telecoms providers to share information with Ofcom that is necessary to assess the security of their networks, including reporting duties in the event of a security compromise. It also places new duties on Ofcom to promote security and resilience of public telecoms providers. In addition, the Bill introduces financial penalties for non-compliance with the new duties and requirements placed on public telecoms providers.
  2. Clauses 15 to 23 introduce new national security powers for the government to manage risks posed by high risk vendors. The Bill creates new powers for the Secretary of State to designate vendors for the purpose of issuing directions to public communications providers imposing controls on their use of those designated vendors’ goods, services and facilities. Designation and the giving of directions can only take place where the Secretary of State considers it is necessary in the interests of national security. The Bill makes it a duty for providers to comply with the requirements set out in the directions and creates financial penalties for non-compliance. It also includes provisions to ensure the monitoring and enforcement of those requirements, including new powers for the Secretary of State to give monitoring directions to Ofcom requiring Ofcom to obtain information relating to a provider’s compliance with requirements in a direction, and to provide such information in a report to the Secretary of State.

In a different fact sheet, the Department described how telecommunications providers would be regulated under the new security framework: “strengthened overarching security duties, specific security requirements, and codes of practice.” The Department provided detail on each piece:

Security duties

The Bill introduces strengthened overarching security duties. These will require all telecoms providers to take appropriate and proportionate measures to identify and reduce the risks of security compromises occurring, as well as preparing for the occurrence of security compromises. Security compromises will include:

  • anything that compromises the availability, performance or functionality of a network or service
  • any unauthorised access to, interference with or exploitation of networks or services
  • anything that compromises the confidentiality of signals or data
  • anything that causes signals or data to be lost, unintentionally altered or altered without permission of the telecoms provider
  • anything occurring in connection with a network or service that causes a compromise on another network or service that belongs to another telecoms provider

Telecoms providers will also be required to take appropriate and proportionate action after a security compromise has occurred, to limit damage and take steps to remedy or mitigate the damage.

Secondary legislation

The Telecommunications (Security) Bill also allows the government to make secondary legislation to detail specific security requirements that providers must meet. This will include targeted action to make sure telecoms providers securely design, construct and maintain network equipment that handles sensitive data; reduce supply chain risks; carefully control access to sensitive parts of the network; and make sure the right processes are in place to understand the risks facing their company’s public networks and services.

These requirements will be enforced by Ofcom and may be updated in the future where new threats arise or technologies evolve. The government will engage with telecoms providers on the technical detail of secondary legislation before it is finalised, during passage of the Bill. This engagement will help to inform an impact assessment, which will be published alongside the secondary legislation to assess costs and benefits to businesses.

Codes of practice

Finally, the Bill provides the government with the powers to issue codes of practice to provide guidance on how, and to what timescale, certain telecoms providers should comply with their legal obligations. For example, it will set out the detailed technical measures that should be taken to segregate and control access to the areas of networks that process and manage customers’ data. Ofcom will take relevant codes into account when monitoring and enforcing the new security framework.

There are many different sized telecoms companies providing telecoms networks and services, and while their security and resilience is critical, it is important their differences are recognised. To ensure measures are applied proportionately, the government intends to define three tiers of telecoms provider in an initial code of practice, which will be finalised via public consultation:

  1. The code of practice will apply to the largest national-scale (‘Tier 1’) telecoms providers, whose availability and security is critical to people and businesses across the UK. These providers will also be subject to intensive Ofcom monitoring and oversight.
  2. The code of practice will also apply to medium-sized (‘Tier 2’) telecoms providers, who will be subject to some Ofcom oversight and monitoring. These providers are expected to have more time to implement the security measures set out in the code of practice.
  3. The smallest (‘Tier 3’) telecoms providers, including small businesses and micro enterprises, will need to comply with the law. It is not anticipated that the code of practice will be applied to Tier 3 providers, but these providers may be subject to some limited Ofcom oversight.

The Bill includes a requirement for the government to consult on any codes of practice. DCMS will issue a full public consultation on the approach to implementing the code of practice following Royal Assent, including the approach to tiering and implementation timetables.

Alongside acting as a tool to help regulatory compliance, the code of practice will serve as best practice security guidance to all UK telecoms providers (including private networks).

The Department explained the new penalty scheme:

  • For contravention of a security duty (other than the duty to explain a failure to follow a code of practice) Ofcom may impose a penalty up to a maximum of ten percent of a provider’s ‘relevant turnover’ or (in the case of a continuing contravention) £100,000 per day.
  • For contravention of an information requirement or refusal to explain a failure to follow a code of practice, Ofcom may impose a penalty up to a maximum of £10 million or (in the case of a continuing contravention) £50,000 per day.

The Department explained other part of the bill under which the Secretary of State would be empowered to address risk in the telecommunications system is discussed in a different fact sheet:

The Telecommunications (Security) Bill introduces new powers for the Secretary of State to manage the risks posed by high risk vendors. In the Bill, such vendors are referred to ‘designated vendors’.

The Bill creates powers for the Secretary of State to:

  1. issue directions, in the interests of national security, to public communications providers placing controls on their use of goods, services or facilities supplied, provided or made available by designated vendors (‘designated vendor directions’)
  2. designate specific vendors, in the interests of national security, for the purpose of issuing the designated vendor directions (‘designated vendors’)

The Bill makes it a duty for public communications providers to comply with any requirements set out in a direction and introduces financial penalties for non-compliance. The Secretary of State will be responsible for assessing and enforcing compliance with any direction requirements. Ofcom may be tasked by the Secretary of State with gathering information relevant to the Secretary of State’s assessment of a provider’s compliance with a direction. Ofcom will provide such information to the Secretary of State in the form of a report, the frequency of which can be specified by the Secretary of State.

The Secretary of State will also be responsible for assessing and enforcing compliance with the requirements in the Bill relating to non-disclosure. The Bill enables the Secretary of State to impose requirements not to disclose particular information (such as in relation to a designated vendor director or designation notice), where disclosure would be contrary to the interests of national security.

The Secretary of State will also be responsible for assessing and enforcing compliance with any requirements to provide information given under the information requirement power. These requirements can apply not just to telecoms providers but to anyone who appears to the Secretary of State to have information relevant to the exercise of the Secretary of State’s functions in relation to designation notices and designated vendor directions.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Development, and Coming Events (8 December)

Further Reading

  • Facebook failed to put fact-check labels on 60% of the most viral posts containing Georgia election misinformation that its own fact-checkers had debunked, a new report says” By Tyler Sonnemaker — Business Insider. Despite its vows to improve its managing of untrue and false content, the platform is not consistently taking down such material related to the runoffs for the Georgia Senate seats. The group behind this finding argues it is because Facebook does not want to. What is left unsaid is that engagement drives revenue, and so, Facebook’s incentives are not to police all violations. Rather it would be to take down enough to be able to say their doing something.
  • Federal Labor Agency Says Google Wrongly Fired 2 Employees” By Kate Conger and Noam Scheiber — The New York Times. The National Labor Relations Board (NLRB) has reportedly sided with two employees Google fired for activities that are traditionally considered labor organizing. The two engineers had been dismissed for allegedly violating the company’s data security practices when they researched the company’s retention of a union-busting firm and sought to alert others about organizing. Even though Google is vowing to fight the action, which has not been finalized, it may well settle given the view of Big Tech in Washington these days. This action could also foretell how a Biden Administration NLRB may look at the labor practices of these companies.
  • U.S. states plan to sue Facebook next week: sources” By Diane Bartz — Reuters. We could see state and federal antitrust suits against Facebook this week. One investigation led by New York Attorney General Tish James could include 40 states although the grounds for alleged violations have not been leaked at this point. It may be Facebook’s acquisition of potential rivals Instagram and WhatsApp that have allowed it to dominate the social messaging market. The Federal Trade Commission (FTC) may also file suit, and, again, the grounds are unknown. The European Commission (EC) is also investigating Facebook for possible violations of European Union (EU) antitrust law over the company’s use of the personal data it holds and uses and about its operation of it online marketplace.
  • The Children of Pornhub” By Nicholas Kristof — The New York Times. This column comprehensively traces the reprehensible recent history of a Canadian conglomerate Mindgeek that owns Pornhub where one can find reams of child and non-consensual pornography. Why Ottawa has not cracked down on this firm is a mystery. The passage and implementation of the “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” (P.L. 115-164) that narrowed the liability shield under 47 USC 230 has forced the company to remove content, a significant change from its indifference before the statutory change in law. Kristof suggests some easy, common sense changes Mindgeek could implement to combat the presence of this illegal material, but it seems like the company will do enough to say it is acting without seriously reforming its platform. Why would it? There is too much money to be made. Additionally, those fighting against this sort of material have been pressuring payment platforms to stop doing business with Mindgeek. PayPal has foresworn any  interaction, and due to pressure Visa and Mastercard are “reviewing” their relationship with Mindgeek and Pornhub. In a statement to a different news outlet, Pornhub claimed it is “unequivocally committed to combating child sexual abuse material (CSAM), and has instituted a comprehensive, industry-leading trust and safety policy to identify and eradicate illegal material from our community.” The company further claimed “[a]ny assertion that we allow CSAM is irresponsible and flagrantly untrue….[w]e have zero tolerance for CSAM.”
  • Amazon and Apple Are Powering a Shift Away From Intel’s Chips” By Don Clark — The New York Times. Two tech giants have chosen new faster, cheaper chips signaling a possible industry shift away from Intel, the firm that has been a significant player for decades. Intel will not go quietly, of course, and a key variable is whether must have software and applications are rewritten to accommodate the new chips from a British firm, Arm.

Other Developments

  • The Government Accountability Office (GAO) and the National Academy of Medicine (NAM) have released a joint report on artificial intelligence in healthcare, consisting of GAO’s Technology Assessment: Artificial Intelligence in Health Care: Benefits and Challenges of Technologies to Augment Patient Care and NAM’s Special Publication: Advancing Artificial Intelligence in Health Settings Outside the Hospital and Clinic. GAO’s report “discusses three topics: (1) current and emerging AI tools available for augmenting patient care and their potential benefits, (2) challenges to the development and adoption of these tools, and (3) policy options to maximize benefits and mitigate challenges to the use of AI tools to augment patient care.” NAM’s “paper aims to provide an analysis of: 1) current technologies and future applications of AI in HSOHC, 2) the logistical steps and challenges involved in integrating AI- HSOHC applications into existing provider workflows, and 3) the ethical and legal considerations of such AI tools, followed by a brief proposal of potential key initiatives to guide the development and adoption of AI in health settings outside the hospital and clinic (HSOHC).
    • The GAO “identified five categories of clinical applications where AI tools have shown promise to augment patient care: predicting health trajectories, recommending treatments, guiding surgical care, monitoring patients, and supporting population health management.” The GAO “also identified three categories of administrative applications where AI tools have shown promise to reduce provider burden and increase the efficiency of patient care: recording digital clinical notes, optimizing operational processes, and automating laborious tasks.” The GAO stated:
      • This technology assessment also identifies challenges that hinder the adoption and impact of AI tools to augment patient care, according to stakeholders, experts, and the literature. Difficulties accessing sufficient high-quality data may hamper innovation in this space. Further, some available data may be biased, which can reduce the effectiveness and accuracy of the tools for some people. Addressing bias can be difficult because the electronic health data do not currently represent the general population. It can also be challenging to scale tools up to multiple locations and integrate them into new settings because of differences in institutions and the patient populations they serve. The limited transparency of AI tools used in health care can make it difficult for providers, regulators, and others to determine whether an AI tool is safe and effective. A greater dispersion of data across providers and institutions can make securing patient data difficult. Finally, one expert described how existing case law does not specifically address AI tools, which can make providers and patients reticent to adopt them. Some of these challenges are similar to those identified previously by GAO in its first publication in this series, such as the lack of high-quality, structured data, and others are more specific to patient care, such as liability concerns.
    • The GAO “described six policy options:”
      • Collaboration. Policymakers could encourage interdisciplinary collaboration between developers and health care providers. This could result in AI tools that are easier to implement and use within an existing workflow.
      • Data Access. Policymakers could develop or expand high-quality data access mechanisms. This could help developers address bias concerns by ensuring data are representative, transparent, and equitable.
      • Best Practices. Policymakers could encourage relevant stakeholders and experts to establish best practices (such as standards) for development, implementation, and use of AI technologies. This could help with deployment and scalability of AI tools by providing guidance on data, interoperability, bias, and formatting issues.
      • Interdisciplinary Education. Policymakers could create opportunities for more workers to develop interdisciplinary skills. This could allow providers to use AI tools more effectively, and could be accomplished through a variety of methods, including changing medical curricula or grants.
      • Oversight Clarity. Policymakers could collaborate with relevant stakeholders to clarify appropriate oversight mechanisms. Predictable oversight could help ensure that AI tools remain safe and effective after deployment and throughout their lifecycle.
      • Status Quo. Policymakers could allow current efforts to proceed without intervention.
    • NAM claimed
      • Numerous AI-powered health applications designed for personal use have been shown to improve patient outcomes, building predictions based on large volumes of granular, real-time, and individualized behavioral and medical data. For instance, some forms of telehealth, a technology that has been critical during the COVID-19 pandemic, benefit considerably from AI software focused on natural language processing, which enables efficient triaging of patients based on urgency and type of illness. Beyond patient-provider communication, AI algorithms relevant to diabetic and cardiac care have demonstrated remarkable efficacy in helping patients manage their blood glucose levels in their day-to-day lives and in detecting cases of atrial fibrillation. AI tools that monitor and synthesize longitudinal patient behaviors are also particularly useful in psychiatric care, where of the exact timing of interventions is often critical. For example, smartphone-embedded sensors that track location and proximity of individuals can alert clinicians of possible substance use, prompting immediate intervention. On the population health level, these individual indicators of activity and health can be combined with environmental- and system-level data to generate predictive insight into local and global health trends. The most salient example of this may be the earliest warnings of the COVID-19 outbreak, issued in December 2019 by two private AI technology firms.
      • Successful implementation and widespread adoption of AI applications in HSOHC requires careful consideration of several key issues related to personal data, algorithm development, and health care insurance and payment. Chief among them are data interoperability, standardization, privacy, ameliorating systemic biases in algorithms, reimbursement of AI- assisted services, quality improvement, and integration of AI tools into provider workflows. Overcoming these challenges and optimizing the impact of AI tools on clinical outcomes will involve engaging diverse stakeholders, deliberately designing AI tools and interfaces, rigorously evaluating clinical and economic utility, and diffusing and scaling algorithms across different health settings. In addition to these potential logistical and technical hurdles, it is imperative to consider the legal and ethical issues surrounding AI, particularly as it relates to the fair and humanistic deployment of AI applications in HSOHC. Important legal considerations include the appropriate designation of accountability and liability of medical errors resulting from AI- assisted decisions for ensuring the safety of patients and consumers. Key ethical challenges include upholding the privacy of patients and their data—particularly with regard to non-HIPAA covered entities involved in the development of algorithms—building unbiased AI algorithms based on high-quality data from representative populations, and ensuring equitable access to AI technologies across diverse communities.
  • The National Institute of Standards and Technology (NIST) published a “new study of face recognition technology created after the onset of the COVID-19 pandemic [that] shows that some software developers have made demonstrable progress at recognizing masked faces.” In Ongoing Face Recognition Vendor Test (FRVT) Part 6B: Face Recognition Accuracy with Face Masks Using Post-COVID-19 Algorithms (NISTIR 8331), NIST stated the “report augments its predecessor with results for more recent algorithms provided to NIST after mid-March 2020.” NIST said that “[w]hile we do not have information on whether or not a particular algorithm was designed with face coverings in mind, the results show evidence that a number of developers have adapted their algorithms to support face recognition on subjects potentially wearing face masks.” NIST stated that
    • The following results represent observations on algorithms provided to NIST both before and after the COVID-19 pandemic to date. We do not have information on whether or not a particular algorithm was designed with face coverings in mind. The results documented capture a snapshot of algorithms submitted to the FRVT 1:1 in face recognition on subjects potentially wearing face masks.
      • False rejection performance: All algorithms submitted after the pandemic continue to give in-creased false non-match rates (FNMR) when the probes are masked. While a few pre-pandemic algorithms still remain within the most accurate on masked photos, some developers have submit-ted algorithms after the pandemic showing significantly improved accuracy and are now among the most accurate in our test.
      • Evolution of algorithms on face masks: We observe that a number of algorithms submitted since mid-March 2020 show notable reductions in error rates with face masks over their pre-pandemic predecessors. When comparing error rates for unmasked versus masked faces, the median FNMR across algorithms submitted since mid-March 2020 has been reduced by around 25% from the median pre-pandemic results. The figure below presents examples of developer evolution on both masked and unmasked datasets. For some developers, false rejection rates in their algorithms submitted since mid-March 2020 decreased by as much as a factor of 10 over their pre-pandemic algorithms, which is evidence that some providers are adapting their algorithms to handle facemasks. However, in the best cases, when comparing results for unmasked images to masked im-ages, false rejection rates have increased from 0.3%-0.5% (unmasked) to 2.4%-5% (masked).
      • False acceptance performance: As most systems are configured with a fixed threshold, it is necessary to report both false negative and false positive rates for each group at that threshold. When comparing a masked probe to an unmasked enrollment photo, in most cases, false match rates (FMR) are reduced by masks. The effect is generally modest with reductions in FMR usually being smaller than a factor of two. This property is valuable in that masked probes do not impart adverse false match security consequences for verification.
      • Mask-agnostic face recognition: All 1:1 verification algorithms submitted to the FRVT test since the start of the pandemic are evaluated on both masked and unmasked datasets. The test is de-signed this way to mimic operational reality: some images will have masks, some will not (especially enrollment samples from a database or ID card). And to the extent that the use of protective masks will exist for some time, our test will continue to evaluate algorithmic capability on verifying all combinations of masked and unmasked faces.
  • The government in London has issued a progress report on its current cybersecurity strategy that has another year to run. The Paymaster General assessed how well the United Kingdom (UK) has implemented the National Cyber Security Strategy 2016 to 2021 and pointed to goals yet to be achieved. This assessment comes in the shadow of the pending exit of the UK from the European Union (EU) and Prime Minister Boris Johnson’s plans to increase the UK’s role in select defense issues, including cyber operations. The Paymaster General stated:
    • The global landscape has changed significantly since the publication of the National Cyber Security Strategy Progress Report in May 2019. We have seen unprecedented levels of disruption to our way of life that few would have predicted. The COVID-19 pandemic has increased our reliance on digital technologies – for our personal communications with friends and family and our ability to work remotely, as well as for businesses and government to continue to operate effectively, including in support of the national response.
    • These new ways of living and working highlight the importance of cyber security, which is also underlined by wider trends. An ever greater reliance on digital networks and systems, more rapid advances in new technologies, a wider range of threats, and increasing international competition on underlying technologies and standards in cyberspace, emphasise the need for good cyber security practices for individuals, businesses and government.
    • Although the scale and international nature of these changes present challenges, there are also opportunities. With the UK’s departure from the European Union in January 2020, we can define and strengthen Britain’s place in the world as a global leader in cyber security, as an independent, sovereign nation.
    • The sustained, strategic investment and whole of society approach delivered so far through the National Cyber Security Strategy has ensured we are well placed to respond to this changing environment and seize new opportunities.
    • The Paymaster General asserted:
      • [The] report has highlighted growing risks, some accelerated by the COVID-19 pandemic, and longer-term trends that will shape the environment over the next decade:
      • Ever greater reliance on digital networks and systems as daily life moves online, bringing huge benefits but also creating new systemic and individuals risks.
      • Rapid technological change and greater global competition, challenging our ability to shape the technologies that will underpin our future security and prosperity.
      • A wider range of adversaries as criminals gain easier access to commoditised attack capabilities and cyber techniques form a growing part of states’ toolsets.
      • Competing visions for the future of the internet and the risk of fragmentation, making consensus on norms and ethics in cyberspace harder to achieve.
      • In February 2020 the Prime Minister announced the Integrated Review of Security, Defence, Development and Foreign Policy. This will define the government’s ambition for the UK’s role in the world and the long-term strategic aims of our national security and foreign policy. It will set out the way in which the UK will be a problem-solving and burden-sharing nation, and a strong direction for recovery from COVID-19, at home and overseas.
      • This will help to shape our national approach and priorities on cyber security beyond 2021. Cyber security is a key element of our international, defence and security posture, as well as a driving force for our economic prosperity.
  • The University of Toronto’s Citizen Lab published a report on an Israeli surveillance firm that uses “[o]ne of the widest-used—but least appreciated” means of surveilling people (i.e., “leveraging of weaknesses in the global mobile telecommunications infrastructure to monitor and intercept phone calls and traffic.” Citizen Lab explained that an affiliate of the NSO Group, “Circles is known for selling systems to exploit Signaling System 7 (SS7) vulnerabilities, and claims to sell this technology exclusively to nation-states.” Citizen Lab noted that “[u]nlike NSO Group’s Pegasus spyware, the SS7 mechanism by which Circles’ product reportedly operates does not have an obvious signature on a target’s phone, such as the telltale targeting SMS bearing a malicious link that is sometimes present on a phone targeted with Pegasus.” Citizen Lab found that
    • Circles is a surveillance firm that reportedly exploits weaknesses in the global mobile phone system to snoop on calls, texts, and the location of phones around the globe. Circles is affiliated with NSO Group, which develops the oft-abused Pegasus spyware.
    • Circles, whose products work without hacking the phone itself, says they sell only to nation-states. According to leaked documents, Circles customers can purchase a system that they connect to their local telecommunications companies’ infrastructure, or can use a separate system called the “Circles Cloud,” which interconnects with telecommunications companies around the world.
    • According to the U.S. Department of Homeland Security, all U.S. wireless networks are vulnerable to the types of weaknesses reportedly exploited by Circles. A majority of networks around the globe are similarly vulnerable.
    • Using Internet scanning, we found a unique signature associated with the hostnames of Check Point firewalls used in Circles deployments. This scanning enabled us to identify Circles deployments in at least 25 countries.
    • We determine that the governments of the following countries are likely Circles customers: Australia, Belgium, Botswana, Chile, Denmark, Ecuador, El Salvador, Estonia, Equatorial Guinea, Guatemala, Honduras, Indonesia, Israel, Kenya, Malaysia, Mexico, Morocco, Nigeria, Peru, Serbia, Thailand, the United Arab Emirates (UAE), Vietnam, Zambia, and Zimbabwe.
    • Some of the specific government branches we identify with varying degrees of confidence as being Circles customers have a history of leveraging digital technology for human rights abuses. In a few specific cases, we were able to attribute the deployment to a particular customer, such as the Security Operations Command (ISOC) of the Royal Thai Army, which has allegedly tortured detainees.
  • Senators Ron Wyden (D-OR) Elizabeth Warren (D-MA) Edward J. Markey (D-MA) and Brian Schatz (D-HI) “announced that the Department of Homeland Security (DHS) will launch an inspector general investigation into Customs and Border Protection’s (CBP) warrantless tracking of phones in the United States following an inquiry from the senators earlier this year” per their press release.
    • The Senators added:
      • As revealed by public contracts, CBP has paid a government contractor named Venntel nearly half a million dollars for access to a commercial database containing location data mined from applications on millions of Americans’ mobile phones. CBP officials also confirmed the agency’s warrantless tracking of phones in the United States using Venntel’s product in a September 16, 2020 call with Senate staff.
      • In 2018, the Supreme Court held in Carpenter v. United States that the collection of significant quantities of historical location data from Americans’ cell phones is a search under the Fourth Amendment and therefore requires a warrant.
      • In September 2020, Wyden and Warren successfully pressed for an inspector general investigation into the Internal Revenue Service’s use of Venntel’s commercial location tracking service without a court order.
    • In a letter, the DHS Office of the Inspector General (OIG) explained:
      • We have reviewed your request and plan to initiate an audit that we believe will address your concerns. The objective of our audit is to determine if the Department of Homeland Security (DHS) and it [sic] components have developed, updated, and adhered to policies related to cell-phone surveillance devices. In addition, you may be interested in our audit to review DHS’ use and protection of open source intelligence. Open source intelligence, while different from cell phone surveillance, includes the Department’s use of information provided by the public via cellular devices, such as social media status updates, geo-tagged photos, and specific location check-ins.
    • In an October letter, these Senators plus Senator Sherrod Brown (D-OH) argued:
      • CBP is not above the law and it should not be able to buy its way around the Fourth Amendment. Accordingly, we urge you to investigate CBP’s warrantless use of commercial databases containing Americans’ information, including but not limited to Venntel’s location database. We urge you to examine what legal analysis, if any, CBP’s lawyers performed before the agency started to use this surveillance tool. We also request that you determine how CBP was able to begin operational use of Venntel’s location database without the Department of Homeland Security Privacy Office first publishing a Privacy Impact Assessment.
  • The American Civil Liberties Union (ACLU) has filed a lawsuit in a federal court in New York City, seeking an order to compel the United States (U.S.) Department of Homeland Security (DHS), U.S. Customs and Border Protection (CBP), and U.S. Immigration and Customs Enforcement (ICE) “to release records about their purchases of cell phone location data for immigration enforcement and other purposes.” The ACLU made these information requests after numerous media accounts showing that these and other U.S. agencies were buying location data and other sensitive information in ways intended to evade the bar in the Fourth Amendment against unreasonable searches.
    • In its press release, the ACLU asserted:
      • In February, The Wall Street Journal reported that this sensitive location data isn’t just for sale to commercial entities, but is also being purchased by U.S. government agencies, including by U.S. Immigrations and Customs Enforcement to locate and arrest immigrants. The Journal identified one company, Venntel, that was selling access to a massive database to the U.S. Department of Homeland Security, U.S. Customs and Border Protection, and ICE. Subsequent reporting has identified other companies selling access to similar databases to DHS and other agencies, including the U.S. military.
      • These practices raise serious concerns that federal immigration authorities are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. There’s even more reason for alarm when those agencies evade requests for information — including from U.S. senators — about such practices. That’s why today we asked a federal court to intervene and order DHS, CBP, and ICE to release information about their purchase and use of precise cell phone location information. Transparency is the first step to accountability.
    • The ACLU explained in the suit:
      • Multiple news sources have confirmed these agencies’ purchase of access to databases containing precise location information for millions of people—information gathered by applications (apps) running on their smartphones. The agencies’ purchases raise serious concerns that they are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. Yet, more than nine months after the ACLU submitted its FOIA request (“the Request”), these agencies have produced no responsive records. The information sought is of immense public significance, not only to shine a light on the government’s use of powerful location-tracking data in the immigration context, but also to assess whether the government’s purchase of this sensitive data complies with constitutional and legal limitations and is subject to appropriate oversight and control.
  • Facebook’s new Oversight Board announced “the first cases it will be deliberating and the opening of the public comment process” and “the appointment of five new trustees.” The cases were almost all referred by Facebook users and the new board is asking for comments on the right way to manage what may be objectionable content. The Oversight Board explained it “prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
    • The new trustees are:
      • Kristina Arriaga is a globally recognized advocate for freedom of expression, with a focus on freedom of religion and belief. Kristina is president of the advisory firm Intrinsic.
      • Cherine Chalaby is an expert on internet governance, international finance and technology, with extensive board experience. As Chairman of ICANN, he led development of the organization’s five-year strategic plan for 2021 to 2025.
      • Wanda Felton has over 30 years of experience in the financial services industry, including serving as Vice Chair of the Board and First Vice President of the Export-Import Bank of the United States.
      • Kate O’Regan is a former judge of the Constitutional Court of South Africa and commissioner of the Khayelitsha Commission. She is the inaugural director of the Bonavero Institute of Human Rights at the University of Oxford.
      • Robert Post is an American legal scholar and Professor of Law at Yale Law School, where he formerly served as Dean. He is a leading scholar of the First Amendment and freedom of speech.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Further Reading, Other Developments, and Coming Events (17 November)

Further Reading

  • How the U.S. Military Buys Location Data from Ordinary Apps” By Joseph Cox — Vice’s Motherboard. This article confirms the entirely foreseeable: the Department of Defense and its contractors are obtaining and using personal information from smartphones all over the world. Given this practice is common in United States’ (U.S.) law enforcement agencies, it is little surprise the U.S. military is doing the same. Perhaps the fact the U.S. is doing this has been one of the animating force behind the Trump Administration’s moves against applications from the People’s Republic of China (PRC)?
  • Regulators! Stand Back: Under a Biden administration, Big Tech is set for a field day” By Lizzie O’Shea — The Baffler. This piece argues that a Biden Administration may be little more than a return to the Obama Administration’s favorable view of and largely laissez-faire regulatory approach. At least one expert worries the next administration may do enough on addressing big tech to appear to be doing something but not nearly enough to change the current market and societal dynamics.
  • Cheating-detection companies made millions during the pandemic. Now students are fighting back.” By Drew Harwell — The Washington Post. There are scores of problems with online testing platforms, including weak or easily compromised data security and privacy safeguards. Many students report getting flagged for stretching, looking off-screen, and even needing to go to the restroom. However, the companies in the market are in growth-mode and seem unresponsive to such criticisms.
  • Zuckerberg defends not suspending ex-Trump aide Bannon from Facebook: recording” By Katie Paul — Reuters. On an internal company call, Facebook CEO Mark Zuckerberg defended the platform’s decision not to deactivate former White House advisor Steve Bannon’s account after he “metaphorically” advocated for the beheadings of Federal Bureau of Investigation Director Christopher Wray and National Institute of Allergy and Infectious Diseases (NIAID) Director Anthony Fauci. Zuckerberg also reassured employees that a Biden Administration would not necessarily be entirely adversarial to Facebook.
  • How Trump uses Twitter to distract the media – new research” By Ullrich Ecker, Michael Jetter, and Stephan Lewandowsky — The Conversation. Research backs up the assertion that President Donald Trump has tweeted bizarre non-sequiturs to distract from what he perceived to be negative stories, and it worked because the media reported on the tweets almost every time. Trump is not the only politician or leader using this strategy.
  • Bumble Vulnerabilities Put Facebook Likes, Locations And Pictures Of 95 Million Daters At Risk” By Thomas Brewster — Forbes. Users of the dating app, Bumble, were at risk due to weak security white hacker researchers easily circumvented. Worse still, it took the company months to address and fix these vulnerabilities after being informed.

Other Developments

  • A number of United States (U.S.) election security stakeholders issued a statement, carefully and tactfully refuting the claims of President Donald Trump and other Republicans who have claimed that President-elect Joe Biden won the election only because of massive fraud. These officials declared “[t]he November 3rd election was the most secure in American history” and “[t]here is no evidence that any voting system deleted or lost votes, changed votes, or was in any way compromised.”
    • The officials seemed to flatly contradict Trump and others:
      • While we know there are many unfounded claims and opportunities for misinformation about the process of our elections, we can assure you we have the utmost confidence in the security and integrity of our elections, and you should too.
    • The members of Election Infrastructure Government Coordinating Council (GCC) Executive Committee – Cybersecurity and Infrastructure Security Agency (CISA) Assistant Director Bob Kolasky, U.S. Election Assistance Commission Chair Benjamin Hovland, National Association of Secretaries of State (NASS) President Maggie Toulouse Oliver, National Association of State Election Directors (NASED) President Lori Augino, and Escambia County (Florida) Supervisor of Elections David Stafford – and the members of the Election Infrastructure Sector Coordinating Council (SCC) – Chair Brian Hancock (Unisyn Voting Solutions), Vice Chair Sam Derheimer (Hart InterCivic), Chris Wlaschin (Election Systems & Software), Ericka Haas (Electronic Registration Information Center), and Maria Bianchi (Democracy Works) issued the statement.
  • President Donald Trump signed an executive order that would bar from the United States’ (U.S.) security markets those companies from the People’s Republic of China (PRC) connected to the PRC’s “military-industrial complex.” This order would take effect on 11 January 2021 and seeks, as a matter of national security, to cut off access to U.S. capital for these PRC companies because “the PRC exploits United States investors to finance the development and modernization of its military.” Consequently, Trump declared a national emergency with respect to the PRC’s behavior, which triggers a host of powers at the Administration’s request to deny funds and access to the object of such an order. It remains to be seen whether the Biden Administration will rescind or keep in place this executive order when it takes office ten days after it takes effect. Nevertheless, Trump asserted:
    • that the PRC is increasingly exploiting United States capital to resource and to enable the development and modernization of its military, intelligence, and other security apparatuses, which continues to allow the PRC to directly threaten the United States homeland and United States forces overseas, including by developing and deploying weapons of mass destruction, advanced conventional weapons, and malicious cyber-enabled actions against the United States and its people.
  • Microsoft revealed it has “detected cyberattacks from three nation-state actors targeting seven prominent companies directly involved in researching vaccines and treatments for Covid-19.” Microsoft attributed these attacks to Russian and North Korean hackers and tied the announcement to its participation to the company’s advocacy at the Paris Peace Forum where the United States (U.S.) multinational reiterated its calls for “the world’s leaders to affirm that international law protects health care facilities and to take action to enforce the law.” Microsoft sought to position its cyber efforts among larger diplomatic efforts to define the norms of cyberspace and to bring cyber action into the body of international law. The company asserted:
    • In recent months, we’ve detected cyberattacks from three nation-state actors targeting seven prominent companies directly involved in researching vaccines and treatments for Covid-19. The targets include leading pharmaceutical companies and vaccine researchers in Canada, France, India, South Korea and the United States. The attacks came from Strontium, an actor originating from Russia, and two actors originating from North Korea that we call Zinc and Cerium.
    • Among the targets, the majority are vaccine makers that have Covid-19 vaccines in various stages of clinical trials. One is a clinical research organization involved in trials, and one has developed a Covid-19 test. Multiple organizations targeted have contracts with or investments from government agencies from various democratic countries for Covid-19 related work.
    • Strontium continues to use password spray and brute force login attempts to steal login credentials. These are attacks that aim to break into people’s accounts using thousands or millions of rapid attempts. Zinc has primarily used spear-phishing lures for credential theft, sending messages with fabricated job descriptions pretending to be recruiters. Cerium engaged in spear-phishing email lures using Covid-19 themes while masquerading as World Health Organization representatives. The majority of these attacks were blocked by security protections built into our products. We’ve notified all organizations targeted, and where attacks have been successful, we’ve offered help.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) announced a £1.25 million fine of Ticketmaster UK for failing “to put appropriate security measures in place to prevent a cyber-attack on a chat-bot installed on its online payment page” in violation of the General Data Protection Regulation (GDPR). The ICO explained:
    • The breach began in February 2018 when Monzo Bank customers reported fraudulent transactions. The Commonwealth Bank of Australia, Barclaycard, Mastercard and American Express all reported suggestions of fraud to Ticketmaster. But the company failed to identify the problem.
    • In total, it took Ticketmaster nine weeks from being alerted to possible fraud to monitoring the network traffic through its online payment page.
    • The ICO’s investigation found that Ticketmaster’s decision to include the chat-bot, hosted by a third party, on its online payment page allowed an attacker access to customers’ financial details.
    • Although the breach began in February 2018, the penalty only relates to the breach from 25 May 2018, when new rules under the GDPR came into effect. The chat-bot was completely removed from Ticketmaster UK Limited’s website on 23 June 2018.
    • The ICO added:
      • The data breach, which included names, payment card numbers, expiry dates and CVV numbers, potentially affected 9.4million of Ticketmaster’s customers across Europe including 1.5million in the UK.
      • Investigators found that, as a result of the breach, 60,000 payment cards belonging to Barclays Bank customers had been subjected to known fraud. Another 6,000 cards were replaced by Monzo Bank after it suspected fraudulent use.
      • The ICO found that Ticketmaster failed to:
        • Assess the risks of using a chat-bot on its payment page
        • Identify and implement appropriate security measures to negate the risks
        • Identify the source of suggested fraudulent activity in a timely manner
  • The Office of the Comptroller of the Currency, the Board of Governors of the Federal Reserve System, and the Federal Deposit Insurance Corporation issued an interagency paper titled “Sound Practices to Strengthen Operational Resilience.” The agencies stated the paper “generally describes standards for operational resilience set forth in the agencies’ existing rules and guidance for domestic banking organizations that have average total consolidated assets greater than or equal to (1) $250 billion or (2) $100 billion and have $75 billion or more in average cross-jurisdictional activity, average weighted short-term wholesale funding, average nonbank assets, or average off-balance-sheet exposure.” The agencies explained the paper also:
    • promotes a principles-based approach for effective governance, robust scenario analysis, secure and resilient information systems, and thorough surveillance and reporting.
    • includes an appendix focused on sound practices for managing cyber risk.
    • In the appendix, the agencies stressed they could not “endorse the use of any particular tool,” they did state:
      • To manage cyber risk and assess cybersecurity preparedness of its critical operations, core business lines and other operations, services, and functions firms may choose to use standardized tools that are aligned with common industry standards and best practices. Some of the tools that firms can choose from include the Federal Financial Institutions Examination Council (FFIEC) Cybersecurity Assessment Tool, the National Institute of Standards and Technology Cybersecurity Framework (NIST), the Center for Internet Security Critical Security Controls, and the Financial Services Sector Coordinating Council Cybersecurity Profile.
  • A class action was filed in the United Kingdom (UK) against Facebook over the Cambridge Analytica scandal. Facebook You Owe Us announced its legal action “for the illegal use of one million users’ data in the England and Wales.” The campaign claimed:
    • Group legal actions like Facebook You Owe Us will pave the way for consumers in the UK to gain redress and compensation for the persistent mass misuse of personal data by the world’s largest companies.  
    • Facebook has exhibited a pattern of unethical behaviour including allegations of election interference and failing to remove fake news. The Information Commissioners Office noted when issuing a £500,000 fine against Facebook for the Cambridge Analytica data breach that “protection of personal information and personal privacy is of fundamental importance, not only for the rights of individuals, but also as we now know, for the preservation of a strong democracy.” Facebook You Owe Us aims to fight back by holding the company to account for failing to protect Facebook users’ personal data and showing that Facebook is not above the law.  
    • The launch of Facebook You Owe Us follows Google You Owe Us’ victory in the Court of Appeal. The Google You Owe Us case has been appealed by Google and is now scheduled to be heard before the Supreme Court in April 2021. If successful, the case will demonstrate that personal data is of value to individuals and that companies cannot simply take it and profit from it illegally. Both cases are led by James Oldnall at Milberg London LLP, with Richard Lloyd, the former executive director of Which?. 

Coming Events

  • The Senate Homeland Security and Governmental Affairs Committee’s Regulatory Affairs and Federal Management Subcommittee will hold a hearing on how to modernize telework in light of what was learned during the COVID-19 pandemic on 18 November.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.
  • On 27 November, The European Data Protection Board “is organising a remote stakeholder workshop on the topic of Legitimate Interest.” The EDPB explained “[p]laces will be allocated on a first come, first served basis, depending on availability.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (9 November)

Further Reading

  • Facebook bans ‘STOP THE STEAL’ group Trump allies were using to organize protests against vote counting” By Tony Romm, Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. A significant portion of the online activity among those on the right wing alleging that the Biden Campaign and Democrats have stolen the election is traceable to right-wing media influencers and it is less an organic effort. Moreover, Facebook has apparently had a mixed record in locating and taking down material that is seeking to spread lies about the integrity of the election and foment violence.
  • False News Targeting Latinos Trails the Election” By Patricia Mazzei and Nicole Perlroth — The New York Times. By the metrics used in the article (although it’s not clear exactly where the Times got its data), the disinformation in Spanish on social media in 2020 exceeded the Russian disinformation campaign in 2016. Apparently, Facebook, Twitter, and YouTube were not prepared or were not expecting the flood of lies, misinformation, and disinformation about President-elect Joe Biden or the Democrats generally, especially in South Florida where Republicans did much better than expected. Much of this content tied Biden to the former dictators of Cuba and Venezuela, Fidel Castro and Hugo Chavez.
  • Trump’s Tweeting Isn’t Crazy. It’s Strategic, Typos and All.” By Emily Dreyfuss — The New York Times. This piece traces the evolution of a campaign to paint the Biden family as engaged in criminal activity to both smear them and to blunt any criticism of the Trump family given the many and serious allegations of lawbreaking and unethical behavior.
  • TikTok invites UK lawmakers to review algorithm after being probed on China censorship concerns” By Sam Shead — CNBC. In testimony before the United Kingdom’s (UK) Parliament’s Business, Energy and Industrial Strategy Committee, TikTok’s head of policy in the UK said the platform used to censor content but then hedged the statement after the hearing in a statement. Prior to May 2019, the company hewed to the content wishes of the People’s Republic of China and material on Tiananmen Square was not on the platform. However, she did claim that TikTok’s data is stored in the United States with backups in Singapore, none of which goes to the PRC.
  • The Disinformation Is Coming From Inside the White House” By Matthew Rosenberg, Jim Rutenberg and Nick Corasaniti — The New York Times. Turns out much of the disinformation about alleged but unproven vote fraud is coming directly from the President, his advisers, his allies, and his family. It may come to pass that domestic disinformation, misinformation, and lies will have a larger impact than similar efforts from overseas.

Other Developments

  • Representative Ro Khanna (D-CA) introduced “The 21st Century Jobs Package” (H.R.8693) that establish a Federal Institute of Technology (FIT) and “allocates $900 billion in research & development (R&D) funding for emerging technologies like Advanced Manufacturing, Synthetic Biology, Artificial intelligence, Biotechnology, and Cybersecurity” according to his press release. In a summary, Khanna explained:
    • At the center of this proposal is the creation of a FIT, with presence in multiple locations around the country. These locations will initially take the form of additional facilities and faculty within or alongside existing universities and complementing ecosystems that are already dynamic. Over time, they will grow to include new stand-alone operations in areas without strong existing university bases. The vision, as in the past, is to marry federal resources and guidance with local initiative.
    • The proposed budget for this entire initiative is $900 billion over ten years. This would raise total public R&D spending to 1% of GDP by the end of the period, returning us to our role as an international leader. Most importantly, it would create as many as three million good new jobs per year. Many of these jobs would be in places that have fallen behind.
  • Australia’s Attorney-General has released an issues paper as a precursor of a possible rewrite of the country’s Privacy Act 1988 “to ensure privacy settings empower consumers, protect their data and best serve the Australian economy…as part of the government’s response to the Australian Competition and Consumer Commission’s Digital Platforms Inquiry” according to the its press release. The Attorney-General explained:
    • The review will examine and, if needed, consider options for reform on matters including:
    • The scope and application of the Privacy Act including in relation to:
      • the definition of ‘personal information’
      • current exemptions, and
      • general permitted situations for the collection, use and disclosure of personal information.
    • Whether the Privacy Act effectively protects personal information and provides a practical and proportionate framework for promoting good privacy practices including in relation to:
      • notification requirements
      • consent requirements including default privacy settings
      • overseas data flows, and
      • erasure of personal information.
    • Whether individuals should have direct rights of action to enforce privacy obligations under the Privacy Act.
    • Whether a statutory tort for serious invasions of privacy should be introduced into Australian law.
    • The impact of the notifiable data breach scheme and its effectiveness in meeting its objectives.
    • The effectiveness of enforcement powers and mechanisms under the Privacy Act and the interaction with other Commonwealth regulatory frameworks.
    • The desirability and feasibility of an independent certification scheme to monitor and demonstrate compliance with Australian privacy laws
  • The National Institute of Standards and Technology (NIST) has released for comment its “Draft Federal Information Processing Standard (FIPS) 201-3, Personal Identity Verification (PIV) of Federal Employees and Contractors (Standard).” NIST explained in the Federal Register notice:
    • This Standard defines common credentials and authentication mechanisms offering varying degrees of security for both logical and physical access applications. The draft revision proposes changes to FIPS 201-2, Standard for Personal Identity Verification of Federal Employees and Contractors to include: Expanding specification on the use of additional PIV credentials known as derived PIV credentials, procedures for supervised remote identity proofing, the use of federation as a means for a relying system to interoperate with PIV credentials issued by other agencies, alignment with the current practice/policy of the Federal Government and specific changes requested by Federal agencies and implementers. Before recommending these proposed changes to the Secretary of Commerce for review and approval, NIST invites comments from all interested parties.
    • In the draft document, NIST stated:
      • Authentication of an individual’s identity is a fundamental component of physical and logical access control. An access control decision must be made when an individual attempts to access security-sensitive buildings, information systems, and applications. An accurate determination of an individual’s identity supports making sound access control decisions. T
      • his document establishes a standard for a Personal Identity Verification (PIV) system that meets the control and security objectives of Homeland Security Presidential Directive-12 [HSPD-12]. It is based on secure and reliable forms of identity credentials issued by the Federal Government to its employees and contractors. These credentials are used by mechanisms that authenticate individuals who require access to federally controlled facilities, information systems, and applications. This Standard addresses requirements for initial identity proofing, infrastructure to support interoperability of identity credentials, and accreditation of organizations and processes issuing PIV credentials.
  • The Federal Communications Commission (FCC) announced a $200 million settlement with T-Mobile “to resolve an investigation of its subsidiary Sprint’s compliance with the Commission’s rules regarding waste, fraud, and abuse in the Lifeline program for low-income consumers” according to the agency’s press release. The FCC explained:
    • The payment is the largest fixed-amount settlement the Commission has ever secured to resolve an investigation.  The settlement comes after an Enforcement Bureau investigation into reports that Sprint, prior to its merger with T-Mobile, was claiming monthly subsidies for serving approximately 885,000 Lifeline subscribers even though those subscribers were not using the service, in potential violation of the Commission’s “non-usage” rule.  The matter initially came to light as a result of an investigation by the Oregon Public Utility Commission.  In addition to paying a $200 million civil penalty, Sprint agreed to enter into a compliance plan to help ensure future adherence to the Commission’s rules for the Lifeline program.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Walkerssk from Pixabay

Further Reading, Other Developments, and Coming Events (5 November)

Further Reading

  • Confusion and conflict stir online as Trump claims victory, questions states’ efforts to count ballots” By Craig Timberg, Tony Romm, Isaac Stanley-Becker and Drew Harwell — Washington Post. When the post-mortem on the 2020 Election is written, it is likely to be the case that foreign disinformation was not the primary threat. Rather, it may be domestic interference given the misinformation, disinformation, and lies circulating online despite the best efforts of social media platforms to label, take down, and block such material. However, if this article is accurate, much of it is coming from the right wing, including the President.
  • Polls close on Election Day with no apparent cyber interference” By Kevin Collier and Ken Dilanian — NBC News. Despite crowing from officials like The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) Director Christopher Krebs and U.S. Cyber Command head General Paul Naksone, it is not altogether clear that U.S. efforts, especially publicized offensive operations are the reason there were no significant cyber attacks on Election Day. However, officials are cautioning the country is not out of the woods as vote counting is ongoing and opportunities for interference and mischief remain.
  • Russian hackers targeted California, Indiana Democratic parties” By Raphael Satter, Christopher Bing, Joel Schectman — Reuters. Apparently, Microsoft helped foil Russian efforts to hack two state Democratic parties and think tanks, some of which are allied with the Democratic party. However, it appears none of the attempts, which occurred earlier this year, were successful. The article suggests but does not claim that increased cyber awareness and defenses foiled most of the attempts by hacking group, Fancy Bear.
  • LexisNexis to Pay $5 Million Class Action Settlement for Selling DMV Data” By Joseph Cox — Vice. Data broker LexisNexis is settling a suit that it violated the Drivers’ Privacy Protection Act (DPPA) by obtaining Department of Motor Vehicles (DMV) records on people for a purpose not authorized under the law. Vice has written a number of articles on the practices of DMVs selling people’s data, which has caught the attention of at least two Democratic Members of Congress who have said they will introduce legislation to tighten the circumstances under which these data may be shared or sold.
  • Spy agency ducks questions about ‘back doors’ in tech products” By Joseph Menn — Reuters. Senator Ron Wyden (D-OR) is demanding that the National Security Agency (NSA) reveal the guidelines put in place after former NSA contractor Edward Snowden revealed the agency’s practice of getting backdoors in United States (U.S.) technology it could use in the future. This practice allowed the NSA to sidestep warrant requirements, but it also may have weakened technology that was later exploited by other governments as the People’s Republic of China (PRC) allegedly did to Juniper in 2015. After Snowden divulged the NSA’s practice, reforms were supposedly put in place but never shared with Congress.

Other Developments

  • Australia’s Joint Committee on Intelligence and Security issued a new report into Australia’s mandatory data retention regime that makes 22 recommendations to “increase transparency around the use of the mandatory data retention and increase the threshold for when data can be accessed…[and] reduce the currently very broad access to telecommunications data under the Telecommunications Act.” The committee stated “[t]he report’s 22 recommendations include:
    • access to data kept under the mandatory data retention regime will only be available under specific circumstances
    • the Department of Home Affairs develop guidelines for data collection including an ability for enforcement agencies and Home Affairs to produce reports to oversight agencies or Parliament when requested
    • the repeal of section 280(1)(b) of the Telecommunications Act which allows for access where ‘disclosure or use is required or authorised by or under law.’ It is the broad language in this subsection that has allowed the access that concerned the committee
    • The committee explained:
      • The Parliamentary Joint Committee on Intelligence and Security (the Committee) is required by Part 5-1A of the Telecommunications (Interception and Access) Act 1979 (TIA Act) to undertake a review of the mandatory data retention regime (MDRR).
      • The mandatory data retention regime is a legislative framework which requires carriers, carriage service providers and internet service providers to retain a defined set of telecommunications data for two years, ensuring that such data remains available for law enforcement and national security investigations.
  • Senators Ron Wyden (D-OR) and Sherrod Brown (D-OH) wrote a letter “to trade associations urging them to take immediate action to ensure their members are not complicit in China’s state-directed human rights abuses, including by relocating production from the Xinjiang Uyghur Autonomous Region.” They stated:
    • We write to express our concerns over reports that the industries and companies that the U.S. Chamber of Commerce represents have supply chains that have been implicated in the state-sanctioned forced labor of Uyghurs and other Muslim groups in the Xinjiang Uyghur Autonomous Region of China (XUAR) and in sites where Uyghurs have been relocated.  The decision to operate or contract with production facilities overseas must be accompanied by high standards of supply chain accountability and transparency to ensure that no company’s products are made with forced labor.  We urge your members to take immediate action to ensure goods manufactured for them are not complicit in the China’s state-directed human rights abuses, including by relocating production from the XUAR.  In addition, we ask your members to take critical, comprehensive steps to achieve the supply chain integrity and transparency American consumers and workers deserve.  It is past time for American multinational companies to be part of the solution, not part of the problem, on efforts to eradicate forced labor and end human rights abuses against workers in China. 
  • The Federal Trade Commission (FTC) finalized a settlement alleging violations of the now struck down European Union-United States Privacy Shield. In its press release, the agency explained it had “alleged that NTT Global Data Centers Americas, Inc. (NTT), formerly known as RagingWire Data Centers, Inc., claimed in its online privacy policy and marketing materials that the company participated in the Privacy Shield framework and complied with the program’s requirements.” The FTC noted “the company’s certification lapsed in January 2018 and it failed to comply with certain Privacy Shield requirements while it was a participant in the framework.” The FTC stated:
    • Under the settlement, the company, among other things, is prohibited not just from misrepresenting its compliance with or participation in the Privacy Shield framework, but also any other privacy or data security program sponsored by the government or any self-regulatory or standard-setting organization. The company also must continue to apply the Privacy Shield requirements or equivalent protections to personal information it collected while participating in the framework or return or delete the information.
    • Although the European Court of Justice invalidated the Privacy Shield framework in July 2020, that decision does not affect the validity of the FTC’s decision and order relating to NTT’s misrepresentations about its participation in and compliance with the framework. The framework allowed participants to transfer data legally from the European Union to the United States.
  • The Commission nationale de l’informatique et des libertés (CNIL) issued a press release, explaining that France’s “Council of State acknowledges the existence of a risk of data transfer from the Health Data Hub to the United States and requests additional safeguards.” CNIL stated it “will advise the public authorities on appropriate measures and will ensure, for research authorization related to the health crisis, that there is a real need to use the platform.” This announcement follows from the Court of Justice of the European Union (CJEU) striking down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). CNIL summarized the “essentials:”
    • Fearing that some data might be transferred to the United States, some claimants lodged an appeal with the Council of State requesting the suspension of the “Health Data Hub”, the new platform designed to ultimately host all the health data of people who receive medical care in France.
    • The Court considers that a risk cannot be excluded with regard to the transfer of health data hosted on the Health Data Hub platform to the US intelligence.
    • Because of the usefulness of the Health Data Hub in managing the health crisis, it refuses to suspend the operation of the platform.
    • However, it requires the Health Data Hub to strengthen its contract with Microsoft on a number of points and to seek additional safeguards to better protect the data it hosts.
    • It is the responsibility of the CNIL to ensure, for authorization of research projects on the Health Data Hub in the context of the health crisis, that the use of the platform is technically necessary, and to advise public authorities on the appropriate safeguards.
    • These measures will have to be taken while awaiting a lasting solution that will eliminate any risk of access to personal data by the American authorities, as announced by the French Secretary of State for the Digital Agenda.
  • The United Kingdom’s (UK) National Cyber Security Centre (NCSC) has published its annual review that “looks back at some of the key developments and highlights from the NCSC’s work between 1 September 2019 and 31 August 2020.” In the foreword, new NCSC Chief Executive Officer Lindy Cameron provided an overview:
    • Expertise from across the NCSC has been surged to assist the UK’s response to the pandemic. More than 200 of the 723 incidents the NCSC handled this year related to coronavirus and we have deployed experts to support the health sector, including NHS Trusts, through cyber incidents they have faced. We scanned more than one million NHS IP addresses for vulnerabilities and our cyber expertise underpinned the creation of the UK’s coronavirus tracing app.
    • An innovative approach to removing online threats was created through the ‘Suspicious Email Reporting Service’ – leading to more than 2.3 million reports of malicious emails being flagged by the British public. Many of the 22,000 malicious URLs taken down as a result related to coronavirus scams, such as pretending to sell PPE equipment to hide a cyber attack. The NCSC has often been described as world-leading, and that has been evident over the last 12 months. Our innovative ‘Exercise in a Box’ tool, which supports businesses and individuals to test their cyber defences against realistic scenarios, was used in 125 countries in the last year.
    • Recognising the change in working cultures due to the pandemic, our team even devised a specific exercise on remote working, which has helped organisations to understand where current working practices may be presenting alternative cyber risks. Proving that cyber really is a team sport, none of this would be possible without strong partnerships internationally and domestically. We worked closely with law enforcement – particularly the National Crime Agency – and across government, industry, academia and, of course, the UK public.
    • The NCSC is also looking firmly ahead to the future of cyber security, as our teams work to understand both the risks and opportunities to the UK presented by emerging technologies. A prominent area of work this year was the NCSC’s reviews of high-risk vendors such as Huawei – and in particular the swift and thorough review of US sanctions against Huawei. The NCSC gave advice on the impact these changes would have in the UK, publishing a summary of the advice given to government as well as timely guidance for operators and the public.
  • Australia’s Department of Industry, Science, Energy and Resources has put out for comment a discussion paper titled “An AI Action Plan for all Australians” to “shape Australia’s vision for artificial intelligence (AI).” The department said it “is now consulting on the development of a whole-of-government AI Action Plan…[that] will help us maximise the benefits of AI for all Australians and manage the potential challenges.” The agency said “[t]he will help to:
    • ensure the development and use of AI in Australia is responsible
    • coordinate government policy and national capability under a clear, common vision for AI in Australia
    • explore the actions needed for our AI future
    • The department explained:
      • Building on Australia’s AI Ethics Framework, the Australian Government is developing an AI Action Plan. It is a key component of the government’s vision to be a leading digital economy by 2030. It builds on almost $800 million invested in the 2020-21 Budget to enable businesses to take advantage of digital technologies to grow their businesses and create jobs. It is an opportunity to leverage AI as part of the Australian Government’s economic recovery plan. We must work together to ensure all Australians can benefit from advances in AI.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.