Further Reading, Other Developments, and Coming Events (13 and 14 January 2021)

Further Reading

  • YouTube Suspends Trump’s Channel for at Least Seven Days” By Daisuke Wakabayashi — The New York Times. Even Google is getting further into the water. Its YouTube platform flagged a video of President Donald Trump’s for inciting violence and citing the “ongoing potential for violence,” Trump and his team will not be able to upload videos for seven days and the comments section would be permanently disabled. YouTube has been the least inclined of the major platforms to moderate content and has somehow escaped the scrutiny and opprobrium Facebook and Twitter have faced even though those platforms have been more active in policing offensive content.
  • Online misinformation that led to Capitol siege is ‘radicalization,’ say researchers” By Elizabeth Culliford — Reuters. Experts in online disinformation are saying that the different conspiracy movements that impelled followers to attack the United States (U.S.) Capitol are the result of radicalization. Online activities translated into real world violence, they say. The also decried the responsive nature of social media platforms in acting, waiting for an insurrection to take steps experts and others have been begging them to take.
  • Uganda orders all social media to be blocked – letter” — Reuters. In response to Facebook blocking a number of government related accounts for Coordinated Inauthentic Behaviour” (CIB), the Ugandan government has blocked all access to social media ahead of its elections. In a letter seen by Reuters, the Uganda Communications Commission directed telecommunications providers “to immediately suspend any access and use, direct or otherwise, of all social media platforms and online messaging applications over your network until further notice.” This may become standard practice for many regimes around the world if social media companies crack down on government propaganda.
  • BlackBerry sells 90 patents to Huawei, covering key smartphone technology advances” By Sean Silcoff — The Globe and Mail. Critics of a deal to assign 90 key BlackBerry patents to Huawei are calling on the government of Prime Minister Justin Trudeau to be more involved in protecting Canadian intellectual property and innovations.
  • ‘Threat to democracy is real’: MPs call for social media code of conduct” By David Crowe and Nick Bonyhady — The Sydney Morning Herald. There has been mixed responses in Australia’s Parliament on social media platforms banning President Donald Trump after his role in inciting the violence at the United States (U.S.) Capitol. Many agree with the platforms, some disagree strenuously in light of other inflammatory content that is not taken down, and many want greater rationality and transparency in how platforms make these decisions. And since Canberra has been among the most active governments in regulating technology, it may inform the process of drafting its “Online Safety Bill,” which may place legal obligations on social media platforms.
  • Poland plans to make censoring of social media accounts illegal” By Shaun Walker — The Guardian. Governments around the world continue to respond to a number of social media companies deciding to deplatform United States (U.S.) President Donald Trump. In Warsaw there is a draft bill that would make deplatforming a person illegal unless the offense is also contrary to Polish law. The spin is that the right wing regime in Warsaw is less interested in protecting free speech and more interested in propagating the same grievances the right wing in the United States is. Therefore, this push in Poland may be more about messaging and trying to cow social media companies and less about protecting free speech, especially speech with which the government disagrees (e.g. advocates for LGBTQI rights have been silenced in Poland.)
  • Facebook, Twitter could face punishing regulation for their role in U.S. Capitol riot, Democrats say” By Tony Romm — The Washington Post. Democrats were already furious with social media companies for what they considered their lacking governance of content that clearly violated terms of service and policies. These companies are bracing for an expected barrage of hearings and legislation with the Democrats controlling the White House, House, and Senate.
  • Georgia results sweep away tech’s regulatory logjam” By Margaret Harding McGill and Ashley Gold — Axios. This is a nice survey of possible policy priorities at the agencies and in the Congress over the next two years with the Democrats in control of both.
  • The Capitol rioters put themselves all over social media. Now they’re getting arrested.” By Sara Morrison — Recode. Will the attack on the United States (U.S.) Capitol be the first time a major crime is solved by the evidence largely provided by the accused? It is sure looking that way as law enforcement continues to use the posts of the rioters to apprehend, arrest, and charge them. Additionally, in the same way people who acted in racist and entitled ways (e.g. Amy Cooper in Central Park threatening an African American gentleman with calling the police even though he had asked her to put her dog on a leash) were caught through crowd-sourced identification pushes, rioters are also being identified.
  • CISA: SolarWinds Hackers Got Into Networks by Guessing Passwords” By Mariam Baksh — Nextgov. The Cybersecurity and Infrastructure Security Agency (CISA) has updated its alert on the SolarWinds hack to reflect its finding. CISA explained:
    • CISA incident response investigations have identified that initial access in some cases was obtained by password guessing [T1101.001], password spraying [T1101.003], and inappropriately secured administrative credentials [T1078] accessible via external remote access services [T1133]. Initial access root cause analysis is still ongoing in a number of response activities and CISA will update this section as additional initial vectors are identified.
  •  “A Facial Recognition Company Says That Viral Washington Times “Antifa” Story Is False” By Craig Silverman — BuzzFeed News. XRVIsion denied the Washington Times’ account that the company had identified antifa protestors among the rioters at the United States (U.S. Capitol) (archived here.) The company said it had identified two Neo-Nazis and a QAnon adherent. Even though the story was retracted and a corrected version issued, some still claimed the original story had merit such as Trump supporter Representative Matt Gaetz (R-FL).

Other Developments

  • The United States (U.S.) Trade Representative (USTR) announced that it would not act on the basis of three completed reports on Digital Services Taxes (DST) three nations have put in place and also that it would not proceed with tariffs in retaliation against France, one of the first nations in the world to enact a DST. Last year, the Organization for Economic Co-operation and Development convened multi-lateral talks to resolve differences on how a global digital services tax will ideally function with most of the nations involved arguing for a 2% tax to be assessed in the nation where the transaction occurs as opposed to where the company is headquartered. European Union (EU) officials claimed an agreement was possible, but the U.S. negotiators walked away from the table. It will fall to the Biden Administration to act on these USTR DST investigations if they choose.
    • In its press release, the USTR stated it would “suspend the tariff action in the Section 301 investigation of France’s Digital Services Tax (DST).”
      • The USTR added:
        • The additional tariffs on certain products of France were announced in July 2020, and were scheduled to go into effect on January 6, 2021.  The U.S. Trade Representative has decided to suspend the tariffs in light of the ongoing investigation of similar DSTs adopted or under consideration in ten other jurisdictions.  Those investigations have significantly progressed, but have not yet reached a determination on possible trade actions.  A suspension of the tariff action in the France DST investigation will promote a coordinated response in all of the ongoing DST investigations.
      • In its December 2019 report, the USTR determined “that France’s DST is unreasonable or discriminatory and burdens or restricts U.S. commerce, and therefore is actionable under sections 301(b) and 304(a) of the Trade Act (19 U.S.C. 2411(b) and 2414(a))” and proposed a range of measures in retaliation.
    • The USTR also “issued findings in Section 301 investigations of Digital Service Taxes (DSTs) adopted by India, Italy, and Turkey, concluding that each of the DSTs discriminates against U.S. companies, is inconsistent with prevailing principles of international taxation, and burden or restricts U.S. commerce.” The USTR stated it “is not taking any specific actions in connection with the findings at this time but will continue to evaluate all available options.” The USTR added:
      • The Section 301 investigations of the DSTs adopted by India, Italy, and Turkey were initiated in June 2020, along with investigations of DSTs adopted or under consideration by Austria, Brazil, the Czech Republic, the European Union, Indonesia, Spain, and the United Kingdom.  USTR expects to announce the progress or completion of additional DST investigations in the near future. 
  • The United Kingdom’s Competition and Markets Authority (CMA) has started investigating Google’s Privacy Sandbox’ project to “assess whether the proposals could cause advertising spend to become even more concentrated on Google’s ecosystem at the expense of its competitors.” The CMA asserted:
    • Third party cookies currently play a fundamental role online and in digital advertising. They help businesses target advertising effectively and fund free online content for consumers, such as newspapers. But there have also been concerns about their legality and use from a privacy perspective, as they allow consumers’ behaviour to be tracked across the web in ways that many consumers may feel uncomfortable with and may find difficult to understand.
    • Google’s announced changes – known collectively as the ‘Privacy Sandbox’ project – would disable third party cookies on the Chrome browser and Chromium browser engine and replace them with a new set of tools for targeting advertising and other functionality that they say will protect consumers’ privacy to a greater extent. The project is already under way, but Google’s final proposals have not yet been decided or implemented. In its recent market study into online platforms digital advertising, the CMA highlighted a number of concerns about their potential impact, including that they could undermine the ability of publishers to generate revenue and undermine competition in digital advertising, entrenching Google’s market power.
  • Facebook took down coordinated inauthentic behavior (CIB) originating from France and Russia, seeking to allegedly influence nations in Africa and the Middle East. Facebook asserted:
    • Each of the networks we removed today targeted people outside of their country of origin, primarily targeting Africa, and also some countries in the Middle East. We found all three of them as a result of our proactive internal investigations and worked with external researchers to assess the full scope of these activities across the internet.
    • While we’ve seen influence operations target the same regions in the past, this was the first time our team found two campaigns — from France and Russia — actively engage with one another, including by befriending, commenting and criticizing the opposing side for being fake. It appears that this Russian network was an attempt to rebuild their operations after our October 2019 takedown, which also coincided with a notable shift in focus of the French campaign to begin to post about Russia’s manipulation campaigns in Africa.
    • Unlike the operation from France, both Russia-linked networks relied on local nationals in the countries they targeted to generate content and manage their activity across internet services. This is consistent with cases we exposed in the past, including in Ghana and the US, where we saw the Russian campaigns co-opt authentic voices to join their influence operations, likely to avoid detection and help appear more authentic. Despite these efforts, our investigation identified some links between these two Russian campaigns and also with our past enforcements.
  • Two of the top Democrats on the House Energy and Committee along with another Democrat wrote nine internet service providers (ISP) “questioning their commitment to consumers amid ISPs raising prices and imposing data caps during the COVID-19 pandemic.” Committee Chair Frank Pallone, Jr. (D-NJ), Communications and Technology Subcommittee Chairman Mike Doyle (D-PA), and Representative Jerry McNerney (D-CA) wrote the following ISPs:
    • Pallone, Doyle, and McNerney took issue with the companies raising prices and imposing data caps after having pledged not to do so at the behest of the Federal Communications Commission (FCC). They asked the companies to answer a series of questions:
      • Did the company participate in the FCC’s “Keep Americans Connected” pledge?
      • Has the company increased prices for fixed or mobile consumer internet and fixed or phone service since the start of the pandemic, or do they plan to raise prices on such plans within the next six months? 
      • Prior to March 2020, did any of the company’s service plans impose a maximum data consumption threshold on its subscribers?
      • Since March 2020, has the company modified or imposed any new maximum data consumption thresholds on service plans, or do they plan to do so within the next six months? 
      • Did the company stop disconnecting customers’ internet or telephone service due to their inability to pay during the pandemic? 
      • Does the company offer a plan designed for low-income households, or a plan established in March or later to help students and families with connectivity during the pandemic?
      • Beyond service offerings for low-income customers, what steps is the company currently taking to assist individuals and families facing financial hardship due to circumstances related to COVID-19? 
  • The United States (U.S.) Department of Homeland Security (DHS) issued a “Data Security Business Advisory: Risks and Considerations for Businesses Using Data Services and Equipment from Firms Linked to the People’s Republic of China,” that “describes the data-related risks American businesses face as a result of the actions of the People’s Republic of China (PRC) and outlines steps that businesses can take to mitigate these risks.” DHS generally recommended:
    • Businesses and individuals that operate in the PRC or with PRC firms or entities should scrutinize any business relationship that provides access to data—whether business confidential, trade secrets, customer personally identifiable information (PII), or other sensitive information. Businesses should identify the sensitive personal and proprietary information in their possession. To the extent possible, they should minimize the amount of at-risk data being stored and used in the PRC or in places accessible by PRC authorities. Robust due diligence and transaction monitoring are also critical for addressing potential legal exposure, reputation risks, and unfair advantage that data and intellectual property theft would provide competitors. Businesses should seek to acquire a thorough understanding of the ownership of data service providers, location of data infrastructure, and any tangential foreign business relationships and significant foreign investors.
  • The Federal Communications Commission (FCC) is asking for comments on the $3.2 billion Emergency Broadband Benefit Program established in the “Consolidated Appropriations Act, 2021” (H.R. 133). Comments are due by 16 February 2021. The FCC noted “eligible households may receive a discount off the cost of broadband service and certain connected devices during an emergency period relating to the COVID-19 pandemic, and participating providers can receive a reimbursement for such discounts.” The FCC explained the program in further detail:
    • Pursuant to the Consolidated Appropriations Act, the Emergency Broadband Benefit Program will use available funding from the Emergency Broadband Connectivity Fund to support participating providers’ provision of certain broadband services and connected devices to qualifying households.
    • To participate in the program, a provider must elect to participate and either be designated as an eligible telecommunications carrier or be approved by the Commission. Participating providers will make available to eligible households a monthly discount off the standard rate for an Internet service offering and associated equipment, up to $50.00 per month.
    • On Tribal lands, the monthly discount may be up to $75.00 per month. Participating providers will receive reimbursement from the Emergency Broadband Benefit Program for the discounts provided.
    • Participating providers that also supply an eligible household with a laptop, desktop computer, or tablet (connected device) for use during the emergency period may receive a single reimbursement of up to $100.00 for the connected device, if the charge to the eligible household for that device is more than $10.00 but less than $50.00.  An eligible household may receive only one supported device.  Providers must submit certain certifications to the Commission to receive reimbursement from the program, and the Commission is required to adopt audit requirements to ensure provider compliance and prevent waste, fraud, and abuse.
  • The Biden-Harris transition team named National Security Agency’s (NSA) Director of Cybersecurity as the Biden White House’s Deputy National Security Advisor for Cyber and Emerging Technology. Anne Neuberger’s portfolio at the NSA included “lead[ing] NSA’s cybersecurity mission, including emerging technology areas like quantum-resistant cryptography.” At the National Security Council, Neuberger would will work to coordinate cybersecurity and emerging technology policy across agencies and funnel policy options up to the full NSC and ultimately the President. It is not clear how Neuberger’s portfolio will interact with the newly created National Cybersecurity Director, a position that, thus far, has remained without a nominee.
    • The transition noted “[p]rior to this role, she led NSA’s Election Security effort and served as Assistant Deputy Director of NSA’s Operations Directorate, overseeing foreign intelligence and cybersecurity operations…[and] also previously served as NSA’s first Chief Risk Officer, as Director of NSA’s Commercial Solutions Center, as Director of the Enduring Security Framework cybersecurity public-private partnership, as the Navy’s Deputy Chief Management Officer, and as a White House Fellow.” The transition stated that “[p]rior to joining government service, Neuberger was Senior Vice President of Operations at American Stock Transfer & Trust Company (AST), where she directed technology and operations.”
  • The Federal Communications Commission (FCC) published a final rule in response to the United States (U.S.) Court of Appeals for the District of Columbia’s decision striking down three aspects of the FCC’s rollback of net neutrality, “Restoring Internet Freedom Order.” The FCC explained the final rule:
    • responds to a remand from the U.S. Court of Appeals for the D.C. Circuit directing the Commission to assess the effects of the Commission’s Restoring Internet Freedom Order on public safety, pole attachments, and the statutory basis for broadband internet access service’s inclusion in the universal service Lifeline program. This document also amends the Commission’s rules to remove broadband internet service from the list of services supported by the universal service Lifeline program, while preserving the Commission’s authority to fund broadband internet access service through the Lifeline program.
    • In 2014, the U.S. Court of Appeals for the District of Columbia struck down a 2010 FCC net neutrality order in Verizon v. FCC, but the court did suggest a path forward. The court held the FCC “reasonably interpreted section 706 to empower it to promulgate rules governing broadband providers’ treatment of Internet traffic, and its justification for the specific rules at issue here—that they will preserve and facilitate the “virtuous circle” of innovation that has driven the explosive growth of the Internet—is reasonable and supported by substantial evidence.” The court added that “even though the Commission has general authority to regulate in this arena, it may not impose requirements that contravene express statutory mandates…[and] [g]iven that the Commission has chosen to classify broadband providers in a manner that exempts them from treatment as common carriers, the Communications Act expressly prohibits the Commission from nonetheless regulating them as such.” However, in 2016, the same court upheld the 2015 net neutrality regulations in U.S. Telecom Association v. FCC, and then upheld most of the Trump Administration’s FCC’s repeal of the its earlier net neutrality rule.
    • However, the D.C. Circuit declined to accept the FCC’s attempt to preempt all contrary state laws and struck down this part of the FCC’s rulemaking. Consequently, states and local jurisdictions may now be free to enact regulations of internet services along the lines of the FCC’s now repealed Open Internet Order. The D.C. Circuit also sent the case back to the FCC for further consideration on three points.
    • In its request for comments on how to respond to the remand, the FCC summarized the three issues: public safety, pole attachments, and the Lifeline Program:
      • Public Safety.  First, we seek to refresh the record on how the changes adopted in the Restoring Internet Freedom Order might affect public safety. In the Restoring Internet Freedom Order, the Commission predicted, for example, that permitting paid prioritization arrangements would “increase network innovation,” “lead[] to higher investment in broadband capacity as well as greater innovation on the edge provider side of the market,” and “likely . . . be used to deliver enhanced service for applications that need QoS [i.e., quality of service] guarantees.” Could the network improvements made possible by prioritization arrangements benefit public safety applications—for example, by enabling the more rapid, reliable transmission of public safety-related communications during emergencies? 
      • Pole Attachments.  Second, we seek to refresh the record on how the changes adopted in the Restoring Internet Freedom Order might affect the regulation of pole attachments in states subject to federal regulation.  To what extent are ISPs’ pole attachments subject to Commission authority in non-reverse preemption states by virtue of the ISPs’ provision of cable or telecommunications services covered by section 224?  What impact would the inapplicability of section 224 to broadband-only providers have on their access to poles?  Have pole owners, following the Order, “increase[d] pole attachment rates or inhibit[ed] broadband providers from attaching equipment”?  How could we use metrics like increases or decreases in broadband deployment to measure the impact the Order has had on pole attachment practices?  Are there any other impacts on the regulation of pole attachments from the changes adopted in the Order?  Finally, how do any potential considerations about pole attachments bear on the Commission’s underlying decision to classify broadband as a Title I information service?
      • Lifeline Program.  Third, we seek to refresh the record on how the changes adopted in the Restoring Internet Freedom Order might affect the Lifeline program.  In particular, we seek to refresh the record on the Commission’s authority to direct Lifeline support to eligible telecommunications carriers (ETCs) providing broadband service to qualifying low-income consumers.  In the 2017 Lifeline NPRM, the Commission proposed that it “has authority under Section 254(e) of the Act to provide Lifeline support to ETCs that provide broadband service over facilities-based broadband-capable networks that support voice service,” and that “[t]his legal authority does not depend on the regulatory classification of broadband Internet access service and, thus, ensures the Lifeline program has a role in closing the digital divide regardless of the regulatory classification of broadband service.”  How, if at all, does the Mozilla decision bear on that proposal, and should the Commission proceed to adopt it? 
  • The Federal Trade Commission (FTC) reached a settlement with a photo app company that allegedly did not tell users their photos would be subject to the company’s facial recognition technology. The FTC deemed this a deceptive business practice in violation of Section 5 of the FTC Act and negotiated a settlement the Commissioners approved in a 5-0 vote. The consent order includes interesting, perhaps even new language, requiring the company “to delete models and algorithms it developed by using the photos and videos uploaded by its users” according to the FTC’s press release.
    • In the complaint, the FTC asserted:
      • Since 2015, Everalbum has provided Ever, a photo storage and organization application, to consumers.
      • In February 2017, Everalbum launched its “Friends” feature, which operates on both the iOS and Android versions of the Ever app. The Friends feature uses face recognition to group users’ photos by faces of the people who appear in the photos. The user can choose to apply “tags” to identify by name (e.g., “Jane”) or alias (e.g., “Mom”) the individuals who appear in their photos. These tags are not available to other Ever users. When Everalbum launched the Friends feature, it enabled face recognition by default for all users of the Ever mobile app. At that time, Everalbum did not provide users of the Ever mobile app an option to turn off or disable the feature.
      • However, prior to April 2019, Ever mobile app users who were located anywhere other than Texas, Illinois, Washington, and the European Union did not need to, and indeed could not, take any affirmative action to “let[ Everalbum] know” that it should apply face recognition to the users’ photos. In fact, for those users, face recognition was enabled by default and the users lacked the ability to disable it. Thus, the article was misleading for Ever mobile app users located outside of Texas, Illinois, Washington, and the European Union.
      • Between September 2017 and August 2019, Everalbum combined millions of facial images that it extracted from Ever users’ photos with facial images that Everalbum obtained from publicly available datasets in order to create four new datasets to be used in the development of its face recognition technology. In each instance, Everalbum used computer scripts to identify and compile from Ever users’ photos images of faces that met certain criteria (i.e., not associated with a deactivated Ever account, not blurry, not too small, not a duplicate of another image, associated with a specified minimum number of images of the same tagged identity, and, in three of the four instances, not identified by Everalbum’s machines as being an image of someone under the age of thirteen).
      • The FTC summarized its settlement:
        • The proposed settlement requires Everalbum to delete:
          • the photos and videos of Ever app users who deactivated their accounts;
          • all face embeddings—data reflecting facial features that can be used for facial recognition purposes—the company derived from the photos of Ever users who did not give their express consent to their use; and
          • any facial recognition models or algorithms developed with Ever users’ photos or videos.
        • In addition, the proposed settlement prohibits Everalbum from misrepresenting how it collects, uses, discloses, maintains, or deletes personal information, including face embeddings created with the use of facial recognition technology, as well as the extent to which it protects the privacy and security of personal information it collects. Under the proposed settlement, if the company markets software to consumers for personal use, it must obtain a user’s express consent before using biometric information it collected from the user through that software to create face embeddings or develop facial recognition technology.
      • FTC Commissioner Rohit Chopra issued a statement, explaining his view on facial recognition technology and he settlement:
        • As outlined in the complaint, Everalbum made promises that users could choose not to have facial recognition technology applied to their images, and that users could delete the images and their account. In addition to those promises, Everalbum had clear evidence that many of the photo app’s users did not want to be roped into facial recognition. The company broke its promises, which constitutes illegal deception according to the FTC’s complaint. This matter and the FTC’s proposed resolution are noteworthy for several reasons.
        • First, the FTC’s proposed order requires Everalbum to forfeit the fruits of its deception. Specifically, the company must delete the facial recognition technologies enhanced by any improperly obtained photos. Commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that derive much of their value from ill-gotten data. This is an important course correction.
        • Second, the settlement does not require the defendant to pay any penalty. This is unfortunate. To avoid this in the future, the FTC needs to take further steps to trigger penalties, damages, and other relief for facial recognition and data protection abuses. Commissioners have voted to enter into scores of settlements that address deceptive practices regarding the collection, use, and sharing of personal data. There does not appear to be any meaningful dispute that these practices are illegal. However, since Commissioners have not restated this precedent into a rule under Section 18 of the FTC Act, we are unable to seek penalties and other relief for even the most egregious offenses when we first discover them.
        • Finally, the Everalbum matter makes it clear why it is important to maintain states’ authority to protect personal data. Because the people of Illinois, Washington, and Texas passed laws related to facial recognition and biometric identifiers, Everalbum took greater care when it came to these individuals in these states. The company’s deception targeted Americans who live in states with no specific state law protections.
  • The Trump Administration issued the “National Maritime Cybersecurity Plan” that “sets forth how the United States government will defend the American economy through enhanced cybersecurity coordination, policies and practices, aimed at mitigating risks to the maritime sub-sector, promoting prosperity through information and intelligence sharing, and preserving and increasing the nation’s cyber workforce” according to the National Security Advisor Robert O’Brien. It will be up to the Biden Administration to implement, revise, or discard this strategy, but strategy documents such as this that complain anodyne recommendations tend to stay in place for the short-term, at least. It bears note that the uneven margins to the columns in the document suggests a rush to issue this document before the end of the Trump Administration. Nevertheless, O’Brien added:
    • President [Donald] Trump designated the cybersecurity of the Maritime Transportation System (MTS) as a top priority for national defense, homeland security, and economic competitiveness in the 2017 National Security Strategy. The MTS contributes to one quarter of all United States gross domestic product, or approximately $5.4 trillion. MTS operators are increasingly reliant on information technology (IT) and operational technology (OT) to maximize the reliability and efficiency of maritime commerce. This plan articulates how the United States government can buy down the potential catastrophic risks to our national security and economic prosperity created by technology innovations to strengthen maritime commerce efficiency and reliability.
    • The strategy lists a number of priority actions for the executive branch, including:
      • The United States will de- conflict government roles and responsibilities.
      • The United States will develop risk modeling to inform maritime cybersecurity standards and best practices.
      • The United States will strengthen cybersecurity requirements in port services contracts and leasing.
      • The United States will develop procedures to identify, prioritize, mitigate, and investigate cybersecurity risks in critical ship and port systems.
      • Exchange United States government information with the maritime industry.
      • Share cybersecurity intelligence with appropriate non- government entities.
      • Prioritize maritime cybersecurity intelligence collection.
  • The National Security Agency’s NSA Cybersecurity Directorate has issued its very annual review, the “2020 NSA Cybersecurity Year in Review” that encapsulates the first year of operation for the newly created part of the NSA.
    • Highlights include:
      • In 2020, NSA focused on modernizing encryption across the Department of Defense (DOD). It began with a push to eliminate cryptography that is at risk from attack due to adversarial computational advances. This applied to several systems commonly used by the Armed Services today to provide command and control, critical communications, and battlefield awareness. It also applied to operational practices concerning the handling of cryptographic keys and the implementation of modern suites of cryptography in network communications devices.
      • 2020 was notable for the number of Cybersecurity Advisories (CSAs) and other products NSA cybersecurity produced and released. These products are intended to alert network owners, specifically National Security System (NSS), Department of Defense (DOD), and Defense Industrial Base (DIB), of cyber threats and enable defenders to take immediate action to secure their systems.
      • 2020 was notable not just because it was the NSA Cybersecurity Directorate’s first year nor because of COVID-19, but also because it was an election year in the United States. Drawing on lessons learned from the 2016 presidential election and the 2018 mid-term elections, NSA was fully engaged in whole-of-government efforts to protect 2020 election from foreign interference and influence. Cybersecurity was a foundational component of NSA’s overall election defense effort.
      • This past year, NSA cybersecurity prioritized public-private collaboration, invested in cybersecurity research, and made a concerted effort to build trusted partnerships with the cybersecurity community.
      • The NSA touted the following achievements:
        • In November 2019, NSA began laying the groundwork to conduct a pilot with the Defense Cyber Crime Center and five DIB companies to monitor and block malicious network traffic based on continuous automated analysis of the domain names these companies’ networks were contacting. The pilot’s operational phase commenced in March 2020. Over six months, the Protective Domain Name Service (PDNS) examined more than 4 billion DNS queries to and from these companies. The PDNS provider identified callouts to 3,519 malicious domains and blocked upwards of 13 million connections to those domains. The pilot proved the value of DoD expanding the PDNS service to all DIB entities at scale
        • How cyber secure is cyber “ready” for combat? In response to legislation that recognized the imperative of protecting key weapons and space systems from adversary cyber intrusions, NSA partnered closely with the DoD CIO, Joint Staff, Undersecretary of Defense for Acquisition & Sustainment, and the Military Services to structure, design, and execute a new cybersecurity program, focused on the most important weapons and space systems, known as the Strategic Cybersecurity Program (SCP), with the mindset of “stop assessing and start addressing.”The program initially identified 12 key weapons and space systems that must be evaluated for cybersecurity vulnerabilities that need to be mitigated. This is either due to the existence of intelligence indicating they are being targeted by cyber adversaries or because the systems are particularly important to warfighting. These systems cover all warfighting domains (land, sea, air, cyber, and space). Under the auspices of the SCP, NSA and military service partners will conduct cybersecurity evaluations, and, most importantly, maintain cyber risk scoreboards and mitigation plans accountability in reducing cyber risk to acceptable levels
      • The NSA sees the following issue son the horizon:
        • In October 2020, NSA launched an expansive effort across the Executive Branch to understand how we can better inform, drive, and understand the activities of NSS owners to prevent, or respond to, critical cybersecurity events, and cultivate an operationally-aligned community resilient against the most advanced threats. These efforts across the community will come to fruition during the first quarter of 2021 and are expected to unify disparate elements across USG for stronger cybersecurity at scale.
        • NSA Cybersecurity is also focused on combating ransomware, a significant threat to NSS and critical infrastructure. Ransomware activity has become more destructive and impactful in nature and scope. Malicious actors target critical data and propagate ransomware across entire networks, alarmingly focusing recent attacks against U.S. hospitals. In 2020, NSA formed multiple working groups with U.S. Government agencies and other partners to identify ways to make ransomware operations more difficult for our adversaries, less scalable, and less lucrative. While the ransomware threat remains significant, NSA will continue to develop innovative ways to keep the activity at bay.
  • This week, Parler sued Amazon after it rescinded its web hosting services to the social media platform billed as the conservative, unbiased alternative to Twitter. Amazon has responded with an extensive list of the inflammatory, inciting material upon which it based its decision.
    • In its 11 January complaint, Parler asked a federal court “for injunctive relief, including a temporary restraining order and preliminary injunctive relief, and damages” because mainly “AWS’s decision to effectively terminate Parler’s account is apparently motivated by political animus…[and] is also apparently designed to reduce competition in the microblogging services market to the benefit of Twitter” in violation of federal antitrust law.
    • In its 12 January response, Amazon disagreed:
      • This case is not about suppressing speech or stifling viewpoints. It is not about a conspiracy to restrain trade. Instead, this case is about Parler’s demonstrated unwillingness and inability to remove from the servers of Amazon Web Services (“AWS”) content that threatens the public safety, such as by inciting and planning the rape, torture, and assassination of named public officials and private citizens. There is no legal basis in AWS’s customer agreements or otherwise to compel AWS to host content of this nature. AWS notified Parler repeatedly that its content violated the parties’ agreement, requested removal, and reviewed Parler’s plan to address the problem, only to determine that Parler was both unwilling and unable to do so. AWS suspended Parler’s account as a last resort to prevent further access to such content, including plans for violence to disrupt the impending Presidential transition.
    • Amazon offered a sampling of the content on Parler that caused AWS to pull the plug on the platform:
      • “Fry’em up. The whole fkn crew. #pelosi #aoc #thesquad #soros #gates #chuckschumer #hrc #obama #adamschiff #blm #antifa we are coming for you and you will know it.”
      • “#JackDorsey … you will die a bloody death alongside Mark Suckerturd [Zuckerberg]…. It has been decided and plans are being put in place. Remember the photographs inside your home while you slept? Yes, that close. You will die a sudden death!”
      • “We are going to fight in a civil War on Jan.20th, Form MILITIAS now and acquire targets.”
      • “On January 20th we need to start systematicly [sic] assassinating [sic] #liberal leaders, liberal activists, #blm leaders and supporters, members of the #nba #nfl #mlb #nhl #mainstreammedia anchors and correspondents and #antifa. I already have a news worthy event planned.”
      • Shoot the police that protect these shitbag senators right in the head then make the senator grovel a bit before capping they ass.”

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 15 January, the Senate Intelligence Committee will hold a hearing on the nomination of Avril Haines to be the Director of National Intelligence.
  • The Senate Homeland Security and Governmental Affairs Committee will hold a hearing on the nomination of Alejandro N. Mayorkas to be Secretary of Homeland Security on 19 January.
  • On 19 January, the Senate Armed Services Committee will hold a hearing on former General Lloyd Austin III to be Secretary of Defense.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

Further Reading, Other Development, and Coming Events (7 December)

Further Reading

  • Facebook steps up campaign to ban false information about coronavirus vaccines” By Elizabeth Dwoskin — The Washington Post. In its latest step to find and remove lies, misinformation, and disinformation, the social media giant is now committing to removing and blocking untrue material about COVID-19 vaccines, especially from the anti-vaccine community. Will the next step be to take on anti-vaccination proponents generally?
  • Comcast’s 1.2 TB data cap seems like a ton of data—until you factor in remote work” By Rob Pegoraro — Fast Company. Despite many people and children working and learning from home, Comcast is reimposing a 1.2 terabyte limit on data for homes. Sounds like quite a lot until you factor in video meetings, streaming, etc. So far, other providers have not set a cap.
  • Google’s star AI ethics researcher, one of a few Black women in the field, says she was fired for a critical email” By Drew Harwell and Nitasha Tiku — The Washington Post. Timnit Gebru, a top flight artificial intelligence (AI) computer scientist, was fired for questioning Google’s review of a paper she wanted to present at an AI conference that is likely critical of the company’s AI projects. Google claims she resigned, but Gebru says she was fired. She has long been an advocate for women and minorities in tech and AI and her ouster will likely only increase scrutiny of and questions about Google’s commitment to diversity and an ethical approach to the development and deployment of AI. It will also probably lead to more employee disenchantment about the company that follows in the wake of protests about Google’s involvement with the United States Department of Defense’s Project Maven and hiring of former United States Department of Homeland Security chief of staff Miles Taylor who was involved with the policies that resulted in caging children and separating families on the southern border of the United States.
  • Humans Can Help Clean Up Facebook and Twitter” By Greg Bensinger — The New York Times. In this opinion piece, the argument is made that social media platforms should redeploy their human monitors to the accounts that violate terms of service most frequently (e.g., President Donald Trump) and more aggressively label and remove untrue or inflammatory content, they would have a greater impact on lies, misinformation, and disinformation.
  • Showdown looms over digital services tax” By Ashley Gold — Axios. Because the Organization for Economic Cooperation and Development (OECD) has not reached a deal on digital services taxes, a number of the United States (U.S.) allies could move forward with taxes on U.S. multinationals like Amazon, Google, and Apple. The Trump Administration has variously taken an adversarial position threatening to retaliate against countries like France who have enacted a tax that has not been collected during the OECD negotiations. The U.S. also withdrew from talks. It is probable the Biden Administration will be more willing to work in a multi-lateral fashion and may strike a deal on an issue that it not going away as the United Kingdom, Italy, and Canada also have plans for a digital tax.
  • Trump’s threat to veto defense bill over social-media protections is heading to a showdown with Congress” By Karoun Demirjian and Tony Romm — The Washington Post. I suppose I should mention of the President’s demands that the FY 2021 National Defense Authorization Act (NDAA) contain a repeal of 47 U.S.C. 230 (Section 230 of the Communications Act) that came at the eleventh hour and fifty-ninth minute of negotiations on a final version of the bill. Via Twitter, Donald Trump threatened to veto the bill which has been passed annually for decades. Republicans were not having it, however, even if they agreed on Trump’s desire to remove liability protection for technology companies. And yet, if Trump continues to insist on a repeal, Republicans may find themselves in a bind and the bill could conceivably get pulled until President-elect Joe Biden is sworn in. On the other hand, Trump’s veto threats about renaming military bases currently bearing the names of Confederate figures have not been renewed even though the final version of the bill contains language instituting a process to do just that.

Other Developments

  • The Senate Judiciary Committee held over its most recent bill to narrow 47 U.S.C. 230 (Section 230 of the Communications Act) that provides liability protection for technology companies for third-party material posted on their platforms and any decisions to edit, alter, or remove such content. The committee opted to hold the “Online Content Policy Modernization Act” (S.4632), which may mean the bill’s chances of making it to the Senate floor are low. What’s more, even if the Senate passes Section 230 legislation, it is not clear there will be sufficient agreement with Democrats in the House to get a final bill to the President before the end of this Congress. On 1 October, the committee also decided to hold over bill to try to reconcile the fifteen amendments submitted for consideration. The Committee could soon meet again to formally markup and report out this legislation.
    • At the earlier hearing, Chair Lindsey Graham (R-SC) submitted an amendment revising the bill’s reforms to Section 230 that incorporate some of the below amendments but includes new language. For example, the bill includes a definition of “good faith,” a term not currently defined in Section 230. This term would be construed as a platform taking down or restricting content only according to its publicly available terms of service, not as a pretext, and equally to all similarly situated content. Moreover, good faith would require alerting the user and giving him or her an opportunity to respond subject to certain exceptions. The amendment also makes clear that certain existing means of suing are still available to users (e.g. suing claiming a breach of contract.)
    • Senator Mike Lee (R-UT) offered a host of amendments:
      • EHF20913 would remove “user[s]” from the reduced liability shield that online platforms would receive under the bill. Consequently, users would still not be legally liable for the content posted by another user.
      • EHF20914 would revise the language the language regarding the type of content platforms could take down with legal protection to make clear it would not just be “unlawful” content but rather content “in violation of a duly enacted law of the United States,” possibly meaning federal laws and not state laws. Or, more likely, the intent would be to foreclose the possibility a platform would say it is acting in concert with a foreign law and still assert immunity.
      • EHF20920 would add language making clear that taking down material that violates terms of service or use according to an objectively reasonable belief would be shielded from liability.
      • OLL20928 would expand legal protection to platforms for removing or restricting spam,
      • OLL20929 would bar the Federal Communications Commission (FCC) from a rulemaking on Section 230.
      • OLL20930 adds language making clear if part of the revised Section 230 is found unconstitutional, the rest of the law would still be applicable.
      • OLL20938 revises the definition of an “information content provider,” the term of art in Section 230 that identifies a platform, to expand when platforms may be responsible for the creation or development of information and consequently liable for a lawsuit.
    • Senator Josh Hawley (R-MO) offered an amendment that would create a new right of action for people to sue large platforms for taking down his or her content if not done in “good faith.” The amendment limits this right only to “edge providers” who are platforms with more than 30 million users in the U.S. , 300 million users worldwide, and with revenues of more than $1.5 billion. This would likely exclude all platforms except for Twitter, Facebook, Instagram, TikTok, Snapchat, and a select group of a few others.
    • Senator John Kennedy (R-LA) offered an amendment that removes all Section 230 legal immunity from platforms that collect personal data and then uses an “automated function” to deliver targeted or tailored content to a user unless a user “knowingly and intentionally elect[s]” to receive such content.
  • The Massachusetts Institute of Technology’s (MIT) Work of the Future Task Force issued its final report and drew the following conclusions:
    • Technological change is simultaneously replacing existing work and creating new work. It is not eliminating work altogether.
    • Momentous impacts of technological change are unfolding gradually.
    • Rising labor productivity has not translated into broad increases in incomes because labor market institutions and policies have fallen into disrepair.
    • Improving the quality of jobs requires innovation in labor market institutions.
    • Fostering opportunity and economic mobility necessitates cultivating and refreshing worker skills.
    • Investing in innovation will drive new job creation, speed growth, and meet rising competitive challenges.
    • The Task Force stated:
      • In the two-and-a-half years since the Task Force set to work, autonomous vehicles, robotics, and AI have advanced remarkably. But the world has not been turned on its head by automation, nor has the labor market. Despite massive private investment, technology deadlines have been pushed back, part of a normal evolution as breathless promises turn into pilot trials, business plans, and early deployments — the diligent, if prosaic, work of making real technologies work in real settings to meet the demands of hard-nosed customers and managers.
      • Yet, if our research did not confirm the dystopian vision of robots ushering workers off of factor y floors or artificial intelligence rendering superfluous human expertise and judgment, it did uncover something equally pernicious: Amidst a technological ecosystem delivering rising productivity, and an economy generating plenty of jobs (at least until the COVID-19 crisis), we found a labor market in which the fruits are so unequally distributed, so skewed towards the top, that the majority of workers have tasted only a tiny morsel of a vast har vest.
      • As this report documents, the labor market impacts of technologies like AI and robotics are taking years to unfold. But we have no time to spare in preparing for them. If those technologies deploy into the labor institutions of today, which were designed for the last century, we will see similar effects to recent decades: downward pressure on wages, skills, and benefits, and an increasingly bifurcated labor market. This report, and the MIT Work of the Future Task Force, suggest a better alternative: building a future for work that har vests the dividends of rapidly advancing automation and ever-more powerful computers to deliver opportunity and economic security for workers. To channel the rising productivity stemming from technological innovations into broadly shared gains, we must foster institutional innovations that complement technological change.
  • The European Data Protection Supervisor (EDPS) Wojciech Wiewiorówski published his “preliminary opinion on the European Commission’s (EC) Communication on “A European strategy for data” and the creation of a common space in the area of health, namely the European Health Data Space (EHDS).” The EDPS lauded the goal of the EHDS, “the prevention, detection and cure of diseases, as well as for evidence-based decisions in order to enhance effectiveness, accessibility and sustainability of the healthcare systems.” However, Wiewiorówski articulated his concerns that the EC needs to think through the applicability of the General Data Protection Regulation (GDPR), among other European Union (EU) laws before it can legally move forward. The EDPS stated:
    • The EDPS calls for the establishment of a thought-through legal basis for the processing operations under the EHDS in line with Article 6(1) GDPR and also recalls that such processing must comply with Article 9 GDPR for the processing of special categories of data.
    • Moreover, the EDPS highlights that due to the sensitivity of the data to be processed within the EHDS, the boundaries of what constitutes a lawful processing and a compatible further processing of the data must be crystal-clear for all the stakeholders involved. Therefore, the transparency and the public availability of the information relating to the processing on the EHDS will be key to enhance public trust in the EHDS.
    • The EDPS also calls on the Commission to clarify the roles and responsibilities of the parties involved and to clearly identify the precise categories of data to be made available to the EHDS. Additionally, he calls on the Member States to establish mechanisms to assess the validity and quality of the sources of the data.
    • The EDPS underlines the importance of vesting the EHDS with a comprehensive security infrastructure, including both organisational and state-of-the-art technical security measures to protect the data fed into the EHDS. In this context, he recalls that Data Protection Impact Assessments may be a very useful tool to determine the risks of the processing operations and the mitigation measures that should be adopted.
    • The EDPS recommends paying special attention to the ethical use of data within the EHDS framework, for which he suggests taking into account existing ethics committees and their role in the context of national legislation.
    • The EDPS is convinced that the success of the EHDS will depend on the establishment of a strong data governance mechanism that provides for sufficient assurances of a lawful, responsible, ethical management anchored in EU values, including respect for fundamental rights. The governance mechanism should regulate, at least, the entities that will be allowed to make data available to the EHDS, the EHDS users, the Member States’ national contact points/ permit authorities, and the role of DPAs within this context.
    • The EDPS is interested in policy initiatives to achieve ‘digital sovereignty’ and has a preference for data being processed by entities sharing European values, including privacy and data protection. Moreover, the EDPS calls on the Commission to ensure that the stakeholders taking part in the EHDS, and in particular, the controllers, do not transfer personal data unless data subjects whose personal data are transferred to a third country are afforded a level of protection essentially equivalent to that guaranteed within the European Union.
    • The EDPS calls on Member States to guarantee the effective implementation of the right to data portability specifically in the EHDS, together with the development of the necessary technical requirements. In this regard, he considers that a gap analysis might be required regarding the need to integrate the GDPR safeguards with other regulatory safeguards, provided e.g. by competition law or ethical guidelines.
  • The Office of Management and Budget (OMB) extended a guidance memorandum directing agencies to consolidate data centers after Congress pushed back the sunset date for the program. OMB extended OMB Memorandum M-19-19, Update to Data Center Optimization Initiative (DCOI) through 30 September 2022, which applies “to the 24 Federal agencies covered by the Chief Financial Officers (CFO) Act of 1990, which includes the Department of Defense.” The DCOI was codified in the “Federal Information Technology Acquisition Reform” (FITARA) (P.L. 113-291) and extended in 2018 until October 1, 2020. And this sunset was pushed back another two years in the FY 2020 National Defense Authorization Act (NDAA) (P.L. 116-92).
    • In March 2020, the Government Accountability Office (GAO) issued another of its periodic assessments of the DCOI, started in 2012 by the Obama Administration to shrink the federal government’s footprint of data centers, increase efficiency and security, save money, and reduce energy usage.
    • The GAO found that 23 of the 24 agencies participating in the DCOI met or planned to meet their FY 2019 goals to close 286 of the 2,727 data centers considered part of the DCOI. This latter figure deserves some discussion, for the Trump Administration changed the definition of what is a data center to exclude smaller ones (so-called non-tiered data centers). GAO asserted that “recent OMB DCOI policy changes will reduce the number of data centers covered by the policy and both OMB and agencies may lose important visibility over the security risks posed by these facilities.” Nonetheless, these agencies are projecting savings of $241.5 million when all the 286 data centers planned for closure in FY 2019 actually close. It bears note that the GAO admitted in a footnote it “did not independently validate agencies’ reported cost savings figures,” so these numbers may not be reliable.
    • In terms of how to improve the DCOI, the GAO stated that “[i]n addition to reiterating our prior open recommendations to the agencies in our review regarding their need to meet DCOI’s closure and savings goals and optimization metrics, we are making a total of eight new recommendations—four to OMB and four to three of the 24 agencies. Specifically:
      • The Director of the Office of Management and Budget should (1) require that agencies explicitly document annual data center closure goals in their DCOI strategic plans and (2) track those goals on the IT Dashboard. (Recommendation 1)
      • The Director of the Office of Management and Budget should require agencies to report in their quarterly inventory submissions those facilities previously reported as data centers, even if those facilities are not subject to the closure and optimization requirements of DCOI. (Recommendation 2)
      • The Director of the Office of Management and Budget should document OMB’s decisions on whether to approve individual data centers when designated by agencies as either a mission critical facility or as a facility not subject to DCOI. (Recommendation 3)
      • The Director of the Office of Management and Budget should take action to address the key performance measurement characteristics missing from the DCOI optimization metrics, as identified in this report. (Recommendation 4)
  • Australia’s Inspector-General of Intelligence and Security (IGIS) released its first report on how well the nation’s security services did in observing the law with respect to COVID  app  data. The IGIS “is satisfied that the relevant agencies have policies and procedures in place and are taking reasonable steps to avoid intentional collection of COVID app data.” The IGIS revealed that “[i]ncidental collection in the course of the lawful collection of other data has occurred (and is permitted by the Privacy Act); however, there is no evidence that any agency within IGIS jurisdiction has decrypted, accessed or used any COVID app data.” The IGIS is also “satisfied  that  the intelligence agencies within IGIS jurisdiction which have the capability to incidentally collect a least some types of COVID app data:
    • Are aware of their responsibilities under Part VIIIA of the Privacy Act and are taking active steps to minimise the risk that they may collect COVID app data.
    • Have appropriate  policies  and  procedures  in  place  to  respond  to  any  incidental  collection of COVID app data that they become aware of. 
    • Are taking steps to ensure any COVID app data is not accessed, used or disclosed.
    • Are taking steps to ensure any COVID app data is deleted as soon as practicable.
    • Have not decrypted any COVID app data.
    • Are applying the usual security measures in place in intelligence agencies such that a ‘spill’ of any data, including COVID app data, is unlikely.
  • New Zealand’s Government Communications Security Bureau’s National Cyber Security Centre (NCSC) has released its annual Cyber Threat Report that found that “nationally significant organisations continue to be frequently targeted by malicious cyber actors of all types…[and] state-sponsored and non-state actors targeted public and private sector organisations to steal information, generate revenue, or disrupt networks and services.” The NCSC added:
    • Malicious cyber actors have shown their willingness to target New Zealand organisations in all sectors using a range of increasingly advanced tools and techniques. Newly disclosed vulnerabilities in products and services, alongside the adoption of new services and working arrangements, are rapidly exploited by state-sponsored actors and cyber criminals alike. A common theme this year, which emerged prior to the COVID-19 pandemic, was the exploitation of known vulnerabilities in internet-facing applications, including corporate security products, remote desktop services and virtual private network applications.
  • The former Director of the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) wrote an opinion piece disputing President Donald Trump’s claims that the 2020 Presidential Election was fraudulent. Christopher Krebs asserted:
    • While I no longer regularly speak to election officials, my understanding is that in the 2020 results no significant discrepancies attributed to manipulation have been discovered in the post-election canvassing, audit and recount processes.
    • This point cannot be emphasized enough: The secretaries of state in Georgia, Michigan, Arizona, Nevada and Pennsylvania, as well officials in Wisconsin, all worked overtime to ensure there was a paper trail that could be audited or recounted by hand, independent of any allegedly hacked software or hardware.
    • That’s why Americans’ confidence in the security of the 2020 election is entirely justified. Paper ballots and post-election checks ensured the accuracy of the count. Consider Georgia: The state conducted a full hand recount of the presidential election, a first of its kind, and the outcome of the manual count was consistent with the computer-based count. Clearly, the Georgia count was not manipulated, resoundingly debunking claims by the president and his allies about the involvement of CIA supercomputers, malicious software programs or corporate rigging aided by long-gone foreign dictators.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Daniel Schludi on Unsplash

Further Reading, Other Developments, and Coming Events (11 November)

Further Reading

  • ICE, IRS Explored Using Hacking Tools, New Documents Show” By Joseph Cox — Vice. Federal agencies other than the Federal Bureau of Investigation (FBI) and the Intelligence Community (IC) appear to be interesting in utilizing some of the capabilities offered by the private sector to access devices or networks in the name of investigating cases.
  • China’s tech industry relieved by Biden win – but not relaxed” By Josh Horwitz and Yingzhi Yang — Reuters. While a Biden Administration will almost certainly lower the temperature between Beijing and Washington, the People’s Republic of China is intent on addressing the pressure points used by the Trump Administration to inflict pain on its technology industry.
  • Trump Broke the Internet. Can Joe Biden Fix It?” By Gilad Edelman — WIRED. This piece provides a view of the waterfront in technology policy under a Biden Administration.
  • YouTube is awash with election misinformation — and it isn’t taking it down” By Rebecca Heilweil — Recode. For unexplained reasons, YouTube seems to have avoided the scrutiny facing Facebook and Twitter on their content moderation policies. Whether the lack of scrutiny is a reason is not clear, but the Google owned platform had much more election-related misinformation than the other social media platforms.
  • Frustrated by internet service providers, cities and schools push for more data” By Cyrus Farivar — NBC News. Internet service providers are not helping cities and states identify families eligible for low-cost internet to help children attend school virtually. They have claimed these data are proprietary, so jurisdictions have gotten creative about identifying such families.

Other Developments

  • The Consumer Product Safety Commission’s (CPSC) Office of the Inspector General (OIG) released its annual Federal Information Security Modernization Act (FISMA) audit and found “that although management continues to make progress in implementing the FISMA requirements much work remains to be done.” More particularly, it was “determined that the CPSC has not implemented an effective information security program and practices in accordance with FISMA requirements.” The OIG asserted:
    • The CPSC information security program was not effective because the CPSC has not developed a holistic formal approach to manage information security risks or to effectively utilize information security resources to address previously identified information security deficiencies. Although the CPSC has begun to develop an Enterprise Risk Management (ERM) program to guide risk management practices at the CPSC, explicit guidance and processes to address information security risks and integrate those risks into the broader agency-wide ERM program has not been developed.
    • In addition, the CPSC has not leveraged the relevant information security risk management guidance prescribed by NIST to develop an approach to manage information security risk.
    • Further, as asserted by CPSC personnel, the CPSC has limited resources to operate the information security program and to address the extensive FISMA requirements and related complex cybersecurity challenges.
    • Therefore, the CPSC has not dedicated the resources necessary to fully address these challenges and requirements. The CPSC began addressing previously identified information security deficiencies but was not able to address all deficiencies in FY 2020.
  • The United States (U.S.) Department of Justice (DOJ) announced the seizure of 27 websites allegedly used by Iran’s Islamic Revolutionary Guard Corps (IRGC) “to further a global covert influence campaign…in violation of U.S. sanctions targeting both the Government of Iran and the IRGC.” The DOJ contended:
    • Four of the domains purported to be genuine news outlets but were actually controlled by the IRGC and targeted audiences in the United States, to covertly influence United States policy and public opinion, in violation of the Foreign Agents Registration Act (FARA). The remainder targeted audiences in other parts of the world.  This seizure warrant follows an earlier seizure of 92 domains used by the IRGC for similar purposes.
  • The United Nations (UN) Special Rapporteur on the right to privacy Joseph Cannataci issued his annual report that “constitutes  a  preliminary  assessment  as  the  evidence  base required to reach definitive conclusions on whether privacy-intrusive, anti-COVID-19 measures are necessary and proportionate in a democratic society is not yet available.” Cannataci added “[a] more definitive report is planned for mid-2021, when 16 months of evidence will be available to allow a more accurate assessment.” He “addresse[d]  two  particular  aspects  of  the impact of COVID-19 on the right to privacy: data protection and surveillance.” The Special Rapporteur noted:
    • While the COVID-19 pandemic has generated much debate about the value of contact tracing and reliance upon technology that track citizens and those they encounter, the use of information and technology is not new in managing public health emergencies. What is concerning in some States are reports of how technology is being used and the degree of intrusion and control being exerted over citizens –possibly to little public health effect.
    • The Special Rapporteur concluded:
      • It is far too early to assess definitively whether some COVID-19-related measures might be unnecessary or disproportionate. The Special Rapporteur will continue to monitor the impact of surveillance in epidemiology on the right to privacy and report to the General Assembly in 2021. The main privacy risk lies in the use of non-consensual methods, such as those outlined in the section on hybrid systems of surveillance, which could result in function creep and be used for other purposes that may be privacy intrusive.
      • Intensive and omnipresent technological surveillance is not the panacea for pandemic situations such as COVID-19. This has been especially driven home by those countries in which the use of conventional contact-tracing methods, without recourse to smartphone applications, geolocation or other technologies, has proven to be most effective in countering the spread of COVID-19.
      • If a State decides that technological surveillance is necessary as a response to the global COVID-19 pandemic, it must make sure that, after proving both the necessity and proportionality of the specific measure, it has a law that explicitly provides for such surveillance measures (as in the example of Israel).
      • A State wishing to introduce a surveillance measure for COVID-19 purposes, should not be able to rely on a generic provision in law, such as one stating that the head of the public health authority may “order such other action be taken as he [or she] may consider appropriate”. That does not provide explicit and specific safeguards which are made mandatory both under the provisions of Convention 108 and Convention 108+, and based on the jurisprudence of the European Court of Human Rights. Indeed, if the safeguard is not spelled out in sufficient detail, it cannot be considered an adequate safeguard.
  • The University of Toronto’s Citizen Lab issued its submission to the Government of Canada’s “public consultation on the renewal of its Responsible Business Conduct (RBC) strategy, which is intended to provide guidance to the Government of Canada and Canadian companies active abroad with respect to their business activities.” Citizen Lab addressed “Canadian technology companies and the threat they pose to human rights abroad” and noted two of its reports on Canadian companies whose technologies were used to violate human rights:
    • In 2018, the Citizen Lab released a report documenting Netsweeper installations on public IP networks in ten countries that each presented widespread human rights concerns. This research revealed that Netsweeper technology was used to block: (1) political content sites, including websites linked to political groups, opposition groups, local and foreign news, and regional human rights issues in Bahrain, Kuwait, Yemen, and UAE; (2) LGBTQ content as a result of Netsweeper’s pre-defined ‘Alternative Lifestyles’ content category, as well as Google searches for keywords relating to LGBTQ content (e.g., the words “gay” or “lesbian”) in the UAE, Bahrain, and Yemen; (3) non-pornographic websites under the mis-categorization of sites like the World Health Organization and the Center for Health and Gender Equity as “pornography”; (4) access to news reporting on the Rohingya refugee crisis and violence against Muslims from multiple news outlets for users in India; (5) Blogspot-hosted websites in Kuwait by categorizing them as “viruses” as well as a range of political content from local and foreign news and a website that monitors human rights issues in the region; and (6) websites like Date.com, Gay.com (the Los Angeles LGBT Center), Feminist.org, and others through categorizing them as “web proxies.” 
    • In 2018, the Citizen Lab released a report documenting the use of Sandvine/Procera devices to redirect users in Turkey and Syria to spyware, as well as the use of such devices to hijack the Internet users’ connections in Egypt, redirecting them to revenue-generating content. These examples highlight some of the ways in which this technology can be used for malicious purposes. The report revealed how Citizen Lab researchers identified a series of devices on the networks of Türk Telekom—a large and previously state-owned ISP in Turkey—being used to redirect requests from users in Turkey and Syria who attempted to download certain common Windows applications like antivirus software and web browsers. Through the use of Sandvine/Procera technology, these users were instead redirected to versions of those applications that contained hidden malware. 
    • Citizen Lab made a number of recommendations:
      • Reform Canadian export law:  
        • Clarify that all Canadian exports are subject to the mandatory analysis set out in section 7.3(1) and section 7.4 of the Export and Import Permits Act (EIPA). 
        • Amend section 3(1) the EIPA such that the human rights risks of an exported good or technology provide an explicit basis for export control.
        • Amend the EIPA to include a ‘catch-all’ provision that subjects cyber-surveillance technology to export control, even if not listed on the Export Control List, when there is evidence that the end-use may be connected with internal repression and/or the commission of serious violations of international human rights or international humanitarian law. 
      • Implement mandatory human rights due diligence legislation:
        • Similar to the French duty of vigilance law, impose a human rights due diligence requirement on businesses such that they are required to perform human rights risk assessments, develop mitigation strategies, implement an alert system, and develop a monitoring and public reporting scheme. 
        • Ensure that the mandatory human rights due diligence legislation provides a statutory mechanism for liability where a company fails to conform with the requirements under the law. 
      • Expand and strengthen the Canadian Ombudsperson for Responsible Enterprise (CORE): 
        • Expand the CORE’s mandate to cover technology sector businesses operating abroad.
        • Expand the CORE’s investigatory mandate to include the power to compel companies and executives to produce testimony, documents, and other information for the purposes of joint and independent fact-finding.
        • Strengthen the CORE’s powers to hold companies to account for human rights violations abroad, including the power to impose fines and penalties and to impose mandatory orders.
        • Expand the CORE’s mandate to assist victims to obtain legal redress for human rights abuses. This could include the CORE helping enforce mandatory human rights due diligence requirements, imposing penalties and/or additional statutory mechanisms for redress when requirements are violated.
        • Increase the CORE’s budgetary allocations to ensure that it can carry out its mandate.
  • A week before the United States’ (U.S.) election, the White House’s Office of Science and Technology Policy (OSTP) issued a report titled “Advancing America’s Global Leadership in Science and Technology: Trump Administration Highlights from the Trump Administration’s First Term: 2017-2020,” that highlights the Administration’s purported achievements. OSTP claimed:
    • Over the past four years, President Trump and the entire Administration have taken decisive action to help the Federal Government do its part in advancing America’s global science and technology (S&T) preeminence. The policies enacted and investments made by the Administration have equipped researchers, health professionals, and many others with the tools to tackle today’s challenges, such as the COVID-19 pandemic, and have prepared the Nation for whatever the future holds.

Coming Events

  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Brett Sayles from Pexels

Further Reading and Other Developments (11 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The United States District Court of Maine denied a motion by a number of telecommunications trade associations to enjoin enforcement of a new Maine law instituting privacy practices for internet service providers (ISP) in the state that limited information collection and processing. The plaintiffs claimed the 2017 repeal of the Federal Communications Commission’s (FCC) 2016 ISP Privacy Order preempted states from implementing their own privacy rules for ISPs. In its decision, the court denied the plaintiffs’ motion and will proceed to decide the merits of the case.
  • The European Data Protection Board (EDPB) has debuted a “One-Stop-Shop” register “containing decisions taken by national supervisory authorities following the One-Stop-Shop cooperation procedure (Art. 60 GDPR).” The EDPB explained “[u]nder the GDPR, Supervisory Authorities have a duty to cooperate on cases with a cross-border component to ensure a consistent application of the regulation – the so-called one-stop-shop (OSS) mechanism…[and] [u]nder the OSS, the Lead Supervisory Authority (LSA) is in charge of preparing the draft decisions and works together with the concerned SAs to reach consensus.” Hence this new repository will contain the decisions on which EU data protection authorities have cooperated in addressing alleged GDPR violations that reach across the borders of EU nations.
  • The chair of the House Energy and Commerce Committee and three subcommittee chairs wrote Facebook, Google, and Twitter asking the companies “provide the Committee with monthly reports similar in scope to what you are providing the European Commission regarding your COVID-19 disinformation efforts as they relate to United States users of your platform.” They are also asking that the companies brief them and staff on 22 July on these efforts. Given the Committee’s focus on disinformation, it is quite possible these monthly reports and the briefing could be the basis of more hearings and/or legislation. Chair Frank Pallone, Jr. (D-NJ), Oversight and Investigations Subcommittee Chair Diana DeGette (D-CO), Communications and Technology Subcommittee Chair Mike Doyle (D-PA) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) signed the letters.
  • Reports indicate the Federal Trade Commission (FTC) and Department of Justice (DOJ) are reviewing the February 2019 $5.7 million settlement between the FTC and TikTok for violating the Children’s Online Privacy Protection Act (COPPA). In May 2020, a number of public advocacy groups filed a complaint with the FTC, asking whether the agency has “complied with the consent decree.” If TikTok has violated the order, it could face huge fines as the FTC and DOJ could seek a range of financial penalties. This seems to be another front in the escalating conflict between the United States and the People’s Republic of China.
  • Tech Inquiry, an organization that “seek[s] to combat abuses in the tech industry through coupling concerned tech workers with relevant members of civil society” revealed “an in-depth analysis of all public US federal (sub)contracting data over the last four and a half years to estimate the rankings of tech companies, both in and out of Silicon Valley, as contractors with the military, law enforcement, and diplomatic arms of the United States.” Tech Inquiry claimed “[o]ur analysis shows a diversity of contracting postures (see Tables 2 and 3), not a systemic divide from Washington. Within a substantial list of namebrand tech companies, only Facebook, Apple, and Twitter look to be staying out of major military and law enforcement contracts.”
  • The United States Secret Service announced the formation of a new Cyber Fraud Task Force (CFTF) which merges “its Electronic Crimes Task Forces (ECTFs) and Financial Crimes Task Forces (FCTFs) into a single unified network.” The rationale given for the merger is “the line between cyber and financial crimes has steadily blurred, to the point today where the two – cyber and financial crimes – cannot be effectively disentangled.”
  • The United States Election Assistance Commission (EAC) held a virtual public hearing, “Lessons Learned from the 2020 Primary Elections” “to discuss the administration of primary elections during the coronavirus pandemic.”
  • The National Council of Statewide Interoperability Coordinators (NCSWIC), a Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) administered program, released its “NCSWIC Strategic Plan and Implementation Guide,” “a stakeholder-driven, multi-jurisdictional, and multi-disciplinary plan to enhance interoperable and emergency communications.” NCSWIC contended “[t]he plan is a critical mid-range (three-year) tool to help NCSWIC and its partners prioritize resources, strengthen governance, identify future investments, and address interoperability gaps.”
  • Access Now is pressing “video conferencing platforms” other than Zoom to issue “regular transparency reports… clarifying exactly how they protect personal user data and enforce policies related to freedom of expression.”

Further Reading

  • India bans 59 Chinese apps, including TikTok and WeChat, after deadly border clash” – South China Morning Post. As a seeming extension to the military skirmish India and the People’s Republic of China (PRC) engaged in, a number of PRC apps have been banned by the Indian government, begging the question of whether there will be further escalation between the world’s two most populous nations. India is the TikTok’s biggest market with more than 120 million users in the South Asian country, and a range of other apps and platforms also have millions of users. Most of the smartphones used in India are made by PRC entities. Moreover, if New Delhi joins Washington’s war on Huawei, ZTE, and other PRC companies, the cumulative effect could significantly affect the PRC’s global technological ambitions.
  • Huawei data flows under fire in German court case” – POLITICO. A former Huawei employee in Germany has sued the company alleging violations of the General Data Protection Regulation (GDPR) through the company’s use of standard contractual clauses. This person requested the data the company had collected from him and the reasons for doing so. Huawei claimed it had deleted the data. A German court’s decision that Huawei had violated the GDPR is being appealed. However, some bigger issues are raised by the case, including growing unease within the European Union, that People’s Republic of China firms are possibly illegally transferring and processing EU citizens’ data and a case before Europe’s highest court in which the legality of standard contractual clauses may be determined as early as this month.
  • Deutsche Telekom under pressure after reports on Huawei reliance” – Politico. A German newspaper reported on confidential documents showing that Deutsche Telekom deepened its relationship with Huawei as the United States’ government was pressuring its allies and other nations to stop using the equipment and services of the company. The German telecommunications company denied the claims, and a number of German officials expressed surprise and dismay, opining that the government of Chancellor Angela Merkel should act more swiftly to implement legislation to secure Germany’s networks.
  • Inside the Plot to Kill the Open Technology Fund” – Vice. According to critics, the Trump Administration’s remaking of the United States (US) Agency for Global Media (USAGM) is threatening the mission and effectiveness of the Open Technology Fund (OTF), a US government non-profit designed to help dissidents and endangered populations throughout the world. The OTF has funded a number of open technology projects, including the Signal messaging app, but the new USAGM head, Michael pack, is pressing for closed source technology.
  • How Police Secretly Took Over a Global Phone Network for Organized Crime” – Vice. European law enforcement agencies penetrated and compromised an encrypted messaging service in Europe, leading to a number of arrests and seizures of drugs. Encrochat had billed itself as completely secure, but hackers with the French government broke into the system and laid bare the details of numerous crimes. And, this is only the latest encrypted app that is marketed to criminals, meaning others will soon step into the void created when Encrochat shut down.
  • Virus-Tracing Apps Are Rife With Problems. Governments Are Rushing to Fix Them.” – The New York Times. In numerous nations around the world, the rush to design and distribute contact tracing apps to fight COVID-19 has resulted in a host of problems predicted by information technology professionals and privacy, civil liberties and human rights advocates. Some apps collect too much information, many are not secure, and some do not seem to perform their intended tasks. Moreover, without mass adoption, the utility of an app is questionable at best. Some countries have sought to improve and perfect their apps in response to criticism, but others are continuing to use and even mandate their citizens and residents use them.
  • Hong Kong Security Law Sets Stage for Global Internet Fight” – The New York Times. After the People’s Republic of China (PRC) passed a new law that strips many of the protections Hong Kong enjoyed, technology companies are caught in a bind, for now Hong Kong may well start demanding they hand over data on people living in Hong Kong or employees could face jail time. Moreover, the data demands made of companies like Google or Facebook could pertain to people anywhere in the world. Companies that comply with Beijing’s wishes would likely face turbulence in Washington and vice versa. TikTok said it would withdraw from Hong Kong altogether.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gino Crescoli from Pixabay