Further Reading, Other Developments, and Coming Events (16 February 2021)

Further Reading

  • India cuts internet around New Delhi as protesting farmers clash with police” By Esha Mitra and Julia Hollingsworth — CNN; “Twitter Temporarily Blocked Accounts Critical Of The Indian Government” By Pranav Dixit — BuzzFeed News. Prime Minister Narendra Modi’s government again shut down the internet as a way of managing unrest or discontent with government policies. The parties out of power have registered their opposition, but the majority seems intent on using this tactic time and again. One advocacy organization named India as the nation with the most shutdowns in 2019, by far. The government in New Delhi also pressed Twitter to take down tweets and accounts critical of the proposed changes in agricultural law. Twitter complied per its own policies and Indian law and then later restored the accounts and tweets.
  • Lacking a Lifeline: How a federal effort to help low-income Americans pay their phone bills failed amid the pandemic” By Tony Romm — The Washington Post. An excellent overview of this Federal Communications Commission (FCC) program and its shortcomings. The Trump era FCC blunted and undid Obama era FCC reforms designed to make the eligibility of potential users easier to discern, among other changes. At the end of the day, many enrollees are left with a fixed number of minutes for phone calls and 4GB of data a month, or roughly what my daughter often uses in a day.
  • She exposed tech’s impact on people of color. Now, she’s on Biden’s team.” By Emily Birnbaum — Protocol. The new Deputy Director for Science and Society in the Office of Science and Technology Policy (OSTP) is a former academic and researcher who often focused her studies on the intersection of race and technology, usually how the latter failed minorities. This is part of the Biden Administration’s fulfillment of its campaign pledges to establish a more inclusive White House. It remains to be seen how the administration will balance the views of those critical of big technology with those hailing from big technology as a number of former high ranking employees have already joined or are rumored to be joining the Biden team.
  • Vaccine scheduling sites are terrible. Can a new plan help Chicago fix them?” By Issie Lapowsky — Protocol. As should not be shocking, many jurisdictions across the country have problematic interfaces for signing up for vaccination against COVID-19. It sounds reminiscent of the problems that plagued the Obamacare exchanges rollout in that potentially well thought out policy was marred by a barely thought out public face.
  • Google launches News Showcase in Australia in sign of compromise over media code” By Josh Taylor — The Guardian; “Cracks in media code opposition as Microsoft outflanks Google and Facebook” By Lisa Visentin — The Sydney Morning Herald. Both Google and Canberra seem to be softening their positions as the company signed up a number of major media outlets for its News Showcase, a feature that will be made available in Australia that will compensate the news organizations at an undisclosed level. However, a few major players, Nine, News Corp., and the Australian Broadcasting Corporation, have not joined, with Nine saying it will not. Google’s de-escalation of rhetoric and tactics will likely allow Prime Minister Scott Morrison’s government to relax the proposed legislation that would mandate Google and Facebook compensate Australian news media (i.e., the News Media and Digital Platforms Mandatory Bargaining Code.) Microsoft’s theoretical entrance into the Australian market through Bing if Google and Facebook actually leave or limit their presence seems to be arguing against the latter two companies’ position that the new code is unworkable. It is not clear if Microsoft is acting earnestly or floating a possible scenario in order that the other companies be cast in a bad light. In any event, cristics of the platforms say the fight is not about the technical feasibility of compensating news media but rather about establishing a precedent of paying for content the platforms now get essentially for free. Other content creators and entities could start demanding payment, too. An interesting tidbit from the second article: Canada may soon join Australia and the European Union in enacting legislation requiring Big Tech to pay its media companies for using their content (i.e., “a more equitable digital regulatory framework across platforms and news media” according to a minister.)

Other Developments

  • The Maryland legislature overrode Governor Larry Hogan’s (R) veto, and the first tax on digital advertising has been enacted in the United States. The “Taxation – Tobacco Tax, Sales and Use Tax, and Digital Advertising Gross Revenues Tax” (HB0732) would impose a tax on digital advertising in the state and may be outside a federal bar on certain taxes on internet services. However, if the veto is overridden, there will inevitably be challenges, and quite likely a push in Congress to enact a federal law preempting such digital taxes. Additionally, the primary sponsor of the legislation has introduced another bill barring companies from passing along the costs of the tax to Maryland businesses and consumers.
    • In a bill analysis, the legislature asserted about HB0732:
      • The bill imposes a tax on the annual gross revenues of a person derived from digital advertising services in the State. The bill provides for the filing of the tax returns and making tax payments. The part of the annual gross revenues of a person derived from digital advertising services in the State are to be determined using an apportionment fraction based on the annual gross revenues of a person derived from digital advertising services in the State and the annual gross revenues of a person derived from digital advertising services in the United States. The Comptroller must adopt regulations that determine the state from which revenues from digital advertising services are derived.
      • The digital advertising gross revenues tax is imposed at the following rates:
        • 2.5% of the assessable base for a person with global annual gross revenues of $100.0 million through $1.0 billion;
        • 5% of the assessable base for a person with global annual gross revenues of $1.0 billion through $5.0 billion;
        • 7.5% of the assessable base for a person with global annual gross revenues of $5.0 billion through $15.0 billion; and
        • 10% of the assessable base for a person with global annual gross revenues exceeding $15.0 billion.
    • In his analysis, Maryland’s Attorney General explained:
      • House Bill 732 would enact a new “digital advertising gross revenues tax.” The tax would be “imposed on annual gross revenues of a person derived from digital advertising services in the State.” Digital advertising services are defined in the bill to include “advertisement services on a digital interface, including advertisements in the form of banner advertising, search engine advertising, interstitial advertising, and other comparable advertising services.” The annual gross revenues derived from digital advertising services is set out in a formula in the bill.
      • Attorney General Brian Frosh conceded there will be legal challenges to the new Maryland tax: there are “three grounds on which there is some risk that a reviewing court would find that the taxis unconstitutional: (1) preemption under the federal Internet Tax Freedom Act; (2) the Commerce Clause; and, (3) the First Amendment.”
  • Democratic Members introduced the “Secure Data and Privacy for Contact Tracing Act” (H.R.778/S.199) in both the House and Senate, legislation that “would provide grants to states that choose to use technology as part of contact tracing efforts for COVID-19 if they agree to adopt strong privacy protections for users” per their press release. Representatives Jackie Speier (D-CA) and Debbie Dingell (D-MI) introduced the House bill and Senators Brian Schatz (D-HI) and Tammy Baldwin (D-WI) the Senate version. Speier, Dingell, Schatz, and Baldwin contended “[t]he Secure Data and Privacy for Contact Tracing Actprovides grant funding for states to responsibly develop digital contact tracing technologies consistent with the following key privacy protections:
    • Digital contact tracing tech must be strictly voluntary and provide clear information on intended use.
    • Data requested must be minimized and proportionate to what is required to achieve contact tracing objectives.
    • Data must be deleted after contact tracing processing is complete, or at the end of the declaration of emergency.
    • States must develop a plan for how their digital contact tracing technology compliments more traditional contact tracing efforts and describe efforts to ensure their technology will be interoperable with other states. 
    • States must establish procedures for independent security assessments of digital contact tracing infrastructure and remediate vulnerabilities. 
    • Information gathered must be used strictly for public health functions authorized by the state and cannot be used for punitive measures, such as criminal prosecution or immigration enforcement.
    • Digital contact tracing tech must have robust detection capabilities consistent with CDC guidance on exposure. 
    • Digital contact tracing technology must ensure anonymity, allowing only authorized public health authorities or other authorized parties to have access to personally identifiable information.
  • The chair and ranking member of the Senate Intelligence Committee wrote the heads of the agencies leading the response to the Russian hack of the United States (U.S.) government and private sector entities through SolarWinds, taking them to task for their thus far cloistered, siloed approach. In an unusually blunt letter, Chair Mark Warner (D-VA) and Ranking Member Marco Rubio (R-FL) asked the agencies name a leader to the response triggered when former President Donald Trump triggered the system established in Presidential Policy Directive-41 because “[t]he federal government’s response so far has lacked the leadership and coordination warranted by a significant cyber event, and we have little confidence that we are on the shortest path to recovery.” Warner and Rubio directed this request to Director of National Intelligence Avril Haines, National Security Agency and Cyber Command head General Paul Nakasone, Federal Bureau of Investigation (FBI) Director Christopher Wray, and Cybersecurity and Infrastructure Security Agency (CISA) Acting Director Brandon Wales. Warner and Rubio further asserted:
    • The briefings we have received convey a disjointed and disorganized response to confronting the breach. Taking a federated rather than a unified approach means that critical tasks that are outside the central roles of your respective agencies are likely to fall through the cracks. The threat our country still faces from this incident needs clear leadership to develop and guide a unified strategy for recovery, in particular a leader who has the authority to coordinate the response, set priorities, and direct resources to where they are needed. The handling of this incident is too critical for us to continue operating the way we have been.
  • Huawei filed suit against the Federal Communications Commission’s (FCC) decision to “designate Huawei, as well as its parents, affiliates, and subsidiaries, as companies posing a national security threat to the integrity of our nation’s communications networks and the communications supply chain” through “In the Matter of Protecting Against National Security Threats to the Communications Supply Chain Through FCC Programs – Huawei Designation.” In the petition filed with the United States Court of Appeals for the Fifth Circuit, Huawei said it is “seek[ing] review of the Final Designation Order on the grounds that it exceeds the FCC’s statutory authority; violates federal law and the Constitution; is arbitrary, capricious, and an abuse of discretion, and not supported by substantial evidence, within the meaning of the Administrative Procedure Act, 5 U.S.C. § 701 et seq.; was adopted through a process that failed to provide Petitioners with the procedural protections afforded by the Constitution and the Administrative Procedure Act; and is otherwise contrary to law.”
  • According to unnamed sources, the Biden Administration has decided to postpone indefinitely the Trump Administration’s efforts to forcing ByteDance to sell TikTok as required by a Trump Administration executive order. Last September, it appeared that Oracle and Walmart had reached a deal in principle with ByteDance that quickly raised more questions that it settled (see here for more details and analysis.) There are reports of ByteDance working with the Committee on Foreign Investment in the United States (CFIUS), the inter-agency review group, that ordered ByteDance to spin off TikTok. TikTok and CFIUS are reportedly talking about what an acceptable divestment would look like, but of course, under recently implemented measures, the People’s Republic of China (PRC) would also have to sign off. Nonetheless, White House Press Secretary Jen Psaki remarked at a press conference “[t]here is a rigorous CFIUS process that is ongoing.”
  • The Biden Administration has asked two federal appeals courts to pause lawsuits brought to stop the United States (U.S.) government from enforcing the Trump Administration executive order banning TikTok from the United States (see here for more analysis.)
    • In the status report filed with the United States Court of Appeal for the District of Columbia, TikTok and the Department of Justice (DOJ) explained:
      • Defendants’ counsel informed Plaintiffs’ counsel regarding the following developments: As the Biden Administration has taken office, the Department of Commerce has begun a review of certain recently issued agency actions, including the Secretary’s prohibitions regarding the TikTok mobile application at issue in this case. In relation to those prohibitions, the Department plans to conduct an evaluation of the underlying record justifying those prohibitions. The government will then be better positioned to determine whether the national security threat described in the President’s August 6, 2020 Executive Order, and the regulatory purpose of protecting the security of Americans and their data, continue to warrant the identified prohibitions. The Department of Commerce remains committed to a robust defense of national security as well as ensuring the viability of our economy and preserving individual rights and data privacy.
    • In its unopposed motion, the DOJ asked the United States Court of Appeals for the Third Circuit “hold this case in abeyance, with status reports due at 60-day intervals.” The DOJ used exactly the same language as in the filing in the D.C. Circuit.
  • The Trump Administration’s President’s Council of Advisors on Science and Technology (PCAST) issued a report at the tail end of the  administration, “Industries of the Future Institutes: A New Model for American Science and Technology Leadership,” that “follows up on a recommendation from PCAST’s report, released June 30, 2020, involving the formation of a new type of multi-sector research and development organization: Industries of the Future Institutes (IotFIs)…[and] provides a framework to inform the design of IotFIs and thus should be used as preliminary guidance by funders and as a starting point for discussion among those considering participation.”
    • PCAST “propose[d] a revolutionary new paradigm for multi-sector collaboration—Industries of the Future Institutes (IotFIs)—to address some of the greatest societal challenges of our time and to ensure American science and technology (S&T) leadership for decades to come.” PCAST stated “[b]y driving research and development (R&D) at the intersection of two or more IotF areas, these Institutes not only will advance knowledge in the individual IotF topics, but they also will spur new research questions and domains of inquiry at their confluence.” PCAST added:
      • By engaging multiple disciplines and each sector of the U.S. R&D ecosystem—all within the same agile organizational framework—IotFIs will span the spectrum from discovery research to the development of new products and services at scale. Flexible intellectual property terms will incentivize participation of all sectors, and reduced administrative and regulatory burdens will optimize researcher time for creativity and productivity while maintaining appropriate safety, transparency, integrity, and accountability. IotFIs also will serve as a proving ground for new, creative approaches to organizational structure and function; broadening participation; workforce development; science, technology, engineering, and math education; and methods for engaging all sectors of the American research ecosystem. Ultimately, the fruits of IotFIs will sustain American global leadership in S&T, improve quality of life, and help ensure national and economic security for the future.
  • Per the European Commission’s (EC) request, the European Data Protection Board (EDPB) issued clarifications on the consistent application of the General Data Protection Regulation (GDPR) with a focus on health research. The EDPB explained:
    • The following response of the EDPB to the questions of the European Commission should be considered as a first attempt to take away some of the misunderstandings and misinterpretations as to the application of the GDPR to the domain of scientific health research. Generally speaking, most of these questions call for more time for in-depth analysis and/or a search for examples and best practices and can as yet not be completely answered.
    • In its guidelines (currently in preparation and due in 2021) on the processing personal data for scientific research purposes, the EDPB will elaborate further on these issues while also aiming to provide a more comprehensive interpretation of the various provisions in the GDPR that are relevant for the processing of personal data for scientific research purposes.
    • This will also entail a clarification of the extent and scope of the ‘special derogatory regime’ for the processing of personal data for scientific research purposes in the GDPR. It is important that this regime is not perceived as to imply a general exemption to all requirements in the GDPR in case of processing data for scientific research purposes. It should be taken into account that this regime only aims to provide for exceptions to specific requirements in specific situations and that the use of such exceptions is made dependent on ‘additional safeguards’ (Article 89(1) GDPR) to be in place.
  • The Government Accountability Office (GAO) has assessed how well the Federal Communications Commission (FCC) has rolled out and implemented its Lifeline National Verifier (referred to as Verifier by the GAO) to aid low income people in accessing telecommunications benefits. The Verifier was established in 2016 to address claims that allowing telecommunications carriers to make eligibility determinations for participation in the program to help people obtain lower cost communications had led to waste, fraud, and abuse. House Energy and Commerce Committee Chair Frank Pallone Jr. (D-NJ), Communications and Technology Subcommittee Chair Mike Doyle (D-PA), and six Democratic colleagues on the committee asked the GAO “to review FCC’s implementation of the Verifier.” The GAO explained “[t]his report examines (1) the status of the Verifier; (2) the extent to which FCC coordinated with state and federal stakeholders, educated consumers, and facilitated involvement of tribal stakeholders; and (3) the extent to which the Verifier is meeting its goals.” The GAO concluded:
    • The Lifeline program is an important tool that helps low-income Americans afford vital voice and broadband services. In creating the Lifeline National Verifier, FCC sought to facilitate eligible Americans’ access to Lifeline support while protecting the program from waste, fraud, and abuse. Although USAC, under FCC’s oversight, has made progress to implement the Verifier, many eligible consumers are unaware of it and may be unable to use it. Additionally, tribal governments and organizations do not have the information they need from FCC to effectively assist residents of tribal lands in using the Verifier to enroll in Lifeline, even though Lifeline support is critical to increasing access to affordable telecommunications services on tribal lands. Without FCC developing a plan to educate consumers about the Verifier and empowering tribal governments to assist residents of tribal lands with the Verifier, eligible consumers, especially those on tribal lands, will continue to lack awareness of the Verifier and the ability to use it.
    • Further, without measures and information to assess progress toward some of its goals, FCC lacks information it needs to refine and improve the Verifier. While it is too soon to determine if the Verifier is protecting against fraud, FCC has measures in place to monitor fraud moving forward. However, FCC lacks measures to track the Verifier’s progress toward the intent of its second goal of delivering value to Lifeline consumers. FCC also lacks information to help it assess and improve its efforts to meet the third goal of improving the consumer experience. Additionally, consumers may experience challenges with the Verifier’s online application, such as difficulty identifying the Verifier as a government service, and may be uncomfortable providing sensitive information to a website that does not use a “.gov” domain. Unless FCC identifies and addresses challenges with the Verifier’s manual review process and its online application, it will be limited in its ability to improve the consumer experience. As a result, some eligible consumers may abandon their applications and go without the support they need to access crucial telecommunications services. Given that a majority of Lifeline subscribers live in states without state database connections and therefore must undergo manual review more frequently, ensuring that challenges with the manual review process are resolved is particularly important.
    • The GAO recommended:
      • The Chairman of FCC should develop and implement a plan to educate eligible consumers about the Lifeline program and Verifier requirements that aligns with key practices for consumer education planning. (Recommendation 1)
      • The Chairman of FCC should provide tribal organizations with targeted information and tools, such as access to the Verifier, that equip them to assist residents of tribal lands with their Verifier applications. (Recommendation 2)
      • The Chairman of FCC should identify and use performance measures to track the Verifier’s progress in delivering value to consumers. (Recommendation 3)
      • The Chairman of FCC should ensure that it has quality information on consumers’ experience with the Verifier’s manual review process, and should use that information to improve the consumer experience to meet the Verifier’s goals. (Recommendation 4)
      • The Chairman of FCC should ensure that the Verifier’s online application and support website align with characteristics for leading federal website design, including that they are accurate, clear, understandable, easy to use, and contain a mechanism for users to provide feedback. (Recommendation 5)
      • The Chairman of FCC should convert the Verifier’s online application, checklifeline.org, to a “.gov” domain. (Recommendation 6)

Coming Events

  • The House Appropriations Committee’s Financial Services and General Government Subcommittee will hold an oversight hearing on the Election Assistance Commission (EAC) on 16 February with EAC Chair Benjamin Hovland.
  • On 17 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Connecting America: Broadband Solutions to Pandemic Problems” with these witnesses:
    • Free Press Action Vice President of Policy and General Counsel Matthew F. Wood
    • Topeka Public Schools Superintendent Dr. Tiffany Anderson
    • Communications Workers of America President Christopher M. Shelton
    • Wireless Infrastructure Association President and CEO Jonathan Adelstein
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Zachary Peterson on Unsplash

Further Reading, Other Developments, and Coming Events (2 February 2021)

Further Reading

  • I checked Apple’s new privacy ‘nutrition labels.’ Many were false.” By Geoffrey Fowler — The Washington Post. It turns out the blue check mark in Apple’s App Store signifying that an app does not collect personal data is based on the honor system. As the Post’s technology columnist learned, Apple tells users this in very small print: “This information has not been verified by Apple.” And so, as Fowler explains, this would seem contrary to the company’s claims of making user privacy a core value. Also, Apple’s definition of tracking is narrow, suggesting the company may be defining its way to being a champion of privacy. Finally, Apple’s practices in light of the coming changes to its iOS to defeat Facebook and others’ tracking people across digital space seem to belie the company’s PR and branding. It would seem like the Federal Trade Commission (FTC) and its overseas counterparts would be interested in such deceptive and unfair practices.
  • Lawmakers Take Aim at Insidious Digital ‘Dark Patterns’” By Tom Simonite — WIRED. Language in the “California Privacy Rights Act” (CPRA) makes consent gained through the use of “dark patterns” (i.e., all those cognitive tricks online and real-life entities use to slant the playing field against consumers) invalid. However, lest one celebrate that policymakers are addressing these underhanded means of gaining consent or selling things, the to be established California Privacy Protection Agency will need to define what dark patterns are and write the regulations barring whatever those will be. In Washington state, the sponsors of the Washington Privacy Act (SB 5062) copied the CPRA language, setting up the possibility Washington state could follow California. It remains to be seen how, or even if, federal privacy legislation proposals deal with dark patterns. And it well may considering that Senators Mark Warner (D-VA) and Deb Fischer (R-NE) introduced the “Deceptive Experiences To Online Users Reduction (DETOUR) Act” (S.1084) in 2019. Moreover, again, as in the previous article, one might think the Federal Trade Commission (FTC) and its overseas counterparts might be interested in policing dark patterns.
  • A PR screwup draws unwanted attention to Google’s Saudi data centers” By Issie Lapowsky — Protocol. The best case scenario is that Google and Snap misstated what cloud infrastructure and content are in the Kingdom of Saudi Arabia. And in this case, privacy and civil liberties groups are unfairly pouncing on the companies over essentially garbling the truth. On the other hand, it may turn out that the companies are routing traffic and content through the repressive regime, allowing a government with an abysmal human rights record to access the data of people. Time may tell what is actually happening, but the two companies are furiously telling the world that there’s nothing to see here.
  • China’s Leader Attacks His Greatest Threat” By John Pomfret — The Atlantic. Xi Jinping, President of the People’s Republic of China (PRC) and Chairman of the Chinese Communist Party (CCP) has accelerated a crack down on entrepreneurs and technology companies started by his predecessors. This would ultimately impair the PRC’s ambitions of becoming the world’s dominant power through technological superiority.
  • Why Is Big Tech Policing Speech? Because the Government Isn’t” By Emily Bazelon — The New York Times. The First Amendment to the United States (U.S.) Constitution is invariably cited in the online speech debate as a reason why people cannot be silenced and as to why social media platforms can silence whom they like. This is an interesting survey of this right in the U.S. and how democracies in Europe have a different understanding of permissible speech.

Other Developments

  • In a recent press conference, White House Press Secretary Jen Psaki shed light on how the Biden Administration will change United States (U.S.) policy towards the People’s Republic of China (PRC). In response to a question about how the U.S. government will deal with TikTok and the PRC generally, Psaki stated:
    • I think our approach to China remains what it has been since — for the last months, if not longer.  We’re in a serious competition with China.  Strategic competition with China is a defining feature of the 21st century.  China is engaged in conduct that it hurts American workers, blunts our technological edge, and threatens our alliances and our influence in international organizations.  
    • What we’ve seen over the last few years is that China is growing more authoritarian at home and more assertive abroad.  And Beijing is now challenging our security, prosperity, and values in significant ways that require a new U.S. approach. 
    • And this is one of the reasons, as we were talking about a little bit earlier, that we want to approach this with some strategic patience, and we want to conduct reviews internally, through our interagency….We wanted to engage more with Republicans and Democrats in Congress to discuss the path forward.  And most importantly, we want to discuss this with our allies. 
    • We believe that this moment requires a strategic and a new approach forward.
    • [T]echnology, as I just noted, is, of course, at the center of the U.S.-China competition.  China has been willing to do whatever it takes to gain a technological advantage — stealing intellectual property, engaging in industrial espionage, and forcing technology transfer.
    • Our view — the President’s view is we need to play a better defense, which must include holding China accountable for its unfair and illegal practices and making sure that American technologies aren’t facilitating China’s military buildup.
    • So he’s firmly committed to making sure that Chinese companies cannot misappropriate and misuse American data.  And we need a comprehensive strategy, as I’ve said, and a more systematic approach that actually addresses the full range of these issues.
    • So there is, again, an ongoing review of a range of these issues.  We want to look at them carefully, and we’ll be committed to approaching them through the lens of ensuring we’re protecting U.S. data and America’s technological edge. 
  • The top Republican on the House Foreign Affairs Committee is calling on Senate Republicans to block Governor Gina Raimondo’s nomination to be the Secretary of Commerce until the White House indicates whether they will keep Huawei on a list of entities to whom the United States (U.S.) restricts exports. Ranking Member Michael McCaul (R-TX) asserted:
    • It is incredibly alarming the Biden Administration has refused to commit to keeping Huawei on the Department of Commerce’s Entity List. Huawei is not a normal telecommunications company – it is a CCP military company that threatens 5G security in our country, steals U.S. intellectual property, and supports the Chinese Communist Party’s genocide in Xinjiang and their human rights abuses across the country. We need a Commerce Department with strong national security credentials and a Secretary with a clear understanding of the CCP threat. Saying people should not use Huawei and actually keeping them on the Entity List are two very different things that result in very different outcomes. I again strongly urge the Biden Administration to reconsider this dangerous position. Until they make their intentions clear on whether they will keep Huawei on the Entity List, I urge my Senate colleagues to hold Ms. Raimondo’s confirmation.
    • McCaul added this background:
      • After the Biden Administration’s nominee for Commerce Secretary, Gina Raimondo, caused heads to turn by refusing to commit to keeping Huawei on the Entity List, White House Press Secretary Jen Psaki seemed to double down by declining on two separate occasions when directly asked to say where President Biden stood on the issue.
      • Huawei was placed on the Commerce Department’s Entity List in August of 2019. Their addition to the Entity List was also one of the recommendations of the [House Republican’s] China Task Force Report.
  • The National Highway Traffic Safety Administration (NHTSA), an agency of the United States (U.S.) Department of Transportation (DOT) is asking for comment “on the Agency’s updated draft cybersecurity best practices document titled Cybersecurity Best Practices for the Safety of Modern Vehicles” according to the notice published in the Federal Register. Comments are due by 15 March 2021. NHTSA explained:
    • In October 2016, NHTSA issued its first best practices document focusing on the cybersecurity of motor vehicles and motor vehicle equipment.Cybersecurity Best Practices for Modern Vehicles (“2016 Best Practices”) was the culmination of years of extensive engagement with public and private stakeholders and NHTSA research on vehicle cybersecurity and methods of enhancing vehicle cybersecurity industry-wide. As explained in the accompanying Federal Register document, NHTSA’s 2016 Best Practices was released with the goal of supporting industry-led efforts to improve the industry’s cybersecurity posture and provide the Agency’s views on how the automotive industry could develop and apply sound risk-based cybersecurity management processes during the vehicle’s entire lifecycle.
    • The 2016 Best Practices leveraged existing automotive domain research as well as non-automotive and IT-focused standards such as the National Institute of Standards and Technology (NIST) Cybersecurity Framework and the Center for internet Security’s Critical Security Controls framework. NHTSA considered these sources to be reasonably applicable and appropriate to augment the limited industry-specific guidance that was available at the time. At publication, NHTSA noted that the 2016 Best Practices were intended to be updated with new information, research, and other cybersecurity best practices related to the automotive industry. NHTSA invited comments from stakeholders and interested parties in response to the document.
    • NHTSA is docketing a draft update to the agency’s 2016 Best Practices, titled Cybersecurity Best Practices for the Safety of Modern Vehicles (2020 Best Practices) for public comments. This update builds upon agency research and industry progress since 2016, including emerging voluntary industry standards, such as the ISO/SAE Draft International Standard (DIS) 21434, “Road Vehicles—Cybersecurity Engineering.” In addition, the draft update references a series of industry best practice guides developed by the Auto-ISAC through its members.
    • The 2020 Best Practices also reflect findings from NHTSA’s continued research in motor vehicle cybersecurity, including over-the-air updates, encryption methods, and building our capability in cybersecurity penetration testing and diagnostics, and the new learnings obtained through researcher and stakeholder engagement. Finally, the updates included in the 2020 Best Practices incorporate insights gained from public comments received in response to the 2016 guidance and from information obtained during the annual SAE/NHTSA Vehicle Cybersecurity Workshops.
  • Ireland’s Data Protection Commission (DPC) has released a draft Fundamentals for a Child-Oriented Approach to Data Processing Draft Version for Consultation (Fundamentals) for consultation until 31 March 2021. The DPC asserted the
    • Fundamentals have been drawn up by the Data Protection Commission (DPC) to drive improvements in standards of data processing. They introduce child-specific data protection interpretative principles and recommended measures that will enhance the level of protection afforded to children against the data processing risks posed to them by their use of/ access to services in both an online and offline world. In tandem, the Fundamentals will assist organisations that process children’s data by clarifying the principles, arising from the high-level obligations under the GDPR, to which the DPC expects such organisations to adhere.
    • The DPC “identified the following 14 Fundamentals that organisations should follow to enhance protections for children in the processing of their personal data:
      • 1. FLOOR OF PROTECTION: Online service providers should provide a “floor” of protection for all users, unless they take a risk-based approach to verifying the age of their users so that the protections set out in these Fundamentals are applied to all processing of children’s data (Section 1.4 “Complying with the Fundamentals”).
      • 2. CLEAR-CUT CONSENT: When a child has given consent for their data to be processed, that consent must be freely given, specific, informed and unambiguous, made by way of a clear statement or affirmative action (Section2.4 “Legal bases for processing children’s data”).
      • 3. ZERO INTERFERENCE: Online service providers processing children’s data should ensure that the pursuit of legitimate interests do not interfere with, conflict with or negatively impact, at any level, the best interests of the child (Section 2.4 “Legal bases for processing children’s data”).
      • 4. KNOW YOUR AUDIENCE: Online service providers should take steps to identify their users and ensure that services directed at/ intended for or likely to be accessed by children have child-specific data protection measures in place (Section 3.1 “Knowing your audience”)
      • 5. INFORMATION IN EVERY INSTANCE: Children are entitled to receive information about the processing of their own personal data irrespective of the legal basis relied on and even if consent was given by a parent on their behalf to the processing of their personal data (Section 3 “Transparency and children”).
      • 6. CHILD-ORIENTED TRANSPARENCY: Privacy information about how personal data is used must be provided in a concise, transparent, intelligible and accessible way, using clear and plain language that is comprehensible and suited to the age of the child (Section 3 “Transparency and children”).
      • 7 .LET CHILDREN HAVE THEIR SAY: Online service providers shouldn’t forget that children are data subjects in their own right and have rights in relation to their personal data at any age. The DPC considers that a child may exercise these rights at any time, as long as they have the capacity to do so and it is in their best interests. (Section 4.1 “The position of children as rights holders”)
      • 8. CONSENT DOESN’T CHANGE CHILDHOOD: Consent obtained from children or from the guardians/ parents should not be used as a justification to treat children of all ages as if they were adults (Section 5.1 “Age of digital consent”).
      • 9. YOUR PLATFORM, YOUR RESPONSIBILITY: Companies who derive revenue from providing or selling services through digital and online technologies pose particular risks to the rights and freedoms of children. Where such a company uses age verification and/ or relies on parental consent for processing, the DPC will expect it to go the extra mile in proving that its measures around age verification and verification of parental consent are effective. (Section 5.2 “Verification of parental consent)
      • 10. DON’T SHUT OUT CHILD USERS OR DOWNGRADE THEIR EXPERIENCE: If your service is directed at, intended for, or likely to be accessed by children, you can’t bypass your obligations simply by shutting them out or depriving them of a rich service experience. (Section 5.4 “Age verification and the child’s user experience”)
      • 11. MINIMUM USER AGES AREN’T AN EXCUSE: Theoretical user age thresholds for accessing services don’t displace the obligations of organisations to comply with the controller obligations under the GDPR and the standards and expectations set out in these Fundamentals where “underage” users are concerned. (Section 5.5 “Minimum user ages”)
      • 12. PROHIBITION ON PROFILING: Online service providers should not profile children and/ or carry out automated decision making in relation to children, or otherwise use their personal data, for marketing/advertising purposes due to their particular vulnerability and susceptibility to behavioural advertising, unless they can clearly demonstrate how and why it is in the best interests of the child to do so (Section 6.2 “Profiling and automated decision making”).
      • 13. DO A DPIA: Online service providers should undertake data protection impact assessments to minimise the data protection risks of their services, and in particular the specific risks to children which arise from the processing of their personal data. The principle of the best interests of the child must be a key criterion in any DPIA and must prevail over the commercial interests of an organisation in the event of a conflict between the two sets of interests (Section 7.1 “Data Protection Impact Assessments”).
      • 14. BAKE IT IN: Online service providers that routinely process children’s personal data should, by design and by default, have a consistently high level of data protection which is “baked in” across their services (Section 7.2 “Data Protection by Design and Default”)
  • The United Kingdom’s (UK) Competition and Markets Authority (CMA) “is now seeking evidence from academics and industry experts on the potential harms to competition and consumers caused by the deliberate or unintended misuse of algorithms…[and] is also looking for intelligence on specific issues with particular firms that the CMA could examine and consider for future action.” CMA stated “[t]he research and feedback will inform the CMA’s future work in digital markets, including its programme on analysing algorithms and the operation of the new Digital Markets Unit (DMU), and the brand-new regulatory regime that the DMU will oversee.” The CMA stated:
    • Algorithms can be used to personalise services in ways that are difficult to detect, leading to search results that can be manipulated to reduce choice or artificially change consumers’ perceptions. An example of this is misleading messages which suggest a product is in short supply.
    • Companies can also use algorithms to change the way they rank products on websites, preferencing their own products and excluding competitors. More complex algorithms could aid collusion between businesses without firms directly sharing information. This could lead to sustained higher prices for products and services.
    • The majority of algorithms used by private firms online are currently subject to little or no regulatory oversight and the research concludes that more monitoring and action is required by regulators, including the CMA. The CMA has already considered the impact of algorithms on competition and consumers in previous investigations, for example monitoring the pricing practices of online travel agents.
    • In the algorithms paper, the CMA explained:
      • The publication of this paper, and the accompanying call for information mark the launch of a new CMA programme of work on analysing algorithms, which aims to develop our knowledge and help us better identify and address harms. This paper reviews the potential harms to competition and consumers from the use of algorithms, focussing on those the CMA or other national competition or consumer authorities may be best placed to address.
      • We first describe direct harms to consumers, many of which involve personalisation. Personalisation can be harmful because it is difficult to detect either by consumers or others, targets vulnerable consumers or has unfair distributive effects. These harms often occur through the manipulation of consumer choices, without the awareness of the consumer.
      • The paper then explores how the use of algorithms can exclude competitors and so reduce competition (for example, a platform preferencing its own products). We outline the most recent developments in the algorithmic collusion literature; collusion appears an increasingly significant risk if the use of more complex pricing algorithms becomes widespread. We also describe how using ineffective algorithms to oversee platform activity fails to prevent harm.
      • Next, we summarise techniques that could be used to analyse algorithmic systems. Potentially problematic systems can be identified even without access to underlying algorithms and data. However, to understand fully how an algorithmic system works and whether consumer or competition law is being breached, regulators need appropriate methods to audit the system. We finally discuss the role of regulators. Regulators can help to set standards and facilitate better accountability of algorithmic systems, including support for the development of ethical approaches, guidelines, tools and principles. They can also use their information gathering powers to identify and remedy harms on either a case-by-case basis or as part of an ex-ante regime overseen by a regulator of technology firms, such as the proposed Digital Markets Unit (DMU) in the UK.
  • The National Institute of Standards and Technology (NIST) is making available for comment a draft of NIST Special Publication (SP) 800-47 Revision 1, Managing the Security of Information Exchanges, that “provides guidance on identifying information exchanges; risk-based considerations for protecting exchanged information before, during, and after the exchange; and example agreements for managing the protection of the exchanged information.” NIST is accepting comments through 12 March 2021. The agency stated:
    • Rather than focus on any particular type of technology-based connection or information access, this draft publication has been updated to define the scope of information exchange, describe the benefits of securely managing the information exchange, identify types of information exchanges, discuss potential security risks associated with information exchange, and detail a four-phase methodology to securely manage information exchange between systems and organizations. Organizations are expected to further tailor the guidance to meet specific organizational needs and requirements.
    • NIST is specifically interested in feedback on:
      • Whether the agreements addressed in the draft publication represent a comprehensive set of agreements needed to manage the security of information exchange.
      • Whether the matrix provided to determine what types of agreements are needed is helpful in determining appropriate agreement types.
      • Whether additional agreement types are needed, as well as examples of additional agreements.
      • Additional resources to help manage the security of information exchange.

Coming Events

  • On 3 February, the Senate Commerce, Science, and Transportation Committee will consider the nomination of Rhode Island Governor Gina Raimondo to be the Secretary of Commerce.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by John Howard from Pixabay

Trump Issues Cloud EO; Its Fate Is Unclear

On the last full day of his administration, then President Donald Trump issued an executive order (EO) “to address the use of United States Infrastructure as a Service (IaaS) products by foreign malicious cyber actors.” This EO follows an Obama Administration EO that set up a formal structure to sanction foreign entities for “significant malicious cyber activities” and seeks to further use the emergency powers granted to the President in the 1970’s to address the threats allegedly posed by malicious actors using IaaS (such as cloud computing) to inflict significant harm on the United States (U.S.)

Given the review the Biden Administration is undertaking, particularly of so-called “midnight” regulations and directives, this EO will undoubtedly be scrutinized by the new White House and possibly modified or even withdrawn. Therefore, it is not certain this EO will be implemented. And this will be especially so for this EO considering that the operative parts of the EO require notice and comment rulemaking, which will depend on how the Biden Administration wants to proceed. And, one can be sure that Amazon, Google, Microsoft, and other cloud and IaaS providers have been and will heavily lobby the White House, the agencies charged with implementing the EO, and Congress to exert pressure from Capitol Hill.

This EO is almost certainly aimed at nation-state hackers, groups affiliated by nation-states, non-criminal hackers, and criminal hackers. The U.S. government is implicitly asserting that these malicious actors are using the relative anonymity and infrastructure currently available through IaaS. The EO requires that IaaS providers keep more complete records on all foreign users, which will undoubtedly be portrayed as an onerous requirements by industry stakeholders, and condition or even prohibit service to certain countries or people if the U.S. determines the risk of malicious activity is too high.

An executive order, if implemented as planned, would require cloud and other IaaS providers to collect more information on foreign users and possibly limit or shut down service for potential malicious cyber-enabled actors.

Executive Order 13984 “Taking Additional Steps To Address the National Emergency With Respect to Significant Malicious Cyber-Enabled Activities” lays out the policy rationale for its issuance:

IaaS products provide persons the ability to run software and store data on servers offered for rent or lease without responsibility for the maintenance and operating costs of those servers. Foreign malicious cyber actors aim to harm the United States economy through the theft of intellectual property and sensitive data and to threaten national security by targeting United States critical infrastructure for malicious cyber-enabled activities. Foreign actors use United States IaaS products for a variety of tasks in carrying out malicious cyber-enabled activities, which makes it extremely difficult for United States officials to track and obtain information through legal process before these foreign actors transition to replacement infrastructure and destroy evidence of their prior activities; foreign resellers of United States IaaS products make it easier for foreign actors to access these products and evade detection.

The EO defines “Infrastructure as a Service Product”

any product or service offered to a consumer, including complimentary or “trial” offerings, that provides processing, storage, networks, or other fundamental computing resources, and with which the consumer is able to deploy and run software that is not predefined, including operating systems and applications. The consumer typically does not manage or control most of the underlying hardware but has control over the operating systems, storage, and any deployed applications. The term is inclusive of “managed” products or services, in which the provider is responsible for some aspects of system configuration or maintenance, and “unmanaged” products or services, in which the provider is only responsible for ensuring that the product is available to the consumer. The term is also inclusive of “virtualized” products and services, in which the computing resources of a physical machine are split between virtualized computers accessible over the internet (e.g., “virtual private servers”), and “dedicated” products or services in which the total computing resources of a physical machine are provided to a single person (e.g., “bare-metal” servers);

And so, the EO requires the Department of Commerce (Commerce) to “propose for notice and comment regulations that require United States IaaS providers to verify the identity of a foreign person that obtains an Account.” The EO suggests the criteria and metrics Commerce should consider using but largely leaves this determination to the agency. And, in the list of considerations, the first leverage point is found that industry will likely use: the discretionary authority Commerce will have to exempt certain U.S. IaaS which may include a finding that the IaaS “complies with security best practices to otherwise deter abuse of IaaS products.” Commerce will also need to propose regulations through a notice and comment rulemaking that requires U.S. IaaS providers to take “special measures” if Commerce determines that reasonable grounds exist to conclude a foreign jurisdiction has either a significant number of resellers or people offering U.S. IaaS for “malicious cyber-enabled activities” or a significant number of people are, in fact, using IaaS for these activities. Commerce may do the same if a foreign person, group of people, or entity are offering or using IaaS for malicious cyber-enabled activities. Another leverage point for U.S. IaaS and other stakeholders appears in this section because Commerce must consider:

(i) whether the imposition of any special measure would create a significant competitive disadvantage, including any undue cost or burden associated with compliance, for United States IaaS providers;

(ii) the extent to which the imposition of any special measure or the timing of the special measure would have a significant adverse effect on legitimate business activities involving the particular foreign jurisdiction or foreign person; and

(iii) the effect of any special measure on United States national security, law enforcement investigations, or foreign policy.

Commerce’s “special measures” consist of prohibiting or conditioning the use of U.S. IaaS. What these may look like may be spelled out in the draft regulations Commerce is required to undertake.

The Departments of Justice (DOJ) and Homeland Security (DHS) must study how to increase information sharing among IaaS providers, other stakeholders, and U.S. agencies with the aim of decreasing malicious cyber-enabled activities. Thereafter, DOJ and DHS would submit a report to the President identifying gaps in authority including protection from legal liability, statutes and regulations that could foster greater sharing of information, and the current landscape of threats posed by the sue of IaaS for malicious cyber-enabled activities.

Commerce also needs to work with some U.S. agencies to identify “funding requirements to support the efforts described in this order and incorporate such requirements into its annual budget submissions to the Office of Management and Budget” (OMB). In other words, agencies will need to fashion their budget requests to OMB to prioritize resources for the execution of this EO, which is not to say this will become their top policy priority. But, depending on the buy-in from OMB, this White House office could exert pressure on agencies to follow through in setting aside funds and executing this EO.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Magda Ehlers from Pexels

Further Reading, Other Development, and Coming Events (20 and 21 January 2021)

Further Reading

  • Amazon’s Ring Neighbors app exposed users’ precise locations and home addresses” By Zack Whittaker — Tech Crunch. Again Amazon’s home security platform suffers problems by way of users data being exposed or less than protected.
  • Harassment of Chinese dissidents was warning signal on disinformation” By Shawna Chen and Bethany Allen-Ebrahimian — Axios. In an example of how malicious online activities can spill into the real world as a number of Chinese dissidents were set upon by protestors.
  • How Social Media’s Obsession with Scale Supercharged Disinformation” By Joan Donovan — Harvard Business Review. Companies like Facebook and Twitter emphasized scale over safety in trying to grow as quickly as possible. This lead to a proliferation of fake accounts and proved welcome ground for the seeds of misinformation.
  • The Moderation War Is Coming to Spotify, Substack, and Clubhouse” By Alex Kantrowitz — OneZero. The same issues with objectionable and abusive content plaguing Twitter, Facebook, YouTube and others will almost certainly become an issue for the newer platforms, and in fact already are.
  • Mexican president mounts campaign against social media bans” By Mark Stevenson — The Associated Press. The leftist President of Mexico President Andrés Manuel López Obrador is vowing to lead international efforts to stop social media companies from censoring what he considers free speech. Whether this materializes into something substantial is not clear.
  • As Trump Clashes With Big Tech, China’s Censored Internet Takes His Side” By Li Yuan — The New York Times. The government in Beijing is framing the ban of former President Donald Trump after the attempted insurrection by social media platforms as proof there is no untrammeled freedom of speech. This position helps bolster the oppressive policing of online content the People’s Republic of China (PRC) wages against its citizens. And quite separately many Chinese people (or what appear to be actual people) are questioning what is often deemed the censoring of Trump in the United States (U.S.), a nation ostensibly committed to free speech. There is also widespread misunderstanding about the First Amendment rights of social media platforms not to host content with which they disagree and the power of platforms to make such determinations without fear that the U.S. government will punish them as is often the case in the PRC.
  • Trump admin slams China’s Huawei, halting shipments from Intel, others – sources” By Karen Freifeld and Alexandra Alper — Reuters. On its way out of the proverbial door, the Trump Administration delivered parting shots to Huawei and the People’s Republic of China by revoking one license and denying others to sell the PRC tech giant semiconductors. Whether the Biden Administration will reverse or stand by these actions remains to be seen. The companies, including Intel, could appeal. Additionally, there are an estimated $400 million worth of applications for similar licenses pending at the Department of Commerce that are now the domain of the new regime in Washington. It is too early to discern how the Biden Administration will maintain or modify Trump Administration policy towards the PRC.
  • Behind a Secret Deal Between Google and Facebook” By Daisuke Wakabayashi and Tiffany Hsu — The New York Times. The newspaper got its hands on an unredacted copy of the antitrust suit Texas Attorney General Ken Paxton and other attorneys general filed against Google, and it has details on the deal Facebook and Google allegedly struck to divide the online advertising world. Not only did Facebook ditch an effort launched by publishers to defeat Google’s overwhelming advantages in online advertising bidding, it joined Google’s rival effort with a guarantee that it would win a specified number of bids and more time to bid on ads. Google and Facebook naturally deny any wrongdoing.
  • Biden and Trump Voters Were Exposed to Radically Different Coverage of the Capitol Riot on Facebook” By Colin Lecher and Jon Keegan — The Markup. Using a tool on browsers the organization pays Facebook users to have, the Markup can track the type of material they see in their feed. Facebook’s algorithm fed people material about the 6 January 2021 attempted insurrection based on their political views. Many have pointed out that this very dynamic creates filter bubbles that poison democracy and public discourse.
  • Banning Trump won’t fix social media: 10 ideas to rebuild our broken internet – by experts” By Julia Carrie Wong — The Guardian. There are some fascinating proposals in this piece that could help address the problems of social media.
  • Misinformation dropped dramatically the week after Twitter banned Trump and some allies” By Elizabeth Dwoskin and Craig Timberg — The Washington Post. Research showed that lies, misinformation, and disinformation about election fraud dropped by three-quarters after former President Donald Trump was banned from Twitter and other platforms. Other research showed that a small group of conservatives were responsible for up to 20% of misinformation on this and other conspiracies.
  • This Was WhatsApp’s Plan All Along” By Shoshana Wodinsky — Gizmodo. This piece does a great job of breaking down into plain English the proposed changes to terms of service on WhatsApp that so enraged users that competitors Signal and Telegram have seen record-breaking downloads. Basically, it is all about reaping advertising dollars for Facebook through businesses and third-party partners using user data from business-related communications. Incidentally, WhatsApp has delayed changes until March because of the pushback.
  • Brussels eclipsed as EU countries roll out their own tech rules” By By Laura Kayali and Mark Scott — Politico EU. The European Union (EU) had a hard-enough task in trying to reach final language on a Digital Services Act and Digital Markets Act without nations like France, Germany, Poland, and others picking and choosing text from draft bills and enacting them into law. Brussels is not happy with this trend.

Other Developments

  • Federal Trade Commission (FTC) Chair Joseph J. Simons announced his resignation from the FTC effective on 29 January 2021 in keeping with tradition and past practice. This resignation clears the way for President Joe Biden to name the chair of the FTC, and along with FTC Commissioner Rohit Chopra’s nomination to head the Consumer Financial Protection Bureau (CFPB), the incoming President will get to nominate two Democratic FTC Commissioners, tipping the political balance of the FTC and likely ushering in a period of more regulation of the technology sector.
    • Simons also announced the resignation of senior staff: General Counsel Alden F. Abbott; Bureau of Competition Director Ian Conner; Bureau of Competition Deputy Directors Gail Levine and Daniel Francis; Bureau of Consumer Protection Director Andrew Smith; Bureau of Economics Director Andrew Sweeting; Office of Public Affairs Director Cathy MacFarlane; and Office of Policy Planning Director Bilal Sayyed.
  • In a speech last week before he sworn in, President Joe Biden announced his $1.9 trillion American Rescue Plan, and according to a summary, Biden will ask Congress to provide $10 billion for a handful of government facing programs to improve technology. Notably, Biden “is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks.” Biden is proposing to dramatically increase funding for a fund that would allow agencies to borrow and then pay back funds to update their technology. Moreover, Biden is looking to push more money to a program to aid officials at agencies who oversee technology development and procurement.
    • Biden stated “[t]o remediate the SolarWinds breach and boost U.S. defenses, including of the COVID-19 vaccine process, President-elect Biden is calling on Congress to:
      • Expand and improve the Technology Modernization Fund. ​A $9 billion investment will help the U.S. launch major new IT and cybersecurity shared services at the Cyber Security and Information Security Agency (CISA) and the General Services Administration and complete modernization projects at federal agencies. ​In addition, the president-elect is calling on Congress to change the fund’s reimbursement structure in order to fund more innovative and impactful projects.
      • Surge cybersecurity technology and engineering expert hiring​. Providing the Information Technology Oversight and Reform fund with $200 million will allow for the rapid hiring of hundreds of experts to support the federal Chief Information Security Officer and U.S. Digital Service.
      • Build shared, secure services to drive transformational projects. ​Investing$300 million in no-year funding for Technology Transformation Services in the General Services Administration will drive secure IT projects forward without the need of reimbursement from agencies.
      • Improving security monitoring and incident response activities. ​An additional $690M for CISA will bolster cybersecurity across federal civilian networks, and support the piloting of new shared security and cloud computing services.
  • The United States (U.S.) Department of Commerce issued an interim final rule pursuant to an executive order (EO) issued by former President Donald Trump to secure the United States (U.S.) information and communications supply chain. This rule will undoubtedly be reviewed by the Biden Administration and may be withdrawn or modified depending on the fate on the EO on which the rule relies.
    • In the interim final rule, Commerce explained:
      • These regulations create the processes and procedures that the Secretary of Commerce will use to identify, assess, and address certain transactions, including classes of transactions, between U.S. persons and foreign persons that involve information and communications technology or services designed, developed, manufactured, or supplied, by persons owned by, controlled by, or subject to the jurisdiction or direction of a foreign adversary; and pose an undue or unacceptable risk. While this interim final rule will become effective on March 22, 2021, the Department of Commerce continues to welcome public input and is thus seeking additional public comment. Once any additional comments have been evaluated, the Department is committed to issuing a final rule.
      • On November 27, 2019, the Department of Commerce (Department) published a proposed rule to implement the terms of the Executive Order. (84 FR 65316). The proposed rule set forth processes for (1) how the Secretary would evaluate and assess transactions involving ICTS to determine whether they pose an undue risk of sabotage to or subversion of the ICTS supply chain, or an unacceptable risk to the national security of the United States or the security and safety of U.S. persons; (2) how the Secretary would notify parties to transactions under review of the Secretary’s decision regarding the ICTS Transaction, including whether the Secretary would prohibit or mitigate the transaction; and (3) how parties to transactions reviewed by the Secretary could comment on the Secretary’s preliminary decisions. The proposed rule also provided that the Secretary could act without complying with the proposed procedures where required by national security. Finally, the Secretary would establish penalties for violations of mitigation agreements, the regulations, or the Executive Order.
      • In addition to seeking general public comment, the Department requested comments from the public on five specific questions: (1) Whether the Secretary should consider categorical exclusions or whether there are classes of persons whose use of ICTS cannot violate the Executive Order; (2) whether there are categories of uses or of risks that are always capable of being reliably and adequately mitigated; (3) how the Secretary should monitor and enforce any mitigation agreements applied to a transaction; (4) how the terms, “transaction,” “dealing in,” and “use of” should be clarified in the rule; and (5) whether the Department should add record-keeping requirements for information related to transactions.
      • The list of “foreign adversaries” consists of the following foreign governments and non-government persons: The People’s Republic of China, including the Hong Kong Special Administrative Region (China); the Republic of Cuba (Cuba); the Islamic Republic of Iran (Iran); the Democratic People’s Republic of Korea (North Korea); the Russian Federation (Russia); and Venezuelan politician Nicolás Maduro (Maduro Regime).
  • The Federal Trade Commission (FTC) adjusted its penalty amounts for inflation, including a boost to the per violation penalty virtually all the privacy bills introduced in the last Congress would allow the agency to wield against first-time violators. The penalty for certain unfair and deceptive acts or practices was increased from $43,280 to $43,792.
  • The United States (U.S.) Department of State stood up its new Bureau of Cyberspace Security and Emerging Technologies (CSET) as it has long planned. At the beginning of the Trump Administration, the Department of State dismantled the Cyber Coordinator Office and gave its cybersecurity portfolio to the Bureau of Economic Affairs, which displeased Congressional stakeholders. In 2019, the department notified Congress of its plan to establish CSET. The department asserted:
    • The need to reorganize and resource America’s cyberspace and emerging technology security diplomacy through the creation of CSET is critical, as the challenges to U.S. national security presented by China, Russia, Iran, North Korea, and other cyber and emerging technology competitors and adversaries have only increased since the Department notified Congress in June 2019 of its intent to create CSET.
    • The CSET bureau will lead U.S. government diplomatic efforts on a wide range of international cyberspace security and emerging technology policy issues that affect U.S. foreign policy and national security, including securing cyberspace and critical technologies, reducing the likelihood of cyber conflict, and prevailing in strategic cyber competition.  The Secretary’s decision to establish CSET will permit the Department to posture itself appropriately and engage as effectively as possible with partners and allies on these pressing national security concerns.
    • The Congressional Members of the Cyberspace Solarium Commission made clear their disapproval of the decision. Senators Angus King (I-ME) and Ben Sasse, (R-NE) and Representatives Mike Gallagher (R-WI) and Jim Langevin (D-RI) said:
      • In our report, we emphasize the need for a greater emphasis on international cyber policy at State. However, unlike the bipartisan Cyber Diplomacy Act, the State Department’s proposed Bureau will reinforce existing silos and […] hinder the development of a holistic strategy to promote cyberspace stability on the international stage. We urge President-elect Biden to pause this reorganization when he takes office in two weeks and work with Congress to enact meaningful reform to protect our country in cyberspace.
  • The Australian Cyber Security Centre (ACSC) the Risk Identification Guidance “developed to assist organisations in identifying risks associated with their use of suppliers, manufacturers, distributors and retailers (i.e. businesses that constitute their cyber supply chain)” and the Risk Management Guidance because “[c]yber supply chain risk management can be achieved by identifying the cyber supply chain, understanding cyber supply chain risk, setting cyber security expectations, auditing for compliance, and monitoring and improving cyber supply chain security practices.”
  • The United Kingdom’s Surveillance Camera Commissioner (SCC), issued “best practice guidance, ‘Facing the Camera’, to all police forces in England and Wales” The SCC explained that “The provisions of this document only apply to the use of facial recognition technology and the inherent processing of images by the police where such use is integral to a surveillance camera system being operated in ‘live time’ or ‘near real time’ operational scenarios.” Last summer, a British appeals court overturned a decision that found that a police force’s use of facial recognition technology in a pilot program that utilized live footage to be legal. The appeals court found the use of this technology by the South Wales Police Force a violation of “the right to respect for private life under Article 8 of the European  Convention  on  Human  Rights,  data  protection  legislation,  and  the  Public  Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.” The SCC stated:
    • The SCC considers surveillance to be an intrusive investigatory power where it is conducted by the police which impacts upon those fundamental rights and freedoms of people, as set out by the European Convention of Human Rights (ECHR) and the Human Rights Act 1998. In the context of surveillance camera systems which make use of facial recognition technology, the extent of state intrusion in such matters is significantly increased by the capabilities of algorithms which are in essence, integral to the surveillance conduct seeking to harvest information, private information, metadata, data, personal data, intelligence and evidence. Each of the aforementioned are bound by laws and rules which ought to be separately and jointly considered and applied in a manner which is demonstrably lawful and ethical and engenders public trust and confidence.
    • Whenever the police seek to use technology in pursuit of a legitimate aim, the key question arises as to whether the degree of intrusion which is caused to the fundamental freedoms of citizens by the police surveillance conduct using surveillance algorithms (biometric or otherwise) is necessary in a democratic society when considered alongside the legality and proportionality of their endeavours and intent. The type of equipment/technology/modality which they choose to use to that end (e.g. LFR, ANPR, thermal imaging, gait analysis, movement sensors etc), the manner in which such technological means are deployed, (such as using static cameras at various locations, used with body worn cameras or other mobile means), and whether such technology is used overtly alongside or networked with other surveillance technologies, are all factors which may significantly influence the depth of intrusion caused by police conduct upon citizen’s rights.
  • The Senate confirmed the nomination of Avril Haines to be the new Director of National Intelligence by an 89-10 vote after Senator Tom Cotton (R-AK) removed his hold on her nomination. However, Josh Hawley (R-MO) placed a hold on the nomination of Alejandro Mayorkas to be the next Secretary of Homeland Security and explained his action this way:
    • On Day 1 of his administration, President-elect Biden has said he plans to unveil an amnesty plan for 11 million immigrants in this nation illegally. This comes at a time when millions of American citizens remain out of work and a new migrant caravan has been attempting to reach the United States. Mr. Mayorkas has not adequately explained how he will enforce federal law and secure the southern border given President-elect Biden’s promise to roll back major enforcement and security measures. Just today, he declined to say he would enforce the laws Congress has already passed to secure the border wall system. Given this, I cannot consent to skip the standard vetting process and fast-track this nomination when so many questions remain unanswered.
  • Former Trump White House Cyber Coordinator Rob Joyce will replace the National Security Agency’s (NSA) Director of Cybersecurity Anne Neuberger who has been named the Biden White House’s Deputy National Security Advisor for Cyber and Emerging Technology. Anne Neuberger’s portfolio at the NSA included “lead[ing] NSA’s cybersecurity mission, including emerging technology areas like quantum-resistant cryptography.” Joyce was purged when former National Security Advisor John Bolton restructured the NSC in 2018, forcing out Joyce and his boss, former Homeland Security Advisor Tom Bossert. Presumably Joyce would have the same responsibilities. At the National Security Council, Neuberger would will work to coordinate cybersecurity and emerging technology policy across agencies and funnel policy options up to the full NSC and ultimately the President. This work would include Joyce.
  • The Supreme Court of the United States (SCOTUS) heard oral arguments on whether the Federal Trade Commission (FTC) Act gives the agency the power to seek monetary damages and restitution alongside permanent injunctions under Section 13(b). In AMG Capital Management, LLC v. FTC, the parties opposing the FTC argue the plain language of the statute does not allow for the seeking of restitution and monetary damages under this specific section of the FTC Act while the agency argues long accepted past practice and Congressional intent do, in fact, allow this relief to be sought when the FTC is seeking to punish violators of Section 5. The FTC is working a separate track to get a fix from Congress which could rewrite the FTC Act to make clear this sort of relief is legal. However, some stakeholders in the debate over privacy legislation may be using the case as leverage.
    • In October 2020, the FTC wrote the House and Senate committees with jurisdiction over the agency, asking for language to resolve the litigation over the power to seek and obtain restitution for victims of those who have violated Section 5 of the FTC Act and disgorgement of ill-gotten gains. The FTC is also asking that Congress clarify that the agency may act against violators even if their conduct has stopped as it has for more than four decades. Two federal appeals courts have ruled in ways that have limited the FTC’s long used powers, and now the Supreme Court of the United States is set to rule on these issues sometime next year. The FTC is claiming, however, that defendants are playing for time in the hopes that the FTC’s authority to seek and receive monetary penalties will ultimately be limited by the United States (U.S.) highest court. Judging by language tucked into a privacy bill introduced by the former chair of one of the committees, Congress may be willing to act soon.
    • The FTC asked the House Energy and Commerce and Senate Commerce, Science, and Transportation Committees “to take quick action to amend Section 13(b) [of the FTC Act i.e. 15 U.S.C. § 53(b)] to make clear that the Commission can bring actions in federal court under Section 13(b) even if conduct is no longer ongoing or impending when the suit is filed and can obtain monetary relief, including restitution and disgorgement, if successful.” The agency asserted “[w]ithout congressional action, the Commission’s ability to use Section 13(b) to provide refunds to consumer victims and to enjoin illegal activity is severely threatened.” All five FTC Commissioners signed the letter.
    • The FTC explained that adverse rulings by two federal appeals courts are constraining the agency from seeking relief for victims and punishment for violators of the FTC Act in federal courts below those two specific courts, but elsewhere defendants are either asking courts for a similar ruling or using delaying tactics in the hopes the Supreme Court upholds the two federal appeals courts:
      • …[C]ourts of appeals in the Third and Seventh Circuits have recently ruled that the agency cannot obtain any monetary relief under Section 13(b). Although review in the Supreme Court is pending, these lower court decisions are already inhibiting our ability to obtain monetary relief under 13(b). Not only do these decisions already prevent us from obtaining redress for consumers in the circuits where they issued, prospective defendants are routinely invoking them in refusing to settle cases with agreed-upon redress payments.
      • Moreover, defendants in our law enforcement actions pending in other circuits are seeking to expand the rulings to those circuits and taking steps to delay litigation in anticipation of a potential Supreme Court ruling that would allow them to escape liability for any monetary relief caused by their unlawful conduct. This is a significant impediment to the agency’s effectiveness, its ability to provide redress to consumer victims, and its ability to prevent entities who violate the law from profiting from their wrongdoing.
  • The United Kingdom’s Information Commissioner’s Office (ICO) issued guidance for British entities that may be affected by the massive SolarWinds hack that has compromised many key systems in the United States. The ICO advised:
    • Organisations should immediately check whether they are using a version of the software that has been compromised. These are versions 2019.4 HF 5, 2020.2 with no hotfix installed, and 2020.2 HF 1.
    • Organisations must also determine if the personal data they hold has been affected by the cyber-attack. If a reportable personal data breach is found, UK data controllers are required to inform the ICO within 72 hours of discovering the breach. Reports can be submitted online or organisations can call the ICO’s personal data breach helpline for advice on 0303 123 1113, option 2.
    • Organisations subject to the NIS Regulation will also need to determine if this incident has led to a “substantial impact on the provision’ of its digital services and report to the ICO.
  • Europol announced the takedown of “the world’s largest illegal marketplace on the dark web” in an operation coordinated by the following nations: “Germany, Australia, Denmark, Moldova, Ukraine, the United Kingdom (the National Crime Agency), and the USA (DEA, FBI, and IRS).” Europol added:
    • The Central Criminal Investigation Department in the German city of Oldenburg arrested an Australian citizen who is the alleged operator of DarkMarket near the German-Danish border over the weekend. The investigation, which was led by the cybercrime unit of the Koblenz Public Prosecutor’s Office, allowed officers to locate and close the marketplace, switch off the servers and seize the criminal infrastructure – more than 20 servers in Moldova and Ukraine supported by the German Federal Criminal Police office (BKA). The stored data will give investigators new leads to further investigate moderators, sellers, and buyers. 
  • The Enforcement Bureau (Bureau) of the Federal Communications Commission (FCC) issued an enforcement advisory intended to remind people that use of amateur and personal radios to commit crimes is itself a criminal offense that could warrant prosecution. The notice was issued because the FCC is claiming it is aware of discussion by some of how these means of communications may be superior to social media, which has been cracking down on extremist material since the attempted insurrection at the United States Capitol on 6 January. The Bureau stated:
    • The Bureau has become aware of discussions on social media platforms suggesting that certain radio services regulated by the Commission may be an alternative to social media platforms for groups to communicate and coordinate future activities.  The Bureau recognizes that these services can be used for a wide range of permitted purposes, including speech that is protected under the First Amendment of the U.S. Constitution.  Amateur and Personal Radio Services, however, may not be used to commit or facilitate crimes. 
    • Specifically, the Bureau reminds amateur licensees that they are prohibited from transmitting “communications intended to facilitate a criminal act” or “messages encoded for the purpose of obscuring their meaning.” Likewise, individuals operating radios in the Personal Radio Services, a category that includes Citizens Band radios, Family Radio Service walkie-talkies, and General Mobile Radio Service, are prohibited from using those radios “in connection with any activity which is against Federal, State or local law.” Individuals using radios in the Amateur or Personal Radio Services in this manner may be subject to severe penalties, including significant fines, seizure of the offending equipment, and, in some cases, criminal prosecution.
  • The European Data Protection Board (EDPB) issued its “Strategy for 2021-2023” in order “[t]o be effective in confronting the main challenges ahead.” The EDPB cautioned:
    • This Strategy does not provide an exhaustive overview of the work of the EDPB in the years to come. Rather it sets out the four main pillars of our strategic objectives, as well as set of key actions to help achieve those objectives. The EDPB will implement this Strategy within its Work Program, and will report on the progress achieved in relation to each Pillar as part of its annual reports.
    • The EDPB listed and explained the four pillars of its strategy:
      • PILLAR 1: ADVANCING HARMONISATION AND FACILITATING COMPLIANCE. The EDPB will continue to strive for a maximum degree of consistency in the application of data protection rules and limit fragmentation among Member States. In addition to providing practical, easily understandable and accessible guidance, the EDPB will develop and promote tools that help to implement data protection into practice, taking into account practical experiences of different stakeholders on the ground.
      • PILLAR 2: SUPPORTING EFFECTIVE ENFORCEMENT AND EFFICIENT COOPERATION BETWEEN NATIONAL SUPERVISORY AUTHORITIES. The EDPB is fully committed to support cooperation between all national supervisory authorities that work together to enforce European data protection law. We will streamline internal processes, combine expertise and promote enhanced coordination. We intend not only to ensure a more efficient functioning of the cooperation and consistency mechanisms, but also to strive for the development of a genuine EU-wide enforcement culture among supervisory authorities.
      • PILLAR 3: A FUNDAMENTAL RIGHTS APPROACH TO NEW TECHNOLOGIES. The protection of personal data helps to ensure that technology, new business models and society develop in accordance with our values, such as human dignity, autonomy and liberty. The EDPB will continuously monitor new and emerging technologies and their potential impact on the fundamental rights and daily lives of individuals. Data protection should work for all people, particularly in the face of processing activities presenting the greatest risks to individuals’ rights and freedoms (e.g. to prevent discrimination). We will help to shape Europe’s digital future in line with our common values and rules. We will continue to work with other regulators and policymakers to promote regulatory coherence and enhanced protection for individuals.
      • PILLAR 4: THE GLOBAL DIMENSION. The EDPB is determined to set and promote high EU and global standards for international data transfers to third countries in the private and the public sector, including in the law enforcement sector. We will reinforce our engagement with the international community to promote EU data protection as a global model and to ensure effective protection of personal data beyond EU borders.
  • The United Kingdom’s (UK) Information Commissioner’s Office (ICO) revealed that all but one of the videoconferencing platforms it and other data protection authorities’ (DPA) July 2020 letter urging them to “adopt principles to guide them in addressing some key privacy risks.” The ICO explained:
    • Microsoft, Cisco, Zoom and Google replied to the open letter. The joint signatories thank these companies for engaging on this important matter and for acknowledging and responding to the concerns raised. In their responses the companies highlighted various privacy and security best practices, measures, and tools that they advise are implemented or built-in to their video teleconferencing services.
    • The information provided by these companies is encouraging. It is a constructive foundation for further discussion on elements of the responses that the joint signatories feel would benefit from more clarity and additional supporting information.
    • The ICO stated:
      • The joint signatories have not received a response to the open letter from Houseparty. They strongly encourage Houseparty to engage with them and respond to the open letter to address the concerns raised.
  • The European Union Agency for Cybersecurity (ENISA) “launched a public consultation, which runs until 7 February 2021, on its draft of the candidate European Union Cybersecurity Certification Scheme on Cloud Services (EUCS)…[that] aims to further improve the Union’s internal market conditions for cloud services by enhancing and streamlining the services’ cybersecurity guarantees.” ENISA stated:
    • There are challenges to the certification of cloud services, such as a diverse set of market players, complex systems and a constantly evolving landscape of cloud services, as well as the existence of different schemes in Member States. The draft EUCS candidate scheme tackles these challenges by calling for cybersecurity best practices across three levels of assurance and by allowing for a transition from current national schemes in the EU. The draft EUCS candidate scheme is a horizontal and technological scheme that intends to provide cybersecurity assurance throughout the cloud supply chain, and form a sound basis for sectoral schemes.
    • More specifically, the draft EUCS candidate scheme:
      • Is a voluntary scheme;
      • The scheme’s certificates will be applicable across the EU Member States;
      • Is applicable for all kinds of cloud services – from infrastructure to applications;
      • Boosts trust in cloud services by defining a reference set of security requirements;
      • Covers three assurance levels: ‘Basic’, ‘Substantial’ and ‘High’;
      • Proposes a new approach inspired by existing national schemes and international standards;
      • Defines a transition path from national schemes in the EU;
      • Grants a three-year certification that can be renewed;
      • Includes transparency requirements such as the location of data processing and storage.

Coming Events

  • The Commerce, Science, and Transportation Committee will hold a hearing on the nomination of Gina Raimondo to be the Secretary of Commerce on 26 January.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Peggy und Marco Lachmann-Anke from Pixabay

Preview of Senate Democratic Chairs

It’s not clear who will end up where, but new Senate chairs will change focus and agenda of committees and debate over the next two years.

With the victories of Senators-elect Rafael Warnock (D-GA) and Jon Ossoff (D-GA), control of the United States Senate will tip to the Democrats once Vice President-elect Kamala Harris (D) is sworn in and can break the 50-50 tie in the chamber in favor of the Democrats. With the shift in control, new chairs will take over committees key to setting the agenda over the next two years in the Senate. However, given the filibuster, and the fact that Senate Republicans will exert maximum leverage through its continued use, Democrats will be hamstrung and forced to work with Republicans on matters such as federal privacy legislation, artificial intelligence (AI), the Internet of Things (IOT), cybersecurity, data flows, surveillance, etc. just as Republicans have had to work with Democrats over the six years they controlled the chamber. Having said that, Democrats will be in a stronger position than they had been and will have the power to set the agenda in committee hearings, being empowered to call the lion’s share of witnesses and to control the floor agenda. What’s more, Democrats will be poised to confirm President-elect Joe Biden’s nominees at agencies like the Federal Communications Commission (FCC), Federal Trade Commission (FTC), the Department of Justice (DOJ), and others, giving the Biden Administration a free hand in many areas of technology policy.

All of that being said, this is not meant to be an exhaustive look at all the committees of jurisdiction and possible chairs. Rather, it seeks to survey likely chairs on selected committees and some of their priorities for the next two years. Subcommittee chairs will also be important, but until the cards get shuffled among the chairs, it will not be possible to see where they land at the subcommittee level.

When considering the possible Democratic chairs of committees, one must keep in mind it is often a matter of musical chairs with the most senior members getting first choice. And so, with Senator Patrick Leahy (D-VT) as the senior-most Democratic Senator, he may well choose to leave the Appropriations Committee and move back to assume the gavel of the Judiciary Committee. Leahy has long been a stakeholder on antitrust, data security, privacy, and surveillance legislation and would be in a position to influence what bills on those and other matters before the Senate look like. If Leahy does not move to the chair on Judiciary, he may still be entitled to chair a subcommittee and exert influence.

If Leahy stays put, then current Senate Minority Whip Dick Durbin (D-IL) would be poised to leapfrog Senator Dianne Feinstein (D-CA) to chair Judiciary after Feinstein was persuaded to step aside on account of her lackluster performance in a number of high-profile hearings in 2020. Durbin has also been active on privacy, data security, and surveillance issues. The Judiciary Committee will be central to a number of technology policies, including Foreign Intelligence Surveillance Act reauthorization, privacy legislation, Section 230 reform, antitrust, and others. On the Republican side of the dais, Senator Lindsey Graham (R-SC) leaving the top post because of term limit restrictions imposed by Republicans, and Senator Charles Grassley (R-IA) is set to replace him. How this changes the 47 USC 230 (Section 230) debate is not immediately clear. And yet, Grassley and three colleagues recently urged the Trump Administration in a letter to omit language in a trade agreement with the United Kingdom (UK) that mirrors the liability protection Section 230. Senators Rob Portman (R-OH), Mark R. Warner (D-VA), Richard Blumenthal (D-CT), and Grassley argued to U.S. Trade Representative Ambassador Robert Lighthizer that a “safe harbor” like the one provided to technology companies for hosting or moderating third party content is outdated, not needed in a free trade agreement, contrary to the will of both the Congress and UK Parliament, and likely to be changed legislatively in the near future. It is likely, however, Grassley will fall in with other Republicans propagating the narrative that social media is unfairly biased against conservatives, particularly in light of the recent purge of President Donald Trump for his many, repeated violations of policy.

The Senate Judiciary Committee will be central in any policy discussions of antitrust and anticompetition in the technology realm. But it bears note the filibuster (and the very low chances Senate Democrats would “go nuclear” and remove all vestiges of the functional supermajority requirement to pass legislation) will give Republicans leverage to block some of the more ambitious reforms Democrats might like to enact (e.g. the House Judiciary Committee’s October 2020 final report that calls for nothing less than a complete remaking of United States (U.S.) antitrust policy and law; see here for more analysis.)

It seems Senator Sherrod Brown (D-OH) will be the next chair of the Senate Banking, Housing, and Urban Development Committee which has jurisdiction over cybersecurity, data security, privacy, and other issues in the financial services sector, making it a player on any legislation designed to encompass the whole of the United States economy. Having said that, it may again be the case that sponsors of, say, privacy legislation decide to cut the Gordian knot of jurisdictional turf battles by cutting out certain committees. For example, many of the privacy bills had provisions making clear they would deem financial services entities in compliance with the Financial Services Modernization Act of 1999 (P.L. 106-102) (aka Gramm-Leach-Bliley) to be in compliance with the new privacy regime. I suppose these provisions may have been included on the basis of the very high privacy and data security standards Gramm-Leach-Bliley has brought about (e.g. the Experian hack), or sponsors of federal privacy legislation made the strategic calculation to circumvent the Senate Banking Committee as much as they can. Nonetheless, this committee has sought to insert itself into the policymaking process on privacy last year as Brown and outgoing Chair Mike Crapo (R-ID) requested “feedback” in February 2019 “from interested stakeholders on the collection, use and protection of sensitive information by financial regulators and private companies.” Additionally, Brown released what may be the most expansive privacy bill from the perspective of privacy and civil liberties advocates, the “Data Accountability and Transparency Act of 2020” in June 2020 (see here for my analysis.) Therefore, Brown may continue to push for a role in federal privacy legislation with a gavel in his hands.

In a similar vein, Senator Patty Murray (D-WA) will likely take over the Senate Health, Education, Labor, and Pensions (HELP) Committee which has jurisdiction over health information privacy and data security through the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act). Again, as with the Senate Banking Committee and Gramm-Leach-Bliley, most of the privacy bills exempt HIPAA-compliant entities. And yet, even if her committee is cut out of a direct role in privacy legislation, Murray will still likely exert influence through oversight of and possible legislation changing HIPAA regulations and the Department of Health and Human Services (HHS) enforcement and rewriting of these standards for most of the healthcare industry. For example, HHS is rushing a rewrite of the HIPAA regulations at the tail end of the Trump Administration, and Murray could be in a position to inform how the Biden Administration and Secretary of Health and Human Services-designate Xavier Berra handles this rulemaking. Additionally, Murray may push the Office of Civil Rights (OCR), the arm of HHS that writes and enforces these regulations, to prioritize matters differently.

Senator Maria Cantwell (D-WA) appears to be the next chair of the Senate Commerce, Science, and Transportation Committee and arguably the largest technology portfolio in the Senate. It is the primary committee of jurisdiction for the FCC, FTC, National Telecommunications and Information Administration (NTIA), the National Institute of Standards and Technology (NIST), and the Department of Commerce. Cantwell may exert influence on which people are nominated to head and staff those agencies and others. Her committee is also the primary committee of jurisdiction for domestic and international privacy and data protection matters. And so, federal privacy legislation will likely be drafted by this committee, and legislative changes so the U.S. can enter into a new personal data sharing agreement with the European Union (EU) would also likely involve her and her committee.

Cantwell and likely next Ranking Member Roger Wicker (R-MS) agree on many elements of federal privacy law but were at odds last year on federal preemption and whether people could sue companies for privacy violations. Between them, they circulated three privacy bills. In September 2020, Wicker and three Republican colleagues introduced the “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) (see here for more analysis). Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis).

Cantwell could also take a leading role on Section 230, but her focus, of late, seems to be on how technology companies are wreaking havoc to traditional media. released a report that she has mentioned during her opening statement at the 23 September hearing aimed at trying to revive data privacy legislation. She and her staff investigated the decline and financial troubles of local media outlets, which are facing a cumulative loss in advertising revenue of up to 70% since 2000. And since advertising revenue has long been the life blood of print journalism, this has devastated local media with many outlets shutting their doors or radically cutting their staff. This trend has been exacerbated by consolidation in the industry, often in concert with private equity or hedge funds looking to wring the last dollars of value from bargain basement priced newspapers. Cantwell also claimed that the overwhelming online advertising dominance of Google and Facebook has further diminished advertising revenue and other possible sources of funding through a variety of means. She intimates that much of this content may be illegal under U.S. law, and the FTC may well be able to use its Section 5 powers against unfair and deceptive acts and its anti-trust authority to take action. (see here for more analysis and context.) In this vein, Cantwell will want her committee to play in any antitrust policy changes, likely knowing massive changes in U.S. law are not possible in a split Senate with entrenched party positions and discipline.

Senator Jack Reed (D-RI) will take over the Senate Armed Services Committee and its portfolio over national security technology policy that includes the cybersecurity, data protection and supply chain of national security agencies and their contractors, AI, offensive and defensive U.S. cyber operations, and other realms. Much of the changes Reed and his committee will seek to make will be through the annual National Defense Authorization Act (NDAA) (see here and here for the many technology provisions in the FY 2021 NDAA.) Reed may also prod the Department of Defense (DOD) to implement or enforce the Cybersecurity Maturity Model Certification (CMMC) Framework differently than envisioned and designed by the Trump Administration. In December 2020, a new rule took effect designed to drive better cybersecurity among U.S. defense contractors. This rule brings together two different lines of effort to require the Defense Industrial Base (DIB) to employ better cybersecurity given the risks they face by holding and using classified information, Federal Contract Information (FCI) and Controlled Unclassified Information (CUI). The Executive Branch has long wrestled with how to best push contractors to secure their systems, and Congress and the White House have opted for using federal contract requirements in that contractors must certify compliance. However, the most recent initiative, the CMMC Framework will require contractors to be certified by third party assessors. And yet, it is not clear the DOD has wrestled with the often-misaligned incentives present in third party certification schemes.

Reed’s committee will undoubtedly delve deep into the recent SolarWinds hack and implement policy changes to avoid a reoccurrence. Doing so may lead the Senate Armed Services Committee back to reconsidering the Cyberspace Solarium Commission’s (CSC) March 2020 final report and follow up white papers, especially their views embodied in “Building a Trusted ICT Supply Chain.”

Senator Mark Warner (D-VA) will likely take over the Senate Intelligence Committee. Warner has long been a stakeholder on a number of technology issues and would be able to exert influence on the national security components of such issues. He and his committee will almost certainly play a role in the Congressional oversight of and response to the SolarWinds hack. Likewise, his committee shares jurisdiction over FISA with the Senate Judiciary Committee and over national security technology policy with the Armed Services Committee.

Senator Amy Klobuchar (D-MN) would be the Senate Democratic point person on election security from her perch at the Senate Rules and Administration Committee, which may enable her to more forcefully push for the legislative changes she has long advocated for. In May 2019, Klobuchar and other Senate Democrats introduced the “Election Security Act” (S. 1540), the Senate version of the stand-alone measure introduced in the House that was taken from the larger package, the “For the People Act” (H.R. 1) passed by the House.

In August 2018, the Senate Rules and Administration Committee postponed indefinitely a markup on a compromise bill to provide states additional assistance in securing elections from interference, the “The Secure Elections Act” (S.2593). Reportedly, there was concern among state officials that a provision requiring audits of election results would be in effect an unfunded mandate even though this provision was softened at the insistence of Senate Republican leadership. However, a Trump White House spokesperson indicated in a statement that the Administration opposed the bill, which may have posed an additional obstacle to Committee action. However, even if the Senate had passed its bill, it was unlikely that the Republican controlled House would have considered companion legislation (H.R. 6663).

Senator Gary Peters (D-MI) may be the next chair of the Senate Homeland Security and Governmental Affairs Committee, and if so, he will continue to face the rock on which many the bark of cybersecurity legislation has been dashed: Senator Ron Johnson (R-WI). So significant has Johnson’s opposition been to bipartisan cybersecurity legislation from the House, some House Republican stakeholders have said so in media accounts not bothering to hide in anonymity. And so whatever Peters’ ambitions may be to shore up the cybersecurity of the federal government as his committee will play a role in investigating and responding to the Russian hack of SolarWinds and many federal agencies, he will be limited by whatever Johnson and other Republicans will allow to move through the committee and through the Senate. Of course, Peters’ purview would include the Department of Homeland Security and the Cybersecurity and Infrastructure Security Agency (CISA) and its remit to police the cybersecurity practices of the federal government. Peters would also have in his portfolio the information technology (IT) practices of the federal government, some $90 billion annually across all agencies.

Finally, whether it be Leahy or Durbin at the Senate Appropriations Committee, this post allows for immense influence in funding and programmatic changes in all federal programs through the power of the purse Congress holds.

Further Reading, Other Developments, and Coming Events (12 January 2021)

Further Reading

  • Biden’s NSC to focus on global health, climate, cyber and human rights, as well as China and Russia” By Karen DeYoung — The Washington Post. Like almost every incoming White House, the Biden team has announced a restructuring of the National Security Council (NSC) to better effectuate the President-elect’s policy priorities. To not surprise, the volume on cybersecurity policy will be turned up. Other notable change is plans to take “cross-cutting” approaches to issues that will likely meld foreign and domestic and national security and civil issues, meaning there could be a new look on offensive cyber operations, for example. It is possible President Biden decides to put the genie back in the bottle, so to speak, by re-imposing an interagency decision-making process as opposed to the Trump Administration’s approach of delegating discretion to the National Security Agency/Cyber Command head. Also, the NSC will focus on emerging technology, a likely response to the technology arms race the United States finds itself in against the People’s Republic of China.
  • Exclusive: Pandemic relief aid went to media that promoted COVID misinformation” By Caitlin Dickson — yahoo! news. The consulting firm Alethea Group and the nonprofit Global Disinformation Index are claiming the COVID stimulus Paycheck Protection Program (PPP) provided loans and assistance to five firms that “were publishing false or misleading information about the pandemic, thus profiting off the infodemic” according to an Alethea Group vice president. This report follows an NBC News article claiming that 14 white supremacist and racist organizations have also received PPP loans. The Alethea Group and Global Disinformation Index named five entities who took PPP funds and kept spreading pandemic misinformation: Epoch Media Group, Newsmax Media, The Federalist, Liftable Media, and Prager University.
  • Facebook shuts Uganda accounts ahead of vote” — France24. The social media company shuttered a number of Facebook and Instagram accounts related to government officials in Uganda ahead of an election on account of “Coordinated Inauthentic Behaviour” (CIB). This follows the platform shutting down accounts related to the French Army and Russia seeking to influence events in Africa. These and other actions may indicate the platform is starting to pay the same attention to the non-western world as at least one former employee has argued the platform was negligent at best and reckless at worst in not properly resourcing efforts to police CIB throughout the Third World.
  • China tried to punish European states for Huawei bans by adding eleventh-hour rule to EU investment deal” By Finbarr Bermingham — South China Morning Post. At nearly the end of talks on a People’s Republic of China (PRC)-European Union (EU) trade deal, PRC negotiators tried slipping in language that would have barred entry to the PRC’s cloud computing market to any country or company from a country that restricts Huawei’s services and products. This is alternately being seen as either standard Chinese negotiating tactics or an attempt to avenge the thwarting of the crown jewel in its telecommunications ambitions.
  • Chinese regulators to push tech giants to share consumer credit data – sources” By Julie Zhu — Reuters. Ostensibly in a move to better manage the risks of too much unsafe lending, tech giants in the People’s Republic of China (PRC) will soon need to share data on consumer loans. It seems inevitable that such data will be used by Beijing to further crack down on undesirable people and elements within the PRC.
  • The mafia turns social media influencer to reinforce its brand” By Miles Johnson — The Financial Times. Even Italy’s feared ’Ndrangheta is creating and curating a social media presence.

Other Developments

  • President Donald Trump signed an executive order (EO) that bans eight applications from the People’s Republic of China on much the same grounds as the EOs prohibiting TikTok and WeChat. If this EO is not rescinded by the Biden Administration, federal courts may block its implementation as has happened with the TikTok and WeChat EOs to date. Notably, courts have found that the Trump Administration exceeded its authority under the International Emergency Economic Powers Act (IEEPA), which may also be an issue in the proposed prohibition on Alipay, CamScanner, QQ Wallet, SHAREit, Tencent QQ, VMate, WeChat Pay, and WPS Office. Trump found:
    • that additional steps must be taken to deal with the national emergency with respect to the information and communications technology and services supply chain declared in Executive Order 13873 of May 15, 2019 (Securing the Information and Communications Technology and Services Supply Chain).  Specifically, the pace and pervasiveness of the spread in the United States of certain connected mobile and desktop applications and other software developed or controlled by persons in the People’s Republic of China, to include Hong Kong and Macau (China), continue to threaten the national security, foreign policy, and economy of the United States.  At this time, action must be taken to address the threat posed by these Chinese connected software applications.
    • Trump directed that within 45 days of issuance of the EO, there shall be a prohibition on “any transaction by any person, or with respect to any property, subject to the jurisdiction of the United States, with persons that develop or control the following Chinese connected software applications, or with their subsidiaries, as those transactions and persons are identified by the Secretary of Commerce (Secretary) under subsection (e) of this section: Alipay, CamScanner, QQ Wallet, SHAREit, Tencent QQ, VMate, WeChat Pay, and WPS Office.”
  • The Government Accountability Office (GAO) issued its first statutorily required annual assessment of how well the United States Department of Defense (DOD) is managing its major information technology (IT) procurements. The DOD spent more than $36 billion of the $90 billion the federal government was provided for IT in FY 2020. The GAO was tasked with assessing how well the DOD did in using iterative development, managing costs and schedules, and implementing cybersecurity measures. The GAO found progress in the first two realms but a continued lag in deploying long recommended best practices to ensure the security of the IT the DOD buys or builds. Nonetheless, the GAO focused on 15 major IT acquisitions that qualify as administrative (i.e. “business”) and communications and information security (i.e. “non-business.”) While there were no explicit recommendations made, the GAO found:
    • Ten of the 15 selected major IT programs exceeded their planned schedules, with delays ranging from 1 month for the Marine Corps’ CAC2S Inc 1 to 5 years for the Air Force’s Defense Enterprise Accounting and Management System-Increment 1.
    • …eight of the 10 selected major IT programs that had tested their then-current technical performance targets reported having met all of their targets…. As of December 2019, four programs had not yet conducted testing activities—Army’s ACWS, Air Force’s AFIPPS Inc 1, Air Force’s MROi, and Navy ePS. Testing data for one program, Air Force’s ISPAN Inc 4, were classified.
    • …officials from the 15 selected major IT programs we reviewed reported using software development approaches that may help to limit risks to cost and schedule outcomes. For example, major business IT programs reported using COTS software. In addition, most programs reported using an iterative software development approach and using a minimum deployable product. With respect to cybersecurity practices, all the programs reported developing cybersecurity strategies, but programs reported mixed experiences with respect to conducting cybersecurity testing. Most programs reported using operational cybersecurity testing, but less than half reported conducting developmental cybersecurity testing. In addition, programs that reported conducting cybersecurity vulnerability assessments experienced fewer increases in planned program costs and fewer schedule delays. Programs also reported a variety of challenges associated with their software development and cybersecurity staff.
    • 14 of the 15 programs reported using an iterative software development approach which, according to leading practices, may help reduce cost growth and deliver better results to the customer. However, programs also reported using an older approach to software development, known as waterfall, which could introduce risk for program cost growth because of its linear and sequential phases of development that may be implemented over a longer period of time. Specifically, two programs reported using a waterfall approach in conjunction with an iterative approach, while one was solely using a waterfall approach.
    • With respect to cybersecurity, programs reported mixed implementation of specific practices, contributing to program risks that might impact cost and schedule outcomes. For example, all 15 programs reported developing cybersecurity strategies, which are intended to help ensure that programs are planning for and documenting cybersecurity risk management efforts.
    • In contrast, only eight of the 15 programs reported conducting cybersecurity vulnerability assessments—systematic examinations of an information system or product intended to, among other things, determine the adequacy of security measures and identify security deficiencies. These eight programs experienced fewer increases in planned program costs and fewer schedule delays relative to the programs that did not report using cybersecurity vulnerability assessments.
  • The United States (U.S.) Department of Energy gave notice of a “Prohibition Order prohibiting the acquisition, importation, transfer, or installation of specified bulk-power system (BPS) electric equipment that directly serves Critical Defense Facilities (CDFs), pursuant to Executive Order 13920.” (See here for analysis of the executive order.) The Department explained:
    • Executive Order No. 13920 of May 1, 2020, Securing the United States Bulk-Power System (85 FR 26595 (May 4, 2020)) (E.O. 13920) declares that threats by foreign adversaries to the security of the BPS constitute a national emergency. A current list of such adversaries is provided in a Request for Information (RFI), issued by the Department of Energy (Department or DOE) on July 8, 2020 seeking public input to aid in its implementation of E.O. 13920. The Department has reason to believe, as detailed below, that the government of the People’s Republic of China (PRC or China), one of the listed adversaries, is equipped and actively planning to undermine the BPS. The Department has thus determined that certain BPS electric equipment or programmable components subject to China’s ownership, control, or influence, constitute undue risk to the security of the BPS and to U.S. national security. The purpose of this Order is to prohibit the acquisition, importation, transfer, or subsequent installation of such BPS electric equipment or programmable components in certain sections of the BPS.
  • The United States’ (U.S.) Department of Commerce’s Bureau of Industry and Security (BIS) added the People’s Republic of China’s (PRC) Semiconductor Manufacturing International Corporation (SMIC) to its Entity List in a move intended to starve the company of key U.S. technology needed to manufacture high end semiconductors. Therefore, any U.S. entity wishing to do business with SMIC will need a license which the Trump Administration may not be likely to grant. The Department of Commerce explained in its press release:
    • The Entity List designation limits SMIC’s ability to acquire certain U.S. technology by requiring U.S. exporters to apply for a license to sell to the company.  Items uniquely required to produce semiconductors at advanced technology nodes—10 nanometers or below—will be subject to a presumption of denial to prevent such key enabling technology from supporting China’s military-civil fusion efforts.
    • BIS also added more than sixty other entities to the Entity List for actions deemed contrary to the national security or foreign policy interest of the United States.  These include entities in China that enable human rights abuses, entities that supported the militarization and unlawful maritime claims in the South China Sea, entities that acquired U.S.-origin items in support of the People’s Liberation Army’s programs, and entities and persons that engaged in the theft of U.S. trade secrets.
    • As explained in the Federal Register notice:
      • SMIC is added to the Entity List as a result of China’s military-civil fusion (MCF) doctrine and evidence of activities between SMIC and entities of concern in the Chinese military industrial complex. The Entity List designation limits SMIC’s ability to acquire certain U.S. technology by requiring exporters, reexporters, and in-country transferors of such technology to apply for a license to sell to the company. Items uniquely required to produce semiconductors at advanced technology nodes 10 nanometers or below will be subject to a presumption of denial to prevent such key enabling technology from supporting China’s military modernization efforts. This rule adds SMIC and the following ten entities related to SMIC: Semiconductor Manufacturing International (Beijing) Corporation; Semiconductor Manufacturing International (Tianjin) Corporation; Semiconductor Manufacturing International (Shenzhen) Corporation; SMIC Semiconductor Manufacturing (Shanghai) Co., Ltd.; SMIC Holdings Limited; Semiconductor Manufacturing South China Corporation; SMIC Northern Integrated Circuit Manufacturing (Beijing) Co., Ltd.; SMIC Hong Kong International Company Limited; SJ Semiconductor; and Ningbo Semiconductor International Corporation (NSI).
  • The United States’ (U.S.) Department of Commerce’s Bureau of Industry and Security (BIS) amended its Export Administration Regulations “by adding a new ‘Military End User’ (MEU) List, as well as the first tranche of 103 entities, which includes 58 Chinese and 45 Russian companies” per its press release. The Department asserted:
    • The U.S. Government has determined that these companies are ‘military end users’ for purposes of the ‘military end user’ control in the EAR that applies to specified items for exports, reexports, or transfers (in-country) to the China, Russia, and Venezuela when such items are destined for a prohibited ‘military end user.’
  • The Australia Competition and Consumer Commission (ACCC) rolled out another piece of the Consumer Data Right (CDR) scheme under the Competition and Consumer Act 2010, specifically accreditation guidelines “to provide information and guidance to assist applicants with lodging a valid application to become an accredited person” to whom Australians may direct data holders share their data. The ACCC explained:
    • The CDR aims to give consumers more access to and control over their personal data.
    • Being able to easily and efficiently share data will improve consumers’ ability to compare and switch between products and services and encourage competition between service providers, leading to more innovative products and services for consumers and the potential for lower prices.
    • Banking is the first sector to be brought into the CDR.
    • Accredited persons may receive a CDR consumer’s data from a data holder at the request and consent of the consumer. Any person, in Australia or overseas, who wishes to receive CDR data to provide products or services to consumers under the CDR regime, must be accredited
  • Australia’s government has released its “Data Availability and Transparency Bill 2020” that “establishes a new data sharing scheme for federal government data, underpinned by strong safeguards to mitigate risks and simplified processes to make it easier to manage data sharing requests” according to the summary provided in Parliament by the government’s point person. In the accompanying “Explanatory Memorandum,” the following summary was provided:
    • The Bill establishes a new data sharing scheme which will serve as a pathway and regulatory framework for sharing public sector data. ‘Sharing’ involves providing controlled access to data, as distinct from open release to the public.
    • To oversee the scheme and support best practice, the Bill creates a new independent regulator, the National Data Commissioner (the Commissioner). The Commissioner’s role is modelled on other regulators such as the Australian Information Commissioner, with whom the Commissioner will cooperate.
    • The data sharing scheme comprises the Bill and disallowable legislative instruments (regulations, Minister-made rules, and any data codes issued by the Commissioner). The Commissioner may also issue non-legislative guidelines that participating entities must have regard to, and may release other guidance as necessary.
    • Participants in the scheme are known as data scheme entities:
      • Data custodians are Commonwealth bodies that control public sector data, and have the right to deal with that data.
      • Accredited users are entities accredited by the Commissioner to access to public sector data. To become accredited, entities must satisfy the security, privacy, infrastructure and governance requirements set out in the accreditation framework.
      • Accredited data service providers (ADSPs) are entities accredited by the Commissioner to perform data services such as data integration. Government agencies and users will be able to draw upon ADSPs’ expertise to help them to share and use data safely.
    • The Bill does not compel sharing. Data custodians are responsible for assessing each sharing request, and deciding whether to share their data if satisfied the risks can be managed.
    • The data sharing scheme contains robust safeguards to ensure sharing occurs in a consistent and transparent manner, in accordance with community expectations. The Bill authorises data custodians to share public sector data with accredited users, directly or through an ADSP, where:
      • Sharing is for a permitted purpose – government service delivery, informing government policy and programs, or research and development;
      • The data sharing principles have been applied to manage the risks of sharing; and
      • The terms of the arrangement are recorded in a data sharing agreement.
    • Where the above requirements are met, the Bill provides limited statutory authority to share public sector data, despite other Commonwealth, State and Territory laws that prevent sharing. This override of non-disclosure laws is ‘limited’ because it occurs only when the Bill’s requirements are met, and only to the extent necessary to facilitate sharing.
  • The United Kingdom’s Competition and Markets Authority’s (CMA) is asking interested parties to provide input on the proposed acquisition of British semiconductor company by a United States (U.S.) company before it launches a formal investigation later this year. However, CMA is limited to competition considerations, and any national security aspects of the proposed deal would need to be investigated by Prime Minister Boris Johnson’s government. CMA stated:
    • US-based chip designer and producer NVIDIA Corporation (NVIDIA) plans to purchase the Intellectual Property Group business of UK-based Arm Limited (Arm) in a deal worth $40 billion. Arm develops and licenses intellectual property (IP) and software tools for chip designs. The products and services supplied by the companies support a wide range of applications used by businesses and consumers across the UK, including desktop computers and mobile devices, game consoles and vehicle computer systems.
    • CMA added:
      • The CMA will look at the deal’s possible effect on competition in the UK. The CMA is likely to consider whether, following the takeover, Arm has an incentive to withdraw, raise prices or reduce the quality of its IP licensing services to NVIDIA’s rivals.
  • The Israeli firm, NSO Group, has been accused by an entity associated with a British university of using real-time cell phone data to sell its COVID-19 contact tracing app, Fleming, in ways that may have broken the laws of a handful of nations. Forensic Architecture,  a research agency, based at Goldsmiths, University of London, argued:
    • In March 2020, with the rise of COVID-19, Israeli cyber-weapons manufacturer NSO Group launched a contact-tracing technology named ‘Fleming’. Two months later, a database belonging to NSO’s Fleming program was found unprotected online. It contained more than five hundred thousand datapoints for more than thirty thousand distinct mobile phones. NSO Group denied there was a security breach. Forensic Architecture received and analysed a sample of the exposed database, which suggested that the data was based on ‘real’ personal data belonging to unsuspecting civilians, putting their private information in risk
    • Forensic Architecture added:
      • Leaving a database with genuine location data unprotected is a serious violation of the applicable data protection laws. That a surveillance company with access to personal data could have overseen this breach is all the more concerning.
      • This could constitute a violation of the General Data Protection Regulation (GDPR) based on where the database was discovered as well as the laws of the nations where NSO Group allegedly collected personal data
    • The NSO Group denied the claims and was quoted by Tech Crunch:
      • “We have not seen the supposed examination and have to question how these conclusions were reached. Nevertheless, we stand by our previous response of May 6, 2020. The demo material was not based on real and genuine data related to infected COVID-19 individuals,” said an unnamed spokesperson. (NSO’s earlier statement made no reference to individuals with COVID-19.)
      • “As our last statement details, the data used for the demonstrations did not contain any personally identifiable information (PII). And, also as previously stated, this demo was a simulation based on obfuscated data. The Fleming system is a tool that analyzes data provided by end users to help healthcare decision-makers during this global pandemic. NSO does not collect any data for the system, nor does NSO have any access to collected data.”

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Judith Scharnowski from Pixabay

Further Action On TikTok Divestment and Ban But No Changes

TikTok sues to block the CFIUS order that it divest and the Trump Administration files an appeal of an injunction.

Even though the Trump Administration’s efforts to implement its ban of TikTok have gone nowhere as numerous courts have enjoined the enforcement of the orders, TikTok filed suit against the related order that the company divest Musical.ly primarily on the grounds that the technology that supposedly threatens United States (U.S.) national security is unrelated to the acquisition. Moreover, the day after this suit was filed, a key U.S. agency announced a delay of the divestment order. In a related action, the Trump Administration filed to appeal one of the injunctions blocking it from moving forward on banning the People’s Republic of China (PRC) app. Depending on how long it takes for the federal court to resolve this suit, a Biden Administration Department of Justice (DOJ) may take a different tack than the Trump DOJ.

The day before the divestment order was set to take effect, TikTok asked the United States Court of Appeals for the District of Columbia to review “the Presidential Order Regarding the Acquisition of Musical.ly by ByteDance Ltd., 85 Fed. Reg. 51,297 (Aug. 14, 2020) (the “Divestment Order”), and the related action of the Committee on Foreign Investment in the United States (CFIUS), including its determination to reject mitigation, truncate its review and investigation, and refer the matter to the President.” TikTok asserted:

The Divestment Order and the CFIUS Action seek to compel the wholesale divestment of TikTok, a multi-billion-dollar business built on technology developed by Petitioner ByteDance Ltd. (“ByteDance”), based on the government’s purported national security review of a three-year- old transaction that involved a different business. This attempted taking exceeds the authority granted to Respondents under Section 721, which authorizes CFIUS to review and the President to, at most, prohibit a specified “covered transaction” to address risks to national security created by that transaction. Here, that covered transaction was ByteDance’s acquisition of the U.S. business of another Chinese- headquartered company, Musical.ly—a transaction that did not include the core technology or other aspects of the TikTok business that have made it successful and yet which the Divestment Order now seeks to compel ByteDance to divest.

TikTok also made claims that CFIUS violated the Due Process Clause of the Fifth Amendment, violated the Administrative Procedures Act, and is proposing a “taking” illegal under the Fifth Amendment.

And yet, the Department of the Treasury, the lead agency in the CFIUS process, issued a statement, explaining that the deadline for divestiture had been pushed back by 15 days:

The President’s August 14 Order requires ByteDance and TikTok Inc. to undertake specific divestments and other measures to address the national security risk arising from ByteDance’s acquisition of Musical.ly.  Consistent with the Order, the Committee on Foreign Investment in the United States (CFIUS) has granted ByteDance a 15-day extension of the original November 12, 2020 deadline.  This extension will provide the parties and the Committee additional time to resolve this case in a manner that complies with the Order.   

The Trump Administration may successfully argue that a delay of the order means the court cannot rule on TikTok’s suit. Consequently, this suit may well get pushed into a Biden Administration.

TikTok issued this statement along with the filing of its suit:

For a year, TikTok has actively engaged with CFIUS in good faith to address its national security concerns, even as we disagree with its assessment. In the nearly two months since the president gave his preliminary approval to our proposal to satisfy those concerns, we have offered detailed solutions to finalize that agreement—but have received no substantive feedback on our extensive data privacy and security framework.

Of course, because of the CFIUS divestment order, ByteDance seems to have reached an agreement with Oracle and Walmart, but what they exactly agreed to remains an open question.

In mid-September, the Trump Administration paused its notice for implementing the Executive Order (EO) against TikTok because of agreement in principles of a deal that would permit Oracle and Walmart to control a certain percentage of TikTok in the U.S. However, the details of which entity would control what remain murky with ByteDance arguing that U.S. entities will not control TikTok, but assertions to the opposite being made by the company’s U.S. partners. In the weekend before the EO has set to take effect, it appeared Oracle and Walmart would be able to take a collective 20% stake in a new entity TikTok Global that would operate in the U.S. Walmart has been partnering with Microsoft, but when the tech giant failed in its bid, Walmart began talks with Oracle. ByteDance would have a stake in the company but not majority control according to some sources. However, ByteDance began pushing back on that narrative as President Donald Trump declared after word of a deal leaked “if we find that [Oracle and Walmart] don’t have total control, then we’re not going to approve the deal.” Moreover, $5 billion would be used for some sort of educational fund. However, it is hard to tell what exactly would occur and whether this is supposed to be the “finder’s fee” of sorts Trump had said the U.S. would deserve from the deal.

On 19 September, the U.S. Department of Commerce issued a statement pushing back the effective date of the order against TikTik from 20 September to 27 September because of “recent positive developments.” The same day, the U.S. Department of the Treasury released a statement, explaining:

The President has reviewed a deal among Oracle, Walmart, and TikTok Global to address the national security threat posed by TikTok’s operations. Oracle will be responsible for key technology and security responsibilities to protect all U.S. user data. Approval of the transaction is subject to a closing with Oracle and Walmart and necessary documentation and conditions to be approved by Committee on Foreign Investment in the United States (CFIUS). 

TikTok also released a statement, asserting

We’re pleased that today we’ve confirmed a proposal that resolves the Administration’s security concerns and settles questions around TikTok’s future in the US. Our plan is extensive and consistent with previous CFIUS resolutions, including working with Oracle, who will be our trusted cloud and technology provider responsible for fully securing our users’ data. We are committed to protecting our users globally and providing the highest levels of security. Both Oracle and Walmart will take part in a TikTok Global pre-IPO financing round in which they can take up to a 20% cumulative stake in the company. We will also maintain and expand the US as TikTok Global’s headquarters while bringing 25,000 jobs across the country.

Walmart issued its own statement on 19 September:

While there is still work to do on final agreements, we have tentatively agreed to purchase 7.5% of TikTok Global as well as enter into commercial agreements to provide our ecommerce, fulfillment, payments and other omnichannel services to TikTok Global. Our CEO, Doug McMillon, would also serve as one of five board members of the newly created company. In addition, we would work toward an initial public offering of the company in the United States within the next year to bring even more ownership to American citizens. The final transaction will need to be approved by the relevant U.S. government agencies.

The same day, Oracle and Walmart released a joint statement:

  • The President has announced that ByteDance has received tentative approval for an agreement with the U.S. Government to resolve the outstanding issues, which will now include Oracle and Walmart together investing to acquire 20% of the newly formed TikTok Global business.
  • As a part of the deal, TikTok is creating a new company called TikTok Global that will be responsible for providing all TikTok services to users in United States and most of the users in the rest of the world. Today, the administration has conditionally approved a landmark deal where Oracle becomes TikTok’s secure cloud provider.
  • TikTok Global will be majority owned by American investors, including Oracle and Walmart. TikTok Global will be an independent American company, headquartered in the U.S., with four Americans out of the five member Board of Directors.
  • All the TikTok technology will be in possession of TikTok Global, and comply with U.S. laws and privacy regulations. Data privacy for 100 million American TikTok users will be quickly established by moving all American data to Oracle’s Generation 2 Cloud data centers, the most secure cloud data centers in the world.
  • In addition to its equity position, Walmart will bring its omnichannel retail capabilities including its Walmart.com assortment, eCommerce marketplace, fulfillment, payment and measurement-as-a-service advertising service.
  • TikTok Global will create more than 25,000 new jobs in the Unites States and TikTok Global will pay more than $5 billion in new tax dollars to the U.S. Treasury.
  • TikTok Global, together with Oracle, SIG, General Atlantic, Sequoia, Walmart and Coatue will create an educational initiative to develop and deliver an AI-driven online video curriculum to teach children from inner cities to the suburbs, a variety of courses from basic reading and math to science, history and computer engineering.
  • TikTok Global will have an Initial Public Offering (IPO) in less than 12 months and be listed on a U.S. Exchange. After the IPO, U.S. ownership of TikTok Global will increase and continue to grow over time.

A day later, Oracle went further in a statement to the media claiming, “ByteDance will have no ownership in TikTok Global,” which is a different message than the one the company was sending. For example, in a blog post, ByteDance stated “[t]he current plan does not involve the transfer of any algorithms or technology…[but] Oracle has the authority to check the source code of TikTok USA.”

On a related note, the DOJ filed a notice of appeal of an injunction barring the implementation of the TikTok issued in late October. Three TikTok influencers had filed suit and lost their motion for a preliminary injunction. However, after District Court of the District of Columbia granted TikTok’s request to stop the Department of Commerce from enforcing the first part of the order implementing the ban, the three influencers revised their motion and refiled.

Judge Wendy Beetlestone found that the Trump Administration exceeded its powers under the International Emergency Economic Powers Act (IEEPA) in issuing part of its TikTok order effectuating the ban set to take effect on 12 November:

  • Any provision of internet hosting services, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, enabling the functioning or optimization of the TikTok mobile application[;]
  • Any provision of content delivery network services, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, enabling the functioning or optimization of the TikTok mobile application[;]
  • Any provision of directly contracted or arranged internet transit or peering services, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, enabling the functioning or optimization of the TikTok mobile application[;and]
  • Any utilization, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, of the TikTok mobile application’s constituent code, functions, or services in the functioning of software or services developed and/or accessible within the land and maritime borders of the United States and its territories.

Beetlestone found that the limit on the use of IEEPA powers to regulate information is clearly implicated by Commerce’s order, which proposes to do just that. Consequently, this is not a legal use of IEEPA powers. The judge also found the plaintiffs would be irreparably harmed through a loss of their audiences and brand sponsorships:

Plaintiffs challenge the Commerce Identification on both statutory and constitutional grounds. First, they contend that the Commerce Identification violates both the First and Fifth Amendments to the U.S. Constitution. They then contend that the Commerce Identification violates the Administrative Procedure Act,5 U.S.C. §701 et seq.,as it is both arbitrary and capricious, see id.§706(2)(A), and ultra vires, see id. § 706(2)(C). Plaintiffs’ ultra vires claim consists of three separate arguments: (1) the Commerce Identification contravenes IEEPA’s “informational materials” exception, 50 U.S.C. § 1702(b)(3); (2) the Commerce Identification contravenes IEEPA’s prohibition on the regulation of “personal communication[s] . . . not involv[ing] a transfer of anything of value,” id. § 1702(b)(1), and (3) the Commerce Identification is not responsive to the national emergency declared in the ICTS Executive Order, and therefore requires the declaration of a new national emergency to take effect, see id. §1701(b).

In the first injunction granted against the TikTok ban, the court found that TikTok’s claims on the misuse of IEEPA, 50 U.S.C. §§ 1701–08, the primary authority President Donald Trump relied on in his executive order banning the app, were unpersuasive. The court conceded “IEEPA contains a broad grant of authority to declare national emergencies and to prohibit certain transactions with foreign countries or foreign nationals that pose risks to the national security of the United States.” But, the court noted “IEEPA also contains two express limitations relevant here: the “authority granted to the President . . . does not include the authority to regulate or prohibit, directly or indirectly” either (a) the importation or exportation of “information or informational materials”; or (b) “personal communication[s], which do[] not involve a transfer of anything of value.” The court concluded:

In sum, the TikTok Order and the Secretary’s prohibitions will have the intended effect of stopping U.S. users from communicating (and thus sharing data) on TikTok. To be sure, the ultimate purpose of those prohibitions is to protect the national security by preventing China from accessing that data and skewing content on TikTok. And the government’s actions may not constitute direct regulations or prohibitions of activities carved out by 50 U.S.C. 1702(b). But Plaintiffs have demonstrated that they are likely to succeed on their claim that the prohibitions constitute indirect regulations of “personal communication[s]” or the exchange of “information or informational materials.”

After considering the risks of irreparable harm to TikTok and the equities and public interest, the court decided:

Weighing these interests together with Plaintiffs’ likelihood of succeeding on their IEEPA claim and the irreparable harm that Plaintiffs (and their U.S. users) will suffer absent an injunction, the Court concludes that a preliminary injunction is appropriate.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Olivier Bergeron from Pexels

Courts Further Block TikTok and WeChat Bans

Two Trump Administration measures to strike at the PRC remain unimplemented.

The Trump Administration has suffered more setbacks in its efforts to move forward with its bans on applications from the People’s Republic of China (PRC). United States’ (U.S.) courts continue to block enforcement of orders prohibiting TikTok and WeChat on national security grounds. Courts have been skeptical of the rationale and reasons offered by the Trump Administration and have not allowed a single portion of either order to take effect.

In a decision late last month, a magistrate judge in San Francisco rejected the Trump Administration’s request to essentially reverse the injunction on the ban of WeChat. Magistrate Judge Laurel Beeler explained:

The government moved to stay the preliminary injunction, and it submitted additional information (that it could not have reasonably submitted earlier) that the Secretary of Commerce considered in identifying the prohibited transactions. The plaintiffs submitted additional information too. On this record, the court denies the motion to stay. The government’s additional evidence does not alter the court’s previous holding that the plaintiffs are entitled to a preliminary injunction.

Beeler said the Department of Commerce presented additional evidence on the threats to national security posed by WeChat and Tencent, chiefly because of the PRC’s access to the data amassed and used by PRC companies, and the Department of Homeland Security’s Cybersecurity Infrastructure and Security Agency presented evidence on potential threats the app poses, including as a vehicle for PRC misinformation, a means of introducing malicious code, and the exposure of data of Americans to the PRC.

One of the plaintiff’s experts identified “four targeted measures to address the government’s concerns about WeChat:

  • First, WeChat could partner with a U.S. cloud provider to store data, which would allow a relatively secure place for user data and easy audits to detect unauthorized access to data.
  • Second, regular compliance audits would mitigate data-security risks.
  • Third, it is industry best practice to have stringent corporate or external oversight over management and personnel with access to user data.
  • Fourth, WeChat could use end-to-end encryption. These measures do not eliminate all risks of data leaks to the Chinese government, but they meet the industry’s current standards.

This expert further noted

that the government’s concern about WeChat’s surveillance capabilities could be addressed by an independent third party’s review and audit of WeChat’s source codes. Banning WeChat downloads is dangerous because it increases security risks to users: software needs updates to fix bugs, and if bugs are not fixed, WeChat users’ devices and data are subject to attack. Security concerns about government employees are addressed through narrower bans of those employees’ use of the WeChat app. Otherwise, data protection generally requires best practices such as end-to-end encryption, protecting consumer data and metadata (in the manner of Europe’s General Data Protection Regulation or California’s Consumer Privacy Act), and supporting research into making traffic analysis more difficult. Tech companies such as Facebook and Google, which collect data and sell it to data brokers, also pose surveillance concerns. If China wants U.S. users’ private information, it can buy it from those data brokers. Effectively banning WeChat does not protect U.S. user data from criminals or China.

Beeler found:

The government’s new evidence does not meaningfully alter its earlier submissions. The court’s assessment of the First Amendment analysis and the risks to national security — on this record — are unchanged.

[T]he record does not support the conclusion that the government has “narrowly tailored” the prohibited transactions to protect its national-security interests. Instead, the record, on balance, supports the conclusion that the restrictions “burden substantially more speech than is necessary to further the government’s legitimate interests.”

Consequently, Beeler denied the motion to lift the injunction against the WeChat order.

Moreover, the United States Court of Appeals for the Ninth Circuit declined to stay’s Beeler’s injunction barring implementation of the WeChat ban, as requested by the Trump Administration.

On 19 September, Beeler granted a preliminary injunction against the Trump Administration’s implementation of the WeChat order. As explained in a footnote, “[t]he plaintiffs are U.S. WeChat Users Alliance, a nonprofit formed to challenge the WeChat Executive Order, and individual and business users.” In short, they contended that the WeChat ban

(1) violates the First Amendment to the U.S. Constitution,

(2) violates the Fifth Amendment,

(3) violates the Religious Freedom Restoration Act, 42 U.S.C. § 2000bb(1)(a),

(4) was not a lawful exercise of the President’s and the Secretary’s authority under IEEPA— which allows the President to prohibit “transactions” in the interest of national security — because the IEEPA, 50 U.S.C. § 1702(b)(1), does not allow them to regulate personal communications, and

(5) violates the Administrative Procedures Act (“APA”) because the Secretary exceeded his authority under the IEEPA and should have promulgated the rule through the notice-and-comment rulemaking procedures in 5 U.S.C. § 553(b).

The judge granted the motion for a preliminary injunction “on the ground that the plaintiffs have shown serious questions going to the merits of the First Amendment claim, the balance of hardships tips in the plaintiffs’ favor, and the plaintiffs establish sufficiently the other elements for preliminary-injunctive relief.” The judge seemed most persuaded by this claim and summarized the plaintiffs’ argument:

  • First, they contend, effectively banning WeChat — which serves as a virtual public square for the Chinese-speaking and Chinese-American community in the United States and is (as a practical matter) their only means of communication — forecloses meaningful access to communication in their community and thereby operates as a prior restraint on their right to free speech that does not survive strict scrutiny.
  • Second, even if the prohibited transactions are content-neutral time-place-or-manner restrictions, they do not survive intermediate scrutiny because the complete ban is not narrowly tailored to address the government’s significant interest in national security.

In a decision from last week, a new federal court has found reason to block the TikTok ban on new grounds. Three TikTok influencers  had filed suit and lost their motion for a preliminary injunction. However, after District Court of the District of Columbia granted TikTok’s request to stop the Department of Commerce from enforcing the first part of the order implementing the ban, the three influencers revised their motion and refiled.

Judge Wendy Beetlestone found that the Trump Administration exceeded its powers under the International Emergency Economic Powers Act (IEEPA) in issuing part of its TikTok order effectuating the ban set to take effect on 12 November:

  • Any provision of internet hosting services, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, enabling the functioning or optimization of the TikTok mobile application[;]
  • Any provision of content delivery network services, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, enabling the functioning or optimization of the TikTok mobile application[;]
  • Any provision of directly contracted or arranged internet transit or peering services, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, enabling the functioning or optimization of the TikTok mobile application[; and]
  • Any utilization, occurring on or after 11:59 p.m. eastern standard time on November 12, 2020, of the TikTok mobile application’s constituent code, functions, or services in the functioning of software or services developed and/or accessible within the land and maritime borders of the United States and its territories.

Beetlestone found that the limit on the use of IEEPA powers to regulate information is clearly implicated by Commerce’s order, which proposes to do just that. Consequently, this is not a legal use of IEEPA powers. The judge also found the plaintiffs would be irreparably harmed through a loss of their audiences and brand sponsorships

Plaintiffs challenge the Commerce Identification on both statutory and constitutional grounds. First, they contend that the Commerce Identification violates both the First and Fifth Amendments to the U.S. Constitution. They then contend that the Commerce Identification violates the Administrative Procedure Act,5 U.S.C. §701 et seq.,as it is both arbitrary and capricious, see id.§706(2)(A), and ultra vires, see id. § 706(2)(C). Plaintiffs’ ultra vires claim consists of three separate arguments: (1) the Commerce Identification contravenes IEEPA’s “informational materials” exception, 50 U.S.C. § 1702(b)(3); (2) the Commerce Identification contravenes IEEPA’s prohibition on the regulation of “personal communication[s] . . . not involv[ing] a transfer of anything of value,” id. § 1702(b)(1), and (3) the Commerce Identification is not responsive to the national emergency declared in the ICTS Executive Order, and therefore requires the declaration of a new national emergency to take effect, see id. §1701(b).

In the first injunction granted against the TikTok ban, the court found that TikTok’s claims on the misuse of IEEPA , 50 U.S.C. §§ 1701–08, the primary authority President Donald Trump relied on in his executive order banning the app, were unpersuasive. The court conceded “IEEPA contains a broad grant of authority to declare national emergencies and to prohibit certain transactions with foreign countries or foreign nationals that pose risks to the national security of the United States.” But, the court noted “IEEPA also contains two express limitations relevant here: the “authority granted to the President . . . does not include the authority to regulate or prohibit, directly or indirectly” either (a) the importation or exportation of “information or informational materials”; or (b) “personal communication[s], which do[] not involve a transfer of anything of value.” The court concluded:

In sum, the TikTok Order and the Secretary’s prohibitions will have the intended effect of stopping U.S. users from communicating (and thus sharing data) on TikTok. To be sure, the ultimate purpose of those prohibitions is to protect the national security by preventing China from accessing that data and skewing content on TikTok. And the government’s actions may not constitute direct regulations or prohibitions of activities carved out by 50 U.S.C. 1702(b). But Plaintiffs have demonstrated that they are likely to succeed on their claim that the prohibitions constitute indirect regulations of “personal communication[s]” or the exchange of “information or informational materials.”

After considering the risks of irreparable harm to TikTok and the equities and public interest, the court decided:

Weighing these interests together with Plaintiffs’ likelihood of succeeding on their IEEPA claim and the irreparable harm that Plaintiffs (and their U.S. users) will suffer absent an injunction, the Court concludes that a preliminary injunction is appropriate.

On 18 September, the Trump Administration issued orders barring TikTok and WeChat pursuant to the “Executive Order on Addressing the Threat Posed by TikTok” and “Executive Order on Addressing the Threat Posed by WeChat” that bar any transactions with the companies that made, distribute, and operate TikTok and WeChat respectively. The U.S. Department of Commerce (Commerce) issued orders effectuating the executive orders.

In a press release, Commerce explained:

As of September 20, 2020, the following transactions are prohibited:

  1. Any provision of service to distribute or maintain the WeChat or TikTok mobile applications, constituent code, or application updates through an online mobile application store in the U.S.;
  2. Any provision of services through the WeChat mobile application for the purpose of transferring funds or processing payments within the U.S.

As of September 20, 2020, for WeChat and as of November 12, 2020, for TikTokthe following transactions are prohibited:

  1. Any provision of internet hosting services enabling the functioning or optimization of the mobile application in the U.S.;
  2. Any provision of content delivery network services enabling the functioning or optimization of the mobile application in the U.S.;
  3. Any provision directly contracted or arranged internet transit or peering services enabling the function or optimization of the mobile application within the U.S.;
  4. Any utilization of the mobile application’s constituent code, functions, or services in the functioning of software or services developed and/or accessible within the U.S.

Commerce added:

Any other prohibitive transaction relating to WeChat or TikTok may be identified at a future date. Should the U.S. Government determine that WeChat’s or TikTok’s illicit behavior is being replicated by another app somehow outside the scope of these executive orders, the President has the authority to consider whether additional orders may be appropriate to address such activities. The President has provided until November 12 for the national security concerns posed by TikTok to be resolved. If they are, the prohibitions in this order may be lifted.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by iXimus from Pixabay

Commerce White Paper on Schrems II

The U.S. tries to lay out the reasons why data can still be transferred from the EU

The Trump Administration has released a crib sheet they are hoping United States (U.S.) multinationals will have success in using to argue to data protection authorities (DPA) in the European Union that their Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) and U.S. law satisfy the European Court of Justice’s ruling that struck down the EU-U.S. Privacy Shield. And, the Trump Administration is basically arguing, sure, we spy, but most EU citizens data is not surveilled and EU governments themselves often share in the proceeds of the surveillance we conduct. Moreover, there are plenty of safeguards and means of redress in the U.S. system because, you know, we say so. It is unlikely this analysis will be very persuasive in the EU, especially since these broad arguments do not go to the criticisms the EU has had under Privacy Shield about U.S. surveillance and privacy rights nor to the basis for the Court of Justice of the European Union’s (CJEU) ruling.

In a summary of its decision Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Case C-311/18 (Schrems II), the CJEU explained:

The General Data Protection Regulation (GDPR) provides that the transfer of such data to a third country may, in principle, take place only if the third country in question ensures an adequate level of data protection. According to the GDPR, the Commission may find that a third country ensures, by reason of its domestic law or its international commitments, an adequate level of protection. In the absence of an adequacy decision, such transfer may take place only if the personal data exporter established in the EU has provided appropriate safeguards, which may arise, in particular, from standard data protection clauses adopted by the Commission, and if data subjects have enforceable rights and effective legal remedies. Furthermore, the GDPR details the conditions under which such a transfer may take place in the absence of an adequacy decision or appropriate safeguards.

Ultimately, the CJEU found the United States (U.S.) lacks the requisite safeguards needed under EU law, and so the general means of transferring the data of EU citizens from the EU to the U.S. was essentially struck down. This marked the second time in the last five years such an agreement had been found to violate EU law.

Needless to say, the Trump Administration did not care for this ruling nor did the multinationals using Privacy Shield. And while those entities using SCCs and BCRs may have been relieved that the CJEU did not strike down those means of transferring data under the GDPR to the U.S., the court made clear that DPAs will need to go through these agreements on a case-by-case basis to see if they comport with EU law, too. Hence, this White Paper. The United States Department of Commerce (hereafter Commerce) explained the rationale for the White Paper as “in an effort to assist organizations in assessing whether their transfers offer appropriate data protection in accordance with the [CJEU’s] ruling, the U.S. Government has prepared the attached White Paper, which outlines the robust limits and safeguards in the United States pertaining to government access to data.”

Commerce made the rather obvious assertion that “[l]ike European nations and other countries, the United States conducts intelligence gathering activities to ensure that national security and foreign policy decision makers have access to timely, accurate, and insightful information on the threats posed by terrorists, criminals, cyber hackers, and other malicious actors.” Comparing U.S. surveillance to other nations is a bit like saying Jeff Bezos and I both made money this year. That is true, of course, but Bezos out earned me and everyone else by orders of magnitude. Moreover, whether EU nations conduct surveillance is beside the point. The CJEU took issue with U.S. surveillance and the rights afforded to EU residents for redress and not surveillance generally. It found the U.S.’s regime violated EU law.

Commerce touted “the extensive U.S. surveillance reforms since 2013” which were, of course, the result of former National Security Agency (NSA) contractor Edward Snowden revealing the massive NSA surveillance programs that were hoovering up data around the world. Nonetheless, after omitting this crucial bit, Commerce claimed “the U.S. legal framework for foreign intelligence collection provides clearer limits, stronger safeguards, and more rigorous independent oversight than the equivalent laws of almost all other countries.” And yet, the CJEU somehow disagreed with this claim.

Commerce summarized its “key points:

(1)  Most U.S. companies do not deal in data that is of any interest to U.S. intelligence agencies, and have no grounds to believe they do. They are not engaged in data transfers that present the type of risks to privacy that appear to have concerned the ECJ in Schrems II.

(2)  The U.S. government frequently shares intelligence information with EU Member States, including data disclosed by companies in response to FISA 702 orders, to counter threats such as terrorism, weapons proliferation, and hostile foreign cyber activity. Sharing of FISA 702 information undoubtedly serves important EU public interests by protecting the governments and people of the Member States.

(3) There is a wealth of public information about privacy protections in U.S. law concerning government access to data for national security purposes, including information not recorded in Decision 2016/1250, new developments that have occurred since 2016, and information the ECJ neither considered nor addressed. Companies may wish to take this information into account in any assessment of U.S. law post-Schrems II.

Again, even if all this were true (and that is a stretch with some of these claims), these arguments are irrelevant in the eyes of the CJEU. Let’s take a look at what the CJEU found so objectionable in the European Commission’s adequacy decision with respect to U.S. surveillance and the rights afforded to EU residents:

  • It is thus apparent that Section 702 of the FISA does not indicate any limitations on the power it confers to implement surveillance programmes for the purposes of foreign intelligence or the existence of guarantees for non-US persons potentially targeted by those programmes. In those circumstances and as the Advocate General stated, in essence, in points 291, 292 and 297 of his Opinion, that article cannot ensure a level of protection essentially equivalent to that guaranteed by the Charter, as interpreted by the case-law set out in paragraphs 175 and 176 above, according to which a legal basis which permits interference with fundamental rights must, in order to satisfy the requirements of the principle of proportionality, itself define the scope of the limitation on the exercise of the right concerned and lay down clear and precise rules governing the scope and application of the measure in question and imposing minimum safeguards.            
  • According to the findings in the Privacy Shield Decision, the implementation of the surveillance programmes based on Section 702 of the FISA is, indeed, subject to the requirements of PPD‑28. However, although the Commission stated, in recitals 69 and 77 of the Privacy Shield Decision, that such requirements are binding on the US intelligence authorities, the US Government has accepted, in reply to a question put by the Court, that PPD‑28 does not grant data subjects actionable rights before the courts against the US authorities. Therefore, the Privacy Shield Decision cannot ensure a level of protection essentially equivalent to that arising from the Charter, contrary to the requirement in Article 45(2)(a) of the GDPR that a finding of equivalence depends, inter alia, on whether data subjects whose personal data are being transferred to the third country in question have effective and enforceable rights.
  • As regards the monitoring programmes based on E.O. 12333, it is clear from the file before the Court that that order does not confer rights which are enforceable against the US authorities in the courts either.
  • It should be added that PPD‑28, with which the application of the programmes referred to in the previous two paragraphs must comply, allows for ‘“bulk” collection … of a relatively large volume of signals intelligence information or data under circumstances where the Intelligence Community cannot use an identifier associated with a specific target … to focus the collection’, as stated in a letter from the Office of the Director of National Intelligence to the United States Department of Commerce and to the International Trade Administration from 21 June 2016, set out in Annex VI to the Privacy Shield Decision. That possibility, which allows, in the context of the surveillance programmes based on E.O. 12333, access to data in transit to the United States without that access being subject to any judicial review, does not, in any event, delimit in a sufficiently clear and precise manner the scope of such bulk collection of personal data.   
  • It follows therefore that neither Section 702 of the FISA, nor E.O. 12333, read in conjunction with PPD‑28, correlates to the minimum safeguards resulting, under EU law, from the principle of proportionality, with the consequence that the surveillance programmes based on those provisions cannot be regarded as limited to what is strictly necessary.
  • In those circumstances, the limitations on the protection of personal data arising from the domestic law of the United States on the access and use by US public authorities of such data transferred from the European Union to the United States, which the Commission assessed in the Privacy Shield Decision, are not circumscribed in a way that satisfies requirements that are essentially equivalent to those required, under EU law, by the second sentence of Article 52(1) of the Charter.

A stroll down memory lane is also helpful. EU authorities have been flagging these issues for years. The European Data Protection Board (EDPB or Board) released its most recent annual assessment of the Privacy Shield in December 2019 and again found both the agreement itself and implementation wanting. There was some overlap between the concerns of the EDPB and the European Commission (EC) as detailed in its recently released third assessment of the Privacy Shield, but the EDPB discusses areas that were either omitted from or downplayed in the EC’s report. The EDPB’s authority is persuasive with respect to Privacy Shield and carries weight with the EC; however, its concerns as detailed in previous annual reports have pushed the EC to demand changes, including but not limited to, pushing the Trump Administration to nominate Board Members to the Privacy and Civil Liberties Oversight Board (PCLOB) and the appointment of a new Ombudsperson to handle complaints about how the U.S. Intelligence Community is handling the personal data of EU citizens.

In January 2019, in the “EU-U.S. Privacy Shield – Second Annual Joint Review,” the EDPB took issue with a number of shortcomings in US implementation. Notably, the EDPB found problems with the assurances provided by the US government regarding the collection and use of personal data by national security and law enforcement agencies. The EDPB also found problems with how the Department of Commerce and FTC are enforcing the Privacy Shield in the US against commercial entities.

The EDPB also took issue with U.S. law enforcement and national security treatment of EU citizens’ personal data. The Board asserted that nothing had changed in the legal landscape in the U.S. since last year’s review but recounted its concerns, chiefly that under Title VII of the Foreign Intelligence Surveillance Act (FISA) and Executive Order (EO) 12333 indiscriminate data collection from and analysis of EU citizens could occur with minimal oversight and little to no redress contrary to EU law. EDPB also decried how the standing requirements in federal courts have effectively blunted the available redress for EU citizens under the Privacy Act of 1974. The Board also enumerated its concerns about the Ombudsperson “provides the only way for EU individuals to ask for a verification that the relevant authorities have complied with the requirements of this instrument by asking the Ombudsperson to refer the matter to the competent authorities, which include the Inspector General, to check the internal policies of these authorities.” The EDPB was concerned about the impartiality and independence of the current Ombudsperson, Under Secretary of State for Economic Growth, Energy, and the Environment Kenneth Krach and asserted “still doubts that the powers of the Ombudsperson to remedy non-compliance vis-a-vis the intelligence authorities are sufficient, as his “power” seems to be limited to decide not to confirm compliance towards the petitioner.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by S. Hermann & F. Richter from Pixabay

Further Reading, Other Developments, and Coming Events (30 September)

Coming Events

  • On October 1, the House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing as part of its series on online competition at which it may unveil its proposal on how to reform antitrust enforcement for the digital age. The hearing is titled “Proposals to Strengthen the Antitrust Laws and Restore Competition Online.”
  • On 1 October, the Senate Commerce, Science, and Transportation Committee may hold a markup to authorize subpoenas to compel the attendance of the technology CEOs for a hearing on 47 U.S.C. 230 (aka Section 230). Ranking Member Maria Cantwell (D-WA) has said:
    • Taking the extraordinary step of issuing subpoenas is an attempt to chill the efforts of these companies to remove lies, harassment, and intimidation from their platforms. I will not participate in an attempt to use the committee’s serious subpoena power for a partisan effort 40 days before an election,” indicating a vote, should one occur, may well be along party lines.
    • Nonetheless, the Committee may subpoena the following CEOs:
      • Mr. Jack Dorsey, Chief Executive Officer, Twitter
      • Mr. Sundar Pichai, Chief Executive Officer, Alphabet Inc., Google
      • Mr. Mark Zuckerberg, Chief Executive Officer, Facebook
  • The Senate Judiciary Committee will markup the “Online Content Policy Modernization Act” (S.4632), a bill to reform 47 U.S.C. 230 (aka Section 230) that provides many technology companies with protection from lawsuits for third party content posted on their platforms and for moderating and removing such content.
  • On October 1, the Senate Armed Services Committee’s Readiness and Management Support Subcommittee will hold a hearing on supply chain integrity with Under Secretary of Defense for Acquisition and Sustainment Ellen Lord testifying. Undoubtedly, implementation of the ban on Huawei, ZTE, and other People’s Republic of China (PRC) equipment and services as required by Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) will be discussed. Also, the Cybersecurity Maturity Model Certification (CMMC) program will also likely be discussed.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • On 29 September, the House passed the following bills as summarized by the House Energy and Commerce Committee:
    • The “Consumer Product Safety Inspection Enhancement Act” (H.R. 8134) that “would amend the Consumer Product Safety Act to enhance the Consumer Product Safety Commission’s (CPSC) ability to identify unsafe consumer products entering the United States, especially e-commerce shipments entering under the de minimis value exemption. Specifically, the bill would require the CPSC to enhance the targeting, surveillance, and screening of consumer products. The bill also would require electronic filing of certificates of compliance for all consumer products entering the United States.
      • The bill directs the CPSC to: 1) examine a sampling of de minimis shipments and shipments coming from China; 2) detail plans and timelines to effectively address targeting and screening of de minimis shipments; 3) establish metrics by which to evaluate the effectiveness of the CPSC’s efforts in this regard; 4) assess projected technology, resources, and staffing necessary; and 5) submit a report to Congress regarding such efforts. The bill further directs the CPSC to hire at least 16 employees every year until staffing needs are met to help identify violative products at ports.
    • The “AI for Consumer Product Safety Act” (H.R. 8128) that “would direct the Consumer Product Safety Commission (CPSC) to establish a pilot program to explore the use of artificial intelligence for at least one of the following purposes: 1) tracking injury trends; 2) identifying consumer product hazards; 3) monitoring the retail marketplace for the sale of recalled consumer products; or 4) identifying unsafe imported consumer products.” The revised bill passed by the committee “changes the title of the bill to the “Consumer Safety Technology Act”, and adds the text based on the Blockchain Innovation Act (H.R. 8153) and the Digital Taxonomy Act (H.R. 2154)…[and] adds sections that direct the Department of Commerce (DOC), in consultation with the Federal Trade Commission (FTC), to conduct a study and submit to Congress a report on the state of blockchain technology in commerce, including its use to reduce fraud and increase security.” The revised bill “would also require the FTC to submit to Congress a report and recommendations on unfair or deceptive acts or practices relating to digital tokens.”
    • The “American Competitiveness Of a More Productive Emerging Tech Economy Act” or the “American COMPETE Act” (H.R. 8132) “directs the DOC and the FTC to study and report to Congress on the state of the artificial intelligence, quantum computing, blockchain, and the new and advanced materials industries in the U.S…[and] would also require the DOC to study and report to Congress on the state of the Internet of Things (IoT) and IoT manufacturing industries as well as the three-dimensional printing industry” involving “among other things:1) listing industry sectors that develop and use each technology and public-private partnerships focused on promoting the adoption and use of each such technology; 2) establishing a list of federal agencies asserting jurisdiction over such industry sectors; and 3) assessing risks and trends in the marketplace and supply chain of each technology.
      • The bill would direct the DOC to study and report on the effect of unmanned delivery services on U.S. businesses conducting interstate commerce. In addition to these report elements, the bill would require the DOC to examine safety risks and effects on traffic congestion and jobs of unmanned delivery services.
      • Finally, the bill would require the FTC to study and report to Congress on how artificial intelligence may be used to address online harms, including scams directed at senior citizens, disinformation or exploitative content, and content furthering illegal activity.
    • The “Cyber Sense Act of 2019” (H.R.360) requires the Secretary of Energy to establish the Cyber Sense Program. This voluntary program would identify cyber-secure products that could be used in the bulk-power system. 
    • The “Enhancing Grid Security through Public-Private Partnerships Act” (H.R.359) directs the Secretary of Energy – in consultation with States, other Federal agencies, and industry stakeholders – to create and implement a program to enhance the physical and cybersecurity of electric utilities. The bill also requires an update to the Interruption Cost Estimate (ICE) Calculator, an electric reliability planning tool for estimating electricity interruption costs and the benefits of reliability improvements, at least once every two years. 
    • The “Energy Emergency Leadership Act” (H.R.362) creates a new Department of Energy Assistant Secretary position with jurisdiction over all energy emergency and security functions related to energy supply, infrastructure and cybersecurity. 
  • Federal Bureau of Investigation (FBI) and the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) released a trio of public service announcements to dispels myths about the threats to voting while also casting light on the realistic risk that might disrupt the 2020 Election:
    • In “False Claims of Hacked Voter Information Likely Intended to Cast Doubt on Legitimacy of U.S. Elections,” the FBI and CISA issued the “announcement to raise awareness of the potential threat posed by attempts to spread disinformation regarding cyberattacks on U.S. voter registration databases or voting systems.” The agencies added:
      • During the 2020 election season, foreign actors and cyber criminals are spreading false and inconsistent information through various online platforms in an attempt to manipulate public opinion, sow discord, discredit the electoral process, and undermine confidence in U.S. democratic institutions. These malicious actors could use these forums to also spread disinformation suggesting successful cyber operations have compromised election infrastructure and facilitated the “hacking” and “leaking” of U.S. voter registration data.
      • In reality, much U.S. voter information can be purchased or acquired through publicly available sources. While cyber actors have in recent years obtained voter registration information, the acquisition of this data did not impact the voting process or election results.
      • In addition, the FBI and CISA have no information suggesting any cyberattack on U.S. election infrastructure has prevented an election from occurring, prevented a registered voter from casting a ballot, compromised the accuracy of voter registration information, or compromised the integrity of any ballots cast.
    • In “Cyber Threats to Voting Processes Could Slow But Not Prevent Voting,” the agencies wanted “to inform the public that attempts by cyber actors to compromise election infrastructure could slow but not prevent voting.” The FBI and CISA asserted they
      • have not identified any threats, to date, capable of preventing Americans from voting or changing vote tallies for the 2020 Elections. Any attempts tracked by FBI and CISA have remained localized and were blocked, minimal, or easily mitigated.
      • have no reporting to suggest cyberactivity has prevented a registered voter from casting a ballot, compromised the integrity of any ballots cast, or affected the accuracy of voter registration information. However, even if actors did achieve such an impact, the public should be aware that election officials have multiple safeguards and plans in place—such as provisional ballots to ensure registered voters can cast ballots, paper backups, and backup pollbooks—to limit the impact and recover from a cyber incident with minimal disruption to voting.
      • continue to assess that attempts to manipulate votes at scale would be difficult to conduct undetected.
    • In “Foreign Actors and Cybercriminals Likely to Spread Disinformation Regarding 2020 Election Results,” the FBI and CISA explained the announcement aims “to raise awareness of the potential threat posed by attempts to spread disinformation regarding the results of the 2020 elections.” The agencies explained:
      • Foreign actors and cybercriminals could create new websites, change existing websites, and create or share corresponding social media content to spread false information in an attempt to discredit the electoral process and undermine confidence in U.S. democratic institutions. State and local officials typically require several days to weeks to certify elections’ final results in order to ensure every legally cast vote is accurately counted. The increased use of mail-in ballots due to COVID-19 protocols could leave officials with incomplete results on election night.
      • Foreign actors and cybercriminals could exploit the time required to certify and announce elections’ results by disseminating disinformation that includes reports of voter suppression, cyberattacks targeting election infrastructure, voter or ballot fraud, and other problems intended to convince the public of the elections’ illegitimacy.
      • The FBI and CISA urged “the American public to critically evaluate the sources of the information they consume and to seek out reliable and verified information from trusted sources, such as state and local election officials” and stated “[t]he public should also be aware that if foreign actors or cyber criminals were able to successfully change an election-related website, the underlying data and internal systems would remain uncompromised.”
  • The Government Accountability Office (GAO) evaluated the United States’ (U.S.) Department of State proposed reorganization to create an office that would have cybersecurity issues in its portfolio. However, the proposal fell short of what the chair and ranking member of the House Foreign Affairs Committee had envisioned in legislation marked up and reported out of committee. The GAO found that the Department of State failed to coordinate with other agencies with international cybersecurity responsibilities, setting up the possibility that the new office will work at cross purposes, thus limiting the effectiveness of the U.S. cyber diplomacy.
    • The GAO stated
      • In 2019, members of Congress introduced the Cyber Diplomacy Act of 2019, which would establish a new office to lead State’s international cyberspace efforts that would consolidate cross-cutting efforts on international cybersecurity, digital economy, and internet freedom, among other cyber diplomacy issues. In June 2019, State notified Congress of its intent to establish a new Bureau of Cyberspace Security and Emerging Technologies (CSET) that would focus more narrowly on cyberspace security and the security aspects of emerging technologies. According to State officials, Members of Congress raised objections to State’s plan, which has not been implemented as of August 2020.
      • [House Foreign Affairs Committee Chair Eliot Engel (D-NY) and Ranking Member Michael McCaul (R-TX)] asked us to review State’s efforts to advance U.S. interests in cyberspace, including State’s planning process for establishing a new bureau to lead its international cyber mission. This report examines the extent to which State involved other federal agencies in the development of its plan for establishing CSET. As part of our ongoing work on this topic, we are also continuing to monitor and review State’s overall planning process for establishing this new bureau.
      • Under State’s proposal, CSET would not focus on the economic and human rights aspects of cyber diplomacy issues. According to State officials, while the department recognized the challenges posed by cyberspace, it considered efforts related to digital economy and internet freedom to be separate and distinct from CSET’s cyberspace security focus. In contrast, under H.R. 739, State would consolidate cyber diplomacy activities, such as those related to international cybersecurity, digital economy, and internet freedom, in a new office.
    • The GAO concluded
      • State has not initiated a process to involve other federal agencies in the development of its plans for the new CSET bureau. As a result, State has not addressed key practices for involving stakeholders in the development of reforms. State officials told us that they were not obligated to consult with other agencies before completing the CSET plan because it was an internal decision. These officials added that they were not consulted by these agencies when they established offices or bureaus responsible for cyber issues. While State is not legally obligated to involve other agencies in the development of its plans for the new bureau, our prior work on government reforms and reorganizations has shown that it is important for agencies to directly and continuously involve key stakeholders, including agencies supporting similar goals, to develop proposed reforms, such as State’s plan for establishing CSET.
      • Without addressing the key reform practice of involving other agencies in its plans for a new cyber diplomacy bureau, State lacks assurance that it will effectively achieve its goals for establishing CSET. Furthermore, because multiple agencies contribute to cyber diplomacy efforts and are engaged in similar activities, State increases the potential for negative effects from fragmentation, overlap, and duplication of efforts if it does not involve agency partners in the development of its plans to reorganize its cyber diplomacy efforts. Potential negative effects include increased costs or inefficiencies from unnecessary overlap or duplication of efforts.
  • The United States Department of Housing and Urban Development’s (HUD) information security and privacy practices were called into question by the Government Accountability Office (GAO) in an assessment of how effectively the agency is “protecting sensitive information exchanged with external entities.” The GAO performed this evaluation because the House Appropriations Committee required the agency to undertake it. Most alarmingly, the GAO found “HUD was not fully able to identify external entities that process, store, or share sensitive information with its systems used to support housing, community investment, or mortgage loan programs.”
    • The GAO concluded:
      • HUD had minimally addressed the leading practices for requiring the implementation of risk-based security and privacy controls, identifying and tracking corrective actions, and monitoring progress in implementing controls when sharing information with external entities. Moreover, the department had not taken steps to make sure that independent assessments are performed to ensure controls are implemented by external entities. Among the reasons for these weaknesses was HUD’s failure to make it a priority to update and improve IT security and privacy policies. Without leading practices for protecting sensitive information shared with external entities in place, HUD lacks assurance that sensitive information shared with external entities is being protected.
      • Further, HUD had a limited ability to identify external entities that process, store, or share sensitive information with its systems. Until the department has access to better quality information and takes action to improve its inventory of systems that share sensitive information with external entities, HUD will face greater risk that it is falling short in working to protect privacy and sensitive data.
    • The GAO made five recommendations to HUD:
      • The Secretary of Housing and Urban Development should direct the Chief Information Officer, Senior Agency Official for Privacy, and Chief Privacy Officer to review and revise department-level security and privacy policies to ensure that they require the implementation of risk-based security and privacy controls for external entities that process, store, or share sensitive information with HUD. (Recommendation 1)
      • The Secretary of Housing and Urban Development should direct the Chief Information Officer, Senior Agency Official for Privacy, and Chief Privacy Officer to review and revise department-level security and privacy policies to ensure that they require independent assessments of external entities that process, store, or share sensitive information with HUD to ensure controls are implemented. (Recommendation 2)
      • The Secretary of Housing and Urban Development should direct the Chief Information Officer, Senior Agency Official for Privacy, and Chief Privacy Officer to review and revise department-level security and privacy policies to ensure that they require identifying and tracking corrective action needed by external entities that process, store, or share sensitive information with HUD. (Recommendation 3)
      • The Secretary of Housing and Urban Development should direct the Chief Information Officer, Senior Agency Official for Privacy, and Chief Privacy Officer to review and revise department-level security and privacy policies to ensure that they require monitoring of progress in implementing controls/corrective actions by external entities that process, store, or share sensitive information with HUD. (Recommendation 4)
      • The Secretary of Housing and Urban Development should direct the Chief Information Officer, Senior Agency Official for Privacy, and Chief Privacy Officer to develop and maintain a comprehensive systems inventory that incorporates sufficient, reliable information about the external entities with which HUD program information is shared and the extent to which each external entity has access to PII and other sensitive information. (Recommendation 5)
  • Amnesty International’s Security Lab followed up on a March 2019 report on the use of German spyware to surveil human rights activists, dissidents, and journalists in a number of countries. Amnesty International explained:
    • FinSpy is a full-fledged surveillance software suite, capable of intercepting communications, accessing private data, and recording audio and video, from the computer or mobile devices it is silently installed on. FinSpy is produced by Munich-based company FinFisher Gmbh and sold to law enforcement and government agencies around the world.
    • In September 2019, Amnesty International discovered samples of FinFisher’s spyware distributed by malicious infrastructure tied to the attacker group commonly known as NilePhish. likely to be state sponsored. These attacks took place amid an unprecedented crackdown on independent civil society and any critical voices. Over the years, numerous research reports, including by Amnesty International, detailed NilePhish’s campaigns of targeting of Egyptian civil society organizations. Further technical investigation by Amnesty’s Security Lab led to the discovery of additional previously unknown samples for Linux and Mac OS computers, provided with extensive interception capabilities.
    • With this report, Amnesty’s Security Lab shares new insights into the capabilities of the NilePhish attacker group, as well as provides detailed analysis of newly discovered variants of FinSpy in order to enable cybersecurity researchers to further investigate and develop protection mechanisms. In addition, we hope to raise awareness among Human Rights Defenders (HRDs) on the evolution of digital attack techniques and help address common misconceptions that Linux and Mac computers are safer against spyware attacks.
  • In advance of Palantir’s initial public offering, Amnesty International published an issue brief, “Failing to Do Right: The Urgent Need for Palantir to Respect Human Rights,” in which the human rights organization “concludes that Palantir is failing to conduct human rights due diligence around its contracts with Immigration and Customs Enforcement (ICE), and that there is a high risk that Palantir is contributing to human rights violations of asylum-seekers and migrants through the ways the company’s technology facilitates ICE operations.” In the report, Amnesty International stated
    • Through Palantir’s contracts with DHS/ICE for products and services for the Homeland Security Investigations (HSI) division of ICE, Amnesty International has determined there is a high risk that Palantir is contributing to serious human rights violations of migrants and asylum-seekers by the U.S. government, which Amnesty International has thoroughly documented for years. In particular, Palantir’s contracts to provide its Integrated Case Management System (ICM) and FALCON analytical platforms to ICE risk contributing to human rights violations of asylum-seekers and migrants who are separated from family members, subject to workplace raids, detained, and face deportation by ICE.

Further Reading

  • Making a Phone Call from Behind Bars Shouldn’t Send Your Family into Debt” By Sylvia A. Harvey — Politico. This piece summarizes the shameful state of how much many inmates are charged in prisons. The Federal Communications Commission (FCC) and Congress are both working to end the usurious rates charged by the duopoly that owns the majority of this market as a matter of public policy.
  • Ring’s latest security camera is a drone that flies around inside your house” By Dan Seifert — The Verge. Amazon appears to be expanding its home security offerings at the potential price of one’s privacy.
  • Exclusive: China preparing an antitrust investigation into Google – sources” By Cheng Leng, Keith Zhai, David Kirton — Reuters. Google may be facing yet another antitrust investigation but one from a country that may be seeking to even up the score with the United States (U.S.). The People’s Republic of China is reportedly considering whether to bring an action that would focus on Google and its Android operating system with the rub that the scrutiny is being caused by U.S. moves to harm and limit PRC companies like Huawei, TikTok, and WeChat. The PRC is apparently examining the European Union’s case against Google that resulted in a € 4.3 billion fine in 2018.
  • Scars, Tattoos, And License Plates: This Is What Palantir And The LAPD Know About You” By Caroline Haskins — BuzzFeed News. Ahead of its initial public offering (IPO), Palantir’s history and usage by the Los Angeles Police Department seems to lead one to the conclusion that artificial intelligence and big data are being used to confirm existing practices and biases in policing. However, millions of federal, state, and local dollars went to the company to pay for a few different iterations of a predictive policing system that seemed to violate rights and produce little in the way of tangible benefits.
  • How Amazon hid its safety crisis” By Will Evans — The Center for Investigative Reporting. As revealed in leaked company records, Amazon’s record on injuries for workers in its warehouses keeps getting worse. This has been exacerbated by Prime Day, a sale that now rivals the holidays, and the move to robots in some warehouses that has radically increased the number of packages workers are supposed to process per hour. Amazon’s response has been to massage the injury numbers in a variety of ways.
  • Justice Dept. Case Against Google Is Said to Focus on Search Dominance” By Cecilia Kang, Katie Benner, Steve Lohr and Daisuke Wakabayashi — The New York Times. As has been long rumored, the United States Department of Justice has indeed narrowed its case against Google to just its online search engine. This approach may well lead to Democratic state attorneys general filing a different, broader case against Google for antitrust claims related to its online advertising business and online search practices that disadvantages rivals. However, Texas Attorney General Ken Paxton is ready to file an antitrust case focused just on Google’s online advertising business.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.