Further Reading, Other Developments, and Coming Events (16 September)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The House Homeland Security Committee will hold a hearing titled “Worldwide Threats to the Homeland” on 17 September with the following witnesses:
    • Chad Wolf, Department of Homeland Security
    • Christopher Wray, Director, Federal Bureau of Investigation
    • Christopher Miller, Director, National Counterterrorism Center (NCTC)
  • On 17 September, the House Energy and Commerce Committee’s Communications & technology Subcommittee will hold a hearing titled “Trump FCC: Four Years of Lost Opportunities.”
  • The House Armed Services Committee’s Intelligence and Emerging Threats and Capabilities Subcommittee will hold a hearing’ titled “Interim Review of the National Security Commission on Artificial Intelligence Effort and Recommendations” on 17 September with these witnesses:
    • Dr. Eric Schmidt , Chairman, National Security Commission on Artificial Intelligence 
    • HON Robert Work, Vice Chairman, National Security Commission on Artificial Intelligence, HON Mignon Clyburn, Commissioner, National Security Commission on Artificial Intelligence 
    • Dr. José-Marie Griffiths, Commissioner, National Security Commission on Artificial Intelligence
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” The agency has released its agenda and explained:
    • The workshop will also feature four panel discussions that will focus on: case studies on data portability rights in the European Union, India, and California; case studies on financial and health portability regimes; reconciling the benefits and risks of data portability; and the material challenges and solutions to realizing data portability’s potential.
  • The Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing “Examining Threats to American Intellectual Property: Cyber-attacks and Counterfeits During the COVID-19 Pandemic” with these witnesses:
    • Adam Hickey, Deputy Assistant Attorney General National Security Division, Department of Justice
    • Clyde Wallace, Deputy Assistant Director Cyber Division, Federal Bureau of Investigation
    • Steve Francis, Assistant Director, HSI Global Trade Investigations Division Director, National Intellectual Property Rights Center, U.S. Immigration and Customs Enforcement, Department of Homeland Security
    • Bryan S. Ware, Assistant Director for Cybersecurity Cyber Security and Infrastructure Security Agency, Department of Homeland Security
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled “Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September and has made available its agenda with these items:
    • Facilitating Shared Use in the 3.1-3.55 GHz Band. The Commission will consider a Report and Order that would remove the existing non-federal allocations from the 3.3-3.55 GHz band as an important step toward making 100 megahertz of spectrum in the 3.45-3.55 GHz band available for commercial use, including 5G, throughout the contiguous United States. The Commission will also consider a Further Notice of Proposed Rulemaking that would propose to add a co-primary, non-federal fixed and mobile (except aeronautical mobile) allocation to the 3.45-3.55 GHz band as well as service, technical, and competitive bidding rules for flexible-use licenses in the band. (WT Docket No. 19-348)
    • Expanding Access to and Investment in the 4.9 GHz Band. The Commission will consider a Sixth Report and Order that would expand access to and investment in the 4.9 GHz (4940-4990 MHz) band by providing states the opportunity to lease this spectrum to commercial entities, electric utilities, and others for both public safety and non-public safety purposes. The Commission also will consider a Seventh Further Notice of Proposed Rulemaking that would propose a new set of licensing rules and seek comment on ways to further facilitate access to and investment in the band. (WP Docket No. 07-100)
    • Improving Transparency and Timeliness of Foreign Ownership Review Process. The Commission will consider a Report and Order that would improve the timeliness and transparency of the process by which it seeks the views of Executive Branch agencies on any national security, law enforcement, foreign policy, and trade policy concerns related to certain applications filed with the Commission. (IB Docket No. 16-155)
    • Promoting Caller ID Authentication to Combat Spoofed Robocalls. The Commission will consider a Report and Order that would continue its work to implement the TRACED Act and promote the deployment of caller ID authentication technology to combat spoofed robocalls. (WC Docket No. 17-97)
    • Combating 911 Fee Diversion. The Commission will consider a Notice of Inquiry that would seek comment on ways to dissuade states and territories from diverting fees collected for 911 to other purposes. (PS Docket Nos. 20-291, 09-14)
    • Modernizing Cable Service Change Notifications. The Commission will consider a Report and Order that would modernize requirements for notices cable operators must provide subscribers and local franchising authorities. (MB Docket Nos. 19-347, 17-105)
    • Eliminating Records Requirements for Cable Operator Interests in Video Programming. The Commission will consider a Report and Order that would eliminate the requirement that cable operators maintain records in their online public inspection files regarding the nature and extent of their attributable interests in video programming services. (MB Docket No. 20-35, 17-105)
    • Reforming IP Captioned Telephone Service Rates and Service Standards. The Commission will consider a Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking that would set compensation rates for Internet Protocol Captioned Telephone Service (IP CTS), deny reconsideration of previously set IP CTS compensation rates, and propose service quality and performance measurement standards for captioned telephone services. (CG Docket Nos. 13-24, 03-123)
    • Enforcement Item. The Commission will consider an enforcement action.

Other Developments

  • The United States House of Representatives took up and passed two technology bills on 14 September. One of the bills, “Internet of Things (IoT) Cybersecurity Improvement Act of 2020” (H.R. 1668), was discussed in yesterday’s Technology Policy Update as part of an outlook on Internet of Things (IoT) legislation (see here for analysis). The House passed a revised version by voice vote, but its fate in the Senate may lie with the Senate Homeland Security & Governmental Affairs Committee, whose chair, Senator Ron Johnson (R-WI), has blocked a number of technology bills during his tenure to the chagrin of some House stakeholders. The House also passed the “AI in Government Act of 2019” (H.R.2575) that would establish an AI Center of Excellence within the General Services Administration that would
    • “(1) advise and promote the efforts of the Federal Government in developing innovative uses of artificial intelligence by the Federal Government to the benefit of the public; and
    • (2) improve cohesion and competency in the use of artificial intelligence.”
    • Also, this bill would direct the Office of Management and Budget (OMB) to “issue a memorandum to the head of each agency that shall—
      • inform the development of artificial intelligence governance approaches by those agencies regarding technologies and applications that—
        • are empowered or enabled by the use of artificial intelligence within that agency; and
        • advance the innovative use of artificial intelligence for the benefit of the public while upholding civil liberties, privacy, and civil rights;
      • consider ways to reduce barriers to the use of artificial intelligence in order to promote innovative application of those technologies for the benefit of the public, while protecting civil liberties, privacy, and civil rights;
      • establish best practices for identifying, assessing, and mitigating any bias on the basis of any classification protected under Federal nondiscrimination laws or other negative unintended consequence stemming from the use of artificial intelligence systems; and
      • provide a template of the required contents of the agency Governance Plans
    • The House Energy and Commerce Committee marked up and reported out more than 30 bills last week including:
      • The “Consumer Product Safety Inspection Enhancement Act” (H.R. 8134) that “would amend the Consumer Product Safety Act to enhance the Consumer Product Safety Commission’s (CPSC) ability to identify unsafe consumer products entering the United States, especially e-commerce shipments entering under the de minimis value exemption. Specifically, the bill would require the CPSC to enhance the targeting, surveillance, and screening of consumer products. The bill also would require electronic filing of certificates of compliance for all consumer products entering the United States.
      • The bill directs the CPSC to: 1) examine a sampling of de minimis shipments and shipments coming from China; 2) detail plans and timelines to effectively address targeting and screening of de minimis shipments; 3) establish metrics by which to evaluate the effectiveness of the CPSC’s efforts in this regard; 4) assess projected technology, resources, and staffing necessary; and 5) submit a report to Congress regarding such efforts. The bill further directs the CPSC to hire at least 16 employees every year until staffing needs are met to help identify violative products at ports.
      • The “AI for Consumer Product Safety Act” (H.R. 8128) that “would direct the Consumer Product Safety Commission (CPSC) to establish a pilot program to explore the use of artificial intelligence for at least one of the following purposes: 1) tracking injury trends; 2) identifying consumer product hazards; 3) monitoring the retail marketplace for the sale of recalled consumer products; or 4) identifying unsafe imported consumer products.” The revised bill passed by the committee “changes the title of the bill to the “Consumer Safety Technology Act”, and adds the text based on the Blockchain Innovation Act (H.R. 8153) and the Digital Taxonomy Act (H.R. 2154)…[and] adds sections that direct the Department of Commerce (DOC), in consultation with the Federal Trade Commission (FTC), to conduct a study and submit to Congress a report on the state of blockchain technology in commerce, including its use to reduce fraud and increase security.” The revised bill “would also require the FTC to submit to Congress a report and recommendations on unfair or deceptive acts or practices relating to digital tokens.”
      • The “American Competitiveness Of a More Productive Emerging Tech Economy Act” or the “American COMPETE Act” (H.R. 8132) “directs the DOC and the FTC to study and report to Congress on the state of the artificial intelligence, quantum computing, blockchain, and the new and advanced materials industries in the U.S…[and] would also require the DOC to study and report to Congress on the state of the Internet of Things (IoT) and IoT manufacturing industries as well as the three-dimensional printing industry” involving “among other things:1) listing industry sectors that develop and use each technology and public-private partnerships focused on promoting the adoption and use of each such technology; 2) establishing a list of federal agencies asserting jurisdiction over such industry sectors; and 3) assessing risks and trends in the marketplace and supply chain of each technology.
      • The bill would direct the DOC to study and report on the effect of unmanned delivery services on U.S. businesses conducting interstate commerce. In addition to these report elements, the bill would require the DOC to examine safety risks and effects on traffic congestion and jobs of unmanned delivery services.
      • Finally, the bill would require the FTC to study and report to Congress on how artificial intelligence may be used to address online harms, including scams directed at senior citizens, disinformation or exploitative content, and content furthering illegal activity.
  • The National Institute of Standards and Technology (NIST) issued NIST Interagency or Internal Report 8272 “Impact Analysis Tool for Interdependent Cyber Supply Chain Risks” designed to help public and private sector entities better address complicated, complex supply chain risks. NIST stated “[t]his publication de-scribes how to use the Cyber Supply Chain Risk Management (C-SCRM) Interdependency Tool that has been developed to help federal agencies identify and assess the potential impact of cybersecurity events in their interconnected supply chains.” NIST explained
    • More organizations are becoming aware of the importance of identifying cybersecurity risks associated with extensive, complicated supply chains. Several solutions have been developed to help manage supply chains; most focus on contract management or compliance. There is a need to provide organizations with a systematic and more usable way to evaluate the potential impacts of cyber supply chain risks relative to an organization’s risk appetite. This is especially important for organizations with complex supply chains and highly interdependent products and suppliers.
    • This publication describes one potential way to visualize and measure these impacts: a Cyber Supply Chain Risk Management (C-SCRM) Interdependency Tool (hereafter “Tool”), which is designed to provide a basic measurement of the potential impact of a cyber supply chain event. The Tool is not intended to measure the risk of an event, where risk is defined as a function of threat, vulnerability, likelihood, and impact. Research conducted by the authors of this publication found that, at the time of publication, existing cybersecurity risk tools and research focused on threats, vulnerabilities, and likelihood, but impact was frequently overlooked. Thus, this Tool is intended to bridge that gap and enable users and tool developers to create a more complete understanding of an organization’s risk by measuring impact in their specific environments.
    • The Tool also provides the user greater visibility over the supply chain and the relative importance of particular projects, products, and suppliers (hereafter referred to as “nodes”) compared to others. This can be determined by examining the metrics that contribute to a node’s importance, such as the amount of access a node has to the acquiring organization’s IT network, physical facilities, and data. By understanding which nodes are the most important in their organization’s supply chain, the user can begin to understand the potential impact a disruption of that node may cause on business operations. The user can then prioritize the completion of risk mitigating actions to reduce the impact a disruption would cause to the organization’s supply chain and overall business.
  • In a blog post, Microsoft released its findings on the escalating threats to political campaigns and figures during the run up to the United States’ (U.S.) election. This warning also served as an advertisement for Microsoft’s security products. But, be that as it may, these findings echo what U.S. security services have been saying for months. Microsoft stated
    • In recent weeks, Microsoft has detected cyberattacks targeting people and organizations involved in the upcoming presidential election, including unsuccessful attacks on people associated with both the Trump and Biden campaigns, as detailed below. We have and will continue to defend our democracy against these attacks through notifications of such activity to impacted customers, security features in our products and services, and legal and technical disruptions. The activity we are announcing today makes clear that foreign activity groups have stepped up their efforts targeting the 2020 election as had been anticipated, and is consistent with what the U.S. government and others have reported. We also report here on attacks against other institutions and enterprises worldwide that reflect similar adversary activity.
    • We have observed that:
      • Strontium, operating from Russia, has attacked more than 200 organizations including political campaigns, advocacy groups, parties and political consultants
      • Zirconium, operating from China, has attacked high-profile individuals associated with the election, including people associated with the Joe Biden for President campaign and prominent leaders in the international affairs community
      • Phosphorus, operating from Iran, has continued to attack the personal accounts of people associated with the Donald J. Trump for President campaign
    • The majority of these attacks were detected and stopped by security tools built into our products. We have directly notified those who were targeted or compromised so they can take action to protect themselves. We are sharing more about the details of these attacks today, and where we’ve named impacted customers, we’re doing so with their support.
    • What we’ve seen is consistent with previous attack patterns that not only target candidates and campaign staffers but also those they consult on key issues. These activities highlight the need for people and organizations involved in the political process to take advantage of free and low-cost security tools to protect themselves as we get closer to election day. At Microsoft, for example, we offer AccountGuard threat monitoring, Microsoft 365 for Campaigns and Election Security Advisors to help secure campaigns and their volunteers. More broadly, these attacks underscore the continued importance of work underway at the United Nations to protect cyberspace and initiatives like the Paris Call for Trust and Security in Cyberspace.
  • The European Data Protection Supervisor (EDPS) has reiterated and expanded upon his calls for caution, prudence, and adherence to European Union (EU) law and principles in the use of artificial intelligence, especially as the EU looks to revamp its approach to AI and data protection. In a blog post, EDPS Wojciech Wiewiórowski stated:
    • The expectations of the increasing use of AI and the related economic advantages for those who control the technologies, as well as its appetite for data, have given rise to fierce competition about technological leadership. In this competition, the EU strives to be a frontrunner while staying true to its own values and ideals.
    • AI comes with its own risks and is not an innocuous, magical tool, which will heal the world harmlessly. For example, the rapid adoption of AI by public administrations in hospitals, utilities and transport services, financial supervisors, and other areas of public interest is considered in the EC White Paper ‘essential’, but we believe that prudency is needed. AI, like any other technology, is a mere tool, and should be designed to serve humankind. Benefits, costs and risks should be considered by anyone adopting a technology, especially by public administrations who process great amounts of personal data.
    • The increase in adoption of AI has not been (yet?) accompanied by a proper assessment of what the impact on individuals and on our society as a whole will likely be. Think especially of live facial recognition (remote biometric identification in the EC White Paper). We support the idea of a moratorium on automated recognition in public spaces of human features in the EU, of faces but also and importantly of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals.
    • Let’s not rush AI, we have to get it straight so that it is fair and that it serves individuals and society at large.
    • The context in which the consultation for the Data Strategy was conducted gave a prominent place to the role of data in matters of public interest, including combating the virus. This is good and right as the GDPR was crafted so that the processing of personal data should serve humankind. There are existing conditions under which such “processing for the public good” could already take place, and without which the necessary trust of data subjects would not be possible.
    • However, there is a substantial persuasive power in the narratives nudging individuals to ‘volunteer’ their data to address highly moral goals. Concepts such as ‘Data altruism”, or ‘Data donation” and their added value are not entirely clear and there is a need to better define and lay down their scope, and possible purposes, for instance, in the context of scientific research in the health sector. The fundamental right to the protection of personal data cannot be ‘waived’ by the individual concerned, be it through a ‘donation’ or through a ‘sale’ of personal data. The data controller is fully bound by the personal data rules and principles, such as purpose limitation even when processing data that have been ‘donated’ i.e. when consent to the processing had been given by the individual.

Further Reading

  • Peter Thiel Met With The Racist Fringe As He Went All In On Trump” By Rosie Gray and Ryan Mac — BuzzFeed News. A fascinating article about one of the technology world’s more interesting figures. As part of his decision to ally himself with Donald Trump when running for president, Peter Thiel also met with avowed white supremacists. However, it appears that the alliance is no longer worthy of his financial assistance or his public support as he supposedly was disturbed about the Administration’s response to the pandemic. However, Palantir, his company has flourished during the Trump Administration and may be going public right before matters may change under a Biden Administration.
  • TikTok’s Proposed Deal Seeks to Mollify U.S. and China” By David McCabe, Ana Swanson and Erin Griffith — The New York Times. ByteDance is apparently trying to mollify both Washington and Beijing in bringing Oracle onboard as “trusted technology partner,” for the arrangement may be acceptable to both nations under their export control and national security regimes. Oracle handling and safeguarding TikTokj user data would seem to address the Trump Administration’s concerns, but not selling the company nor permitting Oracle to access its algorithm for making recommendations would seem to appease the People’s Republic of China (PRC). Moreover, United States (U.S.) investors would hold control over TikTok even though PRC investors would maintain their stakes. Such an arrangement may satisfy the Committee on Foreign Investment in the United States (CFIUS), which has ordered ByteDance to sell the app that is an integral part of TikTok. The wild card, as always, is where President Donald Trump ultimately comes out on the deal.
  • Oracle’s courting of Trump may help it land TikTok’s business and coveted user data” By Jay Greene and Ellen Nakashima — The Washington Post. This piece dives into why Oracle, at first blush, seems like an unlikely suitor to TikTok, but it’s eroding business position visa vis cloud companies like Amazon explains its desire to diversify. Also, Oracle’s role as a data broker makes all the user data available from TikTok very attractive.
  • Chinese firm harvests social media posts, data of prominent Americans and military” By Gerry Shih — The Washington Post. Another view on Shenzhen Zhenhua Data Technology, the entity from the People’s Republic of China (PRC) exposed for collecting the personal data of more than 2.4 million westerners, many of whom hold positions of power and influence. This article quotes a number of experts allowed to look at what was leaked of the data base who are of the view the PRC has very little in the way of actionable intelligence, at this point. The country is leveraging publicly available big data from a variety of sources and may ultimately makes something useful from these data.
  • “‘This is f—ing crazy’: Florida Latinos swamped by wild conspiracy theories” By Sabrina Rodriguez and Marc Caputo — Politico. A number of sources are spreading rumors about former Vice President Joe Biden and the Democrats generally in order to curb support among a key demographic the party will need to carry overwhelmingly to win Florida.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Alexander Sinn on Unsplash

Pending Legislation In U.S. Congress, Part V

Congress may well pass IoT legislation this year, and the two bills under consideration take different approaches.

Continuing our look at bills Congress may pass this year leads us to an issue area that has received attention but no legislative action; the Internet of Things (IoT). Many Members are aware and concerned about the lax or nonexistent security standards for many such devices, which leaves them open to attack or being used as part of a larger bot network to attack other internet connected devices. There are two bills with significant odds of being enacted, one better than the other, for it is a more modest bill and it is attached to the Senate’s FY 2021 National Defense Authorization Act. However, the other bill is finally coming to the House floor today, which may shake loose its companion bill in the Senate.

As the United States (U.S.) Departments of Commerce and Homeland Security explained in “A Report to the President on Enhancing the Resilience of the Internet and Communications Ecosystem Against Botnets and Other Automated, Distributed Threats, insecure IoT poses huge threats to the rest of the connected world:

The Distributed Denial of Service (DDoS) attacks launched from the Mirai botnet in the fall of 2016, for example, reached a level of sustained traffic that overwhelmed many common DDoS mitigation tools and services, and even disrupted a Domain Name System (DNS) service that was a commonly used component in many DDoS mitigation strategies. This attack also highlighted the growing insecurities in—and threats from—consumer-grade IoT devices. As a new technology, IoT devices are often built and deployed without important security features and practices in place. While the original Mirai variant was relatively simple, exploiting weak device passwords, more sophisticated botnets have followed; for example, the Reaper botnet uses known code vulnerabilities to exploit a long list of devices, and one of the largest DDoS attacks seen to date recently exploited a newly discovered vulnerability in the relatively obscure MemCacheD software.

Later in the report, as part of one of the proposed goals, the departments asserted:

When market incentives encourage manufacturers to feature security innovations as a balanced complement to functionality and performance, it increases adoption of tools and processes that result in more secure products. As these security features become more popular, increased demand will drive further research.

However, I would argue there are no such market incentives at this point, for most people looking to buy and use IoT are not even thinking about security except in the most superficial ways. Moreover, manufacturers and developers of IoT have not experienced the sort of financial liability or regulatory action that might change the incentive structure. In May, the Federal Trade Commission (FTC) reached “a settlement with a Canadian company related to allegations it falsely claimed that its Internet-connected smart locks were designed to be “unbreakable” and that it took reasonable steps to secure the data it collected from users.”

As mentioned, one of the two major IoT bills stands a better chance of enactment. The “Developing Innovation and Growing the Internet of Things Act” (DIGIT Act) (S. 1611) would establish the beginnings of a statutory regime for the regulation of IoT at the federal level. The bill is sponsored by Senators Deb Fischer (R-NE), Cory Gardner (R-CO), Brian Schatz (D-HI), and Cory Booker (D-NJ) and is substantially similar to legislation (S. 88) the Senate passed unanimously in the last Congress the House never took up. In January, the Senate passed the bill by unanimous consent but the House has yet to take up the bill. S.1611was then added as an amendment to the “National Defense Authorization Act for Fiscal Year 2021“ (S.4049) in July. Its inclusion in an NDAA passed by a chamber of Congress dramatically increases the chances of enactment. However, it is possible the stakeholders in the House that have stopped this bill from advancing may yet succeed in stripping it out of a final NDAA.

Under this bill, the Secretary of Commerce must “convene a working group of Federal stakeholders for the purpose of providing recommendations and a report to Congress relating to the aspects of the Internet of Things, including”

identify any Federal regulations, statutes, grant practices, budgetary or jurisdictional challenges, and other sector-specific policies that are inhibiting, or could inhibit, the development or deployment of the Internet of Things;

  • consider policies or programs that encourage and improve coordination among Federal agencies that have responsibilities that are relevant to the objectives of this Act;
  • consider any findings or recommendations made by the steering committee and, where appropriate, act to implement those recommendations;
  • examine—
    • how Federal agencies can benefit from utilizing the Internet of Things;
    • the use of Internet of Things technology by Federal agencies as of the date on which the working group performs the examination;
    • the preparedness and ability of Federal agencies to adopt Internet of Things technology as of the date on which the working group performs the examination and in the future; and
    • any additional security measures that Federal agencies may need to take to—
      • safely and securely use the Internet of Things, including measures that ensure the security of critical infrastructure; and
      • enhance the resiliency of Federal systems against cyber threats to the Internet of Things

S.1611 requires this working group to have representatives from specified agencies such as the National Telecommunications and Information Administration, the National Institute of Standards and Technology, the Department of Homeland Security, the Office of Management and Budget, the Federal Trade Commission, and others. Nongovernmental stakeholders would also be represented on this body. Moreover, a steering committee would be established inside the Department of Commerce to advise this working group on a range of legal, policy, and technical issues. Within 18 months of enactment of S.1611, the working group would need to submit its recommendations to Congress that would then presumably inform additional legislation regulating IoT.  Finally, the Federal Communications Commission (FCC) would report to Congress on “future spectrum needs to enable better connectivity relating to the Internet of Things” after soliciting input from interested parties.

As noted, there is another IoT bill in Congress that may make it to the White House. In June 2019 the Senate and House committees of jurisdictions marked up their versions of the “Internet of Things (IoT) Cybersecurity Improvement Act of 2019” (H.R. 1668/S. 734), legislation that would tighten the federal government’s standards with respect to buying and using IoT. In what may augur enactment of this legislation, the House will take up its version today. However, new language in the amended bill coming to the floor making clear that the IoT standards for the federal government would not apply to “national security systems” (i.e. most of the Department of Defense, Intelligence Community, and other systems) suggests the roadblock that may have stalled this legislation for 15 months. It is reasonable to deduce that the aforementioned agencies made their case to the bill’s sponsors or allies in Congress that these IoT standards would somehow harm national security if made applicable to the defense IoT.

The bill text as released in March for both bills was identical signaling agreement between the two chambers’ sponsors, but the process of marking up the bills has resulted in different versions, requiring negotiation on a final bill. The House Oversight and Reform Committee marked up and reported out H.R. 1668 after adopting an amendment in the nature of a substitute that narrowed the scope of the bill and is more directive than the bill initially introduced in March. The Senate Homeland Security and Governmental Affairs Committee marked up S. 734 a week later, making their own changes from the March bill. The March version of the legislation unified two similar bills from the 115th Congress of the same title: the “Internet of Things (IoT) Cybersecurity Improvement Act of 2017” (S. 1691) and the “Internet of Things (IoT) Federal Cybersecurity Improvement Act of 2018” (H.R. 7283).

Per the Committee Report for S. 734, the purpose of bill

is to proactively mitigate the risks posed by inadequately-secured IoT devices through the establishment of minimum security standards for IoT devices purchased by the Federal Government. The bill codifies the ongoing work of the National Institute of Standards and Technology (NIST) to develop standards and guidelines, including minimum-security requirements, for the use of IoT devices by Federal agencies. The bill also directs the Office of Management and Budget (OMB), in consultation with the Department of Homeland Security (DHS), to issue the necessary policies and principles to implement the NIST standards and guidelines on IoT security and management. Additionally, the bill requires NIST, in consultation with cybersecurity researchers and industry experts, to publish guidelines for the reporting, coordinating, publishing, and receiving of information about Federal agencies’ security vulnerabilities and the coordinate resolutions of the reported vulnerabilities. OMB will provide the policies and principles and DHS will develop and issue the procedures necessary to implement NIST’s guidelines on coordinated vulnerability disclosure for Federal agencies. The bill includes a provision allowing Federal agency heads to waive the IoT use and management requirements issued by OMB for national security, functionality, alternative means, or economic reasons.

In general, this bill seeks to leverage the federal government’s ability to set standards through acquisition processes to ideally drive the development of more secure IoT across the U.S. The legislation would require the National Institute of Standards and Technology (NIST), the Office of Management and Budget (OMB), and the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) to work together to institute standards for IoT owned or controlled by most federal agencies. As mentioned, the latest version of this bill explicitly exclude “national security systems.” These standards would need to focus on secure development, identity management, patching, and configuration management and would be made part of Federal Acquisition Regulations (FAR), making them part of the federal government’s approach to buying and utilizing IoT. Thereafter, civilian federal agencies and contractors would need to use and buy IoT that meets the new security standards. Moreover, NIST would need to create and implement a process for the reporting of vulnerabilities in information systems owned or operated by agencies, including IoT naturally. However, the bill would seem to make contractors and subcontractors providing IoT responsible for sharing vulnerabilities upon discovery and then sending around fixes and patches when developed. And yet, this would seem to overlap with the recently announced Trump Administration vulnerabilities disclosure process (see here for more analysis) and language in the bill could be read as enshrining in statute the basis for the recently launched initiative even though future Administrations would have flexibility to modify or revamp as necessary.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by lea hope bonzer from Pixabay

Pending Legislation In U.S. Congress, Part IV

There is an even chance that Congress further narrows the Section 230 liability shield given criticism of how tech companies have wielded this language.

This year, Congress increased its focus on Section 230 of the Communications Act of 1934 that gives companies like Facebook, Twitter, Google, and others blanket immunity from litigation based on the content others post. Additionally, these platforms cannot be sued for “good faith” actions to take down or restrict material considered “to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” Many Republicans are claiming both that these platforms are biased against conservative content (a claim not borne out by the evidence we have) and are not doing enough to find and remove material that exploits children. Many Democrats are arguing the platforms are not doing enough to remove right wing hate speech and agree, in some part, regarding material that exploits children.

Working in the background of any possible legislation to narrow Section 230 is an executive order issued by the President directing two agencies to investigate “online censorship” even though the Supreme Court of the United States has long held that a person or entity does not have First Amendment rights visa vis private entities. Finally, the debate over encryption is also edging its way into Section 230 by a variety of means, as the Trump Administration, especially the United States Department of Justice (DOJ) has been pressuring tech companies to address end-to-end encryption on devices and apps. One means of pressure is threatening to remove Section 230 liability protection to garner compliance on encryption issues.

In late July, the Senate Judiciary Committee met, amended and reported out the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020” (EARN IT Act of 2020) (S.3398), a bill that would change 47 USC 230 by narrowing the liability shield and potentially making online platforms liable to criminal and civil actions for having child sexual materials on their platforms. The bill as introduced in March was changed significantly when a manager’s amendment was released and then further changed at the markup. The Committee reported out the bill unanimously, sending it to the full Senate, perhaps signaling the breadth of support for the legislation. It is possible this could come before the full Senate this year. If passed, the EARN IT Act of 2020 would represent a second piece of legislation to change Section 230 in the last two years with enactment of “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” (P.L. 115-164). There is, at present, no House companion bill.

In advance of the markup, two of the sponsors, Judiciary Committee Chair Lindsey Graham (R-SC) and Senator Richard Blumenthal (D-CT) released a manager’s amendment to the EARN IT Act. The bill would still establish a National Commission on Online Child Sexual Exploitation Prevention (Commission) that would design and recommend voluntary “best practices” applicable to technology companies such as Google, Facebook, and many others to address “the online sexual exploitation of children.”

Moreover, instead of creating a process under which the DOJ, Department of Homeland Security (DHS), and the Federal Trade Commission (FTC) would accept or reject these standards, as in the original bill, the DOJ would merely have to publish them in the Federal Register. Likewise, the language establishing a fast track process for Congress to codify these best practices has been stricken, too as well as the provisions requiring certain technology companies to certify compliance with the best practices.

Moreover, the revised bill also lacks the safe harbor against lawsuits based on having “child sexual abuse material” on their platform for following the Commission’s best practices. Therefore, instead of encouraging technology companies to use the best practices in exchange for continuing to enjoy liability protection, the language creating this safe harbor in the original bill has been stricken. Now the manager’s amendment strikes liability protection under 47 USC 230 for these materials except if a platform is acting as a Good Samaritan in removing these materials. Consequently, should a Facebook or Google fail to find and take down these materials in an expeditious fashion, then they would face federal and state liability to civil and criminal lawsuits.

However, the Committee adopted an amendment offered by Senator Patrick Leahy (D-VT) that would change 47 USC 230 by making clear that the use of end-to-end encryption does not make providers liable for child sexual exploitation laws and abuse material. Specifically, no liability would attach because the provider

  • utilizes full end-to-end encrypted messaging   services,   device   encryption,   or   other   encryption services;
  • does  not  possess  the  information  necessary to decrypt a communication; or
  • fails to take an action that would otherwise  undermine  the  ability  of  the  provider  to  offer  full  end-to-end  encrypted  messaging  services, device encryption, or other encryption services.

Moreover, in advance of the first hearing to markup the EARN IT Act of 2020, key Republican stakeholders released a bill that would require device manufacturers, app developers, and online platforms to decrypt data if a federal court issues a warrant based on probable cause. Critics of the EARN IT Act of 2020 claimed the bill would force big technology companies to choose between weakening encryption or losing their liability protection under Section 230. They likely see this most recent bill as another shot across the bow of technology companies, many of which continue to support and use end-to-end encryption even though the United States government and close allies are pressuring them on the issue. However, unlike the EARN IT Act of 2020, this latest bill does not have any Democratic cosponsors.

Graham and Senators Tom Cotton (R-AR) and Marsha Blackburn (R-TN) introduced the “Lawful Access to Encrypted Data Act” (S.4051) that would require the manufacturers of devices such as smartphones, app makers, and platforms to decrypt a user’s data if a federal court issues a warrant to search a device, app, or operating system.

The assistance covered entities must provide includes:

  • isolating the information authorized to be searched;
  • decrypting or decoding information on the electronic device or remotely stored electronic information that is authorized to be searched, or otherwise providing such information in an intelligible format, unless the independent actions of an unaffiliated entity make it technically impossible to do so; and
  • providing technical support as necessary to ensure effective execution of the warrant for the electronic devices particularly described by the warrant.


The DOJ would be able to issue “assistance capability directives” that would require the recipient to prepare or maintain the ability to aid a law enforcement agency that obtained a warrant that needs technical assistance to access data. Recipients of such orders can file a petition in federal court in Washington, DC to modify or set aside the order on only three grounds: it is illegal, it does meet the requirements of the new federal regulatory structure, or “it is technically impossible for the person to make any change to the way the hardware, software, or other property of the person behaves in order to comply with the directive.” If a court rules against the recipient of such an order, it must comply, and if any recipient of such an order does not comply, a court may find it in contempt of court, allowing for a range of punishments until the contempt is cured. The bill also amends the “Foreign Intelligence Surveillance Act” (FISA) to require the same decryption and assistance in FISA activities, which are mostly surveillance of people outside the United States. The bill would focus on those device manufacturers that sell more than 1 million devices and those platforms and apps with more than 1 million users, meaning obviously companies like Apple, Facebook, Google, and others. The bill also tasks the DOJ with conducting a prize competition “to incentivize and encourage research and innovation into solutions providing law enforcement access to encrypted data pursuant to legal process”

In response to the EARN IT Act, a bicameral group of Democrats released legislation to dramatically increase funding for the United States’ government to combat the online exploitation of children that has served as an alternate proposal to  a bill critics claim would force technology companies to give way on encryption under pain of losing the Section 230 liability shield. The “Invest in Child Safety Act” (H.R.6752/S.3629) would require $5 billion in funding outside the appropriations process to bolster current efforts to fight online exploitation and abuse. This bill was introduced roughly two months after the EARN IT Act of 2020, and in their press release, Senators Ron Wyden (D-OR), Kirsten Gillibrand (D-NY), Bob Casey D-PA) and Sherrod Brown (D-OH) stated

The Invest in Child Safety Act would direct $5 billion in mandatory funding to investigate and target the pedophiles and abusers who create and share child sexual abuse material online. And it would create a new White House office to coordinate efforts across federal agencies, after DOJ refused to comply with a 2008 law requiring coordination and reporting of those efforts. It also directs substantial new funding for community-based efforts to prevent children from becoming victims in the first place.  

Representatives Anna Eshoo (D-CA), Kathy Castor (D-FL), Ann M. Kuster (D-NH), Eleanor Holmes Norton (D-DC), Alcee L. Hastings (D-FL), and Deb Haaland (D-NM) introduced the companion bill in the House.

The bill would establish in the Executive Office of the President an Office to Enforce and Protect Against Child Sexual Exploitation headed by a Senate confirmed Director who would coordinate efforts across the U.S. government to fight child exploitation. Within six months of the appointment of the first Director, he or she would need to submit to Congress “an enforcement and protection strategy” and thereafter send an annual report as well. The DOJ and Federal Bureau of Investigation would receive additional funding to bolster and improve their efforts in this field.

In June, Senator Josh Hawley (R-MO) introduced the “Limiting Section 230 Immunity to Good Samaritans Act” (S.3983) that is cosponsored by Senators Marco Rubio (R-FL), Kelly Loeffler (R-GA), Mike Braun (R-IN) and Tom Cotton (R-AR). The bill would amend the liability shield in 47 U.S.C. 230 to require large social media platforms like Facebook and Twitter to update their terms of service so that they must operate under “good faith” or face litigation with possible monetary damages for violating these new terms of service. Hawley’s bill would add a definition of “good faith” to the statute, which echoes one of the recommendations made by the DOJ. In relevant part, the new terms of service would bar so-called “edge providers” from “intentional[]  selective  enforcement  of  the  terms  of  service  of  the  interactive  computer  service,  including  the  intentionally  selective  enforcement  of  policies  of  the  provider  relating  to  restricting  access to or availability of material.” If such “selective enforcement” were to occur, then edge providers could be sued but the plaintiffs would have to show the edge provider actually knew they were breaching the terms of service by selectively enforcing its platform rules.

The focus of such alleged “selective enforcement” arise from allegations that conservative material posted on Twitter and Facebook is being targeted in ways that liberal material is not, including being taken down. This claim has been leveled by many Republican stakeholders. And now they are proposing providing affected people with the right to sue; however, it is not clear whether these Republicans have changed their minds on allowing private rights of action against technology companies as a means of enforcing laws. To date, many Republicans have opposed private rights of action for data breaches or violations of privacy.

In early July, Senator Brian Schatz (D-HI) and Senate Majority Whip John Thune (R-SD) introduced the “Platform Accountability and Consumer Transparency (PACT) Act” (S.4066) that would reform Section 230. Schatz and Thune are offering their bill as an alternative to the EARN IT Act of 2020. Schatz and Thune serve as the Ranking Member and Chair of the Communications, Technology, Innovation and the Internet Subcommittee of the Senate Commerce, Science, and Transportation Committee and are thus key stakeholders on any legislation changing Section 230.

Under the PACT Act, so-called “interactive computer services” (the term of art used in Section 230) would need to draft and publish “acceptable use polic[ies]” that would inform users of what content may be posted, a breakdown of the process by which the online platform reviews content to make sure it is in compliance with policy, and spell out the process people may use to report potentially policy-violating content, illegal content, and illegal activity. The PACT Act defines each of the three terms:

  • ‘‘illegal activity’’ means activity conducted by an information content provider that has been determined by a Federal or State court to violate Federal criminal or civil law.
  • ‘‘illegal content’’ means information provided by an information content provider that has been determined by a Federal or State court to violate—
    • Federal criminal or civil law; or
    • State defamation law.
  • “potentially policy-violating content’’ means content that may violate the acceptable use policy of the provider of an interactive computer service.

The first two definitions will pose problems in practice, for if one state court determines content is illegal but another does not, then how must an online platform respond to comply with the reformed Section 230. The same would be true of illegal activity. Consequently, online platforms may be forced to monitor content state to state, hardly a practical system and one that would favor existing market entrants while proving a barrier to entry for new entrants. And, then based on different state or federal court rulings are online platforms to then allow or take down content on the basis of where the person posting the content lives?

In any event, after receiving notice, online platforms would have 24 hours to remove illegal content or activity and two weeks for potentially policy-violating content to review the notice and determine if the content actually violates the platform’s policies. In the latter case, the platform would be required to notify the person that posted the content and allow them an appeal if the online platform decides to take down the content because it violated its policies based on a user complaint. There would be a different standard for small business providers, requiring them to act on the three categories of information within a reasonable period of time after receiving notice. And, telecommunications and cloud networks and other entities would be exempted from this reform to Section 230 altogether.

However, Section 230’s liability shield would be narrowed with respect to illegal content and activity. If a provider knows of the illegal content and activity but does not remove it within 24 hours, then they would lose the shield from lawsuits. So, if Facebook fails to take down a posting urging someone to assassinate the President, a federal crime, within 24 hours of being notified it was posted, it could be sued. However, Facebook and similar companies would not have an affirmative duty to locate and remove illegal content and activity, however, and could continue to enjoy Section 230 liability if there is either type of content on its platform so long as there is no notice provided. And yet, Section 230 would be narrowed overall as the provision making clear that all federal criminal and civil laws and regulations are outside the liability protection. Currently, this provision only pertains to federal criminal statutes. And, state attorneys general would be able to enforce federal civil laws if the lawsuit could also be brought on the basis of a civil law in the attorney general’s state.

Interactive computer services must publish a quarterly transparency report including the total number of instances in which illegal content, illegal activity, or potentially policy-violating content was flagged and the number of times action was taken, among other data. Additionally, they would need to identify the number of times they demonetized or deprioritized content. These reports would be publicly available.

The FTC would be explicitly empowered to act under the bill. Any violations of the process by which an online platform reviews notices of potentially policy-violating content, appeals, and transparency reports would be violations of an FTC regulation defining an unfair or deceptive act or practice, allowing the agency to seek civil fines for first violations. But, this authority is circumscribed by a provision barring the FTC from reviewing “any action or decision by a provider of an interactive computer service related to the application of the acceptable use policy of the provider.” This limitation would seem to allow an online platform to remove content on its own initiative if it violates the platform’s policies without the FTC being able to review such decisions. This would provide ample incentive for Facebook, Twitter, Reddit, and others to police their platforms so that they could avoid FTC action. The FTC’s jurisdiction would be widened to include non-profits regarding how they manage removing content based on a user complaint the same way for profit entities would be subject to the agency’s scrutiny.

The National Institute of Technology and Standards (NIST) would need to develop “a voluntary framework, with input from relevant experts, that consists of non-binding standards, guidelines, and best practices to manage risk and shared challenges related to, for the purposes of this Act, good faith moderation practices by interactive computer service providers.”

This week, Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS), Graham, and Blackburn introduced the latest Section 230 bill, the “Online Freedom and Viewpoint Diversity Act” (S.4534) that would essentially remove liability protection for social media platforms and others that choose to correct, label, or remove material, mainly political material. A platform’s discretion would be severely limited as to when and under what circumstances it could take down content. This bill would seem tailored to conservatives who believe Twitter, Facebook, etc. are biased against them and their viewpoints.

In May, after Twitter factchecked two of his Tweets regarding false claims made about mail voting in California in response to the COVID-19 pandemic, President Donald Trump signed a long rumored executive order (EO) seen by many as a means of cowing social media platforms: the “Executive Order on Preventing Online Censorship.” This EO directed federal agencies to act, and one has by asking the Federal Communications Commission (FCC) to start a rulemaking, which has been initiated. However, there is at least one lawsuit pending to enjoin action on the EO that could conceivably block implementation.

In the EO, the President claimed

Section 230 was not intended to allow a handful of companies to grow into titans controlling vital avenues for our national discourse under the guise of promoting open forums for debate, and then to provide those behemoths blanket immunity when they use their power to censor content and silence viewpoints that they dislike.  When an interactive computer service provider removes or restricts access to content and its actions do not meet the criteria of subparagraph (c)(2)(A), it is engaged in editorial conduct.  It is the policy of the United States that such a provider should properly lose the limited liability shield of subparagraph (c)(2)(A) and be exposed to liability like any traditional editor and publisher that is not an online provider.

Consequently, the EO directs that “all executive departments and agencies should ensure that their application of section 230(c) properly reflects the narrow purpose of the section and take all appropriate actions in this regard.”

With respect to specific actions, the Department of Commerce’s the National Telecommunications and Information Administration (NTIA) was directed to file a petition for rulemaking with the FCC to clarify the interplay between clauses of Section 230, notably whether the liability shield that protects companies like Twitter and Facebook for content posted on an online platform also extends to so-called “editorial decisions,” presumably actions like Twitter’s in factchecking Trump regarding mail balloting. The NTIA was also to ask the FCC to define better the conditions under which an online platform may take down content in good faith that are “deceptive, pretextual, or inconsistent with a provider’s terms of service; or taken after failing to provide adequate notice, reasoned explanation, or a meaningful opportunity to be heard.” The NTIA was directed to also ask the FCC to promulgate any other regulations necessary to effectuate the EO.

The FTC must consider whether online platforms are violating Section 5 of the FTC Act barring unfair or deceptive practices, which “may include practices by entities covered by section 230 that restrict speech in ways that do not align with those entities’ public representations about those practices.” As of yet, the FTC has not done so, and in remarks before Congress, FTC Chair Joseph Simons has opined that doing so is outside the scope of the agency’s mission. Consequently, there has been talk in Washington that the Trump Administration is looking for a new FTC Chair.

Following the directive in the EO, on 27 July, the NTIA filed a petition with the FCC, asking the agency to start a rulemaking to clarify alleged ambiguities in 47 USC 230 regarding the limits of the liability shield for the content others post online versus the liability protection for “good faith” moderation by the platform itself.

The NTIA asserted “[t]he FCC should use its authorities to clarify ambiguities in section 230 so as to make its interpretation appropriate to the current internet marketplace and provide clearer guidance to courts, platforms, and users…[and] urges the FCC to promulgate rules addressing the following points:

  1. Clarify the relationship between subsections (c)(1) and (c)(2), lest they be read and applied in a manner that renders (c)(2) superfluous as some courts appear to be doing.
  2. Specify that Section 230(c)(1) has no application to any interactive computer service’s decision, agreement, or action to restrict access to or availability of material provided by another information content provider or to bar any information content provider from using an interactive computer service.
  3. Provide clearer guidance to courts, platforms, and users, on what content falls within (c)(2) immunity, particularly section 230(c)(2)’s “otherwise objectionable” language and its requirement that all removals be done in “good faith.”
  4. Specify that “responsible, in whole or in part, for the creation or development of information” in the definition of “information content provider,” 47 U.S.C.
    § 230(f)(3), includes editorial decisions that modify or alter content, including but not limited to substantively contributing to, commenting upon, editorializing about, or presenting with a discernible viewpoint content provided by another information content provider.
  5. Mandate disclosure for internet transparency similar to that required of other internet companies, such as broadband service providers.

NTIA argued that

  • Section 230(c)(1) has a specific focus: it prohibits “treating” “interactive computer services,” i.e., internet platforms, such as Twitter or Facebook, as “publishers.” But, this provision only concerns “information” provided by third parties, i.e., “another internet content provider”68 and does not cover a platform’s own content or editorial decisions.
  • Section (c)(2) also has a specific focus: it eliminates liability for interactive computer services that act in good faith “to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

In early August, the FCC asked for comments on the NTIA petition, and comments were due by 2 September. Over 2500 comments have been filed, and a cursory search turned up numerous form letter comments drafted by a conservative organization that were then submitted by members and followers.

Finally, the House’s “FY 2021 Financial Services and General Government Appropriations Act” (H.R. 7668) has a provision that would bar either the FTC or FCC from taking certain actions related to Executive Order 13925, “Preventing Online Censorship.” It is very unlikely Senate Republicans, some of whom have publicly supported this Executive Order, will allow this language into the final bill funding the agencies.

There has been other executive branch action on Section 230. In mid-June, the DOJ released “a set of reform proposals to update the outdated immunity for online platforms under Section 230” according to a department press release. While these proposals came two weeks after President Donald Trump’s “Executive Order on Preventing Online Censorship” signed after Twitter fact checked two tweets that were not true (see here for more detail and analysis), the DOJ launched its review of 47 U.S.C. 230 in February 2020.

The DOJ explained “[t]he Section 230 reforms that the Department of Justice identified generally fall into four categories:

1) Incentivizing Online Platforms to Address Illicit Content. The first category of potential reforms is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation.

  1. Bad Samaritan Carve-Out. First, the Department proposes denying Section 230 immunity to truly bad actors. The title of Section 230’s immunity provision—“Protection for ‘Good Samaritan’ Blocking and Screening of Offensive Material”—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms. It therefore makes little sense to immunize from civil liability an online platform that purposefully facilitates or solicits third-party content or activity that would violate federal criminal law.
  2. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes exempting from immunity specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute.
  3. Case-Specific Carve-Outs for Actual Knowledge or Court Judgments. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.

2) Clarifying Federal Government Civil Enforcement Capabilities. A second category of reform would increase the ability of the government to protect citizens from illicit online conduct and activity by making clear that the immunity provided by Section 230 does not apply to civil enforcement by the federal government, which is an important complement to criminal prosecution.

3) Promoting Competition. A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.

4) Promoting Open Discourse and Greater Transparency. A fourth category of potential reforms is intended to clarify the text and original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users.

  1. Replace Vague Terminology in (c)(2). First, the Department supports replacing the vague catch-all “otherwise objectionable” language in Section 230 (c)(2) with “unlawful” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230—to reduce online content harmful to children—while limiting a platform’s ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it “objectionable.”
  2. Provide Definition of Good Faith. Second, the Department proposes adding a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and accompanied by a reasonable explanation, unless such notice would impede law enforcement or risk imminent harm to others. Clarifying the meaning of “good faith” should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.
  3. Continue to Overrule Stratton Oakmont to Avoid the Moderator’s Dilemma. Third, the Department proposes clarifying that a platform’s removal of content pursuant to Section 230 (c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service.

While the DOJ did not release legislative language to affect these changes, it is possible to suss out the DOJ’s purposes in making these recommendations. The Department clearly believes that the Section 230 liability shield deprives companies like Facebook of a number of legal and financial incentives to locate, takedown, or block material such as child pornography. The New York Times published articles last year (see here and here) about the shortcomings critics have found in a number of online platforms’ efforts to find and remove this material. If the companies faced civil liability for not taking down the images, the rationale seems to go, then they would devote much greater resources to doing so. Likewise, with respect to terrorist activities and cyber-bullying, the DOJ seems to think this policy change would have the same effect.

Some of the DOJ’s other recommendations seem aimed at solving an issue often alleged by Republicans and conservatives: that their speech is more heavily policed and censored than others on the political spectrum. The recommendations call for removing the word “objectionable” from the types of material a provider may remove or restrict in good faith and adding “unlawful” and “promotes terrorism.” The recommendations would also call for a statutory definition of “good faith,” which dovetails with an action in the EO for an Administration agency to petition the Federal Communications Commission (FCC) to conduct a rulemaking to better define this term.

Some consider the Department’s focus on Section 230 liability a proxy for its interest in having technology companies drop default end-to-end encryption and securing their assistance in accessing any communications on such platforms. If this were true, the calculation seems to be technology companies would prefer to be shielded from financial liability over ensuring users communications and transactions are secured via encryption.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Further Reading, Other Developments, and Coming Events (8 September)

Here is today’s Further Reading, Other Developments, and Coming Events.

Coming Events

  • The United States-China Economic and Security Review Commission will hold a hearing on 9 September on “U.S.-China Relations in 2020: Enduring Problems and Emerging Challenges” to “evaluate key developments in China’s economy, military capabilities, and foreign relations, during 2020.”
  • On 10 September, the General Services Administration (GSA) will have a webinar to discuss implementation of Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) that bars the federal government and its contractors from buying the equipment and services from Huawei, ZTE, and other companies from the People’s Republic of China.
  • The Federal Communications Commission (FCC) will hold a forum on 5G Open Radio Access Networks on 14 September. The FCC asserted
    • Chairman [Ajit] Pai will host experts at the forefront of the development and deployment of open, interoperable, standards-based, virtualized radio access networks to discuss this innovative new approach to 5G network architecture. Open Radio Access Networks offer an alternative to traditional cellular network architecture and could enable a diversity in suppliers, better network security, and lower costs.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.”
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled ““Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The National Institute of Standards and Technology (NIST) announced a 15 and 16 September webinar to discuss its Draft Outline of Cybersecurity Profile for the Responsible Use of Positioning, Navigation, and Timing (PNT) Services. NIST stated it “seeks insight and feedback on this Annotated Outline to improve the PNT cybersecurity profile, which is scheduled for publication in February 2021…[and] [a]reas needing more input include feedback on the description of systems that use PNT services and the set of standards, guidelines, and practices addressing systems that use PNT services.” NIST explained that “[t]hrough the Profile development process, NIST will engage the public and private sectors on multiple occasions to include a request for information, participation in workshops, solicitation of feedback on this annotated outline, and public review and comment on the draft Profile.” The agency added “[t]he Profile development process is iterative and, in the end state, will identify and promote the responsible use of PNT services from a cybersecurity point of view.”
    • In June, NIST released a request for information (RFI) “about public and private sector use of positioning, navigation, and timing (PNT) services, and standards, practices, and technologies used to manage cybersecurity risks, to systems, networks, and assets dependent on PNT services.” This RFI is being undertaken per direction in a February executive order (EO) to serve as the foundation for the Trump Administration’s efforts to lessen the reliance of United States’ (U.S.) critical infrastructure on current PNT systems and services. Specifically, the EO seeks to build U.S. capacity to meet and overcome potential disruption or manipulation of the PNT systems and services used by virtually every key sector of the public and private sectors of the U.S.
    • NIST explained “Executive Order 13905, Strengthening National Resilience Through Responsible Use of Positioning, Navigation, and Timing Services, was issued on February 12, 2020 and seeks to protect the national and economic security of the United States from disruptions to PNT services that are vital to the functioning of technology and infrastructure, including the electrical power grid, communications infrastructure and mobile devices, all modes of transportation, precision agriculture, weather forecasting, and emergency response.” The EO directed NIST “to develop and make available, to at least the appropriate agencies and private sector users, PNT profiles.” NIST said “[r]esponses to this RFI will inform NIST’s development of a PNT profile, using the NIST Framework for Improving Critical Infrastructure Cybersecurity (NIST Cybersecurity Framework), that will enable the public and private sectors to identify systems, networks, and assets dependent on PNT services; identify appropriate PNT services; detect the disruption and manipulation of PNT services; and manage the associated cybersecurity risks to the systems, networks, and assets dependent on PNT services.”
    • The EO defines the crucial term this RFI uses: “PNT profile” means a description of the responsible use of PNT services—aligned to standards, guidelines, and sector-specific requirements—selected for a particular system to address the potential disruption or manipulation of PNT services.
    • In April, the Department of Homeland Security (DHS) released a Congressionally required report, “Report on Positioning, Navigation, and Timing (PNT) Backup and Complementary Capabilities to the Global Positioning System (GPS)” as required by Section 1618 of the “2017 National Defense Authorization Act (NDAA) (P.L. 114–328) that was due in December 2017. DHS offered “recommendations to address the nation’s PNT requirements and backup or complementary capability gaps.”
  • Switzerland’s Federal Data Protection and Information Commissioner (FDPIC) has reversed itself and decided that the Swiss-U.S. Privacy Shield does not provide adequate protection for Swiss citizens whose data is transferred for processing into the United States (U.S.) However, it does not appear that there will be any practical effect as of yet. The FDPIC determined that the agreement “does not provide an adequate level of protection for data transfer from Switzerland to the US pursuant to the Federal Act on Data Protection (FADP).” This decision comes two months after the Court of Justice of the European Union (CJEU) struck down the European Union-U.S. Privacy Shield. The FDPIC noted this determination followed “his annual assessment of the Swiss-US Privacy Shield regime and recent rulings on data protection by the CJEU.” The FDPIC also issued a policy paper explaining the determination. The FDPIC added
    • As a result of this assessment, which is based on Swiss law, the FDPIC has deleted the reference to ‘adequate data protection under certain conditions’ for the US in the FDPIC’s list of countries. Since the FDPIC’s assessment has no influence on the continued existence of the Privacy Shield regime, and those concerned can invoke the regime as long as it is not revoked by the US, the comments on the Privacy Shield in the list of countries will be retained in an adapted form.
  • The United States Department of Defense (DOD) released its statutorily required annual report on the People’s Republic of China (PRC) that documented the rising power of the nation, especially with respect to cybersecurity and information warfare. The Pentagon noted
    • 2020 marks an important year for the People’s Liberation Army (PLA) as it works to achieve important modernization milestones ahead of the Chinese Communist Party’s (CCP) broader goal to transform China into a “moderately prosperous society” by the CCP’s centenary in 2021. As the United States continues to respond to the growing strategic challenges posed by the PRC, 2020 offers a unique opportunity to assess both the continuity and changes that have taken place in the PRC’s strategy and armed forces over the past two decades.
    • Regarding Cyberwarfare, the DOD asserted
      • The development of cyberwarfare capabilities is consistent with PLA writings, which identify Information Operations (IO) – comprising cyber, electronic, and psychological warfare – as integral to achieving information superiority and as an effective means for countering a stronger foe. China has publicly identified cyberspace as a critical domain for national security and declared its intent to expedite the development of its cyber forces.
      • The PRC presents a significant, persistent cyber espionage and attack threat to military and critical infrastructure systems. China seeks to create disruptive and destructive effects—from denial-of- service attacks to physical disruptions of critical infrastructure— to shape decision-making and disrupt military operations in the initial stages of a conflict by targeting and exploiting perceived weaknesses of militarily superior adversaries. China is improving its cyberattack capabilities and has the ability to launch cyberattacks—such as disruption of a natural gas pipeline for days to weeks—in the United States.
      • PLA writings note the effectiveness of IO and cyberwarfare in recent conflicts and advocate targeting C2 and logistics networks to affect an adversary’s ability to operate during the early stages of conflict. Authoritative PLA sources call for the coordinated employment of space, cyber, and EW as strategic weapons to “paralyze the enemy’s operational system of systems” and “sabotage the enemy’s war command system of systems” early in a conflict. Increasingly, the PLA considers cyber capabilities a critical component in its overall integrated strategic deterrence posture, alongside space and nuclear deterrence. PLA studies discuss using warning or demonstration strikes—strikes against select military, political, and economic targets with clear “awing effects”—as part of deterrence. Accordingly, the PLA probably seeks to use its cyberwarfare capabilities to collect data for intelligence and cyberattack purposes; to constrain an adversary’s actions by targeting network-based logistics, C2, communications, commercial activities, and civilian and defense critical infrastructure; or, to serve as a force-multiplier when coupled with kinetic attacks during armed conflict.
      • The PLA’s ongoing structural reforms may further change how the PLA organizes and commands IO, particularly as the Strategic Support Force (SSF) evolves over time. By consolidating cyber and other IO-related elements, the SSF likely is generating synergies by combining national-level cyber reconnaissance, attack, and defense capabilities in its organization.
    • The DOD also noted the PLA’s emphasis on intelligentized warfare:
      • The PLA sees emerging technologies as driving a shift to “intelligentized” warfare from today’s “informatized” way of war. PLA strategists broadly describe intelligentized warfare as the operationalization of artificial intelligence (AI) and its enabling technologies, such as cloud computing, big data analytics, quantum information, and unmanned systems, for military applications. These technologies, according to PRC leaders—including Chairman Xi Jinping— represent a “Revolution in Military Affairs” for which China must undertake a whole-of-government approach to secure critical economic and military advantages against advanced militaries.
  • The United States’ (U.S.) Citizenship and Immigration Services (USCIS) of the Department of Homeland Security (DHS) is proposing a rule “to amend DHS regulations concerning the use and collection of biometrics in the enforcement and administration of immigration laws by USCIS, U.S. Customs and Border Protection (CBP), and U.S. Immigration and Customs Enforcement (ICE).”
    • USCIS further explained:
    • First, DHS proposes that any applicant, petitioner, sponsor, beneficiary, or individual filing or associated with an immigration benefit or request, including United States citizens, must appear for biometrics collection without regard to age unless DHS waives or exempts the biometrics requirement.
    • Second, DHS proposes to authorize biometric collection, without regard to age, upon arrest of an alien for purposes of processing, care, custody, and initiation of removal proceedings.
    • Third, DHS proposes to define the term biometrics.
    • Fourth, this rule proposes to increase the biometric modalities that DHS collects, to include iris image, palm print, and voice print.
    • Fifth, this rule proposes that DHS may require, request, or accept DNA test results, which include a partial DNA profile, to prove the existence of a claimed genetic relationship and that DHS may use and store DNA test results for the relevant adjudications or to perform any other functions necessary for administering and enforcing immigration and naturalization laws.
    • Sixth, this rule would modify how VAWA and T nonimmigrant petitioners demonstrate good moral character, as well as remove the presumption of good moral character for those under the age of 14. 
    • Lastly, DHS proposes to further clarify the purposes for which biometrics are collected from individuals filing immigration applications or petitions, to include criminal history and national security background checks; identity enrollment, verification, and management; secure document production, and to administer and enforce immigration and naturalization laws.

Further Reading

  • State aid helps China tech leaders shrug off US sanctions” By Kenji Kawase – Nikkei Asian Review. A number of companies placed on the United States’ no-trade list have received generous subsidies from their government in Beijing. The People’s Republic of China (PRC) sees the health of a number of these companies as vital to its long term development and is willing to prop them up. Some companies have received multiples of their net profit to keep them afloat.
  • Facebook Says Trump’s Misleading Post About Mail-In Voting Is OK. Employees Say It’s Not.” By Craig Silverman and Ryan Mac – BuzzFeed News. There is more internal dissension at Facebook even after the company’s announcement it would not accept political advertising the last week of the election and correct misinformation about voting. Within hours of this policy change, President Donald Trump encouraged voters to possibly vote twice, which many Facebook employees saw as a violation of the new policy. The company disagreed and appended a claim from a bipartisan think tank study finding that mail-in voting is largely fraud free.
  • Why Facebook’s Blocking of New Political Ads May Fall Short” By Davey Alba and Sheera Frenkel – The New York Times. This piece explains in detail why Facebook’s new policy to combat political misinformation is likely to fall quite short of addressing the problem.
  • Student arrested for cyberattack against Miami schools used ‘easy to prevent’ program” By Colleen Wright and David Ovalle – Miami Herald. The United States’ fourth largest school district fell victim to a distributed denial of service attack launched by a 16-year-old student using more than a decade old tools downloaded from the internet. This unnamed hacker foiled the Miami-Dade school district’s first three days of online classes, raising questions about the cybersecurity of the school system if such an old attack succeeded so easily and how safe the personal information of students is in this school system and others around the country.
  • Trump and allies ratchet up disinformation efforts in late stage of campaign” By Ashley Parker – The Washington Post. It has been apparent for some that President Donald Trump and a number of his Republican allies are intentionally or recklessly spreading false information to try to help his campaign cover ground against frontrunner former Vice President Joe Biden. The goal is to so muddy the waters that the average person will neither be able to discern the truth of a claim not be concerned about doing so. This approach is the very same Russia’s leader Vladimir Putin has successfully executed in pushing his country into a post-truth world. Experts are warning that a continuation of this trend in the United States (U.S.) could wreak potentially irreparable harm.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by wal_172619 from Pixabay

Pending Legislation In U.S. Congress, Part II

Appropriations will, of course, be enacted, but when is the question. And along with bills to fund the U.S. government come policy direction.

As Congress returns from an eventful summer recess, it is possible technology focused and related legislation is passed or advances towards passage before the body leaves Washington in late September. Yesterday, we examined the FY 2021 National Defense Authorization Act (NDAA) and the lapsed provisions in the Foreign Intelligence Surveillance Act (FISA). Today we will look at appropriations.

Passage of regular appropriations during federal election years is almost always delayed until after the election, and the Congress and the President usually agree to extend the current year’s level of funding for agencies through late November or early December (aka a continuing resolution.) This year, negotiations over another potential pandemic package might complicate passage of a continuing resolution (CR) this month, but it appears, at present, the two issues are being handled separately with Speaker of the House Nancy Pelosi (D-CA) and Secretary of the Treasury Steven Mnuchin having reached agreement in principle on a CR. It remains to be seen whether this agreement will hold through passage of legislation to keep the U.S. government funded as carefully negotiated deals have unraveled at the last minute when President Donald Trump found reason to object.

Also, there have been only four fiscal years since the enactment of the Congressional Budget Act of 1974 in which all the appropriations bills were enacted by the beginning of the coming fiscal year. Therefore, it is almost certainly going to be the case that the current fractured political environment in Washington results in a current resolution for the first few months of FY 2021 and quite possibly well into calendar year 2021 should the Democrats take control of the White House and Senate.

Moreover, the Trump Administration has again proposed steep cuts to many civilian agencies the Congress will probably ignore based on appropriations from the previous three fiscal years appropriations process. Nonetheless, in a footnote to a summary table in its FY 2021 budget request, the Administration explained it is “propos[ing] to fund base defense programs for 2021 at the existing [Budget Control Act] cap and fund base [non-defense] programs at a level that is five percent below the 2020 [non-defense] cap.” The Administration asked that Congress “extend the [Budget Control Act] caps through 2025 at the levels included in the 2021 Budget…[which] would provide an increase in defense funding of about two percent each year, and decrease funding for [non-defense] programs by two percent (or “2-penny”) each year.”

However, the House Appropriations Committee has again rejected these deep cuts to non-defense funding and have moved forward by passing 10 of the 12 annual bills in July. By way of contrast, the Senate Appropriations Committee, has not even considered any of its bills in committee, reportedly because there was a desire to shield vulnerable Republicans running for reelection from taking tough votes on politically divisive issues. Consequently, the Senate Appropriations Committees almost certainly has bills it has worked on and are ready to go when the time comes to consider the inevitable bundling of bills either into one omnibus or smaller packages to enact FY 2021 funding.

In any event, the annual appropriations bills provide top-line funding numbers for a number of agencies with jurisdiction over United States’ technology programs and policies. There can be policy directives written into these bills usually in the form of denying the use of funds for certain purposes or tying the use of funds to an agency addressing an issue of importance to a committee or subcommittee. However, the more directive policy changes are usually written in the Committee Reports that accompany the bills.

FY 2021 Homeland Security Appropriations Act

The Homeland Security Subcommittee marked up and reported out to the full committee its “FY 2021 Homeland Security Appropriations Act” that would provide the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) with $1.844 billion for operations and support, $396 million for procurement, construction, and improvement, and $14.4 million for research & development. For FY 2020, CISA was appropriated $1.566 billion for operations and support, $434 million for procurement, construction, and improvement, and $14.4 million for research & development. For the next fiscal year, the Trump Administration requested $1.438 billion, $313 million, and $6.4 million respectively for the same categories of programs. Moreover, the Committee made available its Committee Report. However, this bill has not come to the House floor and likely will not to shield Democrats seeking reelection in moderate or right-leaning districts from facing votes on issues like immigration.

The package includes $2.6 million for a Joint Cybersecurity Coordination Group (JCCG) inside DHS “serve as a coordinating entity that will help the Department identify strategic priorities and synchronize cyber-related activities across the operational components.” This new entity comes about because the Trump Administration requested its creation as part of its FY 2021 budget request. The Committee expressed disappointment with “the lack of quality and detail provided in CISA’s fiscal year 2021 budget justification documents, to include several errors and unjustified adjustments that appear to be attributable to CISA’s premature proposal for a new Program, Project, or Activity (PPA) structure and raise questions about whether the budget could be executed as requested.” Consequently, the Committee directed that CISA “submit the fiscal year 2022 budget request at the same level of PPA detail as provided in the table at the end of this report with no further adjustments to the PPA structure.”

Among other programmatic and funding highlights, the Committee

  • “[E]ncourage[d] CISA to continue to use commercial, human-led threat behavioral analysis and technology, and to employ private sector, industry-specific, threat intelligence and best practices to better characterize potential consequences to critical infrastructure sectors during a systemic cyber event.”
  • Urged “CISA and the Election Infrastructure Information Sharing and Analysis Center (EI–ISAC) to expand outreach to the most vulnerable jurisdictions” with respect to election security assistance.
  • Directed “CISA to continue providing the semiannual briefing on the National Cybersecurity Protection System (NCPS) program and the Continuous Diagnostics and Mitigation (CDM)”
  • Pointed to $5.8 million to set up a ‘‘central Federal information security incident center,’ a requirement mandated by the Federal Information Security Modernization Act (FISMA) (P.L. 113-283) and $9.3 million “to establish a formal program office to coordinate supply chain risk management efforts for federal civilian agencies; act as the executive agent for the Federal Acquisition Security Council (FASC), as authorized by the SECURE Technology Act, 2018 (Public Law 115– 390); and fund various supply chain related efforts and services.”
  • Emphasized its increase of $6 million as compared to FY 2020 “to grow CISA’s threat hunting capabilities” “[i]n the face of cyber threats from nation-state adversaries such as Russia, China, Iran, and North Korea.”
  • [P]rovide[d] an increase of $11,568,000 above the request to establish a Joint Cyber Center (JCC) for National Cyber Defense to bring together federal and State, Local, Tribal, and Territorial (SLTT) governments, industry, and international partners to strategically and operationally counter nation-state cyber threats.”
  • Bestowed “an increase of $10,022,000 above the request for the underlying infrastructure that enables better identification, analysis, and publication of known vulnerabilities and common attack patterns, including through the National Vulnerability Database, and to expand the coordinated responsible disclosure of vulnerabilities.”
  • Noted “[t]hrough the Shared Cybersecurity Services Office (SCSO), CISA serves as the Quality Services Management Office for federal cybersecurity” and explained “[t]o help improve efforts to make strategic cybersecurity services available to federal agencies, the Committee includes $5,064,000 above the request to sustain prior year investments and an additional $5,000,000 to continue to expand the office.”
  • Expressed its concern “about cyber vulnerabilities within supply chains, which pose unacceptable risks to the nation’s physical and cyber infrastructure and, therefore, to national security” and provided “an increase of $18,005,000 above the request to continue the development of capabilities to address these risks through the ICT Supply Chain Risk Management Task Force and other stakeholders, such as the FASC.”

FY 2021 Financial Services and General Government Appropriations Act

The FY 2021 Financial Services and General Government Appropriations Act has a provision that would bar either the Federal Trade Commission (FTC) or Federal Communications Commission (FCC) from taking certain actions related to Executive Order 13925, “Preventing Online Censorship” issued in May by the White House after Twitter fact checked a pair of President Donald Trump’s Tweets that contained untruthful claims about voting by mail. It is very unlikely Senate Republicans, some of whom have publicly supported this Executive Order will allow this language into the final bill funding the agencies.

Under the Executive Order, the National Telecommunications and Information Administration (NTIA) is to file a petition for rulemaking with the FCC to clarify the interplay between clauses of 47 USC 230, notably whether the liability shield that protects companies like Twitter and Facebook for content posted on an online platform also extends to so-called “editorial decisions,” presumably actions like Twitter’s in fact checking Trump regarding mail balloting. The NTIA would also ask the FCC to define better the conditions under which an online platform may take down content in good faith that are “deceptive, pretextual, or inconsistent with a provider’s terms of service; or taken after failing to provide adequate notice, reasoned explanation, or a meaningful opportunity to be heard.” The NTIA is also ask the FCC to promulgate any other regulations necessary to effectuate the EO. The FTC was directed consider whether online platforms are violating Section 5 of the FTC Act barring unfair or deceptive practices, which “may include practices by entities covered by section 230 that restrict speech in ways that do not align with those entities’ public representations about those practices.”

In the Committee Report for the FY 2021 Financial Services and General Government Appropriations Act, the House Appropriations Committee explained it provided $341 million for the FTC, “a $10,000,000 increase over fiscal year 2020… will increase the FTC’s capabilities both to monitor mergers and acquisitions that could reduce competition or lead to higher prices, and to take enforcement action against companies that fail to take reasonable steps to secure their customer data or that engage in other problematic trade practices.”

The Committee detailed the following program and funding provisions related to the FTC, including combatting fraudulent calls to seniors, robocalls, fraudulent health care calls, and the following:

  • Cryptocurrency.— The Committee encourages the FTC to work with the Securities and Exchange Commission, other financial regulators, consumer groups, law enforcement, and other public and private stakeholders to identify and investigate fraud related to cryptocurrencies market and discuss methods to empower and protect consumers.”
  • Consumer Repair Rights.—The Committee is aware of the FTC’s ongoing review of how manufacturers—in particular mobile phone and car manufacturers—may limit repairs by consumers and repair shops, and how those limitations may increase costs, limit choice, and impact consumers’ rights under the Magnuson-Moss Warranty Act. Not later than 120 days after the enactment of this Act, the FTC is directed to provide to the Committee, and to publish online, a report on anticompetitive practices related to repair markets. The report shall provide recommendations on how to best address these problems.
  • Antitrust Actions.—The Committee directs the GAO to study FTC and DOJ antitrust actions over the past 25 years. The study shall examine the following questions: How many instances have FTC and DOJ been on opposing sides of the same matter? In how many of these instances was the split created by (a) the FTC intervening in DOJ’s case; and (b) the DOJ intervening in FTC’s case? In these instances, how (if at all) did the split affect the final outcome (e.g., did the judicial opinion cite the split or explain how it affected the court’s decision)? In how many instances has an FTC action appeared before the Supreme Court? Of these instances, in how many cases did the FTC represent itself (rather than be represented by the Solicitor General)? In how many instances has the DOJ or FTC reneged on a clearance agreement with the other agency? In how many of these instances was the disruption created by (a) the FTC’s decision to renege on the agreement; and (b) the DOJ’s decision to renege on the agreement? How many amicus briefs did each agency file in each year? How many of the total amicus briefs filed by DOJ were done so at the invitation of the court? How many of the total amicus briefs filed by FTC were done so at the invitation of the court?

With respect to the FCC, the package provides $376 million and requires a host of programmatic responses, including:

  • Broadband Maps.—The Committee provides significant funding for upfront costs associated with implementation of the Broadband DATA Act. The Committee anticipates funding related to the Broadband DATA Act will decline considerably in future years and expects the FCC to repurpose a significant amount of staff currently working on economic, wireline, and wireless issues to focus on broadband mapping.
  • Broadband Access.—The Committee believes that deployment of broadband in rural and economically disadvantaged areas is a driver of economic development, jobs, and new educational opportunities. The Committee supports FCC efforts to judiciously allocate Universal Service Fund (USF) funds for these areas.
  • Rural Digital Opportunity Fund.—The Committee appreciates the significant investment the FCC is planning to make to deploy broadband services to unserved areas. The Committee recognizes the need for government programs to minimize instances in which two different providers receive support from two different programs to serve the same location. However, the Committee is concerned that current program rules may have the unintended consequence of discouraging other funding sources from participating in broadband deployment, particularly State-based programs. The Committee directs the FCC to adjust program rules to ensure applicants, and the States in which those applicants would deploy broadband, are not put at a disadvantage when applying for the Rural Digital Opportunity Fund based on the State’s proactive, independent investment in broadband.
  • Lifeline Service.—The Committee is concerned that changes to the Lifeline minimum service standards and support levels will adversely impact low-income Americans, including many suffering from economic hardships due to the coronavirus. The Committee directs the FCC to pause implementation of any changes to the currently applicable minimum service standards for Lifeline-supported mobile broadband service and any changes in the current levels of Lifeline support for voice services until the FCC has completed the State of the Lifeline Marketplace Report required by the 2016 Lifeline Order…
  • Mid-Band Spectrum.—The Committee believes that Fifth-Generation (5G) mobile technology is critical to U.S. national and economic security. A key component of the U.S. strategy for 5G is ensuring that U.S. wireless providers have enough mid-band spectrum (frequencies between 3 GHz and 24 GHz), which provides fast data connections while also traveling longer distances. The Committee is concerned that the U.S. is falling behind other countries in the allocation of such spectrum. The Committee urges the Administration and the FCC to work expeditiously to identify and make available more mid-band spectrum for 5G so that the U.S. does not fall further in the race to deploy 5G networks and services.
  • 5G Supply Chain.—The Committee understands the importance of a secure 5G technology supply chain. The Committee encourages the FCC to investigate options for increasing supply chain diversity, competition, and network security via interoperable technologies and open standard-based interfaces.

The Committee had a range of mandates for the Office of Management and Budget (OMB):

  • Federal and Critical Infrastructure Cybersecurity.—The Committee is aware that Federal agencies and the nation’s critical infrastructure face unique cybersecurity threats. Executive Order 13800, issued on May 11, 2017, directs agency heads to implement several risk management and cybersecurity measures, including the National Institute of Standards and Technology Framework for Improving Critical Infrastructure Cybersecurity. OMB is directed to report, within 90 days of enactment of this Act, on the status of compliance with Executive Order 13800 by each applicable agency. The report shall identify risk management and cybersecurity compliance gaps and outline the steps each agency needs to take to manage such risks. OMB shall prioritize working with the applicable agency heads to address remaining gaps and inconsistencies.
  • Federal Information Technology Workforce.—OMB is directed to consult with the Office of Personnel Management and the General Services Administration and report to the Committee, no later than September 30, 2021, on gaps in Federal information technology workforce skills, disciplines, and experience required to enable the Federal government to modernize its ability to use technology and develop effective citizen-facing digital services to carry out its mission.

The Committee noted its additional funding to the Election Assistance Commission (EAC) for Election Security Grants of $500 million:

  • [T]he Coronavirus Aid, Relief, and Economic Security Act (CARES Act) (P.L. 116–136) included $400,000,000 for grants to States to prevent, prepare for, and respond to coronavirus. The Committee is gravely concerned by persistent threats from Russia and other foreign actors attempting to influence the U.S. democratic process, and vulnerabilities that continue to exist throughout the Nation’s election system.
  • Since fiscal year 2018, Congress has provided $805,000,000 in grants to States to improve the security of elections for Federal office.
  • However, that funding has been inconsistent, unpredictable, and insufficient to meet the vast need across all the States and territories.
  • Congress must provide a consistent, steady source of Federal funds to support State and local election officials on the frontlines of protecting U.S. elections. The bill requires States to use payments to replace direct-recording electronic (DRE) voting machines with voting systems that require the use of an individual, durable, voter-verified paper ballot, marked by the voter by hand or through the use of a non-tabulating ballot marking device or system, and made available for inspection and verification by the voter before the vote is cast and counted.
  • Funds shall only be available to a State or local election jurisdiction for further election security improvements after a State has submitted a certification to the EAC that all DRE voting machines have been or are in the process of being replaced. Funds shall be available to States for the following activities to improve the security of elections for Federal office:
    • implementing a post-election, risk-limiting audit system that provides a high level of confidence in the accuracy of the final vote tally;
    • maintaining or upgrading election-related computer systems, including voter registration systems, to address cyber vulnerabilities identified through DHS scans or similar assessments of existing election systems;
    • facilitating cyber and risk mitigation training for State and local election officials;
    • implementing established cybersecurity best practices for election systems; and other priority activities and
    • investments identified by the EAC, in consultation with DHS, to improve election security.
  • The EAC shall define in the Notice of Grant Award the eligible investments and activities for which grant funds may be used by the States. The EAC shall review all proposed investments to ensure funds are used for the purposes set forth in the Notice of Grant Award.
  • The bill also requires that not less than 50 percent of the payment made to a State be allocated in cash or in kind to local government entities responsible for the administration of elections for Federal office.

Regarding the General Services Administration (GSA), the Committee directed the following:

  • Interagency Task Force on Health and Human Services Information Technology (IT).— The Committee urges the Chief Information Office and Chief Technology Officer (CTO) of HHS, in collaboration with the White House CTO and U.S. Department of Agriculture (USDA), as well as the Office of the National Coordinator for Health Information Technology (ONC) within HHS, 18F within the GSA, and the Cybersecurity and Infrastructure security Agency (CISA) within the U.S. Department of Homeland Security, to establish an interagency task force that will examine existing IT infrastructure in Federal health human service programs nationwide and identify the limitations to successfully integrating and modernizing health and human services IT, and the network security necessary for health and human services IT interoperability. The task force shall submit to the Committee within 180 days of enactment on this Act a report on its progress and on recommendations for further Congressional action, which should include estimated costs for agencies to make progress on interoperability initiatives.
  • Category Management.—The Committee is interested in understanding the effects of GSA’s category management policy on contracts with small businesses. Category management refers to the business practice of buying common goods and services as an enterprise to eliminate redundancies, increase efficiency, and deliver more value and savings from the Federal government’s acquisition programs. Within 180 days of the enactment of this Act, the Committee directs GSA, in cooperation with SBA, to submit a report to the Committee on the number of contracts that could have been awarded under sections 8(a), 8(m), 15(a), 15(j), 31, or 36 of the Small Business Act, but were exempted by category management since its implementation.

The Committee made the following recommendations generally:

  • Cyberspace Solarium Commission Recommendations.—The Committee recognizes and supports the priorities and recommendations laid out in the Cyberspace Solarium Commission’s report and urges Federal departments and agencies to align cybersecurity budgetary priorities with those laid out by the Commission. In particular, the Committee calls attention to recommendation 3.2, Develop and Maintain Continuity of the Economy Planning; recommendation 4.6.3, Strengthen the Capacity of the Committee on Foreign Investment in the United States, particularly with respect to the need to train Federal bankruptcy judges; recommendation 3.4, Improve and Enhance the Funding of the Election Assistance Commission; and recommendation 3.1, Strengthen Sector-specific Agencies’ Ability to Manage Critical Infrastructure Risk, particularly with respect to the Department of the Treasury’s Office of Cybersecurity and Critical Infrastructure Protection.
  • Zero Trust Model.—The Committee is aware that the most effective cybersecurity systems are based on the zero trust model, which is designed not only to prevent cyber intrusions but to prevent cyberthieves from accessing or removing protected information. To ensure that Federal agencies achieve the highest level of security against cyberattacks in the shortest amount of time, the Committee encourages all agencies to acquire and deploy zero trust cybersecurity software that is compatible with all existing operating systems and hardware platforms used by Federal agencies. The Committee also encourages Federal agencies to acquire and utilize software compatible with all existing operating systems and hardware platforms that will enable agencies to measure or quantify their risk of a cybersecurity attack in the months ahead and the types of cyberattack the agency is most likely to experience. Upon learning the risk and type of cyberattack the agency is most likely to face, the agency shall immediately take remedial action to minimize such risk. Agencies shall include information in their fiscal year 2022 Congressional Justification to Congress on their progress in complying with this directive.

FY 2021 Department of Defense Appropriations Act

On 14 July, the House Appropriations Committee marked up and reported out the “FY 2021 Department of Defense Appropriations Act,” which would provide $695 billion for the Department of Defense (DOD), “an increase of $1,294,992,000 above the fiscal year 2020 enacted level and a decrease of $3,695,880,000 below the budget request.” The House subsequently passed this bill.

The Committee Report contained these technology-related provisions:

  • ZERO TRUST ARCHITECTURE. The Committee encourages the Secretary of Defense to implement a Zero Trust Architecture to increase its cybersecurity posture and enhance the Department’s ability to protect its systems and data.
  • DISTRIBUTED LEDGER TECHNOLOGY RESEARCH AND DEVELOPMENT. The Committee is aware that distributed ledger technologies, such as blockchain, may have potentially useful applications for the Department of Defense, which include but are not limited to distributed computing, cybersecurity, logistics, and auditing. Therefore, the Committee encourages the Under Secretary of Defense (Research and Engineering) to consider research and development to explore the use of distributed ledger technologies for defense applications.
  • ARTIFICIAL INTELLIGENCE PARTNERSHIPS. The Committee is aware of the United States-Singapore partnership focusing on applying artificial intelligence in support of humanitarian assistance and disaster relief operations, which will help first responders better serve those in disaster zones. The Committee encourages the Secretary of Defense to pursue similar partnerships with additional partners in different regions, including the Middle East.
  • CYBER EDUCATION COLLABORATIVES. The Committee remains concerned by widespread shortages in cybersecurity talent across both the public and private sector. In accordance with the recommendations of the Cyberspace Solarium Commission, the Committee encourages the Under Secretary of Defense (Research and Engineering) to direct cyber-oriented units to collaborate with local colleges and universities on research, fellowships, internships, and cooperative work experiences to expand cyber-oriented education opportunities and grow the cybersecurity workforce. The Committee also appreciates that veterans and transitioning servicemembers could serve as a valuable recruiting pool to fill gaps in the cybersecurity workforce. Accordingly, the Committee encourages the Under Secretary to prioritize collaboration with colleges and universities near military installations as well as the veteran population.
  • 5G TELECOMMUNICATIONS TECHNOLOGY. The Committee is concerned about reports that foreign manufacturers are significantly ahead of United States companies in the development and deployment of 5G telecommunications technologies, which poses a national security risk to the United States and its allies. Without a robust domestic 5G supply chain, the United States will be vulnerable to 5G systems that facilitate cyber intrusion from hostile actors. In order to secure a reliable 5G system and a domestic supply chain that meets the national security needs of the United States and its allies, the Committee encourages the Secretary of Defense to accelerate engagement with domestic industry partners that are developing 5G systems. Additionally, the Committee is aware of the significant investments being made in 5G efforts but is concerned with the level of detail provided for congressional oversight. The Committee directs the Under Secretary of Defense (Research and Engineering) to conduct quarterly execution briefings with the House and Senate Appropriations Committees beginning not later than 90 days after the enactment of this Act.
  • MILITARY INFORMATION SUPPORT OPERATIONS. Over the past decade, the bulk of activities under Military Information Support Operations (MISO) focused on countering violent extremist organizations (VEO). While VEOs remain an ongoing threat and require continued vigilance, peer and near-peer adversaries like China and Russia are using social media and other vectors to weaken domestic and international institutions and undermine United States interests. This new information environment and the difficulty of discriminating between real and fake information heightens the importance of enhancing and coordinating United States government information-related capabilities as a tool of diplomatic and military strategy.
  • The Committee recognizes the efforts and accomplishments of the United States Special Operations Command and other agencies within the executive branch to operate in the digital domain. However, it is difficult to view individual agency activities as a coordinated whole of government effort. Over the past several years, the classified annex accompanying annual Department of Defense Appropriations Acts included direction focusing on the individual activities of geographic combatant commands. However, information messaging strategies to counter Chinese and Russian malign influences cuts across these geographic boundaries and requires coordination between multiple government agencies using different authorities.
  • Therefore, in order to better understand how MISO activities support a whole of government messaging strategy, the Committee directs the Assistant Secretary of Defense (Special Operations/Low Intensity Conflict) to submit a report for MISO activities for the individual geographic combatant commands justified by the main pillars of the National Defense Strategy to the House and Senate Appropriations Committees not later than 15 days after submission of the fiscal year 2022 budget request and annually thereafter. The report shall include spend plans identifying the requested and enacted funding levels for both voice and internet activities and how those activities are coordinated with the Intelligence Community and the Department of State. The enacted levels will serve as the baseline for reprogramming in accordance with section 8007 of this Act. Furthermore, the Committee directs the Assistant Secretary of Defense (Special Operations/Low Intensity Conflict) to submit to the congressional defense committees, not later than 90 days after the end of the fiscal year, an annual report that provides details on each combatant commands’ MISO activities by activity name, description, goal or objective, target audience, dissemination means, executed funds, and assessments of their effectiveness. Additional details for the report are included in the classified annex accompanying this Act.

FY 2021 Commerce, Justice, Science Appropriations Act

In July, the “FY 2021 Commerce, Justice, Science Appropriations Act” was also marked up and reported out, and the House passed the bill. The Committee Report contains these provisions:

  • Cybersecurity Threats.—The Committee remains concerned that as the Census Bureau looks to modernize data collection methods, the Census Bureau could potentially be exploited by nefarious actors who seek to undermine the integrity of census data, which is vital to democratic institutions, and gain access to sensitive information otherwise protected by law. These threats include both hacking into the Census Bureau IT infrastructure and efforts to use supercomputing to unmask the privacy of census respondents. The Committee directs the Census Bureau to prioritize cyber protections and high standards of data differential privacy, while also maintaining the accuracy of the data, and expects the Census Bureau to update the Committee regularly on these efforts.
  • Cybersecurity and Privacy.—The proliferation of data generation, storage, and usage associated with the digital economy is making it increasingly important to protect that data with effective cryptography and privacy standards. The Committee is concerned that individual, corporate, and public-sector data privacy is continuously at risk from attacks by individual actors, criminal organization, and nation-states. The Committee urges NIST to address the rapidly emerging threats in this field by furthering the development of new and needed cryptographic standards and technologies.
  • National Initiative for Cybersecurity Education.—The Committee notes with concern the shortage of cybersecurity professionals across the government and private sector, from entry level applicants to experienced professionals. The Committee therefore supports the National Initiative for Cybersecurity Education (NICE) and directs NIST to provide resources commensurate with the prior fiscal year for this effort.
  • Cybersecurity Conformity Assessment Programs.—The Committee instructs NIST, in collaboration with other relevant organizations, to report to the Committee no later than 270 days after the enactment of this Act on challenges and approaches to establishing and managing voluntary cybersecurity conformity assessment programs for information and communication technologies including federal cloud technologies.
  • Cybersecurity Training.—Within the increase to Manufacturing Extension Partnership (MEP), the Committee directs NIST to maintain the core services of the MEP and encourages NIST to utilize existing expertise within its Information Technology Laboratory to increase cybersecurity technical training to small manufacturers to strengthen their cybersecurity capabilities given the troubling threats from state and non-state actors and other emerging threats.
  • Cybersecurity threat information sharing.—The Committee supports sharing by DOJ of cybersecurity threat warnings and intelligence with private companies who may benefit from actionable information to deter, prevent, or mitigate threats. The Committee asks DOJ to provide a briefing on this topic not later than 90 days after enactment of this Act.
  • Chinese-government affiliated companies.—The Committee is concerned with companies operating within the United States that are known to have substantial ties to the Chinese government, including full or partial ownership by the Chinese government, and that are required by Chinese law to assist in espionage activities, including collection of personally identifiable information of American citizens. Such companies may pose cybersecurity risks, such as vulnerabilities in their equipment, and some are the subject of ongoing Congressional and Executive Branch investigations involving their business practices. The Committee directs DOJ to enforce applicable laws and prevent the operation of known foreign entities who participate in the theft of American intellectual property, the harvesting of personal identifiable information on behalf of a foreign government, and the unlawful surveillance of American citizens by adversarial state-owned enterprises.

The National Institute of Standards and Technology (NIST) would be given $1.044 billion via the “FY 2021 Commerce-Justice-Science Appropriations Act.” NIST received a total of $1.034 billion for FY 2020, and the agency requested $737 million for the next fiscal year. This bill includes annual language barring any agency receiving funds under it from buying “a high-impact or moderate-impact  information  system” unless all the risks have been mitigated associated with the procurement of such a system, most especially including supply chain risks, that may originate in the People’s Republic of China, Iran, North Korea, or Russia.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Francine Sreca from Pixabay

Further Reading, Other Developments, and Coming Events (21 August)

Here are Further Reading, Other Developments, and Coming Events.

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The National Institute of Standards and Technology (NIST) published for input Four Principles of Explainable Artificial Intelligence (Draft NISTIR 8312) in which the authors stated:
    • We introduce four principles for explainable artificial intelligence (AI) that comprise the fundamental properties for explainable AI systems. They were developed to encompass the multidisciplinary nature of explainable AI, including the fields of computer science,  engineering, and psychology. Because one size fits all explanations do not exist, different users will require different types of explanations. We present five categories of explanation and summarize theories of explainable AI. We give an overview of the algorithms in the field that cover the major classes of explainable algorithms. As a baseline comparison, we assess how well explanations provided by people follow our four principles. This assessment provides insights to the challenges of designing explainable AI systems.
    • NIST said “our four principles of explainable AI are:
      • Explanation: Systems deliver accompanying evidence or reason(s) for all outputs.
      • Meaningful: Systems provide explanations that are understandable to individual users.
      • Explanation Accuracy: The explanation correctly reflects the system’s process for generating the output.
      • Knowledge Limits: The system only operates under conditions for which it was designed or when the system reaches a sufficient confidence in its output.
    • A year ago, NIST published “U.S. LEADERSHIP IN AI: A Plan for Federal Engagement in Developing Technical Standards and Related Tools” as required by Executive Order (EO) 13859, Maintaining American Leadership in Artificial Intelligence in response to an August 10, 2019 due date. 
      • NIST explained that “[t]here are a number of cross-sector (horizontal) and sector-specific (vertical) AI standards available now and many others are being developed by numerous standards developing organizations (SDOs)…[and] [s]ome areas, such as communications, have well-established and regularly maintained standards in widespread use, often originally developed for other technologies. Other aspects, such as trustworthiness, are only now being considered.” NIST explained that its AI plan “identifies the following nine areas of focus for AI standards: 
        • Concepts and terminology
        • Data and knowledge 
        • Human interactions 
        • Metrics
        • Networking
        • Performance testing and reporting methodology
        • Safety
        • Risk management
        • Trustworthiness
      • NIST asserting that “[i]n deciding which standards efforts merit strong Federal government involvement, U.S. government agencies should prioritize AI standards efforts that are:
        • Consensus-based, where decision-making is based upon clearly established terms or agreements that are understood by all involved parties, and decisions are reached on general agreement.
        • Inclusive and accessible, to encourage input reflecting diverse and balanced communities of users, developers, vendors, and experts. Stakeholders should include representatives from diverse technical disciplines as well as experts and practioners from non-traditional disciplines of special importance to AI such as ethicists, economists, legal professionals, and policy makers: essentially, accommodating all desiring a “seat at the table.”
        • Multi-path, developed through traditional and novel standards-setting approaches and organizations that best meet the needs of developers and users in the marketplace as well as society at large.
        • Open and transparent, operating in a manner that: provides opportunity for participation by all directly- and materially- affected; has well-established and readily accessible operating rules, procedures, and policies that provide certainty about decision making processes; allows timely feedback for further consideration of the standard; and ensures prompt availability of the standard upon adoption.
        • Result in globally relevant and non-discriminatory standards, where standards avoid becoming non-tariff trade barriers or locking in particular technologies or products.
  • Consumer Watchdog has sued Zoom Video Communications “for making false and deceptive representations to consumers about its data security practices in violation of the District of Columbia Consumer Protection Procedures Act (CPPA).” The advocacy organization asserted
    • To distinguish itself from competitors and attract new customers, Zoom began advertising and touting its use of a strong security feature called “end-to-end encryption” to protect communications on its platform, meaning that the only people who can access the communicated data are the sender and the intended recipient. Using end-to-end encryption prevents unwanted third parties—including the company that owns the platform (in this case, Zoom)—from accessing communications, messages, and data transmitted by users.
    • Unfortunately, Zoom’s claims that communications on its platform were end-to-end encrypted were false. Zoom only used the phrase “end-to-end encryption” as a marketing device to lull consumers and businesses into a false sense of security.
    • The reality is that Zoom is, and has always been, capable of intercepting and accessing any and all of the data that users transmit on its platform—the very opposite of end-to-end encryption. Nonetheless, Zoom relied on its end-to-end encryption claim to attract customers and to build itself into a publicly traded company with a valuation of more than $70 billion.
    • Consumer Watchdog is seeking the greater of treble damages or $1,500 per violation along with other relief
    • Zoom is being sued in a number of other cases, including two class action suits in United States courts in Northern California (#1 and #2).
  • The United States (U.S.) Government Accountability Office (GAO) decided the Trump Administration violated the order of succession at the U.S. Department of Homeland Security by naming the Customs and Border Protection (CBP) Commissioner of Kevin McAleenan the acting Secretary after former Secretary Kirstjen Nielsen resigned early in 2019. The agency’s existing order of succession made clear that Cybersecurity and Infrastructure Security Agency (CISA) Director Christopher Krebs was next in line to lead DHS. The GAO added “[a]s such, the subsequent appointments of Under Secretary for Strategy, Policy, and Plans, Chad Wolf and Principal Deputy Director of U.S. Citizenship and Immigration Services (USCIS) Ken Cuccinelli were also improper because they relied on an amended designation made by Mr. McAleenan.”
    • However, GAO is punting the question of what the implications of its findings are:
      • In this decision we do not review the consequences of Mr. McAleenan’s service as Acting Secretary, other than the consequences of the November delegation, nor do we review the consequences of Messers. Wolf and Cuccinelli service as Acting Secretary and Senior Official Performing the Duties of Deputy Secretary respectively.
      • We are referring the question as to who should be serving as the Acting Secretary and the Senior Official Performing the Duties of Deputy Secretary to the DHS Office of Inspector General for its review.
      • We also refer to the Inspector General the question of consequences of actions taken by these officials, including consideration of whether actions taken by these officials may be ratified by the Acting Secretary and Senior Official Performing the Duties of Deputy Secretary as designated in the April Delegation.
    • The GAO also denied DHS’s request to rescind this opinion because “DHS has not shown that our decision contains either material errors of fact or law, nor has DHS provided information not previously considered that warrants reversal or modification of the decision.”
    • The chairs of the House Homeland Security and Oversight and Reform Committees had requested the GAO legal opinion and claimed in their press release the opinion “conclude[es] that President Donald Trump’s appointments to senior leadership positions at the Department of Homeland Security were illegal and circumvented both the Federal Vacancy Reform Act and the Homeland Security Act.”
  • Top Democrats on the House Energy and Commerce Committee wrote the members of the Facebook Oversight Board expressing their concern the body “does not have the power it needs to change Facebook’s harmful policies.” Chair Frank Pallone, Jr. (D-NJ), Communications and Technology Subcommittee Chair Mike Doyle (D-PA) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) “encouraged the newly appointed members to exert pressure on Facebook to listen to and act upon their policy recommendations, something that is not currently included in the Board Members’ overall responsibilities.” They asserted:
    • The Committee leaders believe Facebook is intentionally amplifying divisive and conspiratorial content because such content attracts more customer usage and, with it, advertising revenue. Pallone, Doyle and Schakowsky were also troubled by recent reports that Facebook had an opportunity to retune its systems responsible for the amplification of this content, but chose not to. 
    • The three Committee leaders wrote that the public interest should be the Oversight Board’s priority and that it should not be influenced by the profit motives of Facebook executives. Pallone, Doyle and Schakowsky also requested the board members answer a series of questions in the coming weeks.
  • The United States (U.S.) Government Accountability Office (GAO) examined how well the United States Department of Homeland Security and selected federal agencies are implementing a cybersecurity program designed to give the government better oversight and control of their networks. In auditing the Continuous Diagnostics and Mitigation (CDM), the GAO found limited success and ongoing, systemic roadblocks preventing increased levels of security. DHS has estimated the program will cost $10.9 billion over ten years.
    • The GAO concluded
      • Selected agencies reported that the CDM program had helped improve their awareness of hardware on their networks. However, although the program has been in existence for several years, these agencies had only implemented the foundational capability for managing hardware to a limited extent, including not associating hardware devices with FISMA systems. In addition, while most agencies implemented requirements for managing software, all of them inconsistently implemented requirements for managing configuration settings. Moreover, poor data quality resulting from these implementation shortcomings diminished the usefulness of agency dashboards to support security-related decision making. Until agencies fully and effectively implement CDM program capabilities, including the foundational capability of managing hardware on their networks, agency and federal dashboards will not accurately reflect agencies’ security posture. Part of the reason that agencies have not fully implemented key CDM requirements is that DHS had not ensured integrators had addressed shortcomings with integrators’ CDM solutions for managing hardware and vulnerabilities. Although DHS has taken various actions to address challenges identified by agencies, without further assistance from DHS in helping agencies overcome implementation shortcomings, the program—costing billions of dollars— will likely not fully achieve expected benefits.
    • The chairs and ranking members of the Senate Homeland Security & Governmental Affairs and House Homeland Security Committees, the chair of the House Oversight and Reform Committee, and other Members requested that the GAO study and report on this issue.
  • Google and the Australian Competition and Consumer Commission (ACCC) have exchanged public letters, fighting over the latter’s proposal to ensure that media companies are compensated for articles and content the former uses.
    • In an Open Letter to Australians, Google claimed:
      • A proposed law, the News Media Bargaining Code, would force us to provide you with a dramatically worse Google Search and YouTube, could lead to your data being handed over to big news businesses, and would put the free services you use at risk in Australia.
      • You’ve always relied on Google Search and YouTube to show you what’s most relevant and helpful to you. We could no longer guarantee that under this law. The law would force us to give an unfair advantage to one group of businesses – news media businesses – over everyone else who has a website, YouTube channel or small business. News media businesses alone would be given information that would help them artificially inflate their ranking over everyone else, even when someone else provides a better result. We’ve always treated all website owners fairly when it comes to information we share about ranking. The proposed changes are not fair and they mean that Google Search results and YouTube will be worse for you.
      • You trust us with your data and our job is to keep it safe. Under this law, Google has to tell news media businesses “how they can gain access” to data about your use of our products. There’s no way of knowing if any data handed over would be protected, or how it might be used by news media businesses.
      • We deeply believe in the importance of news to society. We partner closely with Australian news media businesses — we already pay them millions of dollars and send them billions of free clicks every year. We’ve offered to pay more to license content. But rather than encouraging these types of partnerships, the law is set up to give big media companies special treatment and to encourage them to make enormous and unreasonable demands that would put our free services at risk.
    • In its response, the ACCC asserted:
      • The open letter published by Google today contains misinformation about the draft news media bargaining code which the ACCC would like to address. 
      • Google will not be required to charge Australians for the use of its free services such as Google Search and YouTube, unless it chooses to do so.
      • Google will not be required to share any additional user data with Australian news businesses unless it chooses to do so.
      • The draft code will allow Australian news businesses to negotiate for fair payment for their journalists’ work that is included on Google services.
      • This will address a significant bargaining power imbalance between Australian news media businesses and Google and Facebook.
    • Late last month, the ACCC released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” The government in Canberra had asked the ACCC to draft this code earlier this year after talks broke down between the Australian Treasury and the companies.
    • The ACCC explained
      • The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.
    • This is not the ACCC’s first interaction with the companies. Late last year, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.
    • A year ago, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”
  • The United States Coast Guard is asking for information on “the introduction and development of automated and autonomous commercial vessels and vessel technologies subject to U.S. jurisdiction, on U.S. flagged commercial vessels, and in U.S. port facilities.” The Coast Guard is particularly interested in the “barriers to the development of autonomous vessels.” The agency stated
    • On February 11, 2019, the President issued Executive Order (E.O.) 13859, “Maintaining American Leadership in Artificial Intelligence.”The executive order announced the policy of the United States Government to sustain and enhance the scientific, technological, and economic leadership position of the United States in artificial intelligence (AI) research and development and deployment through a coordinated Federal Government strategy. Automation is a broad category that may or may not incorporate many forms of technology, one of which is AI. This request for information (RFI) will support the Coast Guard’s efforts to accomplish its mission consistent with the policies and strategies articulated in E.O. 13859. Input received from this RFI will allow the Coast Guard to better understand, among other things, the intersection between AI and automated or autonomous technologies aboard commercial vessels, and to better fulfill its mission of ensuring our Nation’s maritime safety, security, and stewardship.

Further Reading

  • ‘Boring and awkward’: students voice concern as colleges plan to reopen – through Minecraft” By Kari Paul – The Guardian. A handful of universities in the United States (U.S.) are offering students access to customized Minecraft, an online game that allows players to build worlds. The aim seems to be to allow students to socialize online in replicas on their campuses. The students interviewed for this story seemed underwhelmed by the effort, however.
  • When regulators fail to rein in Big Tech, some turn to antitrust litigation” – By Reed Albergotti and Jay Greene – The Washington Post. This article places Epic Games suit against Apple and Google into the larger context of companies availing themselves of the right to sue themselves under antitrust laws in the United States. However, for a number of reasons, these suits have not often succeeded, and one legal commentator opined that judges tend to see these actions as sour grapes. However, revelations turned up during discovery can lead antitrust regulators to jump into proceedings, giving the suit additional heft.
  • What Can America Learn from Europe About Regulating Big Tech?” By Nick Romeo – The New Yorker.  A former Member of the European Parliament, Marietje Schaake, from the Netherlands is now a professor at Stanford and is trying to offer a new path on regulating big tech that would rein in the excesses and externalities while allowing new technologies and competition to flourish. The question is whether there is a wide enough appetite for her vision in the European Union let alone the United States.
  • Facebook employees internally question policy after India content controversy – sources, memos” By Aditya Kalra and Munsif Vengattil – Reuters. The tech giant is also facing an employee revolt in the world’s largest democracy. Much like in the United States and elsewhere, employees are pressing leadership to explain why they are seemingly not applying the platform’s rules on false and harmful material to hateful speech by leaders. In this case, it was posts by a member of the ruling Bharatiya Janata Party (BJP) calling Indian Muslims traitors. And, in much the same way accusations have been leveled at a top Facebook lobbyist in Washington who has allegedly interceded on behalf of Republicans and far right interests on questionable material, a lobbyist in New Delhi has done the same the BJB.
  • List of 2020 election meddlers includes Cuba, Saudi Arabia and North Korea, US intelligence official says” By Shannon Vavra – cyberscoop. At a virtual event this week, National Counterintelligence and Security Center (NCSC) Director William Evanina claimed that even more nations are trying to disrupt the United States election this fall, including Cuba, Saudi Arabia, and North Korea. Evanina cautioned anyone lest they think the capabilities of these nations rise to the level of the Russian Federation, People’s Republic of China, and Iran. Earleir this month, Evanina issued an update to his late July statement “100 Days Until Election 2020” through “sharing additional information with the public on the intentions and activities of our adversaries with respect to the 2020 election…[that] is being released for the purpose of better informing Americans so they can play a critical role in safeguarding our election.” Evanina offered more in the way of detail on the three nations identified as those being most active in and capable of interfering in the November election: the Russian Federation, the PRC, and Iran. This additional detail may well have been provided given the pressure Democrats in Congress to do just this. Members like Speaker of the House Nancy Pelosi (D-CA) argued that Evanina was not giving an accurate picture of the actions by foreign nations to influence the outcome and perception of the 2020 election. Republicans in Congress pushed back, claiming Democrats were seeking to politicize the classified briefings given by the Intelligence Community (IC).

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Silentpilot from Pixabay

Further Reading, Other Developments, and Coming Events (19 August)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The United States (U.S.) Department of Commerce tightened its chokehold on Huawei’s access to United States’ semiconductors and chipsets vital to its equipment and services. This rule follows a May rule that significantly closed off Huawei’s access to the point that many analysts are projecting the People’s Republic of China company will run out of these crucial technologies sometime next year without a suitable substitute, meaning the company may not be able to sell its smartphone and other leading products. In its press release, the department asserted the new rule “further restricts Huawei from obtaining foreign made chips developed or produced from U.S. software or technology to the same degree as comparable U.S. chips.”
    • Secretary of Commerce Wilbur Ross argued “Huawei and its foreign affiliates have extended their efforts to obtain advanced semiconductors developed or produced from U.S. software and technology in order to fulfill the policy objectives of the Chinese Communist Party.” He contended “[a]s we have restricted its access to U.S. technology, Huawei and its affiliates have worked through third parties to harness U.S. technology in a manner that undermines U.S. national security and foreign policy interests…[and] his multi-pronged action demonstrates our continuing commitment to impede Huawei’s ability to do so.”
    • The Department of Commerce’s Bureau of Industry and Security (BIS) stated in the final rule that it is “making three sets of changes to controls for Huawei and its listed non-U.S. affiliates under the Export Administration Regulations (EAR):
      • First, BIS is adding additional non-U.S. affiliates of Huawei to the Entity List because they also pose a significant risk of involvement in activities contrary to the national security or foreign policy interests of the United States.
      • Second, this rule removes a temporary general license for Huawei and its non-U.S. affiliates and replaces those provisions with a more limited authorization that will better protect U.S. national security and foreign policy interests.
      • Third, in response to public comments, this final rule amends General Prohibition Three, also known as the foreign-produced direct product rule, to revise the control over certain foreign-produced items recently implemented by BIS.”
    • BIS claimed “[t]hese revisions promote U.S. national security by limiting access to, and use of, U.S. technology to design and produce items outside the United States by entities that pose a significant risk of involvement in activities contrary to the national security or foreign policy interests of the United States.”
    • One technology analyst claimed “[t]he U.S. moves represent a significant tightening of restrictions over Huawei’s ability to procure semiconductors…[and] [t]hat puts into significant jeopardy its ability to continue manufacturing smartphones and base stations, which are its core products.”
  • The Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP) have released their annual guidance to United States department and agencies to direct their budget requests for FY 2022 with respect to research and development (R&D). OMB explained:
  • For FY2022, the five R&D budgetary priorities in this memorandum ensure that America remains at the global forefront of science and technology (S&T) discovery and innovation. The Industries of the Future (IotF) -artificial intelligence (AI), quantum information sciences (QIS), advanced communication networks/5G, advanced manufacturing, and biotechnology-remain the Administration’s top R&D priority. This includes fulfilling President Trump’s commitment to double non-defense AI and QIS funding by FY2022:
    • American Public Health Security and Innovation
    • American Leadership in the Industries of the Future and Related Technologies
    • American Security
    • American Energy and Environmental Leadership
    • American Space Leadership
  • In light of the significant health and economic disruption caused by the COVID-19 pandemic, the FY2022 memorandum includes a new R&D priority aimed at American Public Health Security and Innovation. This priority brings under a single, comprehensive umbrella biomedical and biotechnology R&D aimed at responding to the pandemic and ensuring the U.S. S&T enterprise is maximally prepared for any health-related threats.
  • Lastly, this memorandum also describes/our high-priority crosscutting actions. These actions include research and related strategies that underpin the five R&D priorities and ensure departments and agencies deliver maximum return on investment to the American people:
    • Build the S&T Workforce of the Future
    • Optimize Research Environments and Results
    • Facilitate Multisector Partnerships and Technology Transfer
    • Leverage the Power of Data
  • Despite the Trump Administration touting its R&D priorities and achievements, the non-partisan Congressional Research Service noted
    • President Trump’s budget request for FY2021 includes approximately $142.2 billion for research and development (R&D) for FY 2021, $13.8 billion (8.8%) below the FY2020 enacted level of $156.0 billion. In constant FY 2020 dollars, the President’s FY 2021 R&D request would result in a decrease of $16.6 billion (10.6%) from the FY 2020 level.
  • Two key chairs of subcommittees of the Senate Commerce, Science, and Transportation Committee are pressing the Federal Trade Commission (FTC) to investigate TikTok’s data collection and processing practices. This Committee has primary jurisdiction over the FTC in the Senate and is a key stakeholder on data and privacy issues.
    • In their letter, Consumer Protection Subcommittee Chair Jerry Moran (R-KS) and Communications, Technology, Innovation Chair John Thune (R-SD) explained they are “are seeking specific answers from the FTC related to allegations from a Wall Street Journal article that described TikTok’s undisclosed collection and transmission of unique persistent identifiers from millions of U.S. consumers until November 2019…[that] also described questionable activity by the company as it relates to the transparency of these data collection activities, and the letter seeks clarity on these practices.”
    • Moran and Thune asserted “there are allegations that TikTok discretely collected media access control (MAC) addresses, commonly used for advertisement targeting purposes, through Google Android’s operating system under an “unusual layer of encryption” through November 2019.” They said “[g]iven these reports and their potential relevancy to the “Executive Order on Addressing the Threat Posed by TikTok,” we urge the Federal Trade Commission (FTC) to investigate the company’s consumer data collection and processing practices as they relate to these accusations and other possible harmful activities posed to consumers.”
    • If the FTC were to investigate, find wrongdoing, and seek civil fines against TikTok, the next owner may be left to pay as the White House’s order to ByteDance to sell the company within three months will almost certainly be consummated before any FTC action is completed.
  • Massachusetts Attorney General Maura Healey (D) has established a “Data Privacy and Security Division within her office to protect consumers from the surge of threats to the privacy and security of their data in an ever-changing digital economy.” Healey has been one of the United States’ more active attorneys general on data privacy and technology issues, including her suit and settlement with Equifax for its massive data breach.
    • Her office explained:
      • The Data Privacy and Security Division investigates online threats and the unfair or deceptive collection, use, and disclosure of consumers’ personal data through digital technologies. The Division aims to empower consumers in the digital economy, ensure that companies are protecting consumers’ personal data from breach, protect equal and open access to the internet, and protect consumers from data-driven technologies that unlawfully deny them fair access to socioeconomic opportunities. The Division embodies AG Healey’s commitment to continue and grow on this critical work and ensure that data-driven technologies operate lawfully for the benefit of all consumers.
  • A California appeals court ruled that Amazon can be held liable for defective products their parties sell on its website. The appellate court reversed the trial court which held Amazon could not be liable.
    • The appeals court recited the facts of the case:
      • Plaintiff Angela Bolger bought a replacement laptop computer battery on Amazon, the popular online shopping website operated by defendant Amazon.com, LLC. The Amazon listing for the battery identified the seller as “E-Life, ”a fictitious name used on Amazon by Lenoge Technology (HK) Ltd. (Lenoge). Amazon charged Bolger for the purchase, retrieved the laptop battery from its location in an Amazon warehouse, prepared the battery for shipment in Amazon-branded packaging, and sent it to Bolger. Bolger alleges the battery exploded several months later, and she suffered severe burns as a result.
      • Bolger sued Amazon and several other defendants, including Lenoge. She alleged causes of action for strict products liability, negligent products liability, breach of implied warranty, breach of express warranty, and “negligence/negligent undertaking.”
    • The appeals court continued:
      • Amazon moved for summary judgment. It primarily argued that the doctrine of strict products liability, as well as any similar tort theory, did not apply to it because it did not distribute, manufacture, or sell the product in question. It claimed its website was an “online marketplace” and E-Life (Lenoge) was the product seller, not Amazon. The trial court agreed, granted Amazon’s motion, and entered judgment accordingly.
      • Bolger appeals. She argues that Amazon is strictly liable for defective products offered on its website by third-party sellers like Lenoge. In the circumstances of this case, we agree.
  • The National Institute of Standards and Technology (NIST) issued Special Publication 800-207, “Zero Trust Architecture,” that posits a different conceptual model for an organization’s cybersecurity than perimeter security. NIST claimed:
    • Zero trust security models assume that an attacker is present in the environment and that an enterprise-owned environment is no different—or no more trustworthy—than any nonenterprise-owned environment. In this new paradigm, an enterprise must assume no implicit trust and continually analyze and evaluate the risks to its assets and business functions and then enact protections to mitigate these risks. In zero trust, these protections usually involve minimizing access to resources (such as data and compute resources and applications/services) to only those subjects and assets identified as needing access as well as continually authenticating and authorizing the identity and security posture of each access request.
    • A zero trust architecture (ZTA) is an enterprise cybersecurity architecture that is based on zero trust principles and designed to prevent data breaches and limit internal lateral movement. This publication discusses ZTA, its logical components, possible deployment scenarios, and threats. It also presents a general road map for organizations wishing to migrate to a zero trust design approach and discusses relevant federal policies that may impact or influence a zero trust architecture.
    • ZT is not a single architecture but a set of guiding principles for workflow, system design and operations that can be used to improve the security posture of any classification or sensitivity level [FIPS199]. Transitioning to ZTA is a journey concerning how an organization evaluates risk in its mission and cannot simply be accomplished with a wholesale replacement of technology. That said, many organizations already have elements of a ZTA in their enterprise infrastructure today. Organizations should seek to incrementally implement zero trust principles, process changes, and technology solutions that protect their data assets and business functions by use case. Most enterprise infrastructures will operate in a hybrid zero trust/perimeter-based mode while continuing to invest in IT modernization initiatives and improve organization business processes.
  • The United Kingdom’s Government Communications Headquarters’ (GCHQ) National Cyber Security Centre (NCSC) released “Cyber insurance guidance” “for organisations of all sizes who are considering purchasing cyber insurance…not intended to be a comprehensive cyber insurance buyers guide, but instead focuses on the cyber security aspects of cyber insurance.” The NCSC stated “[i]f you are considering cyber insurance, these questions can be used to frame your discussions…[and] [t]his guidance focuses on standalone cyber insurance policies, but many of these questions may be relevant to cyber insurance where it is included in other policies.”

Further Reading

  • I downloaded Covidwise, America’s first Bluetooth exposure-notification app. You should, too.” By Geoffrey Fowler – The Washington Post. The paper’s technology columnist blesses the Apple/Google Bluetooth exposure app and claims it protects privacy. One person on Twitter pointed out the Android version will not work unless location services are turned on, which is contrary to the claims made by Google and Apple, an issue the New York Times investigated last month. A number of European nations have pressed Google to remove this feature, and a Google spokesperson claimed the Android Bluetooth tracing capability did not use location services, begging the question why the prompt appears. Moreover, one of the apps Fowler names has had its own privacy issues as detailed by The Washington Post in May. As it turns out Care19, a contact tracing app developed when the governor of North Dakota asked a friend who had designed a app for football fans to meet up, is violating its own privacy policy according to Jumbo, the maker of privacy software. Apparently, Care19 shares location and personal data with FourSquare when used on iPhones. Both Apple and state officials are at a loss to explain how this went unnoticed when the app was scrubbed for technical and privacy problems before being rolled out.
  • Truss leads China hawks trying to derail TikTok’s London HQ plan” By Dan Sabbagh – The Guardian. ByteDance’s plan to establish a headquarters in London is now under attack by members of the ruling Conservative party for the company’s alleged role in persecuting the Uighur minority in Xinjiang. ByteDance has been eager to move to London and also eager to avoid the treatment that another tech company from the People’s Republic of China has gotten in the United Kingdom (UK): Huawei. Nonetheless, this decision may turn political as the government’s reversal on Huawei and 5G did. Incidentally, if Microsoft does buy part of TikTok, it would be buying operations in four of the five Five Eyes nations but not the UK.
  • Human Rights Commission warns government over ‘dangerous’ use of AI” By Fergus Hunter – The Sydney Morning Herald. A cautionary tale regarding the use of artificial intelligence and algorithms in government decision-making. While this article nominally pertains to Australia’s Human Rights Commission advice to the country’s government, it is based, in large part, on a scandal in which an automated process illegally collected $721 million AUD from welfare beneficiaries. In the view of the Human Rights Commission, decision-making by humans is still preferable and more accurate than automated means.
  • The Attack That Broke Twitter Is Hitting Dozens of Companies” By Andy Greenberg – WIRED. In the never-ending permutations of hacking, the past has become the present because the Twitter hackers use phone calls to talk their way into gaining access to a number of high-profile accounts (aka phone spear phishing.) Other companies are suffering the same onslaught, proving the axiom that people may be the weakest link in cybersecurity. However, the phone calls are based on exacting research and preparation as hackers scour the internet for information on their targets and the companies themselves. A similar hack was reportedly executed by the Democratic People’s Republic of Korea (DPRK) against Israeli defense firms.
  • Miami Police Used Facial Recognition Technology in Protester’s Arrest” By Connie Fossi and Phil Prazan – NBC Miami. The Miami Police Department used Clearview AI to identify a protestor that allegedly injured an officer but did not divulge this fact to the accused or her attorney. The department’s policy on facial recognition technology bars officers from making arrests solely on the basis of identification through such a system. Given the error rates many facial recognition systems have experienced with identifying minorities and the use of masks during the pandemic, which further decreases accuracy, it is quite likely people will be wrongfully accused and convicted using this technology.
  • Big Tech’s Domination of Business Reaches New Heights” By Peter Eavis and Steve Lohr – The New York Times. Big tech has gotten larger, more powerful, and more indispensable in the United States (U.S.) during the pandemic, and one needs to go back to the railroads in the late 19th Century to find comparable companies. It is an open question whether their size and influence will change much no matter who is president of the U.S. next year.
  • License plate tracking for police set to go nationwide” By Alfred Ng – c/net. A de facto national license plate reader may soon be activated in the United States (U.S.). Flock Safety unveiled the “Total Analytics Law Officers Network,” (TALON) that will link its systems of cameras in more than 700 cities, allowing police departments to track cars across multiple jurisdictions. As the U.S. has no national laws regulating the use of this and other similar technologies, private companies may set policy for the country in the short term.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (17 August)

Here are Coming Events, Other Developments, and Further Reading.

Coming Events

  • On 18 August, the National Institute of Standards and Technology (NIST) will host the “Bias in AI Workshop, a virtual event to develop a shared understanding of bias in AI, what it is, and how to measure it.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • On 14 August, the California Office of Administrative Law (OAL) approved the Attorney General’s proposed final regulations to implement the California Consumer Privacy Act (CCPA) (A.B.375) and they took effect that day. The Office of the Attorney General (OAG) had requested expedited review so the regulations may become effective on 1 July as required by the CCPA. With respect to the substance, the final regulations are very similar to the third round of regulations circulated for comment in March, in part, in response to legislation passed and signed into law last fall that modified the CCPA.
    • The OAL released an Addendum to the Final Statement of Reasons and explained
      • In addition to withdrawing certain provisions for additional consideration, the OAG has made the following non-substantive changes for accuracy, consistency, and clarity. Changes to the original text of a regulation are non-substantive if they clarify without materially altering the requirements, rights, responsibilities, conditions, or prescriptions contained in the original text.
    • For further reading on the third round of proposed CCPA regulations, see this issue of the Technology Policy Update, for the second round, see here, and for the first round, see here. Additionally, to read more on the legislation signed into law last fall, modifying the CCPA, see this issue.
    • Additionally, Californians for Consumer Privacy have succeeded in placing the “California Privacy Rights Act” (CPRA) on the November 2020 ballot. This follow on statute to the CCPA could again force the legislature into making a deal that would revamp privacy laws in California as happened when the CCPA was added to the ballot in 2018. It is also possible this statute remains on the ballot and is added to California’s laws. In either case, much of the CCPA and its regulations may be moot or in effect for only the few years it takes for a new privacy regulatory structure to be established as laid out in the CPRA. See here for more detail.
  • In a proposed rule issued for comment, the Federal Communications Commission (FCC) explained it is taking “further steps to protect the nation’s communications networks from potential security threats as the [FCC] integrates provisions of the recently enacted Secure and Trusted Communications Networks Act of 2019 (Secure Networks Act) (P.L. 116-124) into its existing supply chain rulemaking proceeding….[and] seeks comment on proposals to implement further Congressional direction in the Secure Networks Act.” Comments are due by 31 August.
    • The FCC explained
      • The concurrently adopted Declaratory Ruling finds that the 2019 Supply Chain Order, 85 FR 230, January 3, 2020, satisfies the Secure Networks Act’s requirement that the Commission prohibit the use of funds for covered equipment and services. The Commission now seeks comment on sections 2, 3, 5, and 7 of the Secure Networks Act, including on how these provisions interact with our ongoing efforts to secure the communications supply chain. As required by section 2, the Commission proposes several processes by which to publish a list of covered communications equipment and services. Consistent with sections 3, 5, and 7 of the Secure Networks Act, the Commission proposes to (1) ban the use of federal subsidies for any equipment or services on the new list of covered communications equipment and services; (2) require that all providers of advanced communications service report whether they use any covered communications equipment and services; and (3) establish regulations to prevent waste, fraud, and abuse in the proposed reimbursement program to remove, replace, and dispose of insecure equipment.
    • The agency added
      • The Commission also initially designated Huawei Technologies Company (Huawei) and ZTE Corporation (ZTE) as covered companies for purposes of this rule, and it established a process for designating additional covered companies in the future. Additionally, last month, the Commission’s Public Safety and Homeland Security Bureau issued final designations of Huawei and ZTE as covered companies, thereby prohibiting the use of USF funds on equipment or services produced or provided by these two suppliers.
      • The Commission takes further steps to protect the nation’s communications networks from potential security threats as it integrates provisions of the recently enacted Secure Networks Act into the Commission’s existing supply chain rulemaking proceeding. The Commission seeks comment on proposals to implement further Congressional direction in the Secure Networks Act.
  • The White House’s Office of Science & Technology Policy (OSTP) released a request for information (RFI) “[o]n behalf of the National Science and Technology Council’s (NSTC) Subcommittee on Resilience Science and Technology (SRST), OSTP requests input from all interested parties on the development of a National Research and Development Plan for Positioning, Navigation, and Timing (PNT) Resilience.” OSTP stated “[t]he plan will focus on the research and development (R&D) and pilot testing needed to develop additional PNT systems and services that are resilient to interference and manipulation and that are not dependent upon global navigation satellite systems (GNSS)…[and] will also include approaches to integrate and use multiple PNT services for enhancing resilience. The input received on these topics will assist the Subcommittee in developing recommendations for prioritization of R&D activities.”
    • Executive Order 13905, Strengthening National Resilience Through Responsible Use of Positioning, Navigation, and Timing Services, was issued on February 12, 2020, and President Donald Trump explained the policy basis for the initiative:
      • It is the policy of the United States to ensure that disruption or manipulation of PNT services does not undermine the reliable and efficient functioning of its critical infrastructure. The Federal Government must increase the Nation’s awareness of the extent to which critical infrastructure depends on, or is enhanced by, PNT services, and it must ensure critical infrastructure can withstand disruption or manipulation of PNT services. To this end, the Federal Government shall engage the public and private sectors to identify and promote the responsible use of PNT services.
    • In terms of future steps under the EO, the President directed the following:
      • The Departments of Defense, Transportation, and Homeland Security must use the PNT profiles in updates to the Federal Radionavigation Plan.
      • The Department of Homeland Security must “develop a plan to test the vulnerabilities of critical infrastructure systems, networks, and assets in the event of disruption and manipulation of PNT services. The results of the tests carried out under that plan shall be used to inform updates to the PNT profiles…”
      • The heads of Sector-Specific Agencies (SSAs) and the heads of other executive departments and agencies (agencies) coordinating with the Department of Homeland Security, must “develop contractual language for inclusion of the relevant information from the PNT profiles in the requirements for Federal contracts for products, systems, and services that integrate or utilize PNT services, with the goal of encouraging the private sector to use additional PNT services and develop new robust and secure PNT services. The heads of SSAs and the heads of other agencies, as appropriate, shall update the requirements as necessary.”
      • the Federal Acquisition Regulatory Council, in consultation with the heads of SSAs and the heads of other agencies, as appropriate, shall incorporate the [contractual language] into Federal contracts for products, systems, and services that integrate or use PNT services.
      • The Office of Science and Technology Policy (OSTP) must “coordinate the development of a national plan, which shall be informed by existing initiatives, for the R&D and pilot testing of additional, robust, and secure PNT services that are not dependent on global navigation satellite systems (GNSS).”
  • An ideologically diverse bipartisan group of Senators wrote the official at the United States Department of Justice in charge of the antitrust division and the chair of the Federal Trade Commission (FTC) “regarding allegations of potentially anticompetitive practices and conduct by online platforms toward content creators and emerging competitors….[that] stemmed from a recent Wall Street Journal report that Alphabet Inc., the parent company of Google and YouTube, has designed Google Search to specifically give preference to YouTube and other Google-owned video service providers.”
    • The Members asserted
      • There is no public insight into how Google designs its algorithms, which seem to deliver up preferential search results for YouTube and other Google video products ahead of other competitive services. While a company favoring its own products, in and of itself, may not always constitute illegal anticompetitive conduct, the Journal further reports that a significant motivation behind this action was to “give YouTube more leverage in business deals with content providers seeking traffic for their videos….” This exact conduct was the topic of a Senate Antitrust Subcommittee hearing led by Senators Lee and Klobuchar in March this year.
    • Senators Thom Tillis (R-NC), Mike Lee (R-UT), Amy Klobuchar (D-MN), Richard Blumenthal (D-CT), Marsha Blackburn (R-TN), Josh Hawley (R-MO), Elizabeth Warren (D-MA), Mazie Hirono (D-HI), Cory Booker (D-NJ) and Ted Cruz (R-TX) signed the letter.
  • The National Security Agency (NSA) and the Federal Bureau of Investigation (FBI) released a “Cybersecurity Advisory [and a fact sheet and FAQ] about previously undisclosed Russian malware” “called Drovorub, designed for Linux systems as part of its cyber espionage operations.” The NSA and FBI asserted “[t]he Russian General Staff Main Intelligence Directorate (GRU) 85th Main Special Service Center (GTsSS) military unit 26165” developed and deployed the malware. The NSA and FBI stated the GRU and GTsSS are “sometimes publicly associated with APT28, Fancy Bear, Strontium, and a variety of other identities as tracked by the private sector.”
    • The agencies contended
      • Drovorub represents a threat to National Security Systems, Department of Defense, and Defense Industrial Base customers that use Linux systems. Network defenders and system administrators can find detection strategies, mitigation techniques, and configuration recommendations in the advisory to reduce the risk of compromise.
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published Cybersecurity Best Practices for Operating Commercial Unmanned Aircraft Systems (UAS) “a companion piece to CISA’s Foreign Manufactured UASs Industry Alert,…[to] assist in standing up a new UAS program or securing an existing UAS program, and is intended for information technology managers and personnel involved in UAS operations.” CISA cautioned that “[s]imilar to other cybersecurity guidelines and best practices, the identified best practices can aid critical infrastructure operators to lower the cybersecurity risks associated with the use of UAS, but do not eliminate all risk.”
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) released the “Identity, Credential, and Access Management (ICAM) Value Proposition Suite of documents in collaboration with SAFECOM and the National Council of Statewide Interoperability Coordinators (NCSWIC), Office of the Director of National Intelligence (ODNI), and Georgia Tech Research Institute (GTRI)…[that] introduce[] ICAM concepts, explores federated ICAM use-cases, and highlights the potential benefits for the public safety community:”
    • ICAM Value Proposition Overview
      • This document provides a high-level summary of federated ICAM benefits and introduces domain-specific scenarios covered by other documents in the suite.
    • ICAM Value Proposition Scenario: Drug Response
      • This document outlines federated ICAM use cases and information sharing benefits for large-scale drug overdose epidemic (e.g., opioid, methamphetamine, and cocaine) prevention and response.

Further Reading

  • Trump’s Labor Chief Accused of Intervening in Oracle Pay Bias Case” By Noam Scheiber, David McCabe and Maggie Haberman – The New York Times. In the sort of conduct that is apparently the norm across the Trump Administration, there are allegations that the Secretary of Labor intervened in departmental litigation to help a large technology firm aligned with President Donald Trump. Starting in the Obama Administration and continuing into the Trump Administration, software and database giant Oracle was investigated, accused, and sued for paying non-white, non-male employees significantly less in violation of federal and state law. Estimates of Oracle’s liability ranged between $300-800 million, and litigators in the Department of Labor were seeking $400 million and had taken the case to trial. Secretary Eugene Scalia purportedly stepped in and lowered the dollar amount to $40 million and the head litigator is being offered a transfer from Los Angeles to Chicago in a division in which she has no experience. Oracle’s CEO Safra Catz and Chair Larry Ellison have both supported the President more enthusiastically and before other tech company heads engaged.
  • Pentagon wins brief waiver from government’s Huawei ban” By Joe Gould – Defense News. A Washington D.C. trade publication is reporting the Trump Administration is using flexibility granted by Congress to delay the ban on contractors using Huawei, ZTE, and other People’s Republic of China (PRC) technology for the Department of Defense. Director of National Intelligence John Ratcliffe granted the waiver at the request of Under Secretary of Defense for Acquisition and Sustainment Ellen Lord, claiming:
    • You stated that DOD’s statutory requirement to provide for the military forces needed to deter war and protect the security of out country is critically important to national security. Therefore, the procurement of goods and services in support of DOD’s statutory mission is also in the national security interests of the United States.
    • Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) requires agencies to remove this equipment and systems and also not to contract with private sector entities that use such equipment and services. It is the second part of the ban the DOD and its contractors are getting a reprieve from for an interim rule putting in place such a ban was issued last month.
  • DOD’s IT supply chain has dozens of suppliers from China, report finds” By Jackson Barnett – fedscoop. A data analytics firm, Govini, analyzed a sample of prime contracts at the Department of Defense (DOD) and found a surge in the presence of firms from the People’s Republic of China (PRC) in the supply chains in the software and information technology (IT) sectors. This study has obvious relevance to the previous article on banning PRC equipment and services in DOD supply chains.
  • Facebook algorithm found to ‘actively promote’ Holocaust denial” by Mark Townsend – The Guardian. A British counter-hate organization, the Institute for Strategic Dialogue (ISD), found that Facebook’s algorithms lead people searching for the Holocaust to denial sites and posts. The organization found the same problem on Reddit, Twitter, and YouTube, too. ISD claimed:
    • Our findings show that the actions taken by platforms can effectively reduce the volume and visibility of this type of antisemitic content. These companies therefore need to ask themselves what type of platform they would like to be: one that earns money by allowing Holocaust denial to flourish, or one that takes a principled stand against it.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Foundry Co from Pixabay

Further Reading, Other Developments, and Coming Events (15 August)

Here are Further Reading, Other Developments, and Coming Events.

Coming Events

  • On 18 August, the National Institute of Standards and Technology (NIST) will host the “Bias in AI Workshop, a virtual event to develop a shared understanding of bias in AI, what it is, and how to measure it.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The Global Engagement Center (GEC) at the U.S. Department of State published the “GEC Special Report: Pillars of Russia’s Disinformation and Propaganda Ecosystem” The GEC drew on “on publicly available reporting to provide an overview of Russia’s disinformation and propaganda ecosystem.”  The GEC identified the five pillars of Russia’s Disinformation and Propaganda Ecosystem:
    • official government communications;
    • state-funded global messaging;
    • cultivation of proxy sources;
    • weaponization of social media; and
    • cyber-enabled disinformation.
    • The GEC stated
      • This report provides a visual representation of the ecosystem described above, as well as an example of the media multiplier effect it enables. This serves to demonstrate how the different pillars of the ecosystem play distinct roles and feed off of and bolster each other. The report also includes brief profiles of select proxy sites and organizations that occupy an intermediate role between the pillars of the ecosystem with clear links to Russia and those that are meant to be fully deniable. The emphasis on these proxy sites is meant to highlight the important role they play, which can be overlooked given the attention paid to official Russian voices on one end of the spectrum, and the social media manipulation and cyber-enabled threats on the other.
  • The United States (U.S.) Department of Veterans Affairs (VA) has restarted its process for rolling out its new electronic health record (EHR) and announced it has “revised its previous schedule to convert facilities to its new HER capabilities with updated timelines for deployments in August in Columbus, Ohio, and October in Spokane, Washington.” The VA opted to replace its Veterans Health Information Systems and Technology Architecture (VistA) with a commercial off-the-shelf system the U.S. Department of Defense has chosen, Cerner Millennium. However, this $16 billion acquisition has encountered numerous difficulties and delays, which has caught he continued attention of Congress.
    • The VA claimed “The new timeline will preserve the 10-year implementation schedule and the overall cost estimates of VA’s EHR modernization program…[and] [a]fter the conversion at these sites, VA will bring other select facilities forward in the timeline.”
    • In June 2020, the U.S. Government Accountability Office (GAO) found:
      • VA met its schedule for making the needed system configuration decisions that would enable the department to implement its new EHR system at the first VA medical facility, which was planned for July 2020. In addition, VA has formulated a schedule for making the remaining EHR system configuration decisions before implementing the system at additional facilities planned for fall 2020.
      • VA’s Electronic Health Record Modernization (EHRM) program was generally effective in establishing decision-making procedures that were consistent with applicable federal standards for internal control. However, VA did not always ensure the involvement of relevant stakeholders, including medical facility clinicians and staff, in the system configuration decisions. Specifically, VA did not always clarify terminology and include adequate detail in descriptions of local workshop sessions to medical facility clinicians and staff to ensure relevant representation at local workshop meetings. Participation of such stakeholders is critical to ensuring that the EHR system is configured to meet the needs of clinicians and support the delivery of clinical care.
  • The United States (U.S.) Government Accountability Office (GAO) studied and reported on privacy and accuracy issues related to the use of facial recognition technology requested by the chairs of the House Judiciary and Oversight and Reform Committees. This report updates a 2015 report on the same issues and renews the agency’s call first made in 2013 that Congress “strengthen[] the current consumer privacy framework to reflect the effects of changes in technology and the marketplace—particularly in relation to consumer data used for marketing purposes—while also ensuring that any limitations on data collection and sharing do not unduly inhibit the economic and other benefits to industry and consumers that data sharing can accord.”
    • In the new report, the GAO explained that “[s]takeholders we interviewed identified additional activities that companies could improve the use of facial recognition technology. These activities include
      • defining the purpose for the technology’s use and clearly notifying consumers how companies are using the technology—such as surveillance or marketing;
      • identifying risks and limitations associated with using the technology and prohibiting certain uses (e.g., those with discriminatory purposes); and
      • providing guidance or training related to these issues.
    • The GAO asserted
      • However, these voluntary privacy frameworks and suggested activities that could help address privacy concerns or improve the use of facial recognition technology are not mandatory. Furthermore, as discussed earlier, in most contexts facial recognition technology is not currently covered by federal privacy law. Accordingly, we reiterate our 2013 suggestion that Congress strengthen the current consumer privacy framework to reflect the effects of changes in technology and the marketplace.
  • The United States Department of Justice (DOJ) “announced the dismantling of three terrorist financing cyber-enabled campaigns, involving the al-Qassam Brigades, Hamas’s military wing, al-Qaeda, and Islamic State of Iraq and the Levant (ISIS)…the government’s largest-ever seizure of cryptocurrency in the terrorism context.”
    • The DOJ claimed
      • These three terror finance campaigns all relied on sophisticated cyber-tools, including the solicitation of cryptocurrency donations from around the world.  The action demonstrates how different terrorist groups have similarly adapted their terror finance activities to the cyber age.  Each group used cryptocurrency and social media to garner attention and raise funds for their terror campaigns.  Pursuant to judicially-authorized warrants, U.S. authorities seized millions of dollars, over 300 cryptocurrency accounts, four websites, and four Facebook pages all related to the criminal enterprise.
  • The United States (U.S.) National Counterintelligence and Security Center (NCSC) revealed it has “has been providing classified briefings and other assistance to federal procurement executives, chief information officers and chief information security officers from across the U.S. Government on supply chain threats and risks stemming from contracting with five Chinese companies.” The NCSC explained the “supply chain security briefings are designed to assist federal agencies implement” Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232).
    • The NCSC stated:
      • One provision of the NDAA prohibits the U.S. Government from directly using goods and services from five specified Chinese companies — Huawei, ZTE Corporation, Hytera Communications, Hanghzou Hikvision and Dahua Technology Company.
      • Another, broader, provision of Section 889 prohibits federal agencies from contracting with any company that uses goods and services from these five Chinese firms. This particular prohibition takes effect on August 13, 2020, unless a federal agency authorizes a waiver for a specific company, which can only be granted by the agency head after receiving NCSC supply chain security guidance.
  • The Federal Communications Commission (FCC) denied two petitions to stay an April 2020 rulemaking that would make the 6Ghz band of spectrum available to users other than the incumbents. The FCC noted “wo parties—Edison Electric Institute (EEI) and Association of Public-Safety Communications Officials-International, Inc. (APCO)—petitioned to stay the Order:
    • EEI, a trade association representing investor-owned electric utilities, seeks only to stay the effectiveness of the rules that apply to low-power indoor devices. 
    • APCO, a non-profit association of persons who manage and operate public-safety communications systems, seeks to stay the rules for both standard-power and low-power indoor operations.
    • In the rule and order, the FCC explained
      • We authorize two different types of unlicensed operations—standard-power and indoor low-power operations. We authorize standard-power access points using an automated frequency coordination (AFC) system. These access points can be deployed anywhere as part of hotspot networks, rural broadband deployments, or network capacity upgrades where needed. We also authorize indoor low-power access points across the entire 6 GHz band. These access points will be ideal for connecting devices in homes and businesses such smartphones, tablet devices, laptops, and Internet-of-things (IoT) devices to the Internet. As has occurred with Wi-Fi in the 2.4 GHz and 5 GHz bands, we expect that 6 GHz unlicensed devices will become a part of most peoples’ everyday lives. The rules we are adopting will also play a role in the growth of the IoT; connecting appliances, machines, meters, wearables, and other consumer electronics as well as industrial sensors for manufacturing.
  • In a speech, the Australian Competition and Consumer Commission (ACCC) Chair Rod Sims laid out the status of his agency’s actions against Google, Facebook, and other large technology platforms flowing from its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers,” including:
    • The ACCC recently launched an action against Google regarding misleading representations it made to consumers to obtain their consent to expand the scope of personal information it collected and used about its’ users online activities.
    • In another case, which we brought against Google last year, we allege that Google misled consumers into sharing location data with Google. We contend Google did not clearly inform consumers using Android mobile devices that a particular account setting allowed Google to collect location data. We assert that many consumers may have unknowingly provided more of their personal location data to Google than they intended. Google then used consumers’ location data to enhance the value of its advertising services to prospective advertisers. This case is currently in Court with a hearing scheduled in late November.
    • Currently the ACCC is considering the acquisition by Google and Facebook of Fitbit and Giphy, respectively. We are considering questions such as whether they have the ability to give themselves advantages by favouring their own products, or whether these acquisitions are raising barriers to entry for other competitors.
    • In April 2020 the Federal Government directed the ACCC to develop a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms. We recently published the draft legislation for the code.
  • A British appeals court overturned a decision that found that a police force’s use of facial recognition technology in a pilot program that utilized live footage to be legal. The appeals court found the use of this technology by the South Wales Police Force a violation of “the right to respect for private life under Article 8 of the European  Convention  on  Human  Rights,  data  protection  legislation,  and  the  Public  Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.”

Further Reading

  • North Korean Hacking Group Attacks Israeli Defense Industry” by Ronen Bergman and Nicole Perlroth – The New York Times. Israel is denying the claims of a cybersecurity firm that hackers from the Democratic People’s Republic of Korea (DPRK) deeply penetrated its defense industry. Through the use of sophisticated phishing, including fake LinkedIn accounts and fluent English speakers, employees at Israeli defense companies were tricked into stalling spyware on these personal computers and then the hackers allegedly eventually accessed classified Israeli networks. The attacks show growing sophistication from DPRK hackers and that those looking to penetrate networks will always seek out weak spots.
  • Pentagon Requests More Time to Review JEDI Cloud Contract Bids” by Frank Konkel – Nextgov. The United States Department of Defense (DOD) has asked for yet more time to resolve who will win the second round of the Joint Enterprise Defense Infrastructure (JEDI) cloud contract that may prove worth more than $10 billion to the winner. The Pentagon had told the court it was on schedule to make an award ion the rebid of the contract that Microsoft had won over Amazon. The latter claimed political interference from the White House violated federal contract law, among other claims, resulting in this lawsuit.
  • Google rival’s study urges letting mobile users pick search defaults” by Ashley Gold – Axios. DuckDuckGo, a search engine, claims in newly released research that permitting Android users to choose their search engine would decrease Google’s market share by 20%. This could be relevant to the United States (U.S.) Department of Justice’s (DOJ) antitrust investigation. As a point of reference, in the U.S., the United Kingdom, and Australia, Google’s share of the mobile search engine market is 95%, 98% and 98%. DOJ may seriously look at this remedy as the European Commission (EC) imposed this as part of its antitrust case against Google, resulting in a record €4.34 billion fine.
  • Facial Recognition Start-Up Mounts a First Amendment Defense” By Kashmir Hill – The New York Times. Clearview AI has retained legendary First Amendment lawyer Floyd Abrams to make the argument that its collection, use, and dissemination of publicly photos scraped from the internet is protected as free speech. Abrams is quoting as saying that while privacy is, of course, an important right, the First Amendment to the United States Constitution would trump any such rights. It is expected that this argument will be employed in the myriad suits against the facial recognition technology firm in the range of suits against the company.
  • An advanced group specializing in corporate espionage is on a hacking spree” By Jeff Stone – cyberscoop. A new hacking group, RedCurl, has gone on a worldwide hacking campaign that broke into businesses in the United Kingdom, Canada, and other places. The hackers phished a number of businesses successfully by impersonating someone from the human resources in he organization.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (13 August)

Here are Further Reading, Other Developments, and Coming Events:

Coming Events

  • On 18 August, the National Institute of Standards and Technology (NIST) will host the “Bias in AI Workshop, a virtual event to develop a shared understanding of bias in AI, what it is, and how to measure it.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.

Other Developments

  • Senate Intelligence Committee Acting Chair Marco Rubio (R-FL) and Vice Chairman Mark Warner (D-VA) released a statement indicating the committee had voted to adopt the fifth and final volume of its investigation of the Russian Federation’s interference in the 2016 election. The committee had submitted the report to the Intelligence Community for vetting and have received the report with edits and redactions. The report could be released sometime over the next few weeks.  Rubio and Warner stated “the Senate Intelligence Committee voted to adopt the classified version of the final volume of the Committee’s bipartisan Russia investigation. In the coming days, the Committee will work to incorporate any additional views, as well as work with the Intelligence Community to formalize a properly redacted, declassified, publicly releasable version of the Volume 5 report.” The Senate Intelligence Committee’s has released four previous reports:
  • The National Institute of Standards and Technology (NIST) is accepting comments until 11 September on draft Special Publication 800-53B, “Control Baselines for Information Systems and Organizations,” a guidance document that will serve a key role in the United States government’s efforts to secure and protect the networks and systems it operates and those run by federal contractors. NIST explained:
    • This publication establishes security and privacy control baselines for federal information systems and organizations and provides tailoring guidance for those baselines. The use of the security control baselines is mandatory, in accordance with OMB Circular A-130 [OMB A-130] and the provisions of the Federal Information Security Modernization Act4 [FISMA], which requires the implementation of a set of minimum controls to protect federal information and  information systems. Whereas use of the privacy control baseline is not mandated by law or [OMB A-130], SP 800-53B, along with other supporting NIST publications, is designed to help organizations identify the security and privacy controls needed to manage risk and satisfy the security and privacy requirements in FISMA, the Privacy Act of 1974 [PRIVACT], selected OMB policies (e.g., [OMB A-130]), and designated Federal Information Processing Standards (FIPS), among others
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) released an “Election Vulnerability Reporting Guide
    to provide “election administrators with a step-by-step guide, list of resources, and a template for establishing a successful vulnerability disclosure program to address possible vulnerabilities in their state and local election systems…[and] [t]he six steps include:
    • Step 1: Identify Systems Where You Would Accept Security Testing, and those Off-Limits
    • Step 2: Draft an Easy-to-Read Vulnerability Disclosure Policy (See Appendix III)
    • Step 3: Establish a Way to Receive Reports/Conduct Follow-On Communication
    • Step 4: Assign Someone to Thank and Communicate with Researchers
    • Step 5: Assign Someone to Vet and Fix the Vulnerabilities
    • Step 6: Consider Sharing Information with Other Affected Parties
  • The United Kingdom’s Information Commissioner’s Office (ICO) has issued “Guidance on AI and data protection” that “clarifies how you can assess the risks to rights and freedoms that AI can pose from a data protection perspective; and the appropriate measures you can implement to mitigate them.” The ICO explained “[w]hile data protection and ‘AI ethics’ overlap, this guidance does not provide generic ethical or design principles for your use of AI.” The ICO stated “[i]t corresponds to data protection principles, and is structured as follows:
    • part one addresses accountability and governance in AI, including data protection impact assessments (DPIAs);
    • part two covers fair, lawful and transparent processing, including lawful bases, assessing and improving AI system performance, and mitigating potential discrimination;
    • part three addresses data minimisation and security; and
    • part four covers compliance with individual rights, including rights related to automated decision-making.
  •  20 state attorneys general wrote Facebook Chief Executive Officer Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg “to request  that  you  take  additional  steps  to prevent   Facebook   from   being used   to   spread   disinformation   and   hate   and   to   facilitate discrimination.” They also asked “that you take more steps to provide redress for users who fall victim to intimidation and harassment, including violence and digital abuse.” The attorneys general said that “[b]ased on our collective experience, we believe that Facebook should take additional actions including the following steps—many of which are highlighted in Facebook’s recent Civil Rights Audit—to strengthen its commitment to civil rights and fighting disinformation and discrimination:
    • Aggressively enforce Facebook policies against hate speech and organized hate organizations: Although Facebook has developed policies against hate speech and organizations that peddle it, we remain concerned that Facebook’s policies on Dangerous Individuals and Organizations, including but not limited to its policies on white nationalist and white supremacist content, are not enforced quickly and comprehensively enough. Content that violates Facebook’s own policies too often escapes removal just because it comes as coded language, rather than specific magic words. And even where Facebook takes steps to address a particular violation, it often fails to proactively address the follow-on actions by replacement or splinter groups that quickly emerge.
    • Allow public, third-party audits of hate content and enforcement: To gauge the ongoing progress of Facebook’s enforcement efforts, independent experts should be permitted access to the data necessary to conduct regular, transparent third-party audits of hate and hate-related misinformation on the platform, including any information made available to the Global Oversight Board. As part of this effort, Facebook should capture data on the prevalence of different forms of hate content on the platform, whether or not covered by Facebook’s own community standards, thus allowing the public to determine whether enforcement of anti-hate policies differs based on the type of hate content at issue.
    • Commit to an ongoing, independent analysis of Facebook’s content population scheme and the prompt development of best practices guidance: By funneling users toward particular types of content, Facebook’s content population scheme, including its algorithms, can push users into extremist online communities that feature divisive and inflammatory messages, often directed at particular groups. Although Facebook has conducted research and considered programs to reduce this risk, there is still no mandatory guidance for coders and other teams involved in content population. Facebook should commit to an ongoing, independent analysis of its content population scheme, including its algorithms, and also continuously implement mandatory protocols as best practices are identified to curb bias and prevent recommendations of hate content and groups.
    • Expand policies limiting inflammatory advertisements that vilify minority groups: Although Facebook currently prohibits ads that claim that certain people, because of their membership in a protected group, pose a threat to the physical safety of communities or the nation, its policies still allow attacks that characterize such groups as threats to national culture or values. The current prohibition should be expanded to include such ads.
  • New Zealand’s Ministry of Statistics “launched the Algorithm Charter for Aotearoa New Zealand” that “signals that [the nation’s agencies] are committed to being consistent, transparent and accountable in their use of algorithms.”
    • The Ministry explained “[t]he Algorithm Charter is part of a wider ecosystem and works together with existing tools, networks and research, including:
      • Principles for the Safe and Effective Use of Data and Analytics (Privacy Commissioner and Government Chief Data Steward, 2018)
      • Government Use of Artificial Intelligence in New Zealand (New Zealand Law Foundation and Otago University, 2019)
      • Trustworthy AI in Aotearoa – AI Principles (AI Forum New Zealand, 2020)
      • Open Government Partnership, an international agreement to increase transparency.
      • Data Protection and Use Policy (Social Wellbeing Agency, 2020)
      • Privacy, Human Rights and Ethics Framework (Ministry of Social Development).
  • The European Union (EU) imposed its first cyber sanctions under its Framework for a Joint EU Diplomatic Response to Malicious Cyber Activities (aka the cyber diplomacy toolbox) against six hackers and three entities from the Russian Federation, the People’s Republic of China (PRC) and the Democratic People’s Republic of Korea for attacks against the against the Organisation for the Prohibition of Chemical Weapons (OPCW) in the Netherlands, the malware attacks known as Petya and WannaCry, and Operation Cloud Hopper. The EU’s cyber sanctions follow sanctions the United States has placed on a number of people and entities from the same nations and also indictments the U.S. Department of Justice has announced over the years. The sanctions are part of the effort to levy costs on nations and actors that conduct cyber attacks. The EU explained:
    • The attempted cyber-attack was aimed at hacking into the Wi-Fi network of the OPCW, which, if successful, would have compromised the security of the network and the OPCW’s ongoing investigatory work. The Netherlands Defence Intelligence and Security Service (DISS) (Militaire Inlichtingen- en Veiligheidsdienst – MIVD) disrupted the attempted cyber-attack, thereby preventing serious damage to the OPCW.
    • “WannaCry” disrupted information systems around the world by targeting information systems with ransomware and blocking access to data. It affected information systems of companies in the Union, including information systems relating to services necessary for the maintenance of essential services and economic activities within Member States.
    • “NotPetya” or “EternalPetya” rendered data inaccessible in a number of companies in the Union, wider Europe and worldwide, by targeting computers with ransomware and blocking access to data, resulting amongst others in significant economic loss. The cyber-attack on a Ukrainian power grid resulted in parts of it being switched off during winter.
    • “Operation Cloud Hopper” has targeted information systems of multinational companies in six continents, including companies located in the Union, and gained unauthorised access to commercially sensitive data, resulting in significant economic loss.
  • The United States’ Federal Communications Commission (FCC) is asking for comments on the Department of Commerce’s the National Telecommunications and Information Administration’s (NTIA) petition asking the agency to start a rulemaking to clarify alleged ambiguities in 47 USC 230 regarding the limits of the liability shield for the content others post online versus the liability protection for “good faith” moderation by the platform itself. The NTIA was acting per direction in an executive order allegedly aiming to correct online censorship. Executive Order 13925, “Preventing Online Censorship” was issued in late May after Twitter factchecked two of President Donald Trump’s Tweets regarding false claims made about mail voting in California in response to the COVID-19 pandemic. Comments are due by 2 September.
  • The Australian Competition & Consumer Commission (ACCC) released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” The government in Canberra had asked the ACCC to draft this code earlier this year after talks broke down between the Australian Treasury
    • The ACCC explained
      • The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.
    • This is not the ACCC’s first interaction with the companies. Late last year, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.
    • A year ago, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “released core guidance documentation for the Trusted Internet Connections (TIC) program, developed to assist agencies in protecting modern information technology architectures and services.” CISA explained “In accordance with the Office of Management and Budget (OMB) Memorandum (M) 19-26: Update to the TIC Initiative, TIC 3.0 expands on the original initiative to drive security standards and leverage advances in technology to secure a wide spectrum of agency network architectures.” Specifically, CISA released three core guidance documents:
    • Program Guidebook (Volume 1) – Outlines the modernized TIC program and includes its historical context
    • Reference Architecture (Volume 2) – Defines the concepts of the program to guide and constrain the diverse implementations of the security capabilities
  • Senators Ron Wyden (D-OR), Bill Cassidy (R-LA) and ten other Members wrote the Federal Trade Commission (FTC) urging the agency “to investigate widespread privacy violations by companies in the advertising technology (adtech) industry that are selling private data about millions of Americans, collected without their knowledge or consent from their phones, computers, and smart TVs.” They asked the FTC “to use its authority to conduct broad industry probes under Section 6(b) of the FTC Act to determine whether adtech companies and their data broker partners have violated federal laws prohibiting unfair and deceptive business practices.” They argued “[t]he FTC should not proceed with its review of the Children’s Online Privacy Protection Act (COPPA) Rule before it has completed this investigation.”
  •  “100 U.S. women lawmakers and current and former legislators from around the world,” including Speaker of the House Nancy Pelosi (D-CA), sent a letter to Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg urging the company “to take decisive action to protect women from rampant and increasing online attacks on their platform that have caused many women to avoid or abandon careers in politics and public service.” They noted “[j]ust a few days ago, a manipulated and widely shared video that depicted Speaker Pelosi slurring her speech was once again circulating on major social media platforms, gaining countless views before TikTok, Twitter, and YouTube all removed the footage…[and] [t]he video remains on Facebook and is labeled “partly false,” continuing to gain millions of views.” The current and former legislators “called on Facebook to enforce existing rules, including:
    • Quick removal of posts that threaten candidates with physical violence, sexual violence or death, and that glorify, incite or praise violence against women; disable the relevant accounts, and refer offenders to law enforcement.
    • Eliminate malicious hate speech targeting women, including violent, objectifying or dehumanizing speech, statements of inferiority, and derogatory sexual terms;
    • Remove accounts that repeatedly violate terms of service by threatening, harassing or doxing or that use false identities to attack women leaders and candidates; and
    • Remove manipulated images or videos misrepresenting women public figures.
  • The United States’ Departments of Commerce and Homeland Security released an update “highlighting more than 50 activities led by industry and government that demonstrate progress in the drive to counter botnet threats.” in May 2018, the agencies submitted “A Report to the President on Enhancing the Resilience of the Internet and Communications Ecosystem Against Botnets and Other Automated, Distributed Threats” that identified a number of steps and prompted a follow on “A Road Map Toward Resilience Against Botnets” released in November 2018.
  • United States (U.S.) Secretary of Commerce Wilbur Ross and European Commissioner for Justice Didier Reynders released a joint statement explaining that “[t]he U.S. Department of Commerce and the European Commission have initiated discussions to evaluate the potential for an enhanced EU-U.S. Privacy Shield framework to comply with the July 16 judgment of the Court of Justice of the European Union in the Schrems II case.”
    • Maximillian Schrems filed a complaint against Facebook with Ireland’s Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under European Union law because of the mass U.S. surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-U.S. Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the U.S. passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”
    • However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The CJEU struck down the adequacy decision, throwing into doubt many entities’ transfers out of the EU into the U.S. but upheld SCCs in a way that suggested EU data protection authorities (DPA) may need to review all such agreements to ensure they comply with EU law.
  • The European Commission (EC) announced an “an in-depth investigation to assess the proposed acquisition of Fitbit by Google under the EU Merger Regulation.” The EC voiced its concern “that the proposed transaction would further entrench Google’s market position in the online advertising markets by increasing the already vast amount of data that Google could use for personalisation of the ads it serves and displays.” The EC detailed its “preliminary competition concerns:
    • Following its first phase investigation, the Commission has concerns about the impact of the transaction on the supply of online search and display advertising services (the sale of advertising space on, respectively, the result page of an internet search engine or other internet pages), as well as on the supply of ”ad tech” services (analytics and digital tools used to facilitate the programmatic sale and purchase of digital advertising). By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to Fitbit’s one.
    • The data collected via wrist-worn wearable devices appears, at this stage of the Commission’s review of the transaction, to be an important advantage in the online advertising markets. By increasing the data advantage of Google in the personalisation of the ads it serves via its search engine and displays on other internet pages, it would be more difficult for rivals to match Google’s online advertising services. Thus, the transaction would raise barriers to entry and expansion for Google’s competitors for these services, to the ultimate detriment of advertisers and publishers that would face higher prices and have less choice.
    • At this stage of the investigation, the Commission considers that Google:
      • is dominant in the supply of online search advertising services in the EEA countries (with the exception of Portugal for which market shares are not available);
      • holds a strong market position in the supply of online display advertising services at least in Austria, Belgium, Bulgaria, Croatia, Denmark, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Norway, Poland, Romania, Slovakia, Slovenia, Spain, Sweden and the United Kingdom, in particular in relation to off-social networks display ads;
      • holds a strong market position in the supply of ad tech services in the EEA.
    • The Commission will now carry out an in-depth investigation into the effects of the transaction to determine whether its initial competition concerns regarding the online advertising markets are confirmed.
    • In addition, the Commission will also further examine:
      • the effects of the combination of Fitbit’s and Google’s databases and capabilities in the digital healthcare sector, which is still at a nascent stage in Europe; and
      • whether Google would have the ability and incentive to degrade the interoperability of rivals’ wearables with Google’s Android operating system for smartphones once it owns Fitbit.
    • In February after the deal had been announced, the European Data Protection Board (EDPB) made clear it position that Google and Fitbit will need to scrupulously observe the General Data Protection Regulation’s privacy and data security requirements if the body is sign off on the proposed $2.2 billion acquisition. Moreover, at present Google has not informed European Union (EU) regulators of the proposed deal. The deal comes at a time when both EU and U.S. regulators are already investigating Google for alleged antitrust and anticompetitive practices, and the EDPB’s opinion could carry weight in this process.
  • The United States’ (U.S.) Department of Homeland Security released a Privacy Impact Assessment for the U.S. Border Patrol (USPB) Digital Forensics Programs that details how it may conduct searches of electronic devices at the U.S. border and ports of entry. DHS explained
    • As part of USBP’s law enforcement duties, USBP may search and extract information from electronic devices, including: laptop computers; thumb drives; compact disks; digital versatile disks (DVDs); mobile phones; subscriber identity module (SIM) cards; digital cameras; vehicles; and other devices capable of storing electronic information.
    • Last year, a U.S. District Court held that U.S. Customs and Border Protection (CPB) and U.S. Immigration and Customs Enforcement’s (ICE) current practices for searches of smartphones and computers at the U.S. border are unconstitutional and the agency must have reasonable suspicion before conducting such a search. However, the Court declined the plaintiffs’ request that the information taken off of their devices be expunged by the agencies. This ruling follows a Department of Homeland Security Office of the Inspector General (OIG) report that found CPB “did not always conduct searches of electronic devices at U.S. ports of entry according to its Standard Operating Procedures” and asserted that “[t]hese deficiencies in supervision, guidance, and equipment management, combined with a lack of performance measures, limit [CPB’s] ability to detect and deter illegal activities related to terrorism; national security; human, drug, and bulk cash smuggling; and child pornography.”
    • In terms of a legal backdrop, the United States Supreme Court has found that searches and seizures of electronic devices at borders and airports are subject to lesser legal standards than those conducted elsewhere in the U.S. under most circumstances. Generally, the government’s interest in securing the border against the flow of contraband and people not allowed to enter allow considerable leeway to the warrant requirements for many other types of searches. However, in recent years two federal appeals courts (the Fourth and Ninth Circuits) have held that searches of electronic devices require suspicion on the part of government agents while another appeals court (the Eleventh Circuit) held differently. Consequently, there is not a uniform legal standard for these searches.
  • The Inter-American Development Bank (IDB) and the Organization of Americans States (OAS) released their second assessment of cybersecurity across Latin America and the Caribbean that used the Cybersecurity Capacity Maturity Model for Nations (CMM) developed at University of Oxford’s Global Cyber Security Capacity Centre (GSCC). The IDB and OAS explained:
    • When the first edition of the report “Cybersecurity: Are We Ready in Latin America and the Caribbean?” was released in March 2016, the IDB and the OAS aimed to provide the countries of Latin America and the Caribbean (LAC) not only with a picture of the state of cybersecurity but also guidance about the next steps that should be pursued to strengthen national cybersecurity capacities. This was the first study of its kind, presenting the state of cybersecurity with a comprehensive vision and covering all LAC countries.
    • The great challenges of cybersecurity, like those of the internet itself, are of a global nature. Therefore, it is undeniable that the countries of LAC must continue to foster greater cooperation among themselves, while involving all relevant actors, as well as establishing a mechanism for monitoring, analysis, and impact assessment related to cybersecurity both nationally and regionally. More data in relation to cybersecurity would allow for the introduction of a culture of cyberrisk management that needs to be extended both in the public and private sectors. Countries must be prepared to adapt quickly to the dynamic environment around us and make decisions based on a constantly changing threat landscape. Our member states may manage these risks by understanding the impact on and the likelihood of cyberthreats to their citizens, organizations, and national critical infrastructure. Moving to the next level of maturity will require a comprehensive and sustainable cybersecurity policy, supported by the country’s political agenda, with allocation of  financial resources and qualified human capital to carry it out.
    • The COVID-19 pandemic will pass, but events that will require intensive use of digital technologies so that the world can carry on will continue happening. The challenge of protecting our digital space will, therefore, continue to grow. It is the hope of the IDB and the OAS that this edition of the report will help LAC countries to have a better understanding of their current state of cybersecurity capacity and be useful in the design of the policy initiatives that will lead them to increase their level of cyberresilience.
  • The European Data Protection Supervisor (EDPS) issued an opinion on “the European Commission’s action plan for a comprehensive Union policy on preventing money laundering and terrorism financing (C(2020)2800 final), published on 7 May 2020.” The EDPS asserted:
    • While  the  EDPS acknowledges the  importance  of  the  fight  against money  laundering  and terrorism financing as an objective of general interest, we call for the legislation to strike a balance between the interference with the fundamental rights of privacy and personal data protection and  the measures that  are  necessary  to  effectively  achieve  the  general  interest goals on anti-money  laundering  and  countering the  financing  of terrorism (AML/CFT) (the principle of proportionality).
    • The EDPS recommends that the Commission monitors the effective implementation of the existing  AML/CFT  framework while ensuring that the  GDPR  and  the  data  protection framework are respected and complied with. This is particularly relevant for the works on the interconnection of central bank account mechanisms and beneficial ownership registers that should be largely inspired by the principles of data minimisation, accuracy and privacy-by-design and by default.  

Further Reading

  • China already has your data. Trump’s TikTok and WeChat bans can’t stop that.” By Aynne Kokas – The Washington Post. This article persuasively makes the case that even if a ban on TikTok and WeChat were to work, and there are substantive questions as to how a ban would given how widely the former has been downloaded, the People’s Republic of China (PRC) is almost certainly acquiring massive reams of data on Americans through a variety of apps, platforms, and games. For example, Tencent, owner of WeChat, has a 40% stake in Epic Games that has Fortnite, a massively popular multiplayer game (if you have never heard of it, ask one of the children in your family). Moreover, a recent change to PRC law mandates that companies operating in the PRC must share their data bases for cybersecurity reviews, which may be an opportunity aside from hacking and exfiltrating United States entities, to access data. In summation, if the Trump Administration is serious about stopping the flow of data from the U.S. to the PRC, these executive orders will do very little.
  • Big Tech Makes Inroads With the Biden Campaign” by David McCabe and Kenneth P. Vogel – The New York Times. Most likely long before former Vice President Joe Biden clinched the Democratic nomination, advisers volunteered to help plot out his policy positions, a process that intensified this year. Of course, this includes technology policy, and many of those volunteering for the campaign’s Innovation Policy Committee have worked or are working for large technology companies directly or as consultants or lobbyists. This piece details some of these people and their relationships and how the Biden campaign is managing possible conflicts of interest. Naturally, those on the left wing of the Democratic Party calling for tighter antitrust, competition, and privacy regulation are concerned that Biden might be pulled away from these positions despite his public statements arguing that the United States government needs to get tougher with some practices.
  • A Bible Burning, a Russian News Agency and a Story Too Good to Check Out” By Matthew Rosenberg and Julian E. Barnes – The New York Times. The Russian Federation seems to be using a new tactic with some success for sowing discord in the United States that is the information equivalent of throwing fuel onto a fire. In this case, a fake story manufactured by a Russian outlet was seized on by some prominent Republicans, in part, because it fits their preferred world view of protestors. In this instance, a Russian outlet created a fake story amplifying an actual event that went viral. We will likely see more of this, and it is not confined to fake stories intended to appeal to the right. The same is happening with content meant for the left wing in the United States.
  • Facebook cracks down on political content disguised as local news” by Sara Fischer – Axios. As part of its continuing effort to crack down on violations of its policies, Facebook will no longer allow groups with a political viewpoint to masquerade as news. The company and outside experts have identified a range of instances where groups propagating a viewpoint, as opposed to reporting, have used a Facebook exemption by pretending to be local news outlets.
  • QAnon groups have millions of members on Facebook, documents show” By Ari Sen and Brandy Zadrozny – NBC News. It appears as if some Facebooks are leaking the results of an internal investigation that identified more than 1 million users who are part of QAnon groups. Most likely these employees want the company to take a stronger stance on the conspiracy group QAnon like the company has with COVID-19 lies and misinformation.
  • And, since Senator Kamala Harris (D-CA) was named former Vice President Joe Biden’s (D-DE) vice presidential pick, this article has become even more relevant than when I highlighted it in late July: “New Emails Reveal Warm Relationship Between Kamala Harris And Big Tech” – HuffPost. Obtained via an Freedom of Information request, new email from Senator Kamala Harris’ (D-CA) tenure as her state’s attorney general suggest she was willing to overlook the role Facebook, Google, and others played and still play in one of her signature issues: revenge porn. This article makes the case Harris came down hard on a scammer running a revenge porn site but did not press the tech giants with any vigor to take down such material from their platforms. Consequently, the case is made if Harris is former Vice President Joe Biden’s vice presidential candidate, this would signal a go easy approach on large companies even though many Democrats have been calling to break up these companies and vigorously enforce antitrust laws. Harris has largely not engaged on tech issues during her tenure in the Senate. To be fair, many of these companies are headquartered in California and pump billions of dollars into the state’s economy annually, putting Harris in a tricky position politically. Of course, such pieces should be taken with a grain of salt since it may have been suggested or planted by one of Harris’ rivals for the vice president nomination or someone looking to settle a score.
  • Unwanted Truths: Inside Trump’s Battles With U.S. Intelligence Agencies” by Robert Draper – The New York Times. A deeply sourced article on the outright antipathy between President Donald Trump and Intelligence Community officials, particularly over the issue of how deeply Russia interfered in the election in 2016. A number of former officials have been fired or forced out because they refused to knuckle under to the White House’s desire to soften or massage conclusions of Russia’s past and current actions to undermine the 2020 election in order to favor Trump.
  • Huawei says it’s running out of chips for its smartphones because of US sanctions” By Kim Lyons – The Verge and “Huawei: Smartphone chips running out under US sanctions” by Joe McDonald – The Associated Press. United States (U.S.) sanctions have started biting the Chinese technology company Huawei, which announced it will likely run out of processor chips for its smartphones. U.S. sanctions bar any company from selling high technology items like processors to Huawei, and this capability is not independently available in the People’s Republic of China (PRC) at present.
  • Targeting WeChat, Trump Takes Aim at China’s Bridge to the World” By Paul Mozur and Raymond Zhong – The New York Times. This piece explains WeChat, the app, the Trump Administration is trying to ban in the United States (U.S.) without any warning. It is like a combination of Facebook, WhatsApp, news app, and payment platform and is used by more than 1.2 billion people.
  • This Tool Could Protect Your Photos From Facial Recognition” By Kashmir Hill – The New York Times. Researchers at the University of Chicago have found a method of subtly altering photos of people that appears to foil most facial recognition technologies. However, a number of experts interviewed said it is too late to stop companies like AI Clearview.
  • I Tried to Live Without the Tech Giants. It Was Impossible.” By Kashmir Hill – The New York Times. This New York Times reporter tried living without the products of large technology companies, which involved some fairly obvious challenges and some that were not so obvious. Of course, it was hard for her to skip Facebook, Instagram, and the like, but cutting out Google and Amazon proved hardest and basically impossible because of the latter’s cloud presence and the former’s web presence. The fact that some of the companies cannot be avoided if one wants to be online likely lends weight to those making the case these companies are anti-competitive.
  • To Head Off Regulators, Google Makes Certain Words Taboo” by Adrianne Jeffries – The Markup. Apparently, in what is a standard practice at large companies, employees at Google were coached to avoid using certain terms or phrases that antitrust regulators would take notice of such as: “market,” “barriers to entry,” and “network effects.” The Markup obtained a 16 August 2019 document titled “Five Rules of Thumb For Written Communications” that starts by asserting “[w]ords matter…[e]specially in antitrust laws” and goes on to advise Google’s employees:
    • We’re out to help users, not hurt competitors.
    • Our users should always be free to switch, and we don’t lock anyone in.
    • We’ve got lots of competitors, so don’t assume we control or dominate any market.
    • Don’t try and define a market or estimate our market share.
    • Assume every document you generate, including email, will be seen by regulators.
  • Facebook Fired An Employee Who Collected Evidence Of Right-Wing Pages Getting Preferential Treatment” By Craig Silverman and Ryan Mac – BuzzFeed News. A Facebook engineer was fired after adducing proof in an internal communications system that the social media platform is more willing to change false and negative ratings to claims made by conservative outlets and personalities than any other viewpoint. If this is true, it would be opposite to the narrative spun by the Trump Administration and many Republicans in Congress. Moreover, Facebook’s incentives would seem to align with giving conservatives more preferential treatment because many of these websites advertise on Facebook, the company probably does not want to get crosswise with the Administration, sensational posts and content drive engagement which increases user numbers that allows for higher ad rates, and it wants to appear fair and impartial.
  • How Pro-Trump Forces Work the Refs in Silicon Valley” By Ben Smith – The New York Times. This piece traces the nearly four decade old effort of Republicans to sway mainstream media and now Silicon Valley to its viewpoint.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo credit: Gerd Altmann on Pixabay