Preview of Senate Democratic Chairs

It’s not clear who will end up where, but new Senate chairs will change focus and agenda of committees and debate over the next two years.

With the victories of Senators-elect Rafael Warnock (D-GA) and Jon Ossoff (D-GA), control of the United States Senate will tip to the Democrats once Vice President-elect Kamala Harris (D) is sworn in and can break the 50-50 tie in the chamber in favor of the Democrats. With the shift in control, new chairs will take over committees key to setting the agenda over the next two years in the Senate. However, given the filibuster, and the fact that Senate Republicans will exert maximum leverage through its continued use, Democrats will be hamstrung and forced to work with Republicans on matters such as federal privacy legislation, artificial intelligence (AI), the Internet of Things (IOT), cybersecurity, data flows, surveillance, etc. just as Republicans have had to work with Democrats over the six years they controlled the chamber. Having said that, Democrats will be in a stronger position than they had been and will have the power to set the agenda in committee hearings, being empowered to call the lion’s share of witnesses and to control the floor agenda. What’s more, Democrats will be poised to confirm President-elect Joe Biden’s nominees at agencies like the Federal Communications Commission (FCC), Federal Trade Commission (FTC), the Department of Justice (DOJ), and others, giving the Biden Administration a free hand in many areas of technology policy.

All of that being said, this is not meant to be an exhaustive look at all the committees of jurisdiction and possible chairs. Rather, it seeks to survey likely chairs on selected committees and some of their priorities for the next two years. Subcommittee chairs will also be important, but until the cards get shuffled among the chairs, it will not be possible to see where they land at the subcommittee level.

When considering the possible Democratic chairs of committees, one must keep in mind it is often a matter of musical chairs with the most senior members getting first choice. And so, with Senator Patrick Leahy (D-VT) as the senior-most Democratic Senator, he may well choose to leave the Appropriations Committee and move back to assume the gavel of the Judiciary Committee. Leahy has long been a stakeholder on antitrust, data security, privacy, and surveillance legislation and would be in a position to influence what bills on those and other matters before the Senate look like. If Leahy does not move to the chair on Judiciary, he may still be entitled to chair a subcommittee and exert influence.

If Leahy stays put, then current Senate Minority Whip Dick Durbin (D-IL) would be poised to leapfrog Senator Dianne Feinstein (D-CA) to chair Judiciary after Feinstein was persuaded to step aside on account of her lackluster performance in a number of high-profile hearings in 2020. Durbin has also been active on privacy, data security, and surveillance issues. The Judiciary Committee will be central to a number of technology policies, including Foreign Intelligence Surveillance Act reauthorization, privacy legislation, Section 230 reform, antitrust, and others. On the Republican side of the dais, Senator Lindsey Graham (R-SC) leaving the top post because of term limit restrictions imposed by Republicans, and Senator Charles Grassley (R-IA) is set to replace him. How this changes the 47 USC 230 (Section 230) debate is not immediately clear. And yet, Grassley and three colleagues recently urged the Trump Administration in a letter to omit language in a trade agreement with the United Kingdom (UK) that mirrors the liability protection Section 230. Senators Rob Portman (R-OH), Mark R. Warner (D-VA), Richard Blumenthal (D-CT), and Grassley argued to U.S. Trade Representative Ambassador Robert Lighthizer that a “safe harbor” like the one provided to technology companies for hosting or moderating third party content is outdated, not needed in a free trade agreement, contrary to the will of both the Congress and UK Parliament, and likely to be changed legislatively in the near future. It is likely, however, Grassley will fall in with other Republicans propagating the narrative that social media is unfairly biased against conservatives, particularly in light of the recent purge of President Donald Trump for his many, repeated violations of policy.

The Senate Judiciary Committee will be central in any policy discussions of antitrust and anticompetition in the technology realm. But it bears note the filibuster (and the very low chances Senate Democrats would “go nuclear” and remove all vestiges of the functional supermajority requirement to pass legislation) will give Republicans leverage to block some of the more ambitious reforms Democrats might like to enact (e.g. the House Judiciary Committee’s October 2020 final report that calls for nothing less than a complete remaking of United States (U.S.) antitrust policy and law; see here for more analysis.)

It seems Senator Sherrod Brown (D-OH) will be the next chair of the Senate Banking, Housing, and Urban Development Committee which has jurisdiction over cybersecurity, data security, privacy, and other issues in the financial services sector, making it a player on any legislation designed to encompass the whole of the United States economy. Having said that, it may again be the case that sponsors of, say, privacy legislation decide to cut the Gordian knot of jurisdictional turf battles by cutting out certain committees. For example, many of the privacy bills had provisions making clear they would deem financial services entities in compliance with the Financial Services Modernization Act of 1999 (P.L. 106-102) (aka Gramm-Leach-Bliley) to be in compliance with the new privacy regime. I suppose these provisions may have been included on the basis of the very high privacy and data security standards Gramm-Leach-Bliley has brought about (e.g. the Experian hack), or sponsors of federal privacy legislation made the strategic calculation to circumvent the Senate Banking Committee as much as they can. Nonetheless, this committee has sought to insert itself into the policymaking process on privacy last year as Brown and outgoing Chair Mike Crapo (R-ID) requested “feedback” in February 2019 “from interested stakeholders on the collection, use and protection of sensitive information by financial regulators and private companies.” Additionally, Brown released what may be the most expansive privacy bill from the perspective of privacy and civil liberties advocates, the “Data Accountability and Transparency Act of 2020” in June 2020 (see here for my analysis.) Therefore, Brown may continue to push for a role in federal privacy legislation with a gavel in his hands.

In a similar vein, Senator Patty Murray (D-WA) will likely take over the Senate Health, Education, Labor, and Pensions (HELP) Committee which has jurisdiction over health information privacy and data security through the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act). Again, as with the Senate Banking Committee and Gramm-Leach-Bliley, most of the privacy bills exempt HIPAA-compliant entities. And yet, even if her committee is cut out of a direct role in privacy legislation, Murray will still likely exert influence through oversight of and possible legislation changing HIPAA regulations and the Department of Health and Human Services (HHS) enforcement and rewriting of these standards for most of the healthcare industry. For example, HHS is rushing a rewrite of the HIPAA regulations at the tail end of the Trump Administration, and Murray could be in a position to inform how the Biden Administration and Secretary of Health and Human Services-designate Xavier Berra handles this rulemaking. Additionally, Murray may push the Office of Civil Rights (OCR), the arm of HHS that writes and enforces these regulations, to prioritize matters differently.

Senator Maria Cantwell (D-WA) appears to be the next chair of the Senate Commerce, Science, and Transportation Committee and arguably the largest technology portfolio in the Senate. It is the primary committee of jurisdiction for the FCC, FTC, National Telecommunications and Information Administration (NTIA), the National Institute of Standards and Technology (NIST), and the Department of Commerce. Cantwell may exert influence on which people are nominated to head and staff those agencies and others. Her committee is also the primary committee of jurisdiction for domestic and international privacy and data protection matters. And so, federal privacy legislation will likely be drafted by this committee, and legislative changes so the U.S. can enter into a new personal data sharing agreement with the European Union (EU) would also likely involve her and her committee.

Cantwell and likely next Ranking Member Roger Wicker (R-MS) agree on many elements of federal privacy law but were at odds last year on federal preemption and whether people could sue companies for privacy violations. Between them, they circulated three privacy bills. In September 2020, Wicker and three Republican colleagues introduced the “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) (see here for more analysis). Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis).

Cantwell could also take a leading role on Section 230, but her focus, of late, seems to be on how technology companies are wreaking havoc to traditional media. released a report that she has mentioned during her opening statement at the 23 September hearing aimed at trying to revive data privacy legislation. She and her staff investigated the decline and financial troubles of local media outlets, which are facing a cumulative loss in advertising revenue of up to 70% since 2000. And since advertising revenue has long been the life blood of print journalism, this has devastated local media with many outlets shutting their doors or radically cutting their staff. This trend has been exacerbated by consolidation in the industry, often in concert with private equity or hedge funds looking to wring the last dollars of value from bargain basement priced newspapers. Cantwell also claimed that the overwhelming online advertising dominance of Google and Facebook has further diminished advertising revenue and other possible sources of funding through a variety of means. She intimates that much of this content may be illegal under U.S. law, and the FTC may well be able to use its Section 5 powers against unfair and deceptive acts and its anti-trust authority to take action. (see here for more analysis and context.) In this vein, Cantwell will want her committee to play in any antitrust policy changes, likely knowing massive changes in U.S. law are not possible in a split Senate with entrenched party positions and discipline.

Senator Jack Reed (D-RI) will take over the Senate Armed Services Committee and its portfolio over national security technology policy that includes the cybersecurity, data protection and supply chain of national security agencies and their contractors, AI, offensive and defensive U.S. cyber operations, and other realms. Much of the changes Reed and his committee will seek to make will be through the annual National Defense Authorization Act (NDAA) (see here and here for the many technology provisions in the FY 2021 NDAA.) Reed may also prod the Department of Defense (DOD) to implement or enforce the Cybersecurity Maturity Model Certification (CMMC) Framework differently than envisioned and designed by the Trump Administration. In December 2020, a new rule took effect designed to drive better cybersecurity among U.S. defense contractors. This rule brings together two different lines of effort to require the Defense Industrial Base (DIB) to employ better cybersecurity given the risks they face by holding and using classified information, Federal Contract Information (FCI) and Controlled Unclassified Information (CUI). The Executive Branch has long wrestled with how to best push contractors to secure their systems, and Congress and the White House have opted for using federal contract requirements in that contractors must certify compliance. However, the most recent initiative, the CMMC Framework will require contractors to be certified by third party assessors. And yet, it is not clear the DOD has wrestled with the often-misaligned incentives present in third party certification schemes.

Reed’s committee will undoubtedly delve deep into the recent SolarWinds hack and implement policy changes to avoid a reoccurrence. Doing so may lead the Senate Armed Services Committee back to reconsidering the Cyberspace Solarium Commission’s (CSC) March 2020 final report and follow up white papers, especially their views embodied in “Building a Trusted ICT Supply Chain.”

Senator Mark Warner (D-VA) will likely take over the Senate Intelligence Committee. Warner has long been a stakeholder on a number of technology issues and would be able to exert influence on the national security components of such issues. He and his committee will almost certainly play a role in the Congressional oversight of and response to the SolarWinds hack. Likewise, his committee shares jurisdiction over FISA with the Senate Judiciary Committee and over national security technology policy with the Armed Services Committee.

Senator Amy Klobuchar (D-MN) would be the Senate Democratic point person on election security from her perch at the Senate Rules and Administration Committee, which may enable her to more forcefully push for the legislative changes she has long advocated for. In May 2019, Klobuchar and other Senate Democrats introduced the “Election Security Act” (S. 1540), the Senate version of the stand-alone measure introduced in the House that was taken from the larger package, the “For the People Act” (H.R. 1) passed by the House.

In August 2018, the Senate Rules and Administration Committee postponed indefinitely a markup on a compromise bill to provide states additional assistance in securing elections from interference, the “The Secure Elections Act” (S.2593). Reportedly, there was concern among state officials that a provision requiring audits of election results would be in effect an unfunded mandate even though this provision was softened at the insistence of Senate Republican leadership. However, a Trump White House spokesperson indicated in a statement that the Administration opposed the bill, which may have posed an additional obstacle to Committee action. However, even if the Senate had passed its bill, it was unlikely that the Republican controlled House would have considered companion legislation (H.R. 6663).

Senator Gary Peters (D-MI) may be the next chair of the Senate Homeland Security and Governmental Affairs Committee, and if so, he will continue to face the rock on which many the bark of cybersecurity legislation has been dashed: Senator Ron Johnson (R-WI). So significant has Johnson’s opposition been to bipartisan cybersecurity legislation from the House, some House Republican stakeholders have said so in media accounts not bothering to hide in anonymity. And so whatever Peters’ ambitions may be to shore up the cybersecurity of the federal government as his committee will play a role in investigating and responding to the Russian hack of SolarWinds and many federal agencies, he will be limited by whatever Johnson and other Republicans will allow to move through the committee and through the Senate. Of course, Peters’ purview would include the Department of Homeland Security and the Cybersecurity and Infrastructure Security Agency (CISA) and its remit to police the cybersecurity practices of the federal government. Peters would also have in his portfolio the information technology (IT) practices of the federal government, some $90 billion annually across all agencies.

Finally, whether it be Leahy or Durbin at the Senate Appropriations Committee, this post allows for immense influence in funding and programmatic changes in all federal programs through the power of the purse Congress holds.

Further Reading, Other Development, and Coming Events (4 January 2021)

Further Reading

  • Microsoft Says Russian Hackers Viewed Some of Its Source Code” By Nicole Perlroth — The New York Times. The Sluzhba vneshney razvedki Rossiyskoy Federatsii’s (SVR) hack keeps growing and growing with Microsoft admitting its source code was viewed through an employee account. It may be that authorized Microsoft resellers were one of the vectors by which the SVR accessed SolarWinds, FireEye, and ultimately a number of United States (U.S.) government agencies. Expect more revelations to come about the scope and breadth of entities and systems the SVR compromised.
  • In 2020, we reached peak Internet. Here’s what worked — and what flopped.” By Geoffrey Fowler — The Washington Post. The newspaper’s tech columnist reviews the technology used during the pandemic and what is likely to stay with us when life returns to some semblance of normal.
  • Facebook Says It’s Standing Up Against Apple For Small Businesses. Some Of Its Employees Don’t Believe It.” By Craig Silverman and Ryan Mac — BuzzFeed News. Again, two of the best-sourced journalists when it comes to Facebook have exposed employee dissent within the social media and advertising giant, and this time over the company’s advertising blitz positioning it as the champion of small businesses that allegedly stand to be hurt when Apple rolls out iOS 14 that will allow users to block the type of tracking across apps and the internet Facebook thrives on. The company’s PR campaign stands in contrast to the anecdotal stories about errors that harmed and impeded small companies in using Facebook to advertise and sell products and services to cusstomers.
  • SolarWinds hack spotlights a thorny legal problem: Who to blame for espionage?” By Tim Starks — cyberscoop. This piece previews possible and likely inevitable litigation to follow from the SolarWinds hack, including possible securities action on the basis of fishy dumps of stock by executive, breach of contract, and negligence for failing to patch and address vulnerabilities in a timely fashion. Federal and state regulators will probably get on the field, too. But this will probably take years to play out as Home Depot settled claims arising from its 2014 breach with state attorneys general in November 2020.
  • The Tech Policies the Trump Administration Leaves Behind” By Aaron Boyd — Nextgov. A look back at the good, the bad, and the ugly of the Trump Administration’s technology policies, some of which will live on in the Biden Administration.

Other Developments

  • In response to the SolarWinds hack, the Federal Bureau of Investigation (FBI), the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA), and the Office of the Director of National Intelligence (ODNI) issued a joint statement indicating that the process established in Pursuant to Presidential Policy Directive (PPD) 41, an Obama Administration policy has been activated and a Cyber Unified Coordination Group (UCG) has been formed “to coordinate a whole-of-government response to this significant cyber incident.” The agencies explained “[t]he UCG is intended to unify the individual efforts of these agencies as they focus on their separate responsibilities.”
    • In PPD-41 it is explained that a UCG “shall serve as the primary method for coordinating between and among Federal agencies in response to a significant cyber incident as well as for integrating private sector partners into incident response efforts, as appropriate.” Moreover, “[t]he Cyber UCG is intended to result in unity of effort and not to alter agency authorities or leadership, oversight, or command responsibilities.”
  • Following the completion of its “in-depth” investigation, the European Commission (EC) cleared Google’s acquisition of Fitbit with certain conditions, removing a significant hurdle for the American multinational in buying the wearable fitness tracker company. In its press release, the EC explained that after its investigation, “the Commission had concerns that the transaction, as initially notified, would have harmed competition in several markets.” To address and allay concerns, Google bound itself for ten years to a set of commitments that can be unilaterally extended by the EC and will be enforced, in part, by the appointment of a trustee to oversee compliance.
    • The EC was particularly concerned about:
      • Advertising: By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to that of Fitbit. By increasing the already vast amount of data that Google could use for the personalisation of ads, it would be more difficult for rivals to match Google’s services in the markets for online search advertising, online display advertising, and the entire “ad tech” ecosystem. The transaction would therefore raise barriers to entry and expansion for Google’s competitors for these services to the detriment of advertisers, who would ultimately face higher prices and have less choice.
      • Access to Web Application Programming Interface (‘API’) in the market for digital healthcare: A number of players in this market currently access health and fitness data provided by Fitbit through a Web API, in order to provide services to Fitbit users and obtain their data in return. The Commission found that following the transaction, Google might restrict competitors’ access to the Fitbit Web API. Such a strategy would come especially at the detriment of start-ups in the nascent European digital healthcare space.
      • Wrist-worn wearable devices: The Commission is concerned that following the transaction, Google could put competing manufacturers of wrist-worn wearable devices at a disadvantage by degrading their interoperability with Android smartphones.
    • As noted, Google made a number of commitments to address competition concerns:
      • Ads Commitment:
        • Google will not use for Google Ads the health and wellness data collected from wrist-worn wearable devices and other Fitbit devices of users in the EEA, including search advertising, display advertising, and advertising intermediation products. This refers also to data collected via sensors (including GPS) as well as manually inserted data.
        • Google will maintain a technical separation of the relevant Fitbit’s user data. The data will be stored in a “data silo” which will be separate from any other Google data that is used for advertising.
        • Google will ensure that European Economic Area (‘EEA’) users will have an effective choice to grant or deny the use of health and wellness data stored in their Google Account or Fitbit Account by other Google services (such as Google Search, Google Maps, Google Assistant, and YouTube).
      • Web API Access Commitment:
        • Google will maintain access to users’ health and fitness data to software applications through the Fitbit Web API, without charging for access and subject to user consent.
      • Android APIs Commitment:
        • Google will continue to license for free to Android original equipment manufacturers (OEMs) those public APIs covering all current core functionalities that wrist-worn devices need to interoperate with an Android smartphone. Such core functionalities include but are not limited to, connecting via Bluetooth to an Android smartphone, accessing the smartphone’s camera or its GPS. To ensure that this commitment is future-proof, any improvements of those functionalities and relevant updates are also covered.
        • It is not possible for Google to circumvent the Android API commitment by duplicating the core interoperability APIs outside the Android Open Source Project (AOSP). This is because, according to the commitments, Google has to keep the functionalities afforded by the core interoperability APIs, including any improvements related to the functionalities, in open-source code in the future. Any improvements to the functionalities of these core interoperability APIs (including if ever they were made available to Fitbit via a private API) also need to be developed in AOSP and offered in open-source code to Fitbit’s competitors.
        • To ensure that wearable device OEMs have also access to future functionalities, Google will grant these OEMs access to all Android APIs that it will make available to Android smartphone app developers including those APIs that are part of Google Mobile Services (GMS), a collection of proprietary Google apps that is not a part of the Android Open Source Project.
        • Google also will not circumvent the Android API commitment by degrading users experience with third party wrist-worn devices through the display of warnings, error messages or permission requests in a discriminatory way or by imposing on wrist-worn devices OEMs discriminatory conditions on the access of their companion app to the Google Play Store.
  • The United States (U.S.) Department of Health and Human Services’ (HHS) Office of Civil Rights (OCR) has proposed a major rewrite of the regulations governing medical privacy in the U.S. As the U.S. lacks a unified privacy regime, the proposed changes would affect on those entities in the medical sector subject to the regime, which is admittedly many such entities. Nevertheless, it is almost certain the Biden Administration will pause this rulemaking and quite possibly withdraw it should it prove crosswise with the new White House’s policy goals.
    • HHS issued a notice of proposed rulemaking “to modify the Standards for the Privacy of Individually Identifiable Health Information (Privacy Rule) under the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act).”
      • HHS continued:
        • The Privacy Rule is one of several rules, collectively known as the HIPAA Rules, that protect the privacy and security of individuals’ medical records and other protected health information (PHI), i.e., individually identifiable health information maintained or transmitted by or on behalf of HIPAA covered entities (i.e., health care providers who conduct covered health care transactions electronically, health plans, and health care clearinghouses).
        • The proposals in this NPRM support the Department’s Regulatory Sprint to Coordinated Care (Regulatory Sprint), described in detail below. Specifically, the proposals in this NPRM would amend provisions of the Privacy Rule that could present barriers to coordinated care and case management –or impose other regulatory burdens without sufficiently compensating for, or offsetting, such burdens through privacy protections. These regulatory barriers may impede the transformation of the health care system from a system that pays for procedures and services to a system of value-based health care that pays for quality care.
    • In a press release, OCR asserted:
      • The proposed changes to the HIPAA Privacy Rule include strengthening individuals’ rights to access their own health information, including electronic information; improving information sharing for care coordination and case management for individuals; facilitating greater family and caregiver involvement in the care of individuals experiencing emergencies or health crises; enhancing flexibilities for disclosures in emergency or threatening circumstances, such as the Opioid and COVID-19 public health emergencies; and reducing administrative burdens on HIPAA covered health care providers and health plans, while continuing to protect individuals’ health information privacy interests.
  • The Federal Trade Commission (FTC) has used its powers to compel selected regulated entities to provide requested information in asking that “nine social media and video streaming companies…provide data on how they collect, use, and present personal information, their advertising and user engagement practices, and how their practices affect children and teens.” The TFTC is using its Section 6(b) authority to compel the information from Amazon.com, Inc., ByteDance Ltd., which operates the short video service TikTok, Discord Inc., Facebook, Inc., Reddit, Inc., Snap Inc., Twitter, Inc., WhatsApp Inc., and YouTube LLC. Failure to respond can result in the FTC fining a non-compliant entity.
    • The FTC claimed in its press release it “is seeking information specifically related to:
      • how social media and video streaming services collect, use, track, estimate, or derive personal and demographic information;
      • how they determine which ads and other content are shown to consumers;
      • whether they apply algorithms or data analytics to personal information;
      • how they measure, promote, and research user engagement; and
      • how their practices affect children and teens.
    • The FTC explained in its sample order:
      • The Commission is seeking information concerning the privacy policies, procedures, and practices of Social Media and Video Streaming Service providers, Including the method and manner in which they collect, use, store, and disclose Personal Information about consumers and their devices. The Special Report will assist the Commission in conducting a study of such policies, practices, and procedures.
  • The United States (U.S.) Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) supplemented its Emergency Directive 21-01 to federal civilian agencies in response to the Sluzhba vneshney razvedki Rossiyskoy Federatsii’s (SVR) hack via SolarWinds. In an 18 December update, CISA explained:
    • This section provides additional guidance on the implementation of CISA Emergency Directive (ED) 21-01, to include an update on affected versions, guidance for agencies using third-party service providers, and additional clarity on required actions.
    •  In a 30 December update, CISA stated:
      • Specifically, all federal agencies operating versions of the SolarWinds Orion platform other than those identified as “affected versions” below are required to use at least SolarWinds Orion Platform version 2020.2.1HF2. The National Security Agency (NSA) has examined this version and verified that it eliminates the previously identified malicious code. Given the number and nature of disclosed and undisclosed vulnerabilities in SolarWinds Orion, all instances that remain connected to federal networks must be updated to 2020.2.1 HF2 by COB December 31, 2020. CISA will follow up with additional supplemental guidance, to include further clarifications and hardening requirements.
  • Australia’s Attorney-General’s Department published an unclassified version of the four volumes of the “Report of the Comprehensive Review of the Legal Framework of the National Intelligence Community,” an “examination of the legislative framework underpinning the National Intelligence Community (NIC)…the first and largest since the Hope Royal Commissions considered the Australian Intelligence Community (AIC) in the 1970s and 1980s.” Ultimately, the authors of the report concluded:
    • We do not consider the introduction of a common legislative framework, in the form of a single Act governing all or some NIC agencies, to be a practical, pragmatic or proportionate reform. It would be unlikely that the intended benefits of streamlining and simplifying NIC legislation could be achieved due to the diversity of NIC agency functions—from intelligence to law enforcement, regulatory and policy—and the need to maintain differences in powers, immunities and authorising frameworks. The Review estimates that reform of this scale would cost over $200million and take up to 10years to complete. This would be an impractical and disproportionate undertaking for no substantial gain. In our view, the significant costs and risks of moving to a single, consolidated Act clearly outweigh the limited potential benefits.
    • While not recommending a common legislative framework for the entire NIC, some areas of NIC legislation would benefit from simplification and modernisation. We recommend the repeal of the TIA Act, Surveillance Devices Act 2004(SD Act) and parts of the Australian Security Intelligence Organisation Act 1979 (ASIO Act), and their replacement with a single new Act governing the use of electronic surveillance powers—telecommunications interception, covert access to stored communications, computers and telecommunications data, and the use of optical, listening and tracking devices—under Commonwealth law.
  • The National Institute of Standards and Technology (NIST) released additional materials to supplement a major rewrite of a foundational security guidance document. NIST explained “[n]ew supplemental materials for NIST Special Publication (SP) 800-53 Revision 5, Security and Privacy Controls for Information Systems and Organizations, are available for download to support the December 10, 2020 errata release of SP 800-53 and SP 800-53B, Control Baselines for Information Systems and Organizations.” These supplemental materials include:
    • A comparison of the NIST SP 800-53 Revision 5 controls and control enhancements to Revision 4. The spreadsheet describes the changes to each control and control enhancement, provides a brief summary of the changes, and includes an assessment of the significance of the changes.  Note that this comparison was authored by The MITRE Corporation for the Director of National Intelligence (DNI) and is being shared with permission by DNI.
    • Mapping of the Appendix J Privacy Controls (Revision 4) to Revision 5. The spreadsheet supports organizations using the privacy controls in Appendix J of SP 800-53 Revision 4 that are transitioning to the integrated control catalog in Revision 5.
    • Mappings between NIST SP 800-53 and other frameworks and standards. The mappings provide organizations a general indication of SP 800-53 control coverage with respect to other frameworks and standards. When leveraging the mappings, it is important to consider the intended scope of each publication and how each publication is used; organizations should not assume equivalency based solely on the mapping tables because mappings are not always one-to-one and there is a degree of subjectivity in the mapping analysis.
  • Via a final rule, the Department of Defense (DOD) codified “the National Industrial Security Program Operating Manual (NISPOM) in regulation…[that] establishes requirements for the protection of classified information disclosed to or developed by contractors, licensees, grantees, or certificate holders (hereinafter referred to as contractors) to prevent unauthorized disclosure.” The DOD stated “[i]n addition to adding the NISPOM to the Code of Federal Regulations (CFR), this rule incorporates the requirements of Security Executive Agent Directive (SEAD) 3, “Reporting Requirements for Personnel with Access to Classified Information or Who Hold a Sensitive Position.” The DOD stated “SEAD 3 requires reporting by all contractor cleared personnel who have been granted eligibility for access to classified information.”
    • The DOD added “[t]his NISPOM rule provides for a single nation-wide implementation plan which will, with this rule, include SEAD 3 reporting by all contractor cleared personnel to report specific activities that may adversely impact their continued national security eligibility, such as reporting of foreign travel and foreign contacts.”
    • The DOD explained “NISP Cognizant Security Agencies (CSAs) shall conduct an analysis of such reported activities to determine whether they pose a potential threat to national security and take appropriate action.”
    • The DOD added that “the rule also implements the provisions of Section 842 of Public Law 115-232, which removes the requirement for a covered National Technology and Industrial Base (NTIB) entity operating under a special security agreement pursuant to the NISP to obtain a national interest determination as a condition for access to proscribed information.”
  • An advisory committee housed at the United States (U.S.) Department of Homeland Security (DHS) is calling for the White House to quickly “operationalize intelligence in a classified space with senior executives and cyber experts from most critical entities in the energy, financial services, and communications sectors working directly with intelligence analysts and other government staff.” In their report, the President’s National Infrastructure Advisory Council (NIAC) proposed the creation of a Critical Infrastructure Command Center (CICC) to “provid[e] real-time collaboration between government and industry…[and] take direct action and provide tactical solutions to mitigate, remediate,  and deter threats.” NIAC urged the President to “direct relevant federal agencies to support the private sector in executing the concept, including identifying the required government staff…[and] work with Congress to ensure the appropriate authorities are established to allow the CICC to fully realize its operational functionality.” NIAC recommended “near-term actions to implement the CICC concept:
    • 1.The President should direct the relevant federal agencies to support the private sector in rapidly standing up the CICC concept with the energy, financial services, and communications sectors:
      • a. Within 90 days the private sector will identify the executives who will lead execution of the CICC concept and establish governing criteria (including membership, staffing and rotation, and other logistics).
      • b. Within 120 days the CICC sector executives will identify and assign the necessary CICC staff from the private sector.
      • c. Within 90 days an appropriate venue to house the operational component will be identified and the necessary agreements put in place.
    • 2. The President should direct the Intelligence Community and other relevant government agencies to identify and co-locate the required government staff counterparts to enable the direct coordination required by the CICC. This staff should be pulled from the IC, SSAs, and law enforcement.
    • 3. The President, working with Congress, should establish the appropriate authorities and mission for federal agencies to directly share intelligence with critical infrastructure companies, along with any other authorities required for the CICC concept to be fully successful (identified in Appendix A).
    • 4. Once the CICC concept is fully operational (within 180 days), the responsible executives should deliver a report to the NSC and the NIAC demonstrating how the distinct capabilities of the CICC have been achieved and the impact of the capabilities to date. The report should identify remaining gaps in resources, direction, or authorities.

Coming Events

  • On 13 January, the Federal Communications Commission (FCC) will hold its monthly open meeting, and the agency has placed the following items on its tentative agenda “Bureau, Office, and Task Force leaders will summarize the work their teams have done over the last four years in a series of presentations:
    • Panel One. The Commission will hear presentations from the Wireless Telecommunications Bureau, International Bureau, Office of Engineering and Technology, and Office of Economics and Analytics.
    • Panel Two. The Commission will hear presentations from the Wireline Competition Bureau and the Rural Broadband Auctions Task Force.
    • Panel Three. The Commission will hear presentations from the Media Bureau and the Incentive Auction Task Force.
    • Panel Four. The Commission will hear presentations from the Consumer and Governmental Affairs Bureau, Enforcement Bureau, and Public Safety and Homeland Security Bureau.
    • Panel Five. The Commission will hear presentations from the Office of Communications Business Opportunities, Office of Managing Director, and Office of General Counsel.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by opsa from Pixabay

FY 2021 Omnibus and COVID Stimulus Become Law

The end-of-the-year funding package for FY 2021 is stuffed with technology policy changes.

At the tail end of the calendar year 2020, Congress and the White House finally agreed on FY 2021 appropriations and further COVID-19 relief funding and policies, much of which implicated or involved technology policy. As is often the practice, Congressional stakeholders used the opportunity of must-pass legislation as the vehicle for other legislation that perhaps could not get through a chamber of Congress or surmount the now customary filibuster in the Senate.

Congress cleared the “Consolidated Appropriations Act, 2021” (H.R.133) on 21 December 2020, but President Donald Trump equivocated on whether to sign the package, in part, because it did not provide for $2,000 in aid to every American, a new demand at odds with the one his negotiators worked out with House Democrats and Senate Republicans. Given this disparity, it seems more likely Trump made an issue of the $2,000 assistance to draw attention from a spate of controversial pardons issued to Trump allies and friends. Nonetheless, Trump ultimately signed the package on 27 December.

As one of the only bills or set of bills to annually pass Congress, appropriations acts are often the means by which policy and programmatic changes are made at federal agencies through the ability of the legislative branch to condition the use of such funds as are provided. This year’s package is different only in that it contains much more in the way of ride-along legislation than the average omnibus. In fact, there are hundreds, perhaps even more than 1,000 pages of non-appropriations legislation, some that pertains to technology policy. Moreover, with an additional supplemental bill attached to the FY 2021 omnibus also carries significant technology funding and programming.

First, we will review FY 2021 funding and policy for key U.S. agencies, then discuss COVID-19 related legislation, and then finally all the additional legislation Congress packed into the omnibus.

The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) would receive $2.025 billion, a bare $9 million increase above FY 2020 with significant reordering of how the agency may spend its funds:

  • The agreement includes a net increase of $224,178,000 above the budget request. This includes $226,256,000 above the request to maintain current services, and $54,516,000 in enhancements that are described in more detail below. Assumed in the current services level of funding are several rejections of proposed reductions to prior year initiatives and the inclusion of necessary annualizations to sustain them, such as: $35,606,000 for threat analysis and response; $5,507,000 for soft targets and crowded places security, including school safety and best practices; $6,852,000 for bombing prevention activities, including the train-the-trainer programs; and $67,371,000 to fully fund the Chemical Facility Anti-Terrorism Standards program. The agreement includes the following reductions below the budget request: $6,937,000 for personnel cost adjustments; $2,500,000 of proposed increases to the CyberSentry program; $11,354,000 of proposed increases for the Vulnerability Management program; $2,000,000 of proposed increases to the Cybersecurity Quality Service Management Office (QSMO); $6,500,000 of proposed increases for cybersecurity advisors; and $27,303,000 for the requested increase for protective security advisors. Of the total amount provided for this account, $22,793,000 is available until September 30, 2022, for the National Infrastructure Simulation Analysis Center.

The FY 2021 omnibus requires of CISA the following:

  • Financial Transparency and Accountability.-The Cybersecurity and Infrastructure Security Agency (CISA) is directed to submit the fiscal year 2022 budget request at the same level of PP A detail provided in the table at the end of this report with no further adjustments to the PP A structure. Further, CISA shall brief the Committees not later than 45 days after the date of enactment of this Act and quarterly thereafter on: a spend plan; detailed hiring plans with a delineation of each mission critical occupation (MCO); procurement plans for all major investments to include projected spending and program schedules and milestones; and an execution strategy for each major initiative. The hiring plan shall include an update on CISA’s hiring strategy efforts and shall include the following for each MCO: the number of funded positions and FTE within each PP A; the projected and obligated funding; the number of actual onboard personnel as of the date of the plan; and the hiring and attrition projections for the fiscal year.
  • Cyber Defense Education and Training (CDET).-The agreement includes $29,457,000 for CISA’s CDET programs, an increase of$20,607,000 above the request that is described in further detail below. Efforts are underway to address the shortage of qualified national cybersecurity professionals in the current and future cybersecurity workforce. In order to move forward with a comprehensive plan for a cybersecurity workforce development effort, the agreement includes $10,000,000 above the request to enhance cybersecurity education and training and programs to address the national shortfall of cybersecurity professionals, including activities funded through the use of grants or cooperative agreements as needed in order to fully comply with congressional intent. CISA should consider building a higher education consortium of colleges and universities, led by at least one academic institution with an extensive history of education, research, policy, and outreach in computer science and engineering disciplines; existing designations as a land-grant institution with an extension role; a center of academic excellence in cyber security operations; a proven track record in hosting cyber corps programs; a record of distinction in research cybersecurity; and extensive experience in offering distance education programs and outreach with K-12 programs. The agreement also includes $4,300,000 above the request for the Cybersecurity Education and Training Assistance Program (CETAP), which was proposed for elimination, and $2,500,000 above the request to further expand and initiate cybersecurity education programs, including CETAP, which improve education delivery methods for K-12 students, teachers, counselors and post-secondary institutions and encourage students to pursue cybersecurity careers.
  • Further, the agreement includes $2,500,000 above the request to support CISA’s role with the National Institute of Standards and Technology, National Initiative for Cybersecurity Education Challenge project or for similar efforts to address shortages in the cybersecurity workforce through the development of content and curriculum for colleges, universities, and other higher education institutions.
  • Lastly, the agreement includes $800,000 above the request for a review of CISA’s program to build a national cybersecurity workforce. CISA is directed to enter into a contract for this review with the National Academy of Public Administration, or a similar non-profit organization, within 45 days of the date of enactment of this Act. The review shall assess: whether the partnership models under development by CISA are positioned to be effective and scalable to address current and anticipated needs for a highly capable cybersecurity workforce; whether other existing partnership models, including those used by other agencies and private industry, could usefully augment CISA’s strategy; and the extent to which CISA’s strategy has made progress on workforce development objectives, including excellence, scale, and diversity. A report with the findings of the review shall be provided to the Committees not later than 270 days after the date of enactment of this Act.
  • Cyber QSMO.-To help improve efforts to make strategic cybersecurity services available to federal agencies, the agreement provides $1,514,000 above the request to sustain and enhance prior year investments. As directed in the House report and within the funds provided, CISA is directed to work with the Management Directorate to conduct a crowd-sourced security testing program that uses technology platforms and ethical security researchers to test for vulnerabilities on departmental systems. In addition, not later than 90 days after the date of enactment of this Act, CISA is directed to brief the Committees on opportunities for state and local governments to leverage shared services provided through the Cyber QSMO or a similar capability and to explore the feasibility of executing a pilot program focused on this goal.
  • Cyber Threats to Critical Election Infrastructure.-The briefing required in House Report 116–458 regarding CISA’s efforts related to the 2020 elections shall be delivered not later than 60 days after the date of enactment of this Act. CISA is directed to continue working with SL TT stakeholders to implement election security measures.
  • Cybersecurity Worliforce.-By not later than September 30, 2021, CISA shall provide a joint briefing, in conjunction with the Department of Commerce and other appropriate federal departments and agencies, on progress made to date on each recommendation put forth in Executive Order 13800 and the subsequent “Supporting the Growth and Sustainment of the Nation’s Cybersecurity Workforce” report.
  • Hunt and Incident Response Teams.-The agreement includes an increase of $3,000,000 above fiscal year 2020 funding levels to expand CISA’s threat hunting capabilities.
  • Joint Cyber Planning Office (JCPO).-The agreement provides an increase of $10,568,000 above the request to establish a JCPO to bring together federal and SLTT governments, industry, and international partners to strategically and operationally counter nation-state cyber threats. CISA is directed to brief the Committees not later than 60 days after the date of enactment of this Act on a plan for establishing the JCPO, including a budget and hiring plan; a description of how JCPO will complement and leverage other CISA capabilities; and a strategy for partnering with the aforementioned stakeholders.
  • Multi-State Information Sharing and Analysis Center (MS-ISAC).-The agreement provides $5,148,000 above the request for the MS-ISAC to continue enhancements to SLTT election security support, and furthers ransomware detection and response capabilities, including endpoint detection and response, threat intelligence platform integration, and malicious domain activity blocking.
  • Software Assurance Tools.-Not later than 90 days after the date of enactment of this Act, CISA, in conjunction with the Science and Technology Directorate, is directed to brief the Committees on their collaborative efforts to transition cyber-related research and development initiatives into operational tools that can be used to provide continuous software assurance. The briefing should include an explanation for any completed projects and activities that were not considered viable for practice or were considered operationally self-sufficient. Such briefing shall include software assurance projects, such as the Software Assurance Marketplace.
  • Updated Lifecycle Cost Estimates.–CISA is directed to provide a briefing, not later than 60 days after the date of enactment of this Act, regarding the Continuous Diagnostics and Mitigation (COM) and National Cybersecurity Protection System (NCPS) program lifecycles. The briefing shall clearly describe the projected evolution of both programs by detailing the assumptions that have changed since the last approved program cost and schedule baseline, and by describing the plans to address such changes. In addition, the briefing shall include an analysis of alternatives for aligning vulnerability management, incident response, and NCPS capabilities. Finally, CISA is directed to provide a report not later than 120 days after the date of enactment of this Act with updated five-year program costs and schedules which is congruent with projected capability gaps across federal civilian systems and networks.
  • Vulnerability Management.-The agreement provides $9,452,000 above fiscal year 2020 levels to continue reducing the 12-month backlog in vulnerability assessments. The agreement also provides an increase of $8,000,000 above the request to address the increasing number of identified and reported vulnerabilities in the software and hardware that operates critical infrastructure. This investment will improve capabilities to identify, analyze, and share information about known vulnerabilities and common attack patterns, including through the National Vulnerability Database, and to expand the coordinated responsible disclosure of vulnerabilities.

There are a pair of provisions aimed at the People’s Republic of China (PRC) in Division B (i.e. the FY 2021 Commerce-Justice-Science Appropriations Act):

  • Section 514 prohibits funds for acquisition of certain information systems unless the acquiring department or agency has reviewed and assessed certain risks. Any acquisition of such an information system is contingent upon the development of a risk mitigation strategy and a determination that the acquisition is in the national interest. Each department or agency covered under section 514 shall submit a quarterly report to the Committees on Appropriations describing reviews and assessments of risk made pursuant to this section and any associated findings or determinations.
  • Section 526 prohibits the use of funds by National Aeronautics and Space Administration (NASA), Office of Science and Technology Policy (OSTP), or the National Space Council (NSC) to engage in bilateral activities with China or a Chinese-owned company or effectuate the hosting of official Chinese visitors at certain facilities unless the activities are authorized by subsequent legislation or NASA, OSTP, or NSC have made a certification…

The National Institute of Standards and Technology (NIST) is asked with a number of duties, most of which relate to current or ongoing efforts in artificial intelligence (AI), cybersecurity, and the Internet of Things:

  • Artificial Intelligence (Al). -The agreement includes no less than $6,500,000 above the fiscal year 2020 level to continue NIST’s research efforts related to AI and adopts House language on Data Characterization Standards in Al. House language on Framework for Managing AI Risks is modified to direct NIST to establish a multi-stakeholder process for the development of an Al Risk Management Framework regarding the reliability, robustness, and trustworthiness of Al systems. Further, within 180 days of enactment of this Act, NIST shall establish the process by which it will engage with stakeholders throughout the multi-year framework development process.
  • Cybersecurity.-The agreement includes no less than the fiscal year 2020 enacted level for cybersecurity research, outreach, industry partnerships, and other activities at NIST, including the National Cybersecurity Center of Excellence (NCCoE) and the National Initiative for Cybersecurity Education (NICE). Within the funds provided, the agreement encourages NIST to establish additional NICE cooperative agreements with regional alliances and multi-stakeholder partnerships for cybersecurity workforce and education.
  • Cybersecurity of Genomic Data.-The agreement includes no less than $1,250,000 for NIST and NCCoE to initiate a use case, in collaboration with industry and academia, to research the cybersecurity of personally identifiable genomic data, with a particular focus on better securing deoxyribonucleic acid sequencing techniques, including clustered regularly interspaced short palindromic repeat (CRISPR) technologies, and genomic data storage architectures from cyber threats. NIST and NCCoE should look to partner with entities who have existing capability to research and develop state-of-the-art cybersecurity technologies for the unique needs of genomic and biomedical-based systems.
  • Industrial Internet of Things (IIoT).-The agreement includes no less than the fiscal year 2020 enacted amount for the continued development of an IloT cybersecurity research initiative and to partner, as appropriate, with academic entities and industry to improve the sustainable security of IloT devices in industrial settings.

NIST would receive a modest increase in funding from $1.034 billion to $1.0345 billion from the last fiscal year to the next.

The National Telecommunications and Information Administration (NTIA) would be provided $45.5 million and “the agreement provides (1) up to $7,500,000 for broadband mapping in coordination with the Federal Communications Commission (FCC); (2) no less than the fiscal year 2020 enacted amount for Broadband Programs; (3) $308,000 for Public Safety Communications; and (4) no less than $3,000,000 above the fiscal year 2020 enacted level for Advanced Communications Research.” The agency’s funding for FY 2021 is higher than the last fiscal year at a bit more than $40 million but far less than the Trump Administration’s request of more than $70 million.

Regarding NTIA programmatic language, the bill provides:

  • Further, the agreement directs the additional funds for Advanced Communications Research be used to procure and maintain cutting-edge equipment for research and testing of the next generation of communications technologies, including 5G, as well as to hire staff as needed. The agreement further encourages NTIA to improve the deployment of 5G and spectrum sharing through academic partnerships to accelerate the development of low-cost sensors. For fiscal year 2021, NTIA is directed to follow prior year report language, included in Senate Report 116-127 and adopted in Public Law 116-93, on the following topics: Federal Spectrum Management, Spectrum Management for Science, and the Internet Corporation for Assigned Names and Numbers (ICANN).
  • Spectrum Management System.-The agreement encourages NTIA and the Department to consider alternative proposals to fully fund the needed upgrades to its spectrum management system, including options outside of direct appropriations, and is directed to brief the Committees regarding possible alternative options no later than 90 days after enactment of this Act.
  • Next Generation Broadband in Rural Areas.-NTIA is encouraged to ensure that deployment of last-mile broadband infrastructure is targeted to areas that are currently unserved or underserved, and to utilize public-private partnerships and projects where Federal funding will not exceed 50 percent of a project’s total cost where practicable.
  • National Broadband Map Augmentation.-NTIA is directed to engage with rural and Tribal communities to further enhance the accuracy of the national broadband availability map. NTIA should include in its fiscal year 2022 budget request an update on rural-and Tribal-related broadband availability and access trends, challenges, and Federal actions to achieve equitable access to broadband services in currently underserved communities throughout the Nation. Furthermore, NTIA is encouraged, in coordination with the FCC, to develop and promulgate a standardized process for collecting data from State and local partners.
  • Domain Name Registration.-NTIA is directed, through its position within the Governmental Advisory Committee to work with ICANN to expedite the establishment of a global access model that provides law enforcement, intellectual property rights holders, and third parties with timely access to accurate domain name registration information for legitimate purposes. NTIA is encouraged, as appropriate, to require registrars and registries based in the United States to collect and make public accurate domain name registration information.

The Federal Trade Commission (FTC) would receive $351 million, an increase of $20 million over FY 2020. The final bill includes this policy provision for the FTC to heed:

  • Resources for Data Privacy and Security. -The agreement urges the FTC to conduct a comprehensive internal assessment measuring the agency’s current efforts related to data privacy and security while separately identifying all resource-based needs of the FTC to improve in these areas. The agreement also urges the FTC to provide a report describing the assessment’s findings to the Committees within 180 days of enactment of this Act.

The Federal Communications Commission (FCC) would see a larger increase in funding for agency operations than the FTC, going from $339 million in FY 2020 to $374 million in FY 2021. However, $33 million of the increase is earmarked for implementing the “Broadband DATA Act” (P.L.116-130) along with the $65 million in COVID-19 supplemental funding for the same purpose. The FY 2021 omnibus directs the FCC on a range of policy issues:

  • Broadband Maps.-In addition to adopting the House report language on Broadband Maps, the agreement provides substantial dedicated resources for the FCC to implement the Broadband DATA Act. The FCC is directed to submit a report to the Committees on Appropriations within 90 days of enactment of this Act providing a detailed spending plan for these resources. In addition, the FCC, in coordination with the NTIA, shall outline the specific roles and responsibilities of each agency as it relates to the National Broadband Map and implementation of the Broadband DATA Act. The FCC is directed to report in writing to the Committees every 30 days on the date, amount, and purpose of any new obligation made for broadband mapping and any updates to the broadband mapping spending plan.
  • Lifeline Service. In lieu of the House report language on Lifeline Service, the agreement notes recent action by the FCC to partially waive its rules updating the Lifeline program’s minimum service standard for mobile broadband usage in light of the large increase to the standard that would have gone into effect on Dec. I, 2020, and the increased reliance by Americans on mobile broadband as a result of the pandemic. The FCC is urged to continue to balance the Lifeline program’s goals of accessibility and affordability.
  • 5G Fund and Rural America.-The agreement remains concerned about the feasible deployment of 5G in rural America. Rural locations will likely run into geographic barriers and infrastructure issues preventing the robust deployment of 5G technology, just as they have faced with 4G. The FCC’s proposed 5G Fund fails to provide adequate details or a targeted spend plan on creating seamless coverage in the most rural parts of the Nation. Given these concerns, the FCC is directed to report in writing on: (1) its current and future plans fix prioritizing deployment of 4G coverage in rural areas, (2) its plans for 5G deployment in rural areas, and (3) its plan for improving the mapping and long-term tracking of coverage in rural areas.
  • 6 Gigahertz. -As the FCC has authorized unlicensed use of the 6 gigahertz band, the agreement expects the Commission to ensure its plan does not result in harmful interference to incumbent users or impact critical infrastructure communications systems. The agreement is particularly concerned about the potential effects on the reliability of the electric transmission and distribution system. The agreement expects the FCC to ensure any mitigation technologies are rigorously tested and found to be effective in order to protect the electric transmission system. The FCC is directed to provide a report to the Committees within 90 days of enactment of this Act on its progress in ensuring rigorous testing related to unlicensed use of the 6 gigahertz band. Rural Broadband-The agreement remains concerned that far too many Americans living in rural and economically disadvantaged areas lack access to broadband at speeds necessary to fully participate in the Internet age. The agreement encourages the agency to prioritize projects in underserved areas, where the infrastructure to be installed provides access at download and upload speeds comparable to those available to Americans in urban areas. The agreement encourages the FCC to avoid efforts that could duplicate existing networks and to support deployment of last-mile broadband infrastructure to underserved areas. Further, the agreement encourages the agency to prioritize projects financed through public-private partnerships.
  • Contraband Cell Phones. -The agreement notes continued concern regarding the exploitation of contraband cell phones in prisons and jails nationwide. The agreement urges the FCC to act on the March 24, 2017 Further Notice of Proposed Rulemaking regarding combating contraband wireless devices. The FCC should consider all legally permissible options, including the creation, or use, of “quiet or no service zones,” geolocation-based denial, and beacon technologies to geographically appropriate correctional facilities. In addition, the agreement encourages the FCC to adopt a rules-based approach to cellphone disabling that would require immediate disabling by a wireless carrier upon proper identification of a contraband device. The agreement recommends that the FCC move forward with its suggestion in the Fiscal Year 2019 report to this Committee, noting that “additional field testing of jamming technology will provide a better understanding of the challenges and costs associated with the proper deployment of jamming system.” The agreement urges the FCC to use available funds to coordinate rigorous Federal testing of jamming technology and coordinate with all relevant stakeholders to effectively address this urgent problem.
  • Next-Generation Broadband Networks/or Rural America-Deployment of broadband and telecommunications services in rural areas is imperative to support economic growth and public safety. However, due to geographical challenges facing mobile connectivity and fiber providers, connectivity in certain areas remains challenging. Next generation satellite-based technology is being developed to deliver direct satellite to cellular capability. The FCC is encouraged to address potential regulatory hurdles, to promote private sector development and implementation of innovative, next generation networks such as this, and to accelerate broadband and telecommunications access to all Americans.

$635 million is provided for a Department of Agriculture rural development pilot program, and he Secretary will need to explain how he or she will use authority provided in the last farm bill to expand broadband:

  • The agreement provides $635,000,000 to support the ReConnect pilot program to increase access to broadband connectivity in unserved rural communities and directs the Department to target grants and loans to areas of the country with the largest broadband coverage gaps. These projects should utilize technology that will maximize coverage of broadband with the most benefit to taxpayers and the rural communities served. The agreement notes stakeholder concerns that the ReConnect pilot does not effectively recognize the unique challenges and opportunities that different technologies, including satellite, provide to delivering broadband in noncontiguous States or mountainous terrain and is concerned that providing preference to 100 mbps symmetrical service unfairly disadvantages these communities by limiting the deployment of other technologies capable of providing service to these areas.
  • The Agriculture Improvement Act of 2018 (Public Law 115-334) included new authorities for rural broadband programs that garnered broad stakeholder support as well as bipartisan, bicameral agreement in Congress. Therefore, the Secretary is directed to provide a report on how the Department plans to utilize these authorities to deploy broadband connectivity to rural communities.

In Division M of the package, the “Coronavirus Response and Relief Supplemental Appropriations Act, 2021,” there are provisions related to broadband policy and funding. The bill created a $3.2 billion program to help low-income Americans with internet service and buying devices for telework or distance education. The “Emergency Broadband Benefit Program” is established at the FCC, “under which eligible households may receive a discount of up to $50, or up to $75 on Tribal lands, off the cost of internet service and a subsidy for low-cost devices such as computers and tablets” according to a House Appropriations Committee summary. This funding is far short of what House Democrats wanted. And yet, this program aims to help those on the wrong side of the digital divide during the pandemic.

Moreover, this legislation also establishes two grant programs at the NTIA, designed to help provide broadband on tribal lands and in rural areas. $1 billion is provided for the former and $300 million for the latter with the funds going to tribal and state and local governments to obtain services from private sector providers. The $1 billion for tribal lands allows for greater flexibility in what the funds are ultimately spent on with the $320 million for underserved rural areas being restricted to broadband deployment. Again, these funds are aimed at bridging the disparity in broadband service exposed and exacerbated during the pandemic.

Congress also provided funds for the FCC to reimburse smaller telecommunications providers in removing and replacing risky telecommunications equipment from the People’s Republic of China (PRC). Following the enactment of the “Secure and Trusted Communications Networks Act of 2019” (P.L.116-124) that codified and added to a FCC regulatory effort to address the risks posed by Huawei and ZTE equipment in United States (U.S.) telecommunications networks, there was pressure in Congress to provide the funds necessary to help carriers meet the requirements of the program. The FY 2021 omnibus appropriates $1.9 billion for this program. In another but largely unrelated tranche of funding, the aforementioned $65 million given to the FCC to undertake the “Broadband DATA Act.”

Division Q contains text similar to the “Cybersecurity and Financial System Resilience Act of 2019” (H.R.4458) that would require “the Board of Governors of the Federal Reserve System, Office of the Comptroller of the Currency, Federal Deposit Insurance Corporation, and National Credit Union Administration to annually report on efforts to strengthen cybersecurity by the agencies, financial institutions they regulate, and third-party service providers.”

Division U contains two bills pertaining to technology policy:

  • Title I. The AI in Government Act of 2020. This title codifies the AI Center of Excellence within the General Services Administration to advise and promote the efforts of the federal government in developing innovative uses of artificial intelligence (AI) and competency in the use of AI in the federal government. The section also requires that the Office of Personnel Management identify key skills and competencies needed for federal positions related to AI and establish an occupational series for positions related to AI.
  • Title IX. The DOTGOV Act. This title transfers the authority to manage the .gov internet domain from the General Services Administration to the Cybersecurity and Infrastructure Security Agency (CISA) of the Department of Homeland Security. The .gov internet domain shall be available to any Federal, State, local, or territorial government entity, or other publicly controlled entity, subject to registration requirements established by the Director of CISA and approved by the Director of the Office of Management and Budget.

Division W is the FY 2021 Intelligence Authorization Act with the following salient provisions:

  • Section 323. Report on signals intelligence priorities and requirements. Section 323 requires the Director of National Intelligence (DNI) to submit a report detailing signals intelligence priorities and requirements subject to Presidential Policy Directive-28 (PPD-28) that stipulates “why, whether, when, and how the United States conducts signals intelligence activities.” PPD-28 reformed how the National Security Agency (NSA) and other Intelligence Community (IC) agencies conducted signals intelligence, specifically collection of cellphone and internet data, after former NSA contractor Edward Snowden exposed the scope of the agency’s programs.
  • Section 501. Requirements and authorities to improve education in science, technology, engineering, arts, and mathematics. Section 501 ensures that the Director of the Central Intelligence Agency (CIA) has the legal authorities required to improve the skills in science, technology, engineering, arts, and mathematics (known as STEAM) necessary to meet long-term national security needs. Section 502. Seedling investment in next-generation microelectronics in support of artificial intelligence. Section 502 requires the DNI, acting through the Director of the Intelligence Advanced Research Projects Activity, to award contracts or grants, or enter into other transactions, to encourage microelectronics research.
  • Section 601. Report on attempts by foreign adversaries to build telecommunications and cybersecurity equipment and services for, or to provide them to, certain U.S. Section 601 requires the CIA, NSA, and DIA to submit a joint report that describes the United States intelligence sharing and military posture in Five Eyes countries that currently have or intend to use adversary telecommunications or cybersecurity equipment, especially as provided by China or Russia, with a description of potential vulnerabilities of that information and assessment of mitigation options.
  • Section 602. Report on foreign use of cyber intrusion and surveillance technology. Section 602 requires the DNI to submit a report on the threats posed by foreign governments and foreign entities using and appropriating commercially available cyber intrusion and other surveillance technology.
  • Section 603. Reports on recommendations of the Cyberspace Solarium Commission. Section 603 requires the ODNI and representatives of other agencies to report to Congress their assessment of the recommendations submitted by the Cyberspace Solarium Commission pursuant to Section 1652(j) of the John S. McCain National Defense Authorization Act (NDAA) for Fiscal Year 2019, and to describe actions that each agency expects to take to implement these recommendations.
  • Section 604. Assessment of critical technology trends relating to artificial intelligence, microchips, and semiconductors and related matters. Section 604 requires the DNI to complete an assessment of export controls related to artificial intelligence (AI), microchips, advanced manufacturing equipment, and other AI-enabled technologies, including the identification of opportunities for further cooperation with international partners.
  • Section 605. Combating Chinese influence operations in the United States and strengthening civil liberties protections. Section 605 provides additional requirements to annual reports on Influence Operations and Campaigns in the United States by the Chinese Communist Party (CCP) by mandating an identification of influence operations by the CCP against the science and technology sector in the United States. Section 605 also requires the FBI to create a plan to increase public awareness of influence activities by the CCP. Finally, section 605 requires the FBI, in consultation with the Assistant Attorney General for the Civil Rights and the Chief Privacy and Civil Liberties Officer of the Department of Justice, to develop recommendations to strengthen relationships with communities targeted by the CCP and to build trust with such communities through local and regional grassroots outreach.
  • Section 606. Annual report on corrupt activities of senior officials of the CCP. Section 606 requires the CIA, in coordination with the Department of Treasury’s Office of Intelligence and Analysis and the FBI, to submit to designated congressional committees annually through 2025 a report that describes and assesses the wealth and corruption of senior officials of the CCP, as well as targeted financial measures, including potential targets for sanctions designation. Section 606 further expresses the Sense of Congress that the United States should undertake every effort and pursue every opportunity to expose the corruption and illicit practices of senior officials of the CCP, including President Xi Jinping.
  • Section 607. Report on corrupt activities of Russian and other Eastern European oligarchs. Section 607 requires the CIA, in coordination with the Department of the Treasury’s Office of Intelligence and Analysis and the FBI, to submit to designated congressional committees and the Under Secretary of State for Public Diplomacy, a report that describes the corruption and corrupt or illegal activities among Russian and other Eastern European oligarchs who support the Russian government and Russian President Vladimir Putin, and the impact of those activities on the economy and citizens of Russia. Section 607 further requires the CIA, in coordination with the Department of Treasury’s Office of Intelligence and Analysis, to describe potential sanctions that could be imposed for such activities. Section 608. Report on biosecurity risk and disinformation by the CCP and the PRC. Section 608 requires the DNI to submit to the designated congressional committees a report identifying whether and how CCP officials and the Government of the People’s Republic of China may have sought to suppress or exploit for national advantage information regarding the novel coronavirus pandemic, including specific related assessments. Section 608 further provides that the report shall be submitted in unclassified form, but may have a classified annex.
  • Section 612. Research partnership on activities of People’s Republic of China. Section 612 requires the Director of the NGA to seek to enter into a partnership with an academic or non-profit research institution to carry out joint unclassified geospatial intelligence analyses of the activities of the People’s Republic of China that pose national security risks to the United States, and to make publicly available unclassified products relating to such analyses.

Division Z would tweak a data center energy efficiency and energy savings program overseen by the Secretary of Energy and the Administrator of the Environmental Protection Agency that could impact the Office of Management and Budget’s (OMB) government-wide program. Specifically, “Section 1003 requires the development of a metric for data center energy efficiency, and requires the Secretary of Energy, Administrator of the Environmental Protection Agency (EPA), and Director of the Office of Management and Budget (OMB) to maintain a data center energy practitioner program and open data initiative for federally owned and operated data center energy usage.” There is also language that would require the U.S. government to buy and use more energy-efficient information technology (IT): “each Federal agency shall coordinate with the Director [of OMB], the Secretary, and the Administrator of the Environmental Protection Agency to develop an implementation strategy (including best-practices and measurement and verification techniques) for the maintenance, purchase, and use by the Federal agency of energy-efficient and energy-saving information technologies at or for facilities owned and operated by the Federal agency, taking into consideration the performance goals.”

Division FF contains telecommunications provisions:

  • Section 902. Don’t Break Up the T-Band Act of 2020. Section 902 repeals the requirement for the FCC to reallocate and auction the 470 to 512megahertz band, commonly referred to as the T-band. In certain urban areas, the T-band is utilized by public-safety entities. It also directs the FCC to implement rules to clarify acceptable expenditures on which 9-1- 1 fees can be spent, and creates a strike force to consider how the Federal Government can end 9-1-1 fee diversion.
  • Section 903. Advancing Critical Connectivity Expands Service, Small Business Resources, Opportunities, Access, and Data Based on Assessed Need and Demand (ACCESS BROADBAND) Act. Section 903 establishes the Office of Internet Connectivity and Growth (Office) at the NTIA. This Office would be tasked with performing certain responsibilities related to broadband access, adoption, and deployment, such as performing public outreach to promote access and adoption of high-speed broadband service, and streamlining and standardizing the process for applying for Federal broadband support. The Office would also track Federal broadband support funds, and coordinate Federal broadband support programs within the Executive Branch and with the FCC to ensure unserved Americans have access to connectivity and to prevent duplication of broadband deployment programs.
  • Section 904. Broadband Interagency Coordination Act. Section 904 requires the Federal Communications Commission (FCC), the National Telecommunications and Information Administration (NTIA), and the Department of Agriculture to enter into an interagency agreement to coordinate the distribution of federal funds for broadband programs, to prevent duplication of support and ensure stewardship of taxpayer dollars. The agreement must cover, among other things, the exchange of information about project areas funded under the programs and the confidentiality of such information. The FCC is required to publish and collect public comments about the agreement, including regarding its efficacy and suggested modifications.
  • Section 905. Beat CHINA for 5G Act of 2020. Section 905 directs the President, acting through the Assistant Secretary of Commerce for Communications and Information, to withdraw or modify federal spectrum assignments in the 3450 to 3550 megahertz band, and directs the FCC to begin a system of competitive bidding to permit non-Federal, flexible-use services in a portion or all of such band no later than December 31, 2021.

Section 905 would countermand the White House’s efforts to auction off an ideal part of spectrum for 5G (see here for analysis of the August 2020 announcement). Congressional and a number of Trump Administration stakeholders were alarmed by what they saw as a push to bestow a windfall on a private sector company in the rollout of 5G.

Title XIV of Division FF would allow the FTC to seek civil fines of more than $43,000 per violation during the duration of the public health emergency arising from the pandemic “for unfair and deceptive practices associated with the treatment, cure, prevention, mitigation, or diagnosis of COVID–19 or a government benefit related to COVID-19.”

Finally, Division FF is the vehicle for the “American COMPETES Act” that:

directs the Department of Commerce and the FTC to conduct studies and submit reports on technologies including artificial intelligence, the Internet of Things, quantum computing, blockchain, advanced materials, unmanned delivery services, and 3-D printing. The studies include requirements to survey each industry and report recommendations to help grow the economy and safely implement the technology.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by forcal35 from Pixabay

Further Reading, Other Development, and Coming Events (8 December)

Further Reading

  • Facebook failed to put fact-check labels on 60% of the most viral posts containing Georgia election misinformation that its own fact-checkers had debunked, a new report says” By Tyler Sonnemaker — Business Insider. Despite its vows to improve its managing of untrue and false content, the platform is not consistently taking down such material related to the runoffs for the Georgia Senate seats. The group behind this finding argues it is because Facebook does not want to. What is left unsaid is that engagement drives revenue, and so, Facebook’s incentives are not to police all violations. Rather it would be to take down enough to be able to say their doing something.
  • Federal Labor Agency Says Google Wrongly Fired 2 Employees” By Kate Conger and Noam Scheiber — The New York Times. The National Labor Relations Board (NLRB) has reportedly sided with two employees Google fired for activities that are traditionally considered labor organizing. The two engineers had been dismissed for allegedly violating the company’s data security practices when they researched the company’s retention of a union-busting firm and sought to alert others about organizing. Even though Google is vowing to fight the action, which has not been finalized, it may well settle given the view of Big Tech in Washington these days. This action could also foretell how a Biden Administration NLRB may look at the labor practices of these companies.
  • U.S. states plan to sue Facebook next week: sources” By Diane Bartz — Reuters. We could see state and federal antitrust suits against Facebook this week. One investigation led by New York Attorney General Tish James could include 40 states although the grounds for alleged violations have not been leaked at this point. It may be Facebook’s acquisition of potential rivals Instagram and WhatsApp that have allowed it to dominate the social messaging market. The Federal Trade Commission (FTC) may also file suit, and, again, the grounds are unknown. The European Commission (EC) is also investigating Facebook for possible violations of European Union (EU) antitrust law over the company’s use of the personal data it holds and uses and about its operation of it online marketplace.
  • The Children of Pornhub” By Nicholas Kristof — The New York Times. This column comprehensively traces the reprehensible recent history of a Canadian conglomerate Mindgeek that owns Pornhub where one can find reams of child and non-consensual pornography. Why Ottawa has not cracked down on this firm is a mystery. The passage and implementation of the “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” (P.L. 115-164) that narrowed the liability shield under 47 USC 230 has forced the company to remove content, a significant change from its indifference before the statutory change in law. Kristof suggests some easy, common sense changes Mindgeek could implement to combat the presence of this illegal material, but it seems like the company will do enough to say it is acting without seriously reforming its platform. Why would it? There is too much money to be made. Additionally, those fighting against this sort of material have been pressuring payment platforms to stop doing business with Mindgeek. PayPal has foresworn any  interaction, and due to pressure Visa and Mastercard are “reviewing” their relationship with Mindgeek and Pornhub. In a statement to a different news outlet, Pornhub claimed it is “unequivocally committed to combating child sexual abuse material (CSAM), and has instituted a comprehensive, industry-leading trust and safety policy to identify and eradicate illegal material from our community.” The company further claimed “[a]ny assertion that we allow CSAM is irresponsible and flagrantly untrue….[w]e have zero tolerance for CSAM.”
  • Amazon and Apple Are Powering a Shift Away From Intel’s Chips” By Don Clark — The New York Times. Two tech giants have chosen new faster, cheaper chips signaling a possible industry shift away from Intel, the firm that has been a significant player for decades. Intel will not go quietly, of course, and a key variable is whether must have software and applications are rewritten to accommodate the new chips from a British firm, Arm.

Other Developments

  • The Government Accountability Office (GAO) and the National Academy of Medicine (NAM) have released a joint report on artificial intelligence in healthcare, consisting of GAO’s Technology Assessment: Artificial Intelligence in Health Care: Benefits and Challenges of Technologies to Augment Patient Care and NAM’s Special Publication: Advancing Artificial Intelligence in Health Settings Outside the Hospital and Clinic. GAO’s report “discusses three topics: (1) current and emerging AI tools available for augmenting patient care and their potential benefits, (2) challenges to the development and adoption of these tools, and (3) policy options to maximize benefits and mitigate challenges to the use of AI tools to augment patient care.” NAM’s “paper aims to provide an analysis of: 1) current technologies and future applications of AI in HSOHC, 2) the logistical steps and challenges involved in integrating AI- HSOHC applications into existing provider workflows, and 3) the ethical and legal considerations of such AI tools, followed by a brief proposal of potential key initiatives to guide the development and adoption of AI in health settings outside the hospital and clinic (HSOHC).
    • The GAO “identified five categories of clinical applications where AI tools have shown promise to augment patient care: predicting health trajectories, recommending treatments, guiding surgical care, monitoring patients, and supporting population health management.” The GAO “also identified three categories of administrative applications where AI tools have shown promise to reduce provider burden and increase the efficiency of patient care: recording digital clinical notes, optimizing operational processes, and automating laborious tasks.” The GAO stated:
      • This technology assessment also identifies challenges that hinder the adoption and impact of AI tools to augment patient care, according to stakeholders, experts, and the literature. Difficulties accessing sufficient high-quality data may hamper innovation in this space. Further, some available data may be biased, which can reduce the effectiveness and accuracy of the tools for some people. Addressing bias can be difficult because the electronic health data do not currently represent the general population. It can also be challenging to scale tools up to multiple locations and integrate them into new settings because of differences in institutions and the patient populations they serve. The limited transparency of AI tools used in health care can make it difficult for providers, regulators, and others to determine whether an AI tool is safe and effective. A greater dispersion of data across providers and institutions can make securing patient data difficult. Finally, one expert described how existing case law does not specifically address AI tools, which can make providers and patients reticent to adopt them. Some of these challenges are similar to those identified previously by GAO in its first publication in this series, such as the lack of high-quality, structured data, and others are more specific to patient care, such as liability concerns.
    • The GAO “described six policy options:”
      • Collaboration. Policymakers could encourage interdisciplinary collaboration between developers and health care providers. This could result in AI tools that are easier to implement and use within an existing workflow.
      • Data Access. Policymakers could develop or expand high-quality data access mechanisms. This could help developers address bias concerns by ensuring data are representative, transparent, and equitable.
      • Best Practices. Policymakers could encourage relevant stakeholders and experts to establish best practices (such as standards) for development, implementation, and use of AI technologies. This could help with deployment and scalability of AI tools by providing guidance on data, interoperability, bias, and formatting issues.
      • Interdisciplinary Education. Policymakers could create opportunities for more workers to develop interdisciplinary skills. This could allow providers to use AI tools more effectively, and could be accomplished through a variety of methods, including changing medical curricula or grants.
      • Oversight Clarity. Policymakers could collaborate with relevant stakeholders to clarify appropriate oversight mechanisms. Predictable oversight could help ensure that AI tools remain safe and effective after deployment and throughout their lifecycle.
      • Status Quo. Policymakers could allow current efforts to proceed without intervention.
    • NAM claimed
      • Numerous AI-powered health applications designed for personal use have been shown to improve patient outcomes, building predictions based on large volumes of granular, real-time, and individualized behavioral and medical data. For instance, some forms of telehealth, a technology that has been critical during the COVID-19 pandemic, benefit considerably from AI software focused on natural language processing, which enables efficient triaging of patients based on urgency and type of illness. Beyond patient-provider communication, AI algorithms relevant to diabetic and cardiac care have demonstrated remarkable efficacy in helping patients manage their blood glucose levels in their day-to-day lives and in detecting cases of atrial fibrillation. AI tools that monitor and synthesize longitudinal patient behaviors are also particularly useful in psychiatric care, where of the exact timing of interventions is often critical. For example, smartphone-embedded sensors that track location and proximity of individuals can alert clinicians of possible substance use, prompting immediate intervention. On the population health level, these individual indicators of activity and health can be combined with environmental- and system-level data to generate predictive insight into local and global health trends. The most salient example of this may be the earliest warnings of the COVID-19 outbreak, issued in December 2019 by two private AI technology firms.
      • Successful implementation and widespread adoption of AI applications in HSOHC requires careful consideration of several key issues related to personal data, algorithm development, and health care insurance and payment. Chief among them are data interoperability, standardization, privacy, ameliorating systemic biases in algorithms, reimbursement of AI- assisted services, quality improvement, and integration of AI tools into provider workflows. Overcoming these challenges and optimizing the impact of AI tools on clinical outcomes will involve engaging diverse stakeholders, deliberately designing AI tools and interfaces, rigorously evaluating clinical and economic utility, and diffusing and scaling algorithms across different health settings. In addition to these potential logistical and technical hurdles, it is imperative to consider the legal and ethical issues surrounding AI, particularly as it relates to the fair and humanistic deployment of AI applications in HSOHC. Important legal considerations include the appropriate designation of accountability and liability of medical errors resulting from AI- assisted decisions for ensuring the safety of patients and consumers. Key ethical challenges include upholding the privacy of patients and their data—particularly with regard to non-HIPAA covered entities involved in the development of algorithms—building unbiased AI algorithms based on high-quality data from representative populations, and ensuring equitable access to AI technologies across diverse communities.
  • The National Institute of Standards and Technology (NIST) published a “new study of face recognition technology created after the onset of the COVID-19 pandemic [that] shows that some software developers have made demonstrable progress at recognizing masked faces.” In Ongoing Face Recognition Vendor Test (FRVT) Part 6B: Face Recognition Accuracy with Face Masks Using Post-COVID-19 Algorithms (NISTIR 8331), NIST stated the “report augments its predecessor with results for more recent algorithms provided to NIST after mid-March 2020.” NIST said that “[w]hile we do not have information on whether or not a particular algorithm was designed with face coverings in mind, the results show evidence that a number of developers have adapted their algorithms to support face recognition on subjects potentially wearing face masks.” NIST stated that
    • The following results represent observations on algorithms provided to NIST both before and after the COVID-19 pandemic to date. We do not have information on whether or not a particular algorithm was designed with face coverings in mind. The results documented capture a snapshot of algorithms submitted to the FRVT 1:1 in face recognition on subjects potentially wearing face masks.
      • False rejection performance: All algorithms submitted after the pandemic continue to give in-creased false non-match rates (FNMR) when the probes are masked. While a few pre-pandemic algorithms still remain within the most accurate on masked photos, some developers have submit-ted algorithms after the pandemic showing significantly improved accuracy and are now among the most accurate in our test.
      • Evolution of algorithms on face masks: We observe that a number of algorithms submitted since mid-March 2020 show notable reductions in error rates with face masks over their pre-pandemic predecessors. When comparing error rates for unmasked versus masked faces, the median FNMR across algorithms submitted since mid-March 2020 has been reduced by around 25% from the median pre-pandemic results. The figure below presents examples of developer evolution on both masked and unmasked datasets. For some developers, false rejection rates in their algorithms submitted since mid-March 2020 decreased by as much as a factor of 10 over their pre-pandemic algorithms, which is evidence that some providers are adapting their algorithms to handle facemasks. However, in the best cases, when comparing results for unmasked images to masked im-ages, false rejection rates have increased from 0.3%-0.5% (unmasked) to 2.4%-5% (masked).
      • False acceptance performance: As most systems are configured with a fixed threshold, it is necessary to report both false negative and false positive rates for each group at that threshold. When comparing a masked probe to an unmasked enrollment photo, in most cases, false match rates (FMR) are reduced by masks. The effect is generally modest with reductions in FMR usually being smaller than a factor of two. This property is valuable in that masked probes do not impart adverse false match security consequences for verification.
      • Mask-agnostic face recognition: All 1:1 verification algorithms submitted to the FRVT test since the start of the pandemic are evaluated on both masked and unmasked datasets. The test is de-signed this way to mimic operational reality: some images will have masks, some will not (especially enrollment samples from a database or ID card). And to the extent that the use of protective masks will exist for some time, our test will continue to evaluate algorithmic capability on verifying all combinations of masked and unmasked faces.
  • The government in London has issued a progress report on its current cybersecurity strategy that has another year to run. The Paymaster General assessed how well the United Kingdom (UK) has implemented the National Cyber Security Strategy 2016 to 2021 and pointed to goals yet to be achieved. This assessment comes in the shadow of the pending exit of the UK from the European Union (EU) and Prime Minister Boris Johnson’s plans to increase the UK’s role in select defense issues, including cyber operations. The Paymaster General stated:
    • The global landscape has changed significantly since the publication of the National Cyber Security Strategy Progress Report in May 2019. We have seen unprecedented levels of disruption to our way of life that few would have predicted. The COVID-19 pandemic has increased our reliance on digital technologies – for our personal communications with friends and family and our ability to work remotely, as well as for businesses and government to continue to operate effectively, including in support of the national response.
    • These new ways of living and working highlight the importance of cyber security, which is also underlined by wider trends. An ever greater reliance on digital networks and systems, more rapid advances in new technologies, a wider range of threats, and increasing international competition on underlying technologies and standards in cyberspace, emphasise the need for good cyber security practices for individuals, businesses and government.
    • Although the scale and international nature of these changes present challenges, there are also opportunities. With the UK’s departure from the European Union in January 2020, we can define and strengthen Britain’s place in the world as a global leader in cyber security, as an independent, sovereign nation.
    • The sustained, strategic investment and whole of society approach delivered so far through the National Cyber Security Strategy has ensured we are well placed to respond to this changing environment and seize new opportunities.
    • The Paymaster General asserted:
      • [The] report has highlighted growing risks, some accelerated by the COVID-19 pandemic, and longer-term trends that will shape the environment over the next decade:
      • Ever greater reliance on digital networks and systems as daily life moves online, bringing huge benefits but also creating new systemic and individuals risks.
      • Rapid technological change and greater global competition, challenging our ability to shape the technologies that will underpin our future security and prosperity.
      • A wider range of adversaries as criminals gain easier access to commoditised attack capabilities and cyber techniques form a growing part of states’ toolsets.
      • Competing visions for the future of the internet and the risk of fragmentation, making consensus on norms and ethics in cyberspace harder to achieve.
      • In February 2020 the Prime Minister announced the Integrated Review of Security, Defence, Development and Foreign Policy. This will define the government’s ambition for the UK’s role in the world and the long-term strategic aims of our national security and foreign policy. It will set out the way in which the UK will be a problem-solving and burden-sharing nation, and a strong direction for recovery from COVID-19, at home and overseas.
      • This will help to shape our national approach and priorities on cyber security beyond 2021. Cyber security is a key element of our international, defence and security posture, as well as a driving force for our economic prosperity.
  • The University of Toronto’s Citizen Lab published a report on an Israeli surveillance firm that uses “[o]ne of the widest-used—but least appreciated” means of surveilling people (i.e., “leveraging of weaknesses in the global mobile telecommunications infrastructure to monitor and intercept phone calls and traffic.” Citizen Lab explained that an affiliate of the NSO Group, “Circles is known for selling systems to exploit Signaling System 7 (SS7) vulnerabilities, and claims to sell this technology exclusively to nation-states.” Citizen Lab noted that “[u]nlike NSO Group’s Pegasus spyware, the SS7 mechanism by which Circles’ product reportedly operates does not have an obvious signature on a target’s phone, such as the telltale targeting SMS bearing a malicious link that is sometimes present on a phone targeted with Pegasus.” Citizen Lab found that
    • Circles is a surveillance firm that reportedly exploits weaknesses in the global mobile phone system to snoop on calls, texts, and the location of phones around the globe. Circles is affiliated with NSO Group, which develops the oft-abused Pegasus spyware.
    • Circles, whose products work without hacking the phone itself, says they sell only to nation-states. According to leaked documents, Circles customers can purchase a system that they connect to their local telecommunications companies’ infrastructure, or can use a separate system called the “Circles Cloud,” which interconnects with telecommunications companies around the world.
    • According to the U.S. Department of Homeland Security, all U.S. wireless networks are vulnerable to the types of weaknesses reportedly exploited by Circles. A majority of networks around the globe are similarly vulnerable.
    • Using Internet scanning, we found a unique signature associated with the hostnames of Check Point firewalls used in Circles deployments. This scanning enabled us to identify Circles deployments in at least 25 countries.
    • We determine that the governments of the following countries are likely Circles customers: Australia, Belgium, Botswana, Chile, Denmark, Ecuador, El Salvador, Estonia, Equatorial Guinea, Guatemala, Honduras, Indonesia, Israel, Kenya, Malaysia, Mexico, Morocco, Nigeria, Peru, Serbia, Thailand, the United Arab Emirates (UAE), Vietnam, Zambia, and Zimbabwe.
    • Some of the specific government branches we identify with varying degrees of confidence as being Circles customers have a history of leveraging digital technology for human rights abuses. In a few specific cases, we were able to attribute the deployment to a particular customer, such as the Security Operations Command (ISOC) of the Royal Thai Army, which has allegedly tortured detainees.
  • Senators Ron Wyden (D-OR) Elizabeth Warren (D-MA) Edward J. Markey (D-MA) and Brian Schatz (D-HI) “announced that the Department of Homeland Security (DHS) will launch an inspector general investigation into Customs and Border Protection’s (CBP) warrantless tracking of phones in the United States following an inquiry from the senators earlier this year” per their press release.
    • The Senators added:
      • As revealed by public contracts, CBP has paid a government contractor named Venntel nearly half a million dollars for access to a commercial database containing location data mined from applications on millions of Americans’ mobile phones. CBP officials also confirmed the agency’s warrantless tracking of phones in the United States using Venntel’s product in a September 16, 2020 call with Senate staff.
      • In 2018, the Supreme Court held in Carpenter v. United States that the collection of significant quantities of historical location data from Americans’ cell phones is a search under the Fourth Amendment and therefore requires a warrant.
      • In September 2020, Wyden and Warren successfully pressed for an inspector general investigation into the Internal Revenue Service’s use of Venntel’s commercial location tracking service without a court order.
    • In a letter, the DHS Office of the Inspector General (OIG) explained:
      • We have reviewed your request and plan to initiate an audit that we believe will address your concerns. The objective of our audit is to determine if the Department of Homeland Security (DHS) and it [sic] components have developed, updated, and adhered to policies related to cell-phone surveillance devices. In addition, you may be interested in our audit to review DHS’ use and protection of open source intelligence. Open source intelligence, while different from cell phone surveillance, includes the Department’s use of information provided by the public via cellular devices, such as social media status updates, geo-tagged photos, and specific location check-ins.
    • In an October letter, these Senators plus Senator Sherrod Brown (D-OH) argued:
      • CBP is not above the law and it should not be able to buy its way around the Fourth Amendment. Accordingly, we urge you to investigate CBP’s warrantless use of commercial databases containing Americans’ information, including but not limited to Venntel’s location database. We urge you to examine what legal analysis, if any, CBP’s lawyers performed before the agency started to use this surveillance tool. We also request that you determine how CBP was able to begin operational use of Venntel’s location database without the Department of Homeland Security Privacy Office first publishing a Privacy Impact Assessment.
  • The American Civil Liberties Union (ACLU) has filed a lawsuit in a federal court in New York City, seeking an order to compel the United States (U.S.) Department of Homeland Security (DHS), U.S. Customs and Border Protection (CBP), and U.S. Immigration and Customs Enforcement (ICE) “to release records about their purchases of cell phone location data for immigration enforcement and other purposes.” The ACLU made these information requests after numerous media accounts showing that these and other U.S. agencies were buying location data and other sensitive information in ways intended to evade the bar in the Fourth Amendment against unreasonable searches.
    • In its press release, the ACLU asserted:
      • In February, The Wall Street Journal reported that this sensitive location data isn’t just for sale to commercial entities, but is also being purchased by U.S. government agencies, including by U.S. Immigrations and Customs Enforcement to locate and arrest immigrants. The Journal identified one company, Venntel, that was selling access to a massive database to the U.S. Department of Homeland Security, U.S. Customs and Border Protection, and ICE. Subsequent reporting has identified other companies selling access to similar databases to DHS and other agencies, including the U.S. military.
      • These practices raise serious concerns that federal immigration authorities are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. There’s even more reason for alarm when those agencies evade requests for information — including from U.S. senators — about such practices. That’s why today we asked a federal court to intervene and order DHS, CBP, and ICE to release information about their purchase and use of precise cell phone location information. Transparency is the first step to accountability.
    • The ACLU explained in the suit:
      • Multiple news sources have confirmed these agencies’ purchase of access to databases containing precise location information for millions of people—information gathered by applications (apps) running on their smartphones. The agencies’ purchases raise serious concerns that they are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. Yet, more than nine months after the ACLU submitted its FOIA request (“the Request”), these agencies have produced no responsive records. The information sought is of immense public significance, not only to shine a light on the government’s use of powerful location-tracking data in the immigration context, but also to assess whether the government’s purchase of this sensitive data complies with constitutional and legal limitations and is subject to appropriate oversight and control.
  • Facebook’s new Oversight Board announced “the first cases it will be deliberating and the opening of the public comment process” and “the appointment of five new trustees.” The cases were almost all referred by Facebook users and the new board is asking for comments on the right way to manage what may be objectionable content. The Oversight Board explained it “prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
    • The new trustees are:
      • Kristina Arriaga is a globally recognized advocate for freedom of expression, with a focus on freedom of religion and belief. Kristina is president of the advisory firm Intrinsic.
      • Cherine Chalaby is an expert on internet governance, international finance and technology, with extensive board experience. As Chairman of ICANN, he led development of the organization’s five-year strategic plan for 2021 to 2025.
      • Wanda Felton has over 30 years of experience in the financial services industry, including serving as Vice Chair of the Board and First Vice President of the Export-Import Bank of the United States.
      • Kate O’Regan is a former judge of the Constitutional Court of South Africa and commissioner of the Khayelitsha Commission. She is the inaugural director of the Bonavero Institute of Human Rights at the University of Oxford.
      • Robert Post is an American legal scholar and Professor of Law at Yale Law School, where he formerly served as Dean. He is a leading scholar of the First Amendment and freedom of speech.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Final NDAA Agreement, Part II

There are AI, 5G, and supply chain provisions in the national security policy bill the Armed Services Committee have agreed upon.

So, it appears I failed to include all the technology goodies to be found in the final FY 2021 National Defense Authorization Act (NDAA). And so, I will cover the provisions I missed yesterday in the conference report to accompany the “William M. “Mac” Thornberry National Defense Authorization Act for Fiscal Year 2021” (H.R.6395). For example, there are artificial intelligence (AI), 5G, and supply chain provisions.

Notably, the final bill includes the House Science, Space, and Technology Committee’s “National Artificial Intelligence Initiative Act of 2020” (H.R.6216). In the Joint Explanatory Statement, the conferees asserted:

The conferees believe that artificial intelligence systems have the potential to transform every sector of the United States economy, boosting productivity, enhancing scientific research, and increasing U.S. competitiveness and that the United States government should use this Initiative to enable the benefits of trustworthy artificial intelligence while preventing the creation and use of artificial intelligence systems that behave in ways that cause harm. The conferees further believe that such harmful artificial intelligence systems may include high-risk systems that lack sufficient robustness to prevent adversarial attacks; high-risk systems that harm the privacy or security of users or the general public; artificial general intelligence systems that become self-aware or uncontrollable; and artificial intelligence systems that unlawfully discriminate against protected classes of persons, including on the basis of sex, race, age, disability, color, creed, national origin, or religion. Finally, the conferees believe that the United States must take a whole of government approach to leadership in trustworthy artificial intelligence, including through coordination between the Department of Defense, the Intelligence Community, and the civilian agencies.

H.R.6216 directs the President to establish the National Artificial Intelligence Initiative that would:

  • Ensure the U.S. continues to lead in AI research and development (R&D)
  • Lead efforts throughout the world to develop and use “trustworthy AI systems” in both the public and private sectors
  • Prepare to assist U.S. workers for the coming integration and use of AI throughout the U.S., and
  • Coordinate AI R&D development and demonstration activities across the federal government, including national security agencies.

The President would have a variety of means at his or her discretion in effectuating those goals, including existing authority to ask Congress for funding and to use Executive Office agencies to manage the authority and funding Congress provides.

Big picture, H.R. 6216 would require better coordination of federal AI initiatives, research, and funding, and more involvement in the development of voluntary, consensus-based standards for AI. Much of this would happen through the standing up of a new “National Artificial Intelligence Initiative Office” by the Office of Science and Technology Policy (OSTP) in the White House. This new entity would be the locus of AI activities and programs in the United States’ (U.S.) government with the ultimate goal of ensuring the nation is the world’s foremost developer and user of the new technology.

Moreover, OSTP would “acting through the National Science and Technology Council…establish or designate an Interagency Committee to coordinate Federal programs and activities in support of the Initiative.” This body would “provide for interagency coordination of Federal artificial intelligence research, development, and demonstration activities, development of voluntary consensus standards and guidelines for research, development, testing, and adoption of ethically developed, safe, and trustworthy artificial intelligence systems, and education and training activities and programs of Federal departments and agencies undertaken pursuant to the Initiative.” The committee would need to “develop a strategic plan for AI” within two years and update it every three years thereafter. Moreover, the committee would need to “propose an annually coordinated interagency budget for the Initiative to the Office of Management and Budget (OMB) that is intended to ensure that the balance of funding across the Initiative is sufficient to meet the goals and priorities established for the Initiative.” However, OMB would be under no obligation to take notice of this proposal save for pressure from AI stakeholders in Congress or AI champions in any given Administration. The Secretary of Commerce would create a ‘‘National Artificial Intelligence Advisory Committee” to advise the President and National Artificial Intelligence Initiative Office on a range of AI policy matters. In the bill as added to the House’s FY 2021 NDAA, it was to have been the Secretary of Energy.

Federal agencies would be permitted to award funds to new Artificial Intelligence Research Institutes to pioneer research in any number of AI fields or considerations. The bill does not authorize any set amount of money for this program and instead kicks the decision over to the Appropriations Committees on any funding. The National Institute of Standards and Technology (NIST) must “support measurement research and development of best practices and voluntary standards for trustworthy artificial intelligence systems” and “support measurement research and development of best practices and voluntary standards for trustworthy artificial intelligence systems” among other duties. NIST must “shall work to develop, and periodically update, in collaboration with other public and private sector organizations, including the National Science Foundation and the Department of Energy, a voluntary risk management framework for the trustworthiness of artificial intelligence systems.” NIST would also “develop guidance to facilitate the creation of voluntary data sharing arrangements between industry, federally funded research centers, and Federal agencies for the purpose of advancing artificial intelligence research and technologies.”

The National Science Foundation (NSF) would need to “fund research and education activities in artificial intelligence systems and related fields, including competitive awards or grants to institutions of higher education or eligible non-profit organizations (or consortia thereof).” The Department of Energy must “carry out a cross-cutting research and development program to advance artificial intelligence tools, systems, capabilities, and workforce needs and to improve the reliability of artificial intelligence methods and solutions relevant to the mission of the Department.” This department would also be tasked with advancing “expertise in artificial intelligence and high-performance computing in order to improve health outcomes for veteran populations.”

According to a fact sheet issued by the House Science, Space, and Technology Committee, [t]he legislation will:

  • Formalize interagency coordination and strategic planning efforts in AI research, development, standards, and education through an Interagency Coordination Committee and a coordination office managed by the Office of Science and Technology Policy (OSTP).
  • Create an advisory committee to better inform the Coordination Committee’s strategic plan, track the state of the science around artificial intelligence, and ensure the Initiative is meeting its goals.
  • Create a network of AI institutes, coordinated through the National Science Foundation, that any Federal department of agency could fund to create partnerships between the academia and the public and private sectors to accelerate AI research focused on an economic sector, social sector, or on a cross-cutting AI challenge.
  • Support basic AI measurement research and standards development at the National Institute for Standards and Technology(NIST) and require NIST to create a framework for managing risks associated with AI systems and best practices for sharing data to advance trustworthy AI systems.
  • Support research at the National Science Foundation (NSF) across a wide variety of AI related research areas to both improve AI systems and use those systems to advance other areas of science. This section requires NSF to include an obligation for an ethics statement for all research proposals to ensure researchers are considering, and as appropriate, mitigating potential societal risks in carrying out their research.
  • Support education and workforce development in AI and related fields, including through scholarships and traineeships at NSF.
  • Support AI research and development efforts at the Department of Energy (DOE), utilize DOE computing infrastructure for AI challenges, promote technology transfer, data sharing, and coordination with other Federal agencies, and require an ethics statement for DOE funded research as required at NSF.
  • Require studies to better understand workforce impacts and opportunities created by AI, and identify the computing resources necessary to ensure the United States remains competitive in AI.

A provision would expand the scope of the biannual reports the DOD must submit to Congress on the Joint Artificial Intelligence Center (JAIC) to include the Pentagon’s efforts to develop or contribute to efforts to institute AI standards and more detailed information on uniformed DOD members who serve at the JAIC. Other language would revamp how the Under Secretary of Defense for Research and Engineering shall manage efforts and procurements between the DOD and the private sector on AI and other technology with cutting edge national security applications. The new emphasis of the program would be to buy mature AI to support DOD missions, allowing DOD components to directly use AI and machine learning to address operational problems, speeding up the development, testing, and deployment of AI technology and capabilities, and overseeing and managing any friction between DOD agencies and components over AI development and use. This section also spells out which DOD officials should be involved with this program and how the JAIC fits into the picture. This language and other provisions suggest the DOD may have trouble in coordinating AI activities and managing infighting, at least in the eyes of the Armed Services Committees.

Moreover, the JAIC would be given a new Board of Advisors to advise the Secretary of Defense and JAIC Director on a range of AI issues. However, as the Secretary shall appoint the members of the board, all of whom must be from outside the Pentagon, this organ would seem to be a means of the Office of the Secretary asserting greater control over the JAIC.

And yet, the Secretary is also directed to delegate acquisition authority to the JAIC, permitting it to operate with the same independence as a DOD agency. The JAIC Director will need to appoint an acquisition executive to manage acquisition and policy inside and outside the DOD. $75 million would be authorized a year for these activities, and the Secretary needs to draft and submit an implementation plan to Congress and conduct a demonstration before proceeding.

The DOD must identify five use cases of when AI-enabled systems have improved the functioning of the Department in handling management functions in implementing the National Defense Strategy and then create prototypes and technology pilots to utilize commercially available AI capabilities to bolster the use cases.

Within six months of enactment, the DOD must determine whether it currently has the resources, capability, and know how to ensure that any AI bought has been ethically and responsibly developed. Additionally, the DOD must assess how it can install ethical AI standards in acquisitions and supply chains.

The Secretary is provided the authority to convene a steering committing on emerging technology and national security threats comprised of senior DOD officials to decide on how the Department can best adapt to and buy new technology to ensure U.S. military superiority. This body would also investigate the new technology used by adversaries and how to address and counter any threats. For this steering committee, emerging technology is defined as:

Technology determined to be in an emerging phase of development by the Secretary, including quantum information science and technology, data analytics, artificial intelligence, autonomous technology, advanced materials, software, high performance computing, robotics, directed energy, hypersonics, biotechnology, medical technologies, and such other technology as may be identified by the Secretary.

Not surprisingly, the FY 2021 NDAA has provisions on 5G. Most notably, the Secretary of Defense must assess and mitigate any risks presented by “at-risk” 5G or 6G systems in other nations before a major weapons system or a battalion, squadron, or naval combatant can be based there. The Secretary must take into account any steps the nation is taking to address risk, those steps the U.S. is taking, any agreements in place to mitigate risks, and other steps. This provision names Huawei and ZTE as “at-risk vendors.” This language may be another means by which the U.S. can persuade other nations not to buy and install technology from these People’s Republic of China (PRC) companies.

The Under Secretary of Defense for Research and Engineering and a cross-functional team would need to develop a plan to transition the DOD to 5G throughout the Department and its components. Each military department inside the DOD would get to manage its own 5G acquisition with the caveat that the Secretary would need to establish a telecommunications security program to address 5G security risks in the DOD. The Secretary would also be tasked with conducting a demonstration project to “evaluate the maturity, performance, and cost of covered technologies to provide additional options for providers of fifth-generation wireless network services” for Open RAN (aka oRAN) and “one or more massive multiple-input, multiple-output radio arrays, provided by one or more companies based in the United States, that have the potential to compete favorably with radios produced by foreign companies in terms of cost, performance, and efficiency.”

The service departments would need to submit reports to the Secretary on how they are assessing and mitigating and reporting to the DOD on the following risks to acquisition programs:

  • Technical risks in engineering, software, manufacturing and testing.
  • Integration and interoperability risks, including complications related to systems working across multiple domains while using machine learning and artificial intelligence capabilities to continuously change and optimize system performance.
  • Operations and sustainment risks, including as mitigated by appropriate sustainment planning earlier in the lifecycle of a program, access to technical data, and intellectual property rights.
  • Workforce and training risks, including consideration of the role of contractors as part of the total workforce.
  • Supply chain risks, including cybersecurity, foreign control and ownership of key elements of supply chains, and the consequences that a fragile and weakening defense industrial base, combined with barriers to industrial cooperation with allies and partners, pose for delivering systems and technologies in a trusted and assured manner.

Moreover, “[t]he Under Secretary of Defense for Acquisition and Sustainment, in coordination with the Chief Information Officer of the Department of Defense, shall develop requirements for ap- propriate software security criteria to be included in solicitations for commercial and developmental solutions and the evaluation of bids submitted in response to such solicitations, including a delineation of what processes were or will be used for a secure software development life cycle.”

The Armed Services Committees are directing the Secretary to follow up a report submitted to the President per Executive Order 13806 on strengthening Defense Industrial Base (DIB) manufacturing and supply chain resiliency. The DOD must submit “additional recommendations regarding United States industrial policies….[that] shall consist of specific executive actions, programmatic changes, regulatory changes, and legislative proposals and changes, as appropriate.”

The DOD would also need to submit an annex to an annual report to Congress on “strategic and critical materials, including the gaps and vulnerabilities in supply chains of such materials.”

There is language that would change how the DOD manages the production of microelectronics and related supply chain risk. The Pentagon would also need to investigate how to commercialize its intellectual property for microelectronic R&D. The Department of Commerce would need to “assess the capabilities of the United States industrial base to support the national defense in light of the global nature of the supply chain and significant interdependencies between the United States industrial base and the industrial bases of foreign countries with respect to the manufacture, design, and end use of microelectronics.”

There is a revision of the Secretary of Energy’s authority over supply chain risk administered by the National Nuclear Security Administration (NNSA) that would provide for a “special exclusion action” that would bar the procurement of risky technology for up to two years.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Development, and Coming Events (7 December)

Further Reading

  • Facebook steps up campaign to ban false information about coronavirus vaccines” By Elizabeth Dwoskin — The Washington Post. In its latest step to find and remove lies, misinformation, and disinformation, the social media giant is now committing to removing and blocking untrue material about COVID-19 vaccines, especially from the anti-vaccine community. Will the next step be to take on anti-vaccination proponents generally?
  • Comcast’s 1.2 TB data cap seems like a ton of data—until you factor in remote work” By Rob Pegoraro — Fast Company. Despite many people and children working and learning from home, Comcast is reimposing a 1.2 terabyte limit on data for homes. Sounds like quite a lot until you factor in video meetings, streaming, etc. So far, other providers have not set a cap.
  • Google’s star AI ethics researcher, one of a few Black women in the field, says she was fired for a critical email” By Drew Harwell and Nitasha Tiku — The Washington Post. Timnit Gebru, a top flight artificial intelligence (AI) computer scientist, was fired for questioning Google’s review of a paper she wanted to present at an AI conference that is likely critical of the company’s AI projects. Google claims she resigned, but Gebru says she was fired. She has long been an advocate for women and minorities in tech and AI and her ouster will likely only increase scrutiny of and questions about Google’s commitment to diversity and an ethical approach to the development and deployment of AI. It will also probably lead to more employee disenchantment about the company that follows in the wake of protests about Google’s involvement with the United States Department of Defense’s Project Maven and hiring of former United States Department of Homeland Security chief of staff Miles Taylor who was involved with the policies that resulted in caging children and separating families on the southern border of the United States.
  • Humans Can Help Clean Up Facebook and Twitter” By Greg Bensinger — The New York Times. In this opinion piece, the argument is made that social media platforms should redeploy their human monitors to the accounts that violate terms of service most frequently (e.g., President Donald Trump) and more aggressively label and remove untrue or inflammatory content, they would have a greater impact on lies, misinformation, and disinformation.
  • Showdown looms over digital services tax” By Ashley Gold — Axios. Because the Organization for Economic Cooperation and Development (OECD) has not reached a deal on digital services taxes, a number of the United States (U.S.) allies could move forward with taxes on U.S. multinationals like Amazon, Google, and Apple. The Trump Administration has variously taken an adversarial position threatening to retaliate against countries like France who have enacted a tax that has not been collected during the OECD negotiations. The U.S. also withdrew from talks. It is probable the Biden Administration will be more willing to work in a multi-lateral fashion and may strike a deal on an issue that it not going away as the United Kingdom, Italy, and Canada also have plans for a digital tax.
  • Trump’s threat to veto defense bill over social-media protections is heading to a showdown with Congress” By Karoun Demirjian and Tony Romm — The Washington Post. I suppose I should mention of the President’s demands that the FY 2021 National Defense Authorization Act (NDAA) contain a repeal of 47 U.S.C. 230 (Section 230 of the Communications Act) that came at the eleventh hour and fifty-ninth minute of negotiations on a final version of the bill. Via Twitter, Donald Trump threatened to veto the bill which has been passed annually for decades. Republicans were not having it, however, even if they agreed on Trump’s desire to remove liability protection for technology companies. And yet, if Trump continues to insist on a repeal, Republicans may find themselves in a bind and the bill could conceivably get pulled until President-elect Joe Biden is sworn in. On the other hand, Trump’s veto threats about renaming military bases currently bearing the names of Confederate figures have not been renewed even though the final version of the bill contains language instituting a process to do just that.

Other Developments

  • The Senate Judiciary Committee held over its most recent bill to narrow 47 U.S.C. 230 (Section 230 of the Communications Act) that provides liability protection for technology companies for third-party material posted on their platforms and any decisions to edit, alter, or remove such content. The committee opted to hold the “Online Content Policy Modernization Act” (S.4632), which may mean the bill’s chances of making it to the Senate floor are low. What’s more, even if the Senate passes Section 230 legislation, it is not clear there will be sufficient agreement with Democrats in the House to get a final bill to the President before the end of this Congress. On 1 October, the committee also decided to hold over bill to try to reconcile the fifteen amendments submitted for consideration. The Committee could soon meet again to formally markup and report out this legislation.
    • At the earlier hearing, Chair Lindsey Graham (R-SC) submitted an amendment revising the bill’s reforms to Section 230 that incorporate some of the below amendments but includes new language. For example, the bill includes a definition of “good faith,” a term not currently defined in Section 230. This term would be construed as a platform taking down or restricting content only according to its publicly available terms of service, not as a pretext, and equally to all similarly situated content. Moreover, good faith would require alerting the user and giving him or her an opportunity to respond subject to certain exceptions. The amendment also makes clear that certain existing means of suing are still available to users (e.g. suing claiming a breach of contract.)
    • Senator Mike Lee (R-UT) offered a host of amendments:
      • EHF20913 would remove “user[s]” from the reduced liability shield that online platforms would receive under the bill. Consequently, users would still not be legally liable for the content posted by another user.
      • EHF20914 would revise the language the language regarding the type of content platforms could take down with legal protection to make clear it would not just be “unlawful” content but rather content “in violation of a duly enacted law of the United States,” possibly meaning federal laws and not state laws. Or, more likely, the intent would be to foreclose the possibility a platform would say it is acting in concert with a foreign law and still assert immunity.
      • EHF20920 would add language making clear that taking down material that violates terms of service or use according to an objectively reasonable belief would be shielded from liability.
      • OLL20928 would expand legal protection to platforms for removing or restricting spam,
      • OLL20929 would bar the Federal Communications Commission (FCC) from a rulemaking on Section 230.
      • OLL20930 adds language making clear if part of the revised Section 230 is found unconstitutional, the rest of the law would still be applicable.
      • OLL20938 revises the definition of an “information content provider,” the term of art in Section 230 that identifies a platform, to expand when platforms may be responsible for the creation or development of information and consequently liable for a lawsuit.
    • Senator Josh Hawley (R-MO) offered an amendment that would create a new right of action for people to sue large platforms for taking down his or her content if not done in “good faith.” The amendment limits this right only to “edge providers” who are platforms with more than 30 million users in the U.S. , 300 million users worldwide, and with revenues of more than $1.5 billion. This would likely exclude all platforms except for Twitter, Facebook, Instagram, TikTok, Snapchat, and a select group of a few others.
    • Senator John Kennedy (R-LA) offered an amendment that removes all Section 230 legal immunity from platforms that collect personal data and then uses an “automated function” to deliver targeted or tailored content to a user unless a user “knowingly and intentionally elect[s]” to receive such content.
  • The Massachusetts Institute of Technology’s (MIT) Work of the Future Task Force issued its final report and drew the following conclusions:
    • Technological change is simultaneously replacing existing work and creating new work. It is not eliminating work altogether.
    • Momentous impacts of technological change are unfolding gradually.
    • Rising labor productivity has not translated into broad increases in incomes because labor market institutions and policies have fallen into disrepair.
    • Improving the quality of jobs requires innovation in labor market institutions.
    • Fostering opportunity and economic mobility necessitates cultivating and refreshing worker skills.
    • Investing in innovation will drive new job creation, speed growth, and meet rising competitive challenges.
    • The Task Force stated:
      • In the two-and-a-half years since the Task Force set to work, autonomous vehicles, robotics, and AI have advanced remarkably. But the world has not been turned on its head by automation, nor has the labor market. Despite massive private investment, technology deadlines have been pushed back, part of a normal evolution as breathless promises turn into pilot trials, business plans, and early deployments — the diligent, if prosaic, work of making real technologies work in real settings to meet the demands of hard-nosed customers and managers.
      • Yet, if our research did not confirm the dystopian vision of robots ushering workers off of factor y floors or artificial intelligence rendering superfluous human expertise and judgment, it did uncover something equally pernicious: Amidst a technological ecosystem delivering rising productivity, and an economy generating plenty of jobs (at least until the COVID-19 crisis), we found a labor market in which the fruits are so unequally distributed, so skewed towards the top, that the majority of workers have tasted only a tiny morsel of a vast har vest.
      • As this report documents, the labor market impacts of technologies like AI and robotics are taking years to unfold. But we have no time to spare in preparing for them. If those technologies deploy into the labor institutions of today, which were designed for the last century, we will see similar effects to recent decades: downward pressure on wages, skills, and benefits, and an increasingly bifurcated labor market. This report, and the MIT Work of the Future Task Force, suggest a better alternative: building a future for work that har vests the dividends of rapidly advancing automation and ever-more powerful computers to deliver opportunity and economic security for workers. To channel the rising productivity stemming from technological innovations into broadly shared gains, we must foster institutional innovations that complement technological change.
  • The European Data Protection Supervisor (EDPS) Wojciech Wiewiorówski published his “preliminary opinion on the European Commission’s (EC) Communication on “A European strategy for data” and the creation of a common space in the area of health, namely the European Health Data Space (EHDS).” The EDPS lauded the goal of the EHDS, “the prevention, detection and cure of diseases, as well as for evidence-based decisions in order to enhance effectiveness, accessibility and sustainability of the healthcare systems.” However, Wiewiorówski articulated his concerns that the EC needs to think through the applicability of the General Data Protection Regulation (GDPR), among other European Union (EU) laws before it can legally move forward. The EDPS stated:
    • The EDPS calls for the establishment of a thought-through legal basis for the processing operations under the EHDS in line with Article 6(1) GDPR and also recalls that such processing must comply with Article 9 GDPR for the processing of special categories of data.
    • Moreover, the EDPS highlights that due to the sensitivity of the data to be processed within the EHDS, the boundaries of what constitutes a lawful processing and a compatible further processing of the data must be crystal-clear for all the stakeholders involved. Therefore, the transparency and the public availability of the information relating to the processing on the EHDS will be key to enhance public trust in the EHDS.
    • The EDPS also calls on the Commission to clarify the roles and responsibilities of the parties involved and to clearly identify the precise categories of data to be made available to the EHDS. Additionally, he calls on the Member States to establish mechanisms to assess the validity and quality of the sources of the data.
    • The EDPS underlines the importance of vesting the EHDS with a comprehensive security infrastructure, including both organisational and state-of-the-art technical security measures to protect the data fed into the EHDS. In this context, he recalls that Data Protection Impact Assessments may be a very useful tool to determine the risks of the processing operations and the mitigation measures that should be adopted.
    • The EDPS recommends paying special attention to the ethical use of data within the EHDS framework, for which he suggests taking into account existing ethics committees and their role in the context of national legislation.
    • The EDPS is convinced that the success of the EHDS will depend on the establishment of a strong data governance mechanism that provides for sufficient assurances of a lawful, responsible, ethical management anchored in EU values, including respect for fundamental rights. The governance mechanism should regulate, at least, the entities that will be allowed to make data available to the EHDS, the EHDS users, the Member States’ national contact points/ permit authorities, and the role of DPAs within this context.
    • The EDPS is interested in policy initiatives to achieve ‘digital sovereignty’ and has a preference for data being processed by entities sharing European values, including privacy and data protection. Moreover, the EDPS calls on the Commission to ensure that the stakeholders taking part in the EHDS, and in particular, the controllers, do not transfer personal data unless data subjects whose personal data are transferred to a third country are afforded a level of protection essentially equivalent to that guaranteed within the European Union.
    • The EDPS calls on Member States to guarantee the effective implementation of the right to data portability specifically in the EHDS, together with the development of the necessary technical requirements. In this regard, he considers that a gap analysis might be required regarding the need to integrate the GDPR safeguards with other regulatory safeguards, provided e.g. by competition law or ethical guidelines.
  • The Office of Management and Budget (OMB) extended a guidance memorandum directing agencies to consolidate data centers after Congress pushed back the sunset date for the program. OMB extended OMB Memorandum M-19-19, Update to Data Center Optimization Initiative (DCOI) through 30 September 2022, which applies “to the 24 Federal agencies covered by the Chief Financial Officers (CFO) Act of 1990, which includes the Department of Defense.” The DCOI was codified in the “Federal Information Technology Acquisition Reform” (FITARA) (P.L. 113-291) and extended in 2018 until October 1, 2020. And this sunset was pushed back another two years in the FY 2020 National Defense Authorization Act (NDAA) (P.L. 116-92).
    • In March 2020, the Government Accountability Office (GAO) issued another of its periodic assessments of the DCOI, started in 2012 by the Obama Administration to shrink the federal government’s footprint of data centers, increase efficiency and security, save money, and reduce energy usage.
    • The GAO found that 23 of the 24 agencies participating in the DCOI met or planned to meet their FY 2019 goals to close 286 of the 2,727 data centers considered part of the DCOI. This latter figure deserves some discussion, for the Trump Administration changed the definition of what is a data center to exclude smaller ones (so-called non-tiered data centers). GAO asserted that “recent OMB DCOI policy changes will reduce the number of data centers covered by the policy and both OMB and agencies may lose important visibility over the security risks posed by these facilities.” Nonetheless, these agencies are projecting savings of $241.5 million when all the 286 data centers planned for closure in FY 2019 actually close. It bears note that the GAO admitted in a footnote it “did not independently validate agencies’ reported cost savings figures,” so these numbers may not be reliable.
    • In terms of how to improve the DCOI, the GAO stated that “[i]n addition to reiterating our prior open recommendations to the agencies in our review regarding their need to meet DCOI’s closure and savings goals and optimization metrics, we are making a total of eight new recommendations—four to OMB and four to three of the 24 agencies. Specifically:
      • The Director of the Office of Management and Budget should (1) require that agencies explicitly document annual data center closure goals in their DCOI strategic plans and (2) track those goals on the IT Dashboard. (Recommendation 1)
      • The Director of the Office of Management and Budget should require agencies to report in their quarterly inventory submissions those facilities previously reported as data centers, even if those facilities are not subject to the closure and optimization requirements of DCOI. (Recommendation 2)
      • The Director of the Office of Management and Budget should document OMB’s decisions on whether to approve individual data centers when designated by agencies as either a mission critical facility or as a facility not subject to DCOI. (Recommendation 3)
      • The Director of the Office of Management and Budget should take action to address the key performance measurement characteristics missing from the DCOI optimization metrics, as identified in this report. (Recommendation 4)
  • Australia’s Inspector-General of Intelligence and Security (IGIS) released its first report on how well the nation’s security services did in observing the law with respect to COVID  app  data. The IGIS “is satisfied that the relevant agencies have policies and procedures in place and are taking reasonable steps to avoid intentional collection of COVID app data.” The IGIS revealed that “[i]ncidental collection in the course of the lawful collection of other data has occurred (and is permitted by the Privacy Act); however, there is no evidence that any agency within IGIS jurisdiction has decrypted, accessed or used any COVID app data.” The IGIS is also “satisfied  that  the intelligence agencies within IGIS jurisdiction which have the capability to incidentally collect a least some types of COVID app data:
    • Are aware of their responsibilities under Part VIIIA of the Privacy Act and are taking active steps to minimise the risk that they may collect COVID app data.
    • Have appropriate  policies  and  procedures  in  place  to  respond  to  any  incidental  collection of COVID app data that they become aware of. 
    • Are taking steps to ensure any COVID app data is not accessed, used or disclosed.
    • Are taking steps to ensure any COVID app data is deleted as soon as practicable.
    • Have not decrypted any COVID app data.
    • Are applying the usual security measures in place in intelligence agencies such that a ‘spill’ of any data, including COVID app data, is unlikely.
  • New Zealand’s Government Communications Security Bureau’s National Cyber Security Centre (NCSC) has released its annual Cyber Threat Report that found that “nationally significant organisations continue to be frequently targeted by malicious cyber actors of all types…[and] state-sponsored and non-state actors targeted public and private sector organisations to steal information, generate revenue, or disrupt networks and services.” The NCSC added:
    • Malicious cyber actors have shown their willingness to target New Zealand organisations in all sectors using a range of increasingly advanced tools and techniques. Newly disclosed vulnerabilities in products and services, alongside the adoption of new services and working arrangements, are rapidly exploited by state-sponsored actors and cyber criminals alike. A common theme this year, which emerged prior to the COVID-19 pandemic, was the exploitation of known vulnerabilities in internet-facing applications, including corporate security products, remote desktop services and virtual private network applications.
  • The former Director of the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) wrote an opinion piece disputing President Donald Trump’s claims that the 2020 Presidential Election was fraudulent. Christopher Krebs asserted:
    • While I no longer regularly speak to election officials, my understanding is that in the 2020 results no significant discrepancies attributed to manipulation have been discovered in the post-election canvassing, audit and recount processes.
    • This point cannot be emphasized enough: The secretaries of state in Georgia, Michigan, Arizona, Nevada and Pennsylvania, as well officials in Wisconsin, all worked overtime to ensure there was a paper trail that could be audited or recounted by hand, independent of any allegedly hacked software or hardware.
    • That’s why Americans’ confidence in the security of the 2020 election is entirely justified. Paper ballots and post-election checks ensured the accuracy of the count. Consider Georgia: The state conducted a full hand recount of the presidential election, a first of its kind, and the outcome of the manual count was consistent with the computer-based count. Clearly, the Georgia count was not manipulated, resoundingly debunking claims by the president and his allies about the involvement of CIA supercomputers, malicious software programs or corporate rigging aided by long-gone foreign dictators.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Daniel Schludi on Unsplash

Further Reading, Other Developments, and Coming Events (4 December)

Further Reading

  • How Misinformation ‘Superspreaders’ Seed False Election Theories” By Sheera Frenkel — The New York Times. A significant percentage of lies, misinformation, and disinformation about the legitimacy of the election have been disseminated by a small number of right-wing figures, which are then repeated, reposted, and retweeted. The Times relies on research of how much engagement people like President Donald Trump and Dan Bongino get on Facebook after posting untrue claims about the election and it turns out that such trends and rumors do not start spontaneously.
  • Facebook Said It Would Ban Holocaust Deniers. Instead, Its Algorithm Provided a Network for Them” By Aaron Sankin — The Markup. This news organization still found Holocaust denial material promoted by Facebook’s algorithm even though the platform said it was taking down such material recently. This result may point to the difficulty of policing objectionable material that uses coded language and/or the social media platforms lack of sufficient resources to weed out this sort of content.
  • What Facebook Fed the Baby Boomers” By Charlie Warzel — The New York Times. A dispiriting trip inside two people’s Facebook feeds. This article makes the very good point that comments are not moderated, and these tend to be significant sources of vitriol and disinformation.
  • How to ‘disappear’ on Happiness Avenue in Beijing” By Vincent Ni and Yitsing Wang — BBC. By next year, the People’s Republic of China (PRC) may have as many as 560 million security cameras, and one artist ran an experiment of sorts to see if a group of people could walk down a major street in the capital without being seen by a camera or without their face being seen at places with lots of cameras.
  • Patients of a Vermont Hospital Are Left ‘in the Dark’ After a Cyberattack” By Ellen Barry and Nicole Perlroth — The New York Times. A Russian hacking outfit may have struck back after the Department of Defense’s (DOD) Cyber Command and Microsoft struck them. A number of hospitals were hacked, and care was significantly disrupted. This dynamic may lend itself to arguments that the United States (U.S.) may be wise to curtail its offensive operations.
  • EU seeks anti-China alliance on tech with Biden” By Jakob Hanke Vela and David M. Herszenhorn — Politico. The European Union (EU) is hoping the United States (U.S.) will be more amenable to working together in the realm of future technology policy, especially against the People’s Republic of China (PRC) which has made a concerted effort to drive the adoption of standards that favor its companies (e.g., the PRC pushed for and obtained 5G standards that will favor Huawei). Diplomatically speaking, this is considered low-hanging fruit, and a Biden Administration will undoubtedly be more multilateral than the Trump Administration.
  • Can We Make Our Robots Less Biased Than We Are?” By David Berreby — The New York Times. The bias present in facial recognition technology and artificial intelligence is making its way into robotics, posing the question of how do we change this? Many African American and other minority scientists are calling for the inclusion of people of color inn designing such systems as a countermeasure to the usual bias for white men.

Other Developments

  • The top Democrat on the Senate Homeland Security and Governmental Affairs Committee wrote President Donald Trump and “slammed the Trump Administration for their lack of action against foreign adversaries, including Russia, China, and North Korea, that have sponsored cyber-attacks against American hospitals and research institutions in an effort to steal information related to development of Coronavirus vaccines.” Peters used language that was unusually strong as Members of Congress typically tone down the rhetoric and deploy coded language to signal their level of displeasure about administration action or inaction. Peters could well feel strongly about what he perceives to be Trump Administration indifference to the cyber threats facing institutions researching and developing COVID-19 vaccines, this is an issue on which he may well be trying to split Republicans, placing them in the difficult position of lining up behind a president disinclined to prioritize some cyber issues or breaking ranks with him.
    • Peters stated:
      • I urge you, again, to send a strong message to any foreign government attempting to hack into our medical institutions that this behavior is unacceptable. The Administration should use the tools at its disposal, including the threat of sanctions, to deter future attacks against research institutions. In the event that any foreign government directly threatens the lives of Americans through attacks on medical facilities, other Department of Defense capabilities should be considered to make it clear that there will be consequences for these actions.
  • A United States federal court has ruled against a Trump Administration appointee Michael Pack and the United States Agency for Global Media (USAGM) and their attempts to interfere illegally with the independence of government-funded news organizations such as the Voice of America (VOA). The District Court for the District of Columbia enjoined Pack and the USAGM from a list of actions VOA and USAGM officials claim are contrary to the First Amendment and the organization’s mission.
  • The Federal Trade Commission (FTC) is asking a United States federal court to compel former Trump White House advisor Steve Bannon to appear for questioning per a Civil Investigative Demand (CID) as part of its ongoing probe of Cambridge Analytica’s role in misusing personal data of Facebook users in the 2016 Presidential Election. The FTC noted it “issued the CID to determine, among other things, whether Bannon may be held individually liable for the deceptive conduct of Cambridge Analytica, LLC—the subject of an administrative law enforcement action brought by the Commission.” There had been an interview scheduled in September but the day before it was to take place, Bannon’s lawyers informed the FTC he would not be attending.
    • In 2019, the FTC settled with former Cambridge Analytica CEO Alexander Nix and app developer Aleksandr Kogan in “administrative orders restricting how they conduct any business in the future, and requiring them to delete or destroy any personal information they collected.” The FTC did not, however, settle with the company itself. The agency alleged “that Cambridge Analytica, Nix, and Kogan deceived consumers by falsely claiming they did not collect any personally identifiable information from Facebook users who were asked to answer survey questions and share some of their Facebook profile data.” Facebook settled with the FTC for a record $5 billion for its role in the Cambridge Analytica scandal and for how it violated its 2012 consent order with the agency.
  • Apple responded to a group of human rights and civil liberties organizations about its plans to deploy technology on its operating system that allows users greater control of their privacy. Apple confirmed that its App Tracking Transparency (ATT) would be made part of its iOS early next year and would provide users of Apple products with a prompt with a warning about how their information may be used by the app developer. ATT would stop app developers from tracking users when they use other apps on ta device. Companies like Facebook have objected, claiming that the change is a direct shot at them and their revenue. Apple does not reap a significant revenue stream from collecting, combining, and processing user data whereas Facebook does. Facebook also tracks users across devices and apps on a device through a variety of means.
    • Apple stated:
      • We delayed the release of ATT to early next year to give developers the time they indicated they needed to properly update their systems and data practices, but we remain fully committed to ATT and to our expansive approach to privacy protections. We developed ATT for a single reason: because we share your concerns about users being tracked without their consent and the bundling and reselling of data by advertising networks and data brokers.
      • ATT doesn’t ban the reasonable collection of user data for app functionality or even for advertising. Just as with the other data-access permissions we have added over many software releases, developers will be able to explain why they want to track users both before the ATT prompt is shown and in the prompt itself. At that point, users will have the freedom to make their own choice about whether to proceed. This privacy innovation empowers consumers — not Apple — by simply making it clear what their options are, and giving them the information and power to choose.
    • As mentioned, a number of groups wrote Apple in October “to express our disappointment that Apple is delaying the full implementation of iOS 14’s anti-tracking features until early 2021.” They argued:
      • These features will constitute a vital policy improvement with the potential to strengthen respect for privacy across the industry. Apple should implement these features as expeditiously as possible.
      • We were heartened by Apple’s announcement that starting with the iOS 14 update, all app developers will be required to provide information that will help users understand the privacy implications of an app before they install it, within the App Store interface.
      • We were also pleased that iOS 14 users would be required to affirmatively opt in to app tracking, on an app-by-app basis. Along with these changes, we urge Apple to verify the accuracy of app policies, and to publish transparency reports showing the number of apps that are rejected and/or removed from the App Store due to inadequate or inaccurate policies.
  • The United States (U.S.) Government Accountability Office (GAO) sent its assessment of the privacy notices and practices of U.S. banks and credit unions to the chair of the Senate committee that oversees this issue. Senate Banking, Housing, and Urban Affairs Committee Chair Mike Crapo (R-ID) had asked the GAO “to examine the types of personal information that financial institutions collect, use, and share; how they make consumers aware of their information-sharing practices; and federal regulatory oversight of these activities.” The GAO found that a ten-year-old model privacy disclosure form used across these industries may comply with the prevailing federal requirements but no longer encompasses the breadth and scope of how the personal information of people is collected, processed, and used. The GAO called on the Consumer Financial Protection Bureau (CFPB) to update this form. The GAO explained:
    • Banks and credit unions collect, use, and share consumers’ personal information—such as income level and credit card transactions—to conduct everyday business and market products and services. They share this information with a variety of third parties, such as service providers and retailers.
    • The Gramm-Leach-Bliley Act (GLBA) requires financial institutions to provide consumers with a privacy notice describing their information-sharing practices. Many banks and credit unions elect to use a model form—issued by regulators in 2009—which provides a safe harbor for complying with the law (see figure). GAO found the form gives a limited view of what information is collected and with whom it is shared. Consumer and privacy groups GAO interviewed cited similar limitations. The model form was issued over 10 years ago. The proliferation of data-sharing since then suggests a reassessment of the form is warranted. Federal guidance states that notices about information collection and usage are central to providing privacy protections and transparency.
    • Since Congress transferred authority to the CFPB for implementing GLBA privacy provisions, the agency has not reassessed if the form meets consumer expectations for disclosures of information-sharing. CFPB officials said they had not considered a reevaluation because they had not heard concerns from industry or consumer groups about privacy notices. Improvements to the model form could help ensure that consumers are better informed about all the ways banks and credit unions collect and share personal information
    • The increasing amounts of and changing ways in which industry collects and shares consumer personal information—including from online activities—highlights the importance of clearly disclosing practices for collection, sharing, and use. However, our work shows that banks and credit unions generally used the model form, which was created more than 10 years ago, to make disclosures required under GLBA. As a result, the disclosures often provided a limited view of how banks and credit unions collect, use, and share personal information.
    • We recognize that the model form is required to be succinct, comprehensible to consumers, and allow for comparability across institutions. But, as information practices continue to change or expand, consumer insights into those practices may become even more limited. Improvements and updates to the model privacy form could help ensure that consumers are better informed about all the ways that banks and credit unions collect, use, and share personal information. For instance, in online versions of privacy notices, there may be opportunities for readers to access additional details—such as through hyperlinks—in a manner consistent with statutory requirements.
  • The Australian Competition & Consumer Commission (ACCC) is asking for feedback on Google’s proposed $2.1 billion acquisition of Fitbit. In a rather pointed statement, the chair of the ACCC, Rod Sims, made clear “[o]ur decision to begin consultation should not be interpreted as a signal that the ACCC will ultimately accept the undertaking and approve the transaction.” The buyout is also under scrutiny in the European Union (EU) and may be affected by the suit the United States Department of Justice (DOJ) and some states have brought against the company for anti-competitive behavior. The ACCC released a Statement of Issues in June about the proposed deal.
    • The ACCC explained “[t]he proposed undertaking would require Google to:
      • not use certain user data collected through Fitbit and Google wearables for Google’s advertising purposes for 10 years, with an option for the ACCC to extend this obligation by up to a further 10 years;
      • maintain access for third parties, such as health and fitness apps, to certain user data collected through Fitbit and Google wearable devices for 10 years; and
      • maintain levels of interoperability between third party wearables and Android smartphones for 10 years.
    • In August, the EU “opened an in-depth investigation to assess the proposed acquisition of Fitbit by Google under the EU Merger Regulation.” The European Commission (EC) expressed its concerns “that the proposed transaction would further entrench Google’s market position in the online advertising markets by increasing the already vast amount of data that Google could use for personalisation of the ads it serves and displays.” The EC stated “[a]t this stage of the investigation, the Commission considers that Google:
      • is dominant in the supply of online search advertising services in the EEA countries (with the exception of Portugal for which market shares are not available);
      • holds a strong market position in the supply of online display advertising services at least in Austria, Belgium, Bulgaria, Croatia, Denmark, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Norway, Poland, Romania, Slovakia, Slovenia, Spain, Sweden and the United Kingdom, in particular in relation to off-social networks display ads;
      • holds a strong market position in the supply of ad tech services in the EEA.
    • The EC explained that it “will now carry out an in-depth investigation into the effects of the transaction to determine whether its initial competition concerns regarding the online advertising markets are confirmed…[and] will also further examine:
      • the effects of the combination of Fitbit’s and Google’s databases and capabilities in the digital healthcare sector, which is still at a nascent stage in Europe; and
      • whether Google would have the ability and incentive to degrade the interoperability of rivals’ wearables with Google’s Android operating system for smartphones once it owns Fitbit.
    • Amnesty International (AI) sent EC Executive Vice-President Margrethe Vestager a letter, arguing “[t]he merger risks further extending the dominance of Google and its surveillance-based business model, the nature and scale of which already represent a systemic threat to human rights.” AI asserted “[t]he deal is particularly troubling given the sensitive nature of the health data that Fitbit holds that would be acquired by Google.” AI argued “[t]he Commission must ensure that the merger does not proceed unless the two business enterprises can demonstrate that they have taken adequate account of the human rights risks and implemented strong and meaningful safeguards that prevent and mitigate these risks in the future.”
  • Europol, the United Nations Interregional Crime and Justice Research Institute (UNICRI) and Trend Micro have cooperated on a report that looks “into current and predicted criminal uses of artificial intelligence (AI).
    • The organizations argued “AI could be used to support:
      • convincing social engineering attacks at scale;
      • document-scraping malware to make attacks more efficient;
      • evasion of image recognition and voice biometrics;
      • ransomware attacks, through intelligent targeting and evasion;
      • data pollution, by identifying blind spots in detection rules.
    • The organizations concluded:
      • Based on available insights, research, and a structured open-source analysis, this report covered the present state of malicious uses and abuses of AI, including AI malware, AI-supported password guessing, and AI-aided encryption and social engineering attacks. It also described concrete future scenarios ranging from automated content generation and parsing, AI-aided reconnaissance, smart and connected technologies such as drones and autonomous cars, to AI-enabled stock market manipulation, as well as methods for AI-based detection and defense systems.
      • Using one of the most visible malicious uses of AI — the phenomenon of so-called deepfakes — the report further detailed a case study on the use of AI techniques to manipulate or generate visual and audio content that would be difficult for humans or even technological solutions to immediately distinguish from authentic ones.
      • As speculated on in this paper, criminals are likely to make use of AI to facilitate and improve their attacks by maximizing opportunities for profit within a shorter period, exploiting more victims, and creating new, innovative criminal business models — all the while reducing their chances of being caught. Consequently, as “AI-as-a-Service”206 becomes more widespread, it will also lower the barrier to entry by reducing the skills and technical expertise required to facilitate attacks. In short, this further exacerbates the potential for AI to be abused by criminals and for it to become a driver of future crimes.
      • Although the attacks detailed here are mostly theoretical, crafted as proofs of concept at this stage, and although the use of AI to improve the effectiveness of malware is still in its infancy, it is plausible that malware developers are already using AI in more obfuscated ways without being detected by researchers and analysts. For instance, malware developers could already be relying on AI-based methods to bypass spam filters, escape the detection features of antivirus software, and frustrate the analysis of malware. In fact, DeepLocker, a tool recently introduced by IBM and discussed in this paper, already demonstrates these attack abilities that would be difficult for a defender to stop.
      • To add, AI could also enhance traditional hacking techniques by introducing new ways of performing attacks that would be difficult for humans to predict. These could include fully automated penetration testing, improved password-guessing methods, tools to break CAPTCHA security systems, or improved social engineering attacks. With respect to open-source tools providing such functionalities, the paper discussed some that have already been introduced, such as DeepHack, DeepExploit, and XEvil.
      • The widespread use of AI assistants, meanwhile, also creates opportunities for criminals who could exploit the presence of these assistants in households. For instance, criminals could break into a smart home by hijacking an automation system through exposed audio devices.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (18 November)

Further Reading

  • Trump fires top DHS official who refuted his claims that the election was rigged” By Ellen Nakashima and Nick Miroff — The Washington Post. As rumored, President Donald Trump has decapitated the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA). Director Christopher Krebs was fired via Twitter, after he had endorsed a letter by 59 experts on election security who said there was no fraud in the election. Trump tweeted: “The recent statement by Chris Krebs on the security of the 2020 Election was highly inaccurate, in that there were massive improprieties and fraud — including dead people voting, Poll Watchers not allowed into polling locations, ‘glitches’ in the voting machines which changed votes from Trump to Biden, late voting, and many more. Therefore, effective immediately, Chris Krebs has been terminated as Director of the Cybersecurity and Infrastructure Security Agency.” Of course, the statement CISA cosigned and issued last week asserting there was no evidence of fraud or wrongdoing in the election probably did not help his prospects. Additionally, CISA Deputy Director Matthew Travis was essentially forced out when he was informed the normal succession plan would be ignored and he would not become the acting head of CISA. A CISA senior civil servant, Brandon Wales, will helm the agency in an acting basis. Last week, CISA’s Assistant Director for Cybersecurity Bryan Ware was forced out.
  • NSA Spied On Denmark As It Chose Its Future Fighter Aircraft: Report” By Thomas Newdick — The Drive. A Danish media outlet is claiming the United States U.S. National Security Agency (NSA) spied Denmark’s Ministry of Finance, the Ministry of Foreign Affairs, and the defense firm Terma in order to help Lockheed Martin’s bid to sell F-35 Joint Strike Fighters to Denmark. Eurofighter GmbH and Saab were offering their Typhoon and Gripen fighters to replace Denmark’s F-16s. Reportedly, the NSA used an existing arrangement with Denmark to obtain information from a program allowing the NSA access to fiber optics cables in the country. It is likely Denmark did not have such surveillance in mind when it struck this agreement with the U.S. Two whistleblowers reports have been filed with the Forsvarets Efterretningstjeneste (FE), Denmark’s Defense Intelligence Service, and there are allegations that the U.S. surveillance was illegal. However, the surveillance appears not to have influenced the Danish government, which opted for the F-35. Earlier this year, there were allegations the FE was improperly sharing Danish cables containing information on Danish citizens improperly.
  • Facebook Knows That Adding Labels To Trump’s False Claims Does Little To Stop Their Spread” By Craig Silverman and Ryan Mac — BuzzFeed News. These reporters must know half of Facebook’s staff because they always see what is going on internally with the company. In this latest scoop, they say they have seen internal numbers showing that labeling President Donald Trump’s false tweets have done little to slow their spread. In fact, labelling may only slow their spread by 8%. This outcome is contrary to a practice Facebook employed in 2017 under which fact checkers would label untrue posts as false. This reduced their virality by 80%.
  • Apple Halves Its App Store Fee for the Smaller Companies” By Jack Nicas — The New York Times. The holiday spirit must already be afoot in Cupertino, California, for small app developers will now only pay Apple 15% of in-app purchases for the privilege of being in the App Store. Of course, this decision has nothing to do with the antitrust pressure the company is facing in the European Union and United States (U.S.) and will have very little impact on their bottom line since app developers with less than $1 million in revenue (i.e., those entitled to a reduction) account for 2% of App Store revenue. It does give Apple leadership and executive some great talking points when pressed by antitrust investigators, legislators, and the media.
  • Inside the behind-the-scenes fight to convince Joe Biden about Silicon Valley” By Theodore Schleifer — recode. The jockeying among factions in the Democratic party and other stakeholders is fierce and will only grow fiercer when it comes to who will serve where in a Biden Administration. Silicon Valley and those who would reform tech are fighting to get people amenable to their policy goals placed in the new Administration. President-elect Joe Biden and his campaign were ambiguous on many tech policy issues and have flexibility which has been further helped by appointing people respected in both camps like new White House Chief of Staff Ron Klain.
  • Group of 165 Google critics calls for swift EU antitrust action – letter” By Foo Yun Chee — Reuters. A wide-ranging group of companies and industry associations are urging the European Union to investigate and punish what they see as Google’s anti-competitive dominance of online search engines, especially the One Box that now appears at the top of search results that points people to Google sites and products.

Other Developments

  • The European Union (EU) announced a revision of its export control process for allowing the export of dual use items, including cyber surveillance tools. The European Commission (EC) asserted “[t]hanks to the new Regulation, the EU can now effectively protect its interests and values and, in particular, address the risk of violations of human rights associated with trade in cyber-surveillance technologies without prior agreement at multilateral level…[and] also enhances the EU’s capacity to control trade flows in sensitive new and emerging technologies. The EC explained “[t]he new Regulation includes many of the Commission proposals for a comprehensive “system upgrade”, and will make the existing EU Export control system more effective by:
    • introducing a novel ‘human security’ dimension so the EU can respond to the challenges posed by emerging dual-use technologies – especially cyber-surveillance technologies – that pose a risk to national and international security, including protecting human rights;
    • updating key notions and definitions (e.g. definition of an “exporter” to apply to natural persons and researchers involved in dual-use technology transfers);
    • simplifying and harmonising licensing procedures and allowing the Commission to amend – by ‘simplified’ procedure, i.e. delegated act – the list of items or destinations subject to specific forms of control, thereby making the export control system more agile and able to evolve and adjust to circumstances;
    • enhancing information-exchange between licensing authorities and the Commission with a view to increasing transparency of licensing decisions;
    • coordination of, and support for, robust enforcement of controls, including enhancing secure electronic information-exchange between licensing and enforcement agencies;
    • developing an EU capacity-building and training programme for Member States’ licensing and enforcement authorities;
    • outreach to industry and transparency with stakeholders, developing a structured relationship with the private sector through specific consultations of stakeholders by the relevant Commission group of Member-State experts, and;
    • setting up a dialogue with third countries and seeking a level playing field at global level.
    • The European Parliament contended:
      • The reviewed rules, agreed by Parliament and Council negotiators, govern the export of so-called dual use goods, software and technology – for example, high-performance computers, drones and certain chemicals – with civilian applications that might be repurposed to be used in ways which violate human rights.
      • The current update, made necessary by technological developments and growing security risks, includes new criteria to grant or reject export licenses for certain items.
      • The Parliament added its negotiators
        • got agreement on setting up an EU-wide regime to control cyber-surveillance items that are not listed as dual-use items in international regimes, in the interest of protecting human rights and political freedoms;
        • strengthened member states’ public reporting obligations on export controls, so far patchy, to make the cyber-surveillance sector in particular more transparent;
        • increased the importance of human rights as licensing criterion; and
        • agreed on rules to swiftly include emerging technologies in the regulation.
  • The United States House of Representatives passed three technology bills by voice vote yesterday. Two of these bills would address in different ways the United States’ (U.S.) efforts to make up ground on the People’s Republic of China in the race to roll out 5G networks. It is possible but not foreseeable whether the Senate will take up these bills before year’s end and send them to the White House. It is possible given how discrete the bills are in scope. The House Energy and Commerce Committee provided these summaries:
    • The “Utilizing Strategic Allied (USA) Telecommunications Act of 2020” (H.R.6624) creates a new grant program through the National Telecommunications and Information Administration (NTIA) to promote technology that enhances supply chain security and market competitiveness in wireless communications networks.
      • One of the bill’s sponsors, House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) stated:
        • Earlier this year, the House passed, and the President signed, my Secure and Trusted Communications Networks Act to create a program to fund the replacement of suspect network equipment. Suspect equipment, including that produced by Huawei and ZTE, could allow foreign adversaries to surveil Americans at home or, worse, disrupt our communications systems.
        • While we are still pushing for Congress to appropriate funds to that end, it is important to recognize that my legislation was only half the battle, even when it is funded. We also need to create and foster competition for trusted network equipment that uses open interfaces so that the United States is not beholden to a market for network equipment that is becoming less competitive. This bill before us today, the Utilizing Strategic Allied Telecommunications Act, or the USA Telecommunications Act, does just that.
        • The bipartisan legislation creates a grant program and authorizes $750 million in funding for the National Telecommunications and Information Administration to help promote and deploy Open Radio Access Network technologies that can spur that type of competition. We must support alternatives to companies like Huawei and ZTE…
    • The “Spectrum IT Modernization Act of 2020” (H.R.7310) requires NTIA – in consultation with the Policy and Plans Steering Group – to submit to Congress a report on its plans to modernize agency information technology systems relating to managing the use of federal spectrum. 
      • A sponsor of the bill, House Energy and Commerce Committee Ranking Member Greg Walden (R-OR) explained:
      • H.R. 7310 would require NTIA to establish a process to upgrade their spectrum management infrastructure for the 21st century. The bill would direct the policy coordination arm of NTIA to submit a plan to Congress as to how they will standardize the data collection across agencies and then directs agencies with Federal spectrum assignments from NTIA to issue an implementation plan to interoperate with NTIA’s plan.
      • This is a good-government bill–it really is–and with continued support and oversight from Congress, we can continue the United States’ leadership in making Federal spectrum available for flexible use by the private sector.
    • The “Reliable Emergency Alert Distribution Improvement (READI) Act of 2020” (H.R.6096) amends the Warning, Alert, and Response Network Act to classify emergency alerts from the Federal Emergency Management Agency as a type of alert that commercial mobile service providers may not allow subscribers to block from their devices. The bill also directs the Federal Communications Commission (FCC) to adopt regulations to facilitate coordination with State Emergency Communications Committees in developing and modernizing State Emergency Alert System plans. Finally, the READI Act directs the FCC to examine the feasibility of modernizing the Emergency Alert System by expanding alert distribution to the internet and streaming services.  
  • The same privacy activists that brought the suits that resulted in the striking down of the Safe Harbor and Privacy Shield agreements have filed complaints in Spain and Germany that Apple has violated the European Union’s (EU) e-Privacy Directive and laws in each nation through its use of IDFA (Apple’s Identifier for Advertisers). Because the General Data Protection Regulation (GDPR) is not the grounds for the complaints, each nation could act without needing to consult other EU nations. Moreover, a similar system used by Google is also being investigated for possible violations. The group none of your business (noyb) asserted:
    • IDFA – the cookie in every iPhone user’s pocket. Each iPhone runs on Apple’s iOS operating system. By default, iOS automatically generates a unique “IDFA” (short for Identifier for Advertisers) for each iPhone. Just like a license plate this unique string of numbers and characters allows Apple and other third parties to identify users across applications and even connect online and mobile behaviour (“cross device tracking”).
    • Tracking without user consent. Apple’s operating system creates the IDFA without user’s knowledge or consent. After its creation, Apple and third parties (e.g. applications providers and advertisers) can access the IDFA to track users’ behaviour, elaborate consumption preferences and provide personalised advertising. Such tracking is strictly regulated by the EU “Cookie Law” (Article 5(3) of the e-Privacy Directive) and requires the users’ informed and unambiguous consent.
    • Insufficient “improvement” on third-party access. Recently Apple announced plans for future changes to the IDFA system. These changes seem to restrict the use of the IDFA for third parties (but not for Apple itself). Just like when an app requests access to the camera or microphone, the plans foresee a new dialog that asks the user if an app should be able to access the IDFA. However, the initial storage of the IDFA and Apple’s use of it will still be done without the users’ consent and therefore in breach of EU law. It is unclear when and if these changes will be implemented by the company.
    • No need for EU cooperation. As the complaint is based on Article 5(3) of the e-Privacy Directive and not the GDPR, the Spanish and German authorities can directly fine Apple, without the need for cooperation among EU Data Protection Authorities as under GDPR.
  • The Federal Trade Commission (FTC) Chair made remarks at antitrust conference on how antitrust law should view “an acquisition of a nascent competitive threat by a monopolist when there is reason to think that the state of competition today may not tell the whole story.” Chair Joseph Simons views are timely for a number of reasons, particularly the extent to which large technology firms have sought and bought smaller, newer companies. Obviously, the acquisitions of WhatsApp and Instagram by Facebook and YouTube and AdSense by Google come to mind as the sorts of acquisitions United States (U.S.) regulators approved, possibly without much thought given to what a future market may look like for competition if the larger, dominant company is allowed to proceed. Simons suggested regulators and courts would be wise to give this aspect of antitrust mush more thought, which could theoretically inform the approach the Biden Department of Justice and FTC take. Simons stated:
    • And if firms are looking to the future, then antitrust enforcers should too. We must be willing and able to recognize that harm to competition might not be obvious from looking at the marketplace as it stands. If we confine ourselves to examining a static picture of the market at the moment we investigate a practice or transaction, without regard to the dynamic business realities at work, then we risk forfeiting the benefits of competition that could arise in the future to challenge the dominant firm, even when this future competition is to some extent uncertain.
    • Simons asserted:
      • A merger or acquisition can of course constitute anticompetitive conduct for purposes of Section 2 [of the Sherman Act]
      • From a competition perspective, a monopolist can “squash” a nascent competitor by buying it, not just by targeting it with anticompetitive actions as Microsoft did. In fact, from the monopolist’s perspective, it may be easier and more effective to buy the nascent threat (even if only to keep it out of the hands of others) than to target it with other types of anticompetitive conduct.
      • A central issue in potential competition cases is the nature and strength of evidence that the parties will become actual competitors in the future. Some cases have applied Section 7 [of the Clayton Act] narrowly in this context: too narrowly, I think, given that the purpose of Section 7 is to prohibit acquisitions that “may” substantially lessen competition or “tend” to create a monopoly.
    • Simons concluded:
      • But uncertainty has always been a feature of the competitive process, even in markets that appear to be simple or traditional, and dealing with uncertainty is all in a day’s work for an antitrust enforcer. I have referred to the Microsoft case repeatedly today, so, in closing, let me remind everyone that there was some uncertainty about the future in Microsoft as well. The court, in holding that the plaintiff does not and should not bear the burden of “reconstruct[ing] a product’s hypothetical development,” observed that the defendant should appropriately be “made to suffer the uncertain consequences of its own undesirable conduct.” The same holds when the monopolist has simply chosen to acquire the threat.
  • The National Institute of Standards and Technology’s (NIST) National Initiative for Cybersecurity Education (NICE) revised the Workforce Framework for Cybersecurity (NICE Framework) that “improves communications about how to identify, recruit, develop, and retain cybersecurity talent ­ – offering a common, consistent lexicon that categorizes and describes cybersecurity work.” NIST explained:
    • The NICE Framework assists organizations with managing cybersecurity risks by providing a way to discuss the work and learners associated with cybersecurity. These cybersecurity risks are an important input into enterprise risk decisions as described in NIST Interagency Report 8286, Integrating Cybersecurity and Enterprise Risk Management (ERM).
    • NIST stated “[r]evisions to the NICE Framework (NIST Special Publication 800-181) provide:
      • A streamlined set of “building blocks” comprised of Task, Knowledge, and Skill Statements;
      • The introduction of Competencies as a mechanism for organizations to assess learners; and
      • A reference to artifacts, such as Work Roles and Knowledge Skills and Abilities statements, that will live outside of the publication to enable a more fluid update process.
  • A left center think tank published a report on how the United States (U.S.) and likeminded nations can better fight cybercrime. In the report addressed to President-elect Joe Biden and Vice President-elect Kamala Harris, the Third Way presented the results of a “multiyear effort to define concrete steps to improve the government’s ability to tackle the scourge of cybercrime by better identifying unlawful perpetrators and imposing meaningful consequences on them and those behind their actions.” In “A Roadmap to Strengthen US Cyber Enforcement: Where Do We Go From Here?,” the Third Way made a list of detailed recommendations on how the Biden Administration could better fight cybercrime, but in the cover letter to the report, there was a high level summary of these recommendations:
    • In this roadmap, we identify the challenges the US government faces in investigating and prosecuting these crimes and advancing the level of international cooperation necessary to do so. Cyberattackers take great pains to hide their identity, using sophisticated tools that require technical investigative and forensic expertise to attribute the attacks. The attacks are often done at scale, where perpetrators prey on multiple victims across many jurisdictions and countries, requiring coordination across criminal justice agencies. The skills necessary to investigate these crimes are in high demand in the private sector, making it difficult to retain qualified personnel. A number of diplomatic barriers make cross-border cooperation difficult, a challenge exacerbated often by blurred lines line between state and non-state actors in perpetrating these crimes.
    • This roadmap recommends actions that your administration can take to develop a comprehensive strategy to reduce cybercrime and minimize its impact on the American people by identifying the perpetrators and imposing meaningful consequences on them. We propose you make clear at the outset to the American public and global partners that cyber enforcement will be a top priority for your administration. In reinstating a White House cybersecurity position, we have extensive recommendations on how that position should address cybercrime. And, to make policy from an intelligence baseline, we believe you should request a National Intelligence Estimate on the linkages between cybercrime and nation-state cyber actors to understand the scope of the problem.
    • Our law enforcement working group has detailed recommendations to improve and modernize law enforcement’s ability to track and respond to cybercrime. And our global cooperation working group has detailed recommendations on creating a cohesive international cyber engagement strategy; assessing and improving the capacity of foreign partners on cybercrime; and improving the process for cross-border data requests that are critical to solving these crimes. We believe that with these recommendations, you can make substantial strides in bringing cybercriminals to justice and deterring future cybercriminals from victimizing Americans.

Coming Events

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Bill To Reform IOT Security in U.S. Passes Congress

A long awaited bill to revamp how the U.S. government secures its IOT is on its way to the White House.

Last night, the Senate agreed to a House passed bill that would remake how the United States (U.S.) government buys Internet of Things (IOT) items, with the idea that requiring security standards in government IOT will drive greater security across the U.S. IOT market. Of course, such legislation, if implemented as intended, would also have the salutary effect of strengthening government networks. Incidentally, there is language in the bill that would seem to give the White House additional muscle to drive better information security across the civilian government.

The effort to pass this bill started in the last Congress and continued into this Congress. The bill will require the Office of Management and Budget (OMB) to set standards and practices that private sector contractors will need to meet in selling IOT to federal agencies. The OMB’s work is to be based on a series of IOT guidance documents the National Institute of Standards and Technology (NIST) has issued.

In September, the United States House of Representatives took up and passed a revised version of “Internet of Things Cybersecurity Improvement Act of 2020” (H.R. 1668) by voice vote. As noted, the United States Senate passed the same bill by unanimous consent yesterday, sending the legislation to the White House. While OMB did not issue a Statement of Administration Policy on H.R. 1668 or any of its previous iterations, Senate Republicans, particularly Majority Leader Mitch McConnell (R-KY), have not shown a willingness to even consider any bill the White House has not greenlit. Therefore, it may be reasonable to assume the President will sign this bill into law.

H.R. 1668 requires NIST to publish “standards and guidelines for the Federal Government on the appropriate use and management by agencies of Internet of Things devices owned or controlled by an agency and connected to information systems owned or controlled by an agency, including minimum information security requirements for managing cybersecurity risks associated with such devices.” These standards and guidelines are to be consistent with existing NIST standards and guidance on IOT, and the agency has issued a series of such documents described in some detail later in this article.

Six months after NIST issues such standards and guidelines, OMB must judge current agency standards and practices with IOT against NIST’s (excepting “national security systems, meaning almost all the Department of Defense and Intelligence Community). OMB is required to then issue policies and principles necessary to rectify shortcomings in agency IOT security after consulting with the United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA). At least once every five years after the initial policies and procedures are issued, OMB must revisit, assess, and adjust them as needed. Moreover, U.S. acquisition regulations must be amended to implement these standards and guidelines, meaning these would be binding in the purchase and use of IOT by civilian agencies.

NIST must also create and operate a system under which vulnerabilities and fixes in agency owned or operated IOT can be reported. OMB would oversee the establishment of this process, and DHS would administer the guidelines, possibly through its powers to issue Binding Operational Directives to federal civilian agencies.

Now, we come to a curious section of H.R.1668 that may well have implications for government bought or used technology beyond just IOT. Within two years of becoming law, OMB, in consultation with DHS, must “develop and oversee the implementation of policies, principles, standards, or guidelines as may be necessary to address security vulnerabilities of information systems (including Internet of Things devices) (emphasis added.) This is a seemingly open-ended grant of authority for OMB to put in place binding policies and procedures for all information systems, a very broad term that encompasses information technology and other resources, across federal agencies. OMB already possesses power and means to do much of this, begging the question why such authority was needed. The bill is not clear on this point, and OMB may well use this additional authority in areas not strictly pertaining to IOT.

And now the hammer to drive better IOT security. Civilian agencies will not be able to buy or use IOT until its Chief Information Officer (CIO) has certified such IOT meets the aforementioned standards developed along the dual tracks the bill requires. There are, of course, loopholes to this requirement since industry and agency stakeholders likely insisted on them. First, any purchase below the simplified acquisition threshold (which is currently $250,000) would be exempt from this requirement, and the agency could waive the need for the CIO to agree if

  • the waiver is necessary in the interest of national security;
  • procuring, obtaining, or using such device is necessary for research purposes; or
  • such device is secured using alternative and effective methods appropriate to the function of such device.

And so, these three grounds for waivers may be the exceptions that eat the rule. Time will tell.

In June, the Senate and House committees of jurisdictions marked up their versions of the “Internet of Things (IOT) Cybersecurity Improvement Act of 2020” (H.R. 1668/S. 734). The bill text as released in March 2019 for both bills was identical signaling agreement between the two chambers’ sponsors, but the process of marking up the bills resulted in different versions, requiring negotiation on a final bill. The House Oversight and Reform Committee marked up and reported out H.R. 1668 after adopting an amendment in the nature of a substitute that narrowed the scope of the bill and is more directive than the bill initially introduced in March. The Senate Homeland Security Committee marked up S. 734 a week later, making their own changes from the March bill. The March version of the legislation unified two similar bills from the 115th Congress of the same title: the “Internet of Things (IOT) Cybersecurity Improvement Act of 2017” (S. 1691) and the “Internet of Things (IOT) Federal Cybersecurity Improvement Act of 2018” (H.R. 7283).

Per the Committee Report for S. 734, the purpose of bill

is to proactively mitigate the risks posed by inadequately-secured Internet of Things (IOT) devices through the establishment of minimum security standards for IOT devices purchased by the Federal Government. The bill codifies the ongoing work of the NIST to develop standards and guidelines, including minimum-security requirements, for the use of IOT devices by Federal agencies. The bill also directs OMB, in consultation with DHS, to issue the necessary policies and principles to implement the NIST standards and guidelines on IOT security and management. Additionally, the bill requires NIST, in consultation with cybersecurity researchers and industry experts, to publish guidelines for the reporting, coordinating, publishing, and receiving of information about Federal agencies’ security vulnerabilities and the coordinate resolutions of the reported vulnerabilities. OMB will provide the policies and principles and DHS will develop and issue the procedures necessary to implement NIST’s guidelines on coordinated vulnerability disclosure for Federal agencies. The bill includes a provision allowing Federal agency heads to waive the IOT use and management requirements issued by OMB for national security, functionality, alternative means, or economic reasons.

According to a staff memorandum, H.R. 1668

would require the NIST to develop guidelines for managing cybersecurity risks of IOT devices by June 30, 2020. The bill would require OMB to issue standards for implementing those guidelines by December 31, 2020. The bill also would require similar guidelines from NIST and standards from OMB on reporting, coordinating, and publishing security vulnerabilities of IOT devices.

As noted earlier, NIST has worked on and published a suite of guidance documents on IOT. In June, NIST published final guidance as part of its follow up to A Report to the President on Enhancing the Resilience of the Internet and Communications Ecosystem Against Botnets and Other Automated, Distributed Threats and NIST’s Botnet Roadmap. Neither document is binding on federal agencies or private sector entities, but given the respect the agency enjoys, these will likely become referenced extensively by other standards.

NIST explained in a blog post:

In NISTIR 8259A, NIST explained the purpose of the publication as defining an “IOT device cybersecurity capability core baseline, which is a set of device capabilities generally needed to support common cybersecurity controls that protect an organization’s devices as well as device data, systems, and ecosystems.” NIST stated “[t]he purpose of this publication is to provide organizations a starting point to use in identifying the device cybersecurity capabilities for new IOT devices they will manufacture, integrate, or acquire…[and] can be used in conjunction with NISTIR 8259, Foundational Cybersecurity Activities for IOT Device Manufacturers.”

NIST further explained how the core baseline was developed:

  • The IOT device cybersecurity capability core baseline (core baseline) defined in this publication is a set of device capabilities generally needed to support commonly used cybersecurity controls that protect devices as well as device data, systems, and ecosystems.
  • The core baseline has been derived from researching common cybersecurity risk management approaches and commonly used capabilities for addressing cybersecurity risks to IOT devices, which were refined and validated using a collaborative public-private process to incorporate all viewpoints.
  • Regardless of an organization’s role, this baseline is intended to give all organizations a starting point for IOT device cybersecurity risk management, but the implementation of all capabilities is not considered mandatory. The individual capabilities in the baseline may be implemented in full, in part, or not at all. It is left to the implementing organization to understand the unique risk context in which it operates and what is appropriate for its given circumstance.

NIST 8259 is designed “give manufacturers recommendations for improving how securable the IOT devices they make are…[and] [t]his means the IOT devices offer device cybersecurity capabilities—cybersecurity features or functions the devices provide through their own technical means (i.e., device hardware and software)—that customers, both organizations and individuals, need to secure the devices when used within their systems and environments.”

NIST stated “[t]his publication describes six recommended foundational cybersecurity activities that manufacturers should consider performing to improve the securability of the new IOT devices they make…[and] [f]our of the six activities primarily impact decisions and actions performed by the manufacturer before a device is sent out for sale (pre-market), and the remaining two activities primarily impact decisions and actions performed by the manufacturer after device sale (post-market).” NIST claimed “[p]erforming all six activities can help manufacturers provide IOT devices that better support the cybersecurity-related efforts needed by IOT device customers, which in turn can reduce the prevalence and severity of IOT device compromises and the attacks performed using compromised IOT devices.” NIST asserted “[t]hese activities are intended to fit within a manufacturer’s existing development process and may already be achieved in whole or part by that existing process.”

In June 2019, NIST issued “Considerations for Managing Internet of Things (IOT) Cybersecurity and Privacy Risks” (NISTIR 8228) which is designed “to help organizations better understand and manage the cybersecurity and privacy risks associated with individual IOT devices throughout the devices’ lifecycles.” The agency claims the publication “provides insights to inform organizations’ risk management processes and “[a]fter reading this publication, an organization should be able to improve the quality of its risk assessments for IOT devices and its response to the identified risk through the lens of cybersecurity and privacy.” It bears note that from the onset of tackling IOT standards that NIST paired cybersecurity and privacy unlike its Cybersecurity Framework which addresses privacy as an important but ancillary concern to cybersecurity.

NIST explained that NIST Interagency or Internal Report 8228: Considerations for Managing Internet of Things (IOT) Cybersecurity and Privacy Risks is aimed at “personnel at federal agencies with responsibilities related to managing cybersecurity and privacy risks for IOT devices, although personnel at other organizations may also find value in the content.” NIST stated that “[t]his publication emphasizes what makes managing these risks different for IOT devices in general, including consumer, enterprise, and industrial IOT devices, than conventional information technology (IT) devices…[and] omits all aspects of risk management that are largely the same for IOT and conventional IT, including all aspects of risk management beyond the IOT devices themselves, because these are already addressed by many other risk management publications.”

NIST explained that “[t]his publication identifies three high-level considerations that may affect the management of cybersecurity and privacy risks for IOT devices as compared to conventional IT devices:

1. Many IOT devices interact with the physical world in ways conventional IT devices usually do not. The potential impact of some IOT devices making changes to physical systems and thus affecting the physical world needs to be explicitly recognized and addressed from cybersecurity and privacy perspectives. Also, operational requirements for performance, reliability, resilience, and safety may be at odds with common cybersecurity and privacy practices for conventional IT devices.

2. Many IOT devices cannot be accessed, managed, or monitored in the same ways conventional IT devices can. This can necessitate doing tasks manually for large numbers of IOT devices, expanding staff knowledge and tools to include a much wider variety of IOT device software, and addressing risks with manufacturers and other third parties having remote access or control over IOT devices.

3. The availability, efficiency, and effectiveness of cybersecurity and privacy capabilities are often different for IOT devices than conventional IT devices. This means organizations may have to select, implement, and manage additional controls, as well as determine how to respond to risk when sufficient controls for mitigating risk are not available.

NIST laid out “[c]ybersecurity and privacy risks for IOT devices can be thought of in terms of three high-level risk mitigation goals:

1. Protect device security. In other words, prevent a device from being used to conduct attacks, including participating in distributed denial of service (DDoS) attacks against other organizations, and eavesdropping on network traffic or compromising other devices on the same network segment. This goal applies to all IOT devices.

2. Protect data security. Protect the confidentiality, integrity, and/or availability of data (including personally identifiable information [PII]) collected by, stored on, processed by, or transmitted to or from the IOT device. This goal applies to each IOT device except those without any data that needs protection.

3. Protect individuals’ privacy. Protect individuals’ privacy impacted by PII processing beyond risks managed through device and data security protection. This goal applies to all IOT devices that process PII or that directly or indirectly impact individuals.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Free Creative Stuff from Pexels

Further Reading, Other Developments, and Coming Events (9 November)

Further Reading

  • Facebook bans ‘STOP THE STEAL’ group Trump allies were using to organize protests against vote counting” By Tony Romm, Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. A significant portion of the online activity among those on the right wing alleging that the Biden Campaign and Democrats have stolen the election is traceable to right-wing media influencers and it is less an organic effort. Moreover, Facebook has apparently had a mixed record in locating and taking down material that is seeking to spread lies about the integrity of the election and foment violence.
  • False News Targeting Latinos Trails the Election” By Patricia Mazzei and Nicole Perlroth — The New York Times. By the metrics used in the article (although it’s not clear exactly where the Times got its data), the disinformation in Spanish on social media in 2020 exceeded the Russian disinformation campaign in 2016. Apparently, Facebook, Twitter, and YouTube were not prepared or were not expecting the flood of lies, misinformation, and disinformation about President-elect Joe Biden or the Democrats generally, especially in South Florida where Republicans did much better than expected. Much of this content tied Biden to the former dictators of Cuba and Venezuela, Fidel Castro and Hugo Chavez.
  • Trump’s Tweeting Isn’t Crazy. It’s Strategic, Typos and All.” By Emily Dreyfuss — The New York Times. This piece traces the evolution of a campaign to paint the Biden family as engaged in criminal activity to both smear them and to blunt any criticism of the Trump family given the many and serious allegations of lawbreaking and unethical behavior.
  • TikTok invites UK lawmakers to review algorithm after being probed on China censorship concerns” By Sam Shead — CNBC. In testimony before the United Kingdom’s (UK) Parliament’s Business, Energy and Industrial Strategy Committee, TikTok’s head of policy in the UK said the platform used to censor content but then hedged the statement after the hearing in a statement. Prior to May 2019, the company hewed to the content wishes of the People’s Republic of China and material on Tiananmen Square was not on the platform. However, she did claim that TikTok’s data is stored in the United States with backups in Singapore, none of which goes to the PRC.
  • The Disinformation Is Coming From Inside the White House” By Matthew Rosenberg, Jim Rutenberg and Nick Corasaniti — The New York Times. Turns out much of the disinformation about alleged but unproven vote fraud is coming directly from the President, his advisers, his allies, and his family. It may come to pass that domestic disinformation, misinformation, and lies will have a larger impact than similar efforts from overseas.

Other Developments

  • Representative Ro Khanna (D-CA) introduced “The 21st Century Jobs Package” (H.R.8693) that establish a Federal Institute of Technology (FIT) and “allocates $900 billion in research & development (R&D) funding for emerging technologies like Advanced Manufacturing, Synthetic Biology, Artificial intelligence, Biotechnology, and Cybersecurity” according to his press release. In a summary, Khanna explained:
    • At the center of this proposal is the creation of a FIT, with presence in multiple locations around the country. These locations will initially take the form of additional facilities and faculty within or alongside existing universities and complementing ecosystems that are already dynamic. Over time, they will grow to include new stand-alone operations in areas without strong existing university bases. The vision, as in the past, is to marry federal resources and guidance with local initiative.
    • The proposed budget for this entire initiative is $900 billion over ten years. This would raise total public R&D spending to 1% of GDP by the end of the period, returning us to our role as an international leader. Most importantly, it would create as many as three million good new jobs per year. Many of these jobs would be in places that have fallen behind.
  • Australia’s Attorney-General has released an issues paper as a precursor of a possible rewrite of the country’s Privacy Act 1988 “to ensure privacy settings empower consumers, protect their data and best serve the Australian economy…as part of the government’s response to the Australian Competition and Consumer Commission’s Digital Platforms Inquiry” according to the its press release. The Attorney-General explained:
    • The review will examine and, if needed, consider options for reform on matters including:
    • The scope and application of the Privacy Act including in relation to:
      • the definition of ‘personal information’
      • current exemptions, and
      • general permitted situations for the collection, use and disclosure of personal information.
    • Whether the Privacy Act effectively protects personal information and provides a practical and proportionate framework for promoting good privacy practices including in relation to:
      • notification requirements
      • consent requirements including default privacy settings
      • overseas data flows, and
      • erasure of personal information.
    • Whether individuals should have direct rights of action to enforce privacy obligations under the Privacy Act.
    • Whether a statutory tort for serious invasions of privacy should be introduced into Australian law.
    • The impact of the notifiable data breach scheme and its effectiveness in meeting its objectives.
    • The effectiveness of enforcement powers and mechanisms under the Privacy Act and the interaction with other Commonwealth regulatory frameworks.
    • The desirability and feasibility of an independent certification scheme to monitor and demonstrate compliance with Australian privacy laws
  • The National Institute of Standards and Technology (NIST) has released for comment its “Draft Federal Information Processing Standard (FIPS) 201-3, Personal Identity Verification (PIV) of Federal Employees and Contractors (Standard).” NIST explained in the Federal Register notice:
    • This Standard defines common credentials and authentication mechanisms offering varying degrees of security for both logical and physical access applications. The draft revision proposes changes to FIPS 201-2, Standard for Personal Identity Verification of Federal Employees and Contractors to include: Expanding specification on the use of additional PIV credentials known as derived PIV credentials, procedures for supervised remote identity proofing, the use of federation as a means for a relying system to interoperate with PIV credentials issued by other agencies, alignment with the current practice/policy of the Federal Government and specific changes requested by Federal agencies and implementers. Before recommending these proposed changes to the Secretary of Commerce for review and approval, NIST invites comments from all interested parties.
    • In the draft document, NIST stated:
      • Authentication of an individual’s identity is a fundamental component of physical and logical access control. An access control decision must be made when an individual attempts to access security-sensitive buildings, information systems, and applications. An accurate determination of an individual’s identity supports making sound access control decisions. T
      • his document establishes a standard for a Personal Identity Verification (PIV) system that meets the control and security objectives of Homeland Security Presidential Directive-12 [HSPD-12]. It is based on secure and reliable forms of identity credentials issued by the Federal Government to its employees and contractors. These credentials are used by mechanisms that authenticate individuals who require access to federally controlled facilities, information systems, and applications. This Standard addresses requirements for initial identity proofing, infrastructure to support interoperability of identity credentials, and accreditation of organizations and processes issuing PIV credentials.
  • The Federal Communications Commission (FCC) announced a $200 million settlement with T-Mobile “to resolve an investigation of its subsidiary Sprint’s compliance with the Commission’s rules regarding waste, fraud, and abuse in the Lifeline program for low-income consumers” according to the agency’s press release. The FCC explained:
    • The payment is the largest fixed-amount settlement the Commission has ever secured to resolve an investigation.  The settlement comes after an Enforcement Bureau investigation into reports that Sprint, prior to its merger with T-Mobile, was claiming monthly subsidies for serving approximately 885,000 Lifeline subscribers even though those subscribers were not using the service, in potential violation of the Commission’s “non-usage” rule.  The matter initially came to light as a result of an investigation by the Oregon Public Utility Commission.  In addition to paying a $200 million civil penalty, Sprint agreed to enter into a compliance plan to help ensure future adherence to the Commission’s rules for the Lifeline program.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Walkerssk from Pixabay