Democrat Proposes Creating Data Protection Authority To Address Privacy

Another Senate Democrat has introduced a privacy and data security bill. Senator Kirsten Gillibrand’s “Data Protection Act of 2020” (S. 3300) would create a federal data protection authority along the lines of the agencies each European Union member nation has. This new agency would be the primary federal regulator of privacy laws, including a number of existing laws that govern the privacy practices of the financial services industries, healthcare industry, and others. This new agency would displace the Federal Trade Commission (FTC) regarding privacy matters but would receive similar enforcement authority but with the ability to levy fines in the first instance. However, state laws would be preempted only if they are contrary to the new regime, and state attorneys general could enforce the new law. A private right of action would not, however, be created under this law.

The bill would establish the Data Protection Agency (DPA), an independent agency headed by a presidentially nominated and Senate confirmed Director who may serve for a five year term normally or more time until a successor is nominated and confirmed. Hence, Directors would not serve at the pleasure of the President and would be independent from the political pressure Cabinet Members may feel from the White House. However, the Director may be removed for “inefficiency, neglect of duty, or malfeasance in office.” Generally, the DPA “shall seek to protect individuals’ privacy and limit the collection, disclosure, processing and misuse of individuals’ personal data by covered entities, and is authorized to exercise its authorities under this Act for such purposes.”

Personal data is defined widely as “any information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular individual or device” including a number of different enumerated types of data such as medical information, biometric information, browsing history, geolocation data, political information, photographs and videos not password protected, and others. The bill also creates a term “high-risk data practice” to cover the collection of processing of personal data that is sensitive, novel, or may have adverse, discriminatory real world effects and would be subject to heightened scrutiny and regulation. For example, new high-risk data practices “or related profiling techniques” may not be used before the DPA conducts “a formal public rulemaking process,” which under administrative law is usually meant as a lengthy process including a public hearing.

Those entities covered by the bill are “any person that collects, processes, or otherwise obtains personal data with the exception of an individual processing personal data in the course of personal or household activity,” an incredibly broad definition that sweeps in virtually any commercial entity collecting or processing personal data. There is no carve out for businesses below a certain revenue level or number of persons whose data they collect and process. Large covered entities would be subject to extra scrutiny from the DPA and extra responsibility. Entities falling into category are those with “gross revenues that exceed $25,000,000;” that buy, receive for the covered entity’s commercial purposes, sells, or discloses for commercial purposes the personal information of 50,000 or more individuals, households, or devices; or that drive “50 percent or more of its annual revenues from the sale of personal data.” The DPA “may require reports and conduct examinations on a periodic basis” from large covered entities to ensure compliance with federal privacy laws, examine their practices, compliance processes, and procedures, “detecting and assessing associated risks to individuals and groups of individuals;” and “requiring and overseeing ex-ante impact assessments and ex-post outcome audits of high-risk data practices to advance fair and just data practices.”

Most notably, it appears that the enforcement and rulemaking authority of current privacy statutes would be transferred to the agency, including Title V of the “Financial Services Modernization Act of 1999” (aka Gramm-Leach-Bliley), Subtitle D of the Health Information Technology for Economic and Clinical Health Act (i.e. HIPAA’s privacy provisions), the “Children’s Online Privacy Protection Act,” and the “Fair Credit Reporting Act.” Specifically, the bill provides “[t]he Agency is authorized to exercise its authorities under this Act and Federal privacy law to administer, enforce, and otherwise implement the provisions of this Act and Federal privacy law.” The bill defines “federal privacy law” to include all the aforementioned statutes. Consequently, the agencies currently enforcing the privacy provisions of those statutes and related regulations would turn over enforcement authority to the DPA. This, of course, is not without precedent. Dodd-Frank required the FTC to relinquish some of its jurisdiction to the Consumer Financial Protection Bureau (CFPB) to cite but one recent example. In any event, this approach sets the “Data Protection Act of 2020” apart from a number of the privacy bills, and aside from the policy elegance of housing privacy statutes and regulations at one agency, this would likely cause the current regulators and the committees that oversee them to oppose this provision of the bill.

The DPA would receive authority to punish unfair and deceptive practices (UDAP) regarding the collection, processing, and use of personal data, but unlike the FTC, notice and comment rulemaking authority to effectuate this authority as needed. However, like the FTC, before the agency may use its UDAP powers regarding unfairness, it must establish the harm would is causing or is likely to cause substantial injury, is unavoidable by the consumer, and is not outweighed by countervailing benefits.

The DPA would receive many of the same authorities the FTC currently has to punish UDAP violations, including injunctions, restitution, disgorgement, damages, and other monetary relief, and also the ability to levy civil fines. However, the fine structure is tiered with reckless and knowingly violations subject to much higher liability. The first tier would expose entities to fines of $5,000 per day the violation is occurring or that the entity fails to heed a DPA order. The language could use clarification as to whether this means per violation per day or just a per day fine regardless of the number of separate violations. Nonetheless, the second tier is for reckless violations and the fines could be as high as $25,000, and the third tier for knowing violations for $1,000,000. However, the DPA must either give notice to entities liable to fines an opportunity and a hearing before levying a fine through its administrative procedures or go to federal court to seek a judgment. However, the DPA could enforce the other federal privacy laws under their terms and not bring to bear the aforementioned authority.

There would be no preemption of state laws to the extent such privacy laws are not inconsistent with the “Data Protection Act of 2020” and states may maintain or institute stronger privacy laws so long as they do not run counter to this statute. This is the structure used under Gramm-Leach-Bliley, and so there is precedence. Hence, it is possible there would be a federal privacy floor that some states like California could regulate above. However, the bill would not change the preemption status quo of the federal privacy laws the DPA will be able to enforce, and those federal statutes that preempt state laws would continue to do so. State attorneys general could bring actions in federal court to enforce this law, but no federal private right of action would be created.

Of course, the only other major privacy and data security bill that would create a new agency to regulate these matters instead of putting the FTC in charge is Representatives Anna Eshoo (D-CA) and Zoe Lofgren’s (D-CA) bill, the “Online Privacy Act of 2019” (H.R. 4978) that would create the U.S. Digital Privacy Agency (DPA) that would supersede the FTC on many privacy and data security issues. For many sponsors of privacy bills, creating a new agency may be seen as a few bridges too far, and so they have opted to house new privacy regulation at the FTC.

Finally, as can be seen in her press release, Gillibrand’s bill has garnered quite a bit of support from privacy and civil liberties advocates, some of which generally endorses the idea of a U.S. data protection authority and not this bill per se. Nonetheless, this is another bill that is on the field, and it remains to be seen how much Gillibrand will engage on the issue. It also bears note that she serves on none of the committees of jurisdiction in the Senate.

Spotlight: A Privacy Bill A Week: “Consumer Data Protection Act”

Last week, we dived into Senator Catherine Cortez Masto’s (D-NV) “Digital Accountability and Transparency to Advance Privacy Act” (DATA Privacy Act) (S. 583). Of course, Cortez Masto served as the attorney general of Nevada for eight years prior to succeeding former Senator Harry Reid (D-NV), and this bill demonstrates her background as her state’s top prosecutor. This week, we will analyze the most stringent, most pro-consumer bill on privacy that I have seen introduced in this or the last Congress.

In November, Senate Finance Committee Ranking Member Ron Wyden (D-OR) released the “Consumer Data Protection Act” discussion draft, section-by-section, and one-pager, legislation not to be confused with Senator Bob Menendez’s (D-NJ) “Consumer Data Protection Act” (S. 2188), a data security and breach notification bill. In short, Wyden’s bill would vastly expand the power of the Federal Trade Commission (FTC) to police both the security and privacy practices off many U.S. and international multinational companies. The FTC would receive the authority to levy fines in the first instance, potentially as high as the European Union’s General Data Protection Regulation of 4% of annual gross revenue. Moreover, the operative definition of the “personal information” that must be protected or subject to the privacy wishes of a consumer is very broad. The bill would also sweep into the FTC’s jurisdiction artificial intelligence (AI) and algorithms (i.e. so-called big data).

The “Consumer Data Protection Act” would dramatically expand the types of harms the FTC could use its authority to punish to explicitly include privacy violations and noneconomic injuries. Currently, the FTC must use its Section 5 powers to punish unfair and deceptive practices, or another statutory basis such as COPPA, to target the privacy practices it considers unacceptable. Wyden’s bill would allow the FTC to enforce the FTC Act, as amended by his bill, to punish “noneconomic impacts and those creating a significant risk of unjustified exposure of personal information” as among those “substantial injur[ies]” made illegal. It is worth seeing the proposed language in the context of the section of the FTC’s organic statute (i.e. 15 U.S.C. 45(n)):

(n) Standard of proof; public policy considerations

The Commission shall have no authority…to declare unlawful an act or practice on the grounds that such act or practice is unfair unless the act or practice causes or is likely to cause substantial injury including those involving noneconomic impacts and those creating a significant risk of unjustified exposure of personal information to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition (emphasis added to differentiate the language the bill would add.)

The FTC’s new authority would likely be defined in court actions to test the outer limits of what constitutes “noneconomic impacts” and the types of substantial injuries that create a significant risk of unjustified exposure of personal information. If this language were enacted, undoubtedly industry groups and conservative advocacy organizations would zealously search for test cases to try to circumscribe this authority as narrowly as possible. Finally, it bears note that this sort of language harkens back to the FTC’s construction of its statutory powers in the 1960’s and 1970’s that was considered so expansive that a Democratic Congress reined in the agency and limited its purview.

The FTC’s authority to levy civil fines through an administrative proceeding would be dramatically expanded along the lines of the EU’s power to levy massive fines under the General Data Protection Regulation. Notably, without securing a court order, the agency could impose civil fines as part of a cease and desist order which shall be the higher of $50,000 per violation or 4% of the annual gross revenue of the offender in the previous fiscal year. The upper limits of such a fine structure get very high, very quickly. For example, a violation with 100,000 people affected yields an upper boundary of $5 billion assuming one violation per person. The privacy violations associated with Facebook’s conduct with Cambridge Analytica affected 87 million worldwide, and again assuming one violation per person, the upper boundary of the fine the FTC could levy would be $4,350,000,000,000. However, the FTC would likely not exercise this power to the utmost possible fine but rather dial back the fine to a more reasonable but still punitive amount. Nonetheless, the FTC would have the ability to recover up to $50,000 per violation or 4% of gross annual revenue for any violations of cease and desist orders by filing an action in federal court.

Despite expanding the FTC’s powers dramatically, those entities subject to the agency’s new enforcement powers would not include many medium and small businesses. Covered entities are described as those entities with more “than $50,000,000 in average annual gross receipts for the 3-taxable-year period preceding the fiscal year” and the “personal information” of more than 1,000,000 consumers, and 1,000,000 consumer devices. Additionally, a covered entity may be an affiliate or subsidiary of an entity that meets the aforementioned qualifications. Finally, the term “covered entity” covers all data brokers or commercial entities “that, as a substantial part of their business, collects, assembles, or maintains personal information concerning an individual who is not a customer or an employee of that entity in order to sell or trade the information or provide third- party access to the information.”

Additionally, a subset of these covered entities with more than $1 billion in annual revenues that “stores, shares, or uses personal information on more than 1,000,000 consumers or consumer devices” or those “that stores, shares, or uses personal information on more than 50,000,000 consumers or consumer devices” must submit annual data protection reports to the FTC. Those entities must report “in detail whether, during the reporting period, the covered entity complied with the regulations” the FTC will promulgate to effectuate the “Consumer Data Protection Act” and the extent to which they did not comply by detailing which regulations were violated and the number of consumers affected.

Each report must “be accompanied by a written statement by the chief executive officer, chief privacy officer (or equivalent thereof), and chief information security officer (or equivalent thereof) of the company” that certifies the report fully complies with the requirements of the new statute. If any such person certifies an annual data protection report while knowing it does not meet the requirements of this section or with intentional knowledge it does not faces jail time and/or a personal fine based on income depending on which state of knowledge the actor had in falsely certifying a report. Any CEO, chief privacy officer, or chief information security officer that knowingly certifies a false report faces a fine of the greater of $1 million or 5% of the highest annual compensation for the previous three years and up to ten years in prison. Intentional violations expose these corporate officials to the greater of a $5 million fine or 25% of the highest annual compensation for the previous three years and 20 years in prison.

Of course, falsely certifying knowing that a report fails to meet all the requirement exposes a person to less criminal liability than intentionally certifying. However, the substantive difference between knowing certification and intentional certification is not immediately clear. Perhaps the bill intends knowing to be constructive knowledge (i.e. known or should have known) while intentionality in this context means actual knowledge.

With respect to the information covered entities would need to safeguard, the bill defines “personal information,” which is “any information, regardless of how the information is collected, inferred, or obtained that is reasonably linkable to a specific consumer or consumer device,” which is a very broad definition. Wyden’s bill also defines “use,” “share,” and “store” in the context of personal information:

  • “share’’—
    • means the actions of a person, partnership, or corporation transferring information to another person, partnership, or corporation; and
    • includes actions to knowingly—
    • share, exchange, transfer, sell, lease, rent, provide, disclose, or otherwise permit access to information; or
    • enable or facilitate the collection of personal information by a third party.
  • ‘‘store’’—
    • means the actions of a person, partnership, or corporation to retain information; and
    • includes actions to store, collect, assemble, possess, control, or maintain information.
  • ‘‘use’’ means the actions of a person, partnership, or corporation in using information, including actions to use, process, or access information.

The FTC would be required to promulgate detailed regulations discussed in more detail below within two years of enactment. This timeline may be more realistic than many of the other bills which task the agency with detailed, extensive rulemakings within a year, a deadline the FTC may have trouble meeting. Nonetheless, the agency could take the first year or even 15 months to draft proposed regulations for comment.

The bill would task the FTC with establishing and running a ‘‘Do Not Track’’ data sharing opt-out website that would stop covered entities from sharing a consumer’s personal information subject to certain exceptions including the use of personal information acquired before a consumer opts out. These would be in the case when a covered entity needs to share the information to achieve the primary purpose under which the information was initially acquired. Additionally, this bar would be in effect for personal information a covered entity acquires from non-covered entities.

The FTC would also need to determine technological means that a consumer’s opt-out on its website can be effectuated through web browsers or operating systems. The agency would also need to devise a method by which covered entities could determine which consumers have opted out, possibly through the development of an FTC Application Programming Interface (API). Thereafter, covered entities would have a duty to check at regular intervals the FTC’s opt-out database to ensure they are honoring the consumers’ decisions to opt out. Covered entities would not need to respect a consumer’s desire to opt-out in the event of required legal disclosures they need to make to the government such as under warrants or subpoenas. The FTC would also need to “establish standards and procedures, including through an API, for a covered entity to request and obtain consent from a consumer who has opted-out…for the covered entity to not be bound by the opt-out,” including providing a list of third parties with whom personal information might be shared and a description of such information. And, if the covered entity requires consumers to consent to usage of their personal information before its products or services can be used, then the covered entity must “notify the consumer that he or she can obtain a substantially similar product or service in exchange for monetary payment or other compensation rather than by permitting the covered entity to share the consumer’s personal information.”

The FTC must also “establish standards and procedures requiring that when a non-covered entity that is not the consumer shares personal information about that consumer with a covered-entity, the covered entity shall make reasonable efforts to verify the opt-out status of the consumer whose personal information has been shared with the covered entity.” Thereafter covered entities may only use or store this personal information if a consumer has not opted out on the FTC’s website or if the covered entity has received the consumer’s consent for non-covered entities to collect and share their information.

Additionally, the FTC must draft regulations detailing the “standards and procedures” covered entities and non-covered entities must follow “to request and obtain consent from a consumer…that clearly identifies the covered entity that will be storing or using the personal information and provides the consumer” at the time consent is sought. Consumers must be informed “in a form that is understandable to a reasonable consumer” detailing the entity from whom personal information is to be obtained, the type of personal information to be collected, and the purposes for which such information shall be used.

Certain acts would be prohibited. Covered entities could not require consumers to change their opt-out election on the FTC’s website in order to access products and services “unless the consumer is also given an option to pay a fee to use a substantially similar service that is not conditioned upon a requirement that the consumer give the covered entity consent to not be bound by the consumer’s opt-out status.” Moreover, this fee “shall not be greater than the amount of monetary gain the covered entity would have earned had the average consumer not opted-out.”

Wyden’s bill also marries data security requirements with privacy protections for consumers, a position articulated by a number of prominent Democrats. Notably, the FTC would need to promulgate regulations that

  • require each covered entity to establish and implement reasonable cyber security and privacy policies, practices, and procedures to protect personal information used, stored, or shared by the covered entity from improper access, disclosure, exposure, or use;
  • require each covered entity to implement reasonable physical, technical, and organizational measures to ensure that technologies or products used, produced, sold, offered, or leased by the covered entity that the covered entity knows or has reason to believe store, process, or otherwise interact with personal information are built and function consistently with reasonable data protection practices;

The FTC would also need to draft regulations requiring “each covered entity to provide, at no cost, not later than 30 business days after receiving a written request from a verified consumer about whom the covered entity stores personal information” a way to review any personal information stored, including how and when such information was acquired and a process for challenging the accuracy of any stored information. Additionally, these regulations would “require each covered entity to correct the stored personal information of the verified consumer if, after investigating a challenge by a verified consumer…the covered entity determines that the personal information is inaccurate.” Covered entities would also need to furnish a list of the entities with whom the consumer’s personal information was shared and other detailed information, including the personal information of the consumer the covered entity acquired not from the consumer but a third party.

The “Consumer Data Protection Act” would also institute regulations and requirements related to the increasing use of so-called “big data,” algorithms, machine learning, and artificial learning. The FTC would need to promulgate regulations mandating that each covered entity must “conduct automated decision system impact assessments of existing high-risk automated decision systems, as frequently as the Commission determines is necessary; and…new high-risk automated decision systems, prior to implementation.” However, it would be helpful to examine the bill’s definitions of ‘‘automated decision system,’’ “automated decision system impact assessment,’’ ‘‘high-risk automated decision system’’ and “high-risk information system:”

  • ‘‘automated decision system’’ means “a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts consumers.
  • “automated decision system impact assessment’’ means a study evaluating an automated decision system and the automated decision system’s development process, including the design and training data of the automated decision system, for impacts on accuracy, fairness, bias, discrimination, privacy, and security
  • ‘‘high-risk automated decision system’’ means an automated decision system that—
    • taking into account the novelty of the technology used and the nature, scope, context, and purpose of the automated decision system, poses a significant risk—
      • to the privacy or security of personal information of consumers; or
      • of resulting in or contributing to inaccurate, unfair, biased, or discriminatory decisions impacting consumers;
      • makes decisions, or facilitates human decision making, based on systematic and extensive evaluations of consumers, including attempts to analyze or predict sensitive aspects of their lives, such as their work performance, economic situation, health, personal preferences, interests, behavior, location, or movements, that—
        • alter legal rights of consumers; or
        • otherwise significantly impact consumers;
        • involves the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests;
        • systematically monitors a large, publicly accessible physical place; or
    • meets any other criteria established by the Commission in regulations…
  • ‘high-risk information system’’ means an information system that—
    • taking into account the novelty of the technology used and the nature, scope, context, and purpose of the information system, poses a significant risk to the privacy or security of personal information of consumers;
    • involves the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests;
    • systematically monitors a large, publicly accessible physical place; or
    • meets any other criteria established by the Commission in regulations…

Consequently, algorithmic decision-making would be swept into the FTC’s new regime to govern privacy and data security. However, politically, this is not close to being on most Members’ consciousness as being related to privacy and data security. This reality marks the “Consumer Data Protection Act” as among the most forward looking of the bills that have been introduced over the last year. And, yet it is likely that any privacy or data security bill Congress passes will not include such provisions; however, a state like California could decide to wade into this area, which, again like with privacy, this could force policymakers in Washington to consider an issue percolating up to the federal level from one of the state laboratories of democracy.

In terms of enforcement, the bill explicitly bars the use of any contracts contrary to the rights and requirements in the “Consumer Data Protection Act.” Like virtually all the other bills on privacy, the FTC would be able to ask a federal court for civil fines for a first offense as high as a bit more than $40,000 per violation in addition to all the FTC’s other powers.

This bill is likely the outer bounds desired by the most ardent privacy and civil liberties advocate, and therefore is highly unlikely to get enacted in its current form. Other Democratic bills are far more modest in scope, and few of them address both security and privacy. The chances of enactment are very low, but Congressional interest in privacy legislation will continue because of the GDPR and the California Consumer Privacy Act.

Fall Preview For Technology Legislation

With Congress having returned from the August recess, bright-eyed and bushy-tailed, a host of bills are awaiting these eager lawmakers. However, I will focus only on those bills that have been marked up and reported out of committee or have been passed by one chamber as these bills may be the most likely to be enacted. Of course, there are other issue areas Congress may address with legislation this fall, but as yet, legislation has neither been introduced nor marked up (e.g. privacy, data security, and the PATRIOT Act reauthorization.)

And, it should be noted that past could be prologue with respect to a PATRIOT Act reauthorization. As you might recall, what became the “Cybersecurity Act of 2015” (P.L. 114-113) was effectively blocked because of fighting over expiring PATRIOT Act provisions that were ultimately reauthorized as modified in the “USA Freedom Act” (P.L. 114-23). Therefore, until Congress reauthorizes these provisions, and I think it highly likely they will, it is possible technology-related legislation will be essentially used as leverage by proponents and opponents to see their preferred policy outcome enacted. Having said that, there are a number of technology-related bills that have been reported out of committee or come to the floor of one chamber or the other.

First, and possibly foremost, since this reauthorization has been enacted annually since the Kennedy Administration, is the FY 2020 National Defense Authorization Act (NDAA) (H.R. 2500/S. 1790). As cybersecurity has grown in prominence nationally and at the Pentagon, provisions dealing with this topic area have proliferated. Consequently, both bills are stuffed with statutory language ranging from supply chain to acquisition to offensive and defensive cyber operations, and other facets of cybersecurity. Likewise, the committee reports are also full of directives , mainly to the Pentagon, regarding actions, programs, briefings, and reports Congress would like the Department of Defense to undertake. Both NDAAS have passed their respective chambers and the Armed Services Committees have been working on reconciling the bills. Incidentally, the Senate attached its FY 2018, 2019, and 2020 Intelligence Authorization to S. 1790, which is also replete with cyber-related provisions for the Intelligence Community (i.e. the “Damon Paul Nelson and Matthew Young Pollard Intelligence Authorization Act for Fiscal Years 2018, 2019, and 2020” (S. 1589)). On July 17, the House passed the “Damon Paul Nelson and Matthew Young Pollard Intelligence Authorization Act (IAA) for Fiscal Years 2018, 2019, and 2020” (H.R. 3494) by a 397-31 vote. Therefore, it is possible that the NDAA also carries the intelligence reauthorization to enactment.

Speaking of annually enacted vehicles to effect technology policy, all twelve of the FY 2020 appropriations acts have yet to be enacted. A. number of the bills contain crucial language on cybersecurity and technology funding with a handful of bills being most important with respect to funding: the Homeland Security, Department of Defense, Financial Services and General Government, and the Commerce-Justice-Science appropriations acts. Despite having struck a deal on top-lines, it is not clear that Congress will enact of its appropriations bills before the current year ends on September 30. Therefore, we may be looking a continuing resolution into the fall, ideally followed by an omnibus or series of bills packaged together to fund FY 2020 programs. For example, the “FY 2020 Homeland Security Appropriations Act” would provide the Cybersecurity  and  Infrastructure  Security  Agency  (CISA) $2.016 billion for FY 2020, a boost of $334 million above its FY 2019 funding level and $408 million above the Administration’s budget request.”

Election security will likely be an area around which there will be intense messaging but less legislative action. House Democrats made election security reform a policy priority in large part because of the Russian interference and hacking in the 2016 election. The House has sent substantially the same legislation in two bills (i.e. the “For The People Act of 2019” (H.R. 1), a package of election reforms, and  the “Securing America’s Federal Elections (SAFE) Act of 2019” (H.R. 2722)) to the Senate where Senate Majority Leader Mitch McConnell (R-KY) has refused to consider them or Senate bills. Broadly speaking these bills would authorize funding and establish federal standards for states and localities in improving and upgrading their election systems from hacks and attacks. Incidentally, the $600 million in election grants these bills call for was provided in the “Financial Services and General Government Appropriations Act, 2020” (H.R. 3351) the House passed in June.

As noted, at the end of July, after the Senate Intelligence Committee released the first of the five volume report on the 2016 presidential election, Senators Richard Blumenthal (D-CT), Mark Warner (D-VA), Amy Klobuchar (D-MN), and others sought unanimous consent to proceed to a number of election security related bills but were blocked by Senate Republicans. The bills Senate Democrats tried to bring up for immediate consideration included:

  • The “Duty To Report Act” (S. 1247)
  • The “FIRE Act” (S. 2242)
  • The “Senate Cybersecurity Protection Act” (S. 890)
  • The “Securing America’s Federal Elections Act” (SAFE Act) (H.R. 2722)

The Senate did, however, pass the “Defending the Integrity of Voting Systems Act” (S. 1321) by unanimous consent on July 17. S. 1321 would “make it a federal crime to hack any voting systems used in a federal election” according to the Senate Judiciary Committee’s website. In June the Senate also passed the “Defending Elections against Trolls from Enemy Regimes (DETER) Act” (S. 1328) that “will make “improper interference in U.S. elections” a violation of U.S. immigration law, and violators would be barred from obtaining a visa to enter the United States. The House has yet to act on these bills. However, despite action on S. 1321 and 1328, Senate Democrats seem intent on continuing to try and force consideration of election security legislation. It is unclear whether McConnell will relent.

Likewise, the House has also began legislation to punish those found guilty of interfering with U.S. elections. In July the House Foreign Affairs Committee met and marked up a number of bills, including: the “Safeguard our Elections and Combat Unlawful Interference in Our Democracy Act” (SECURE Our Democracy Act) (H.R. 3501) “would impose sanctions on anyone found to interfere illegally in an American election from overseas…[and] is designed to punish Russian interference in the 2016 election and also deter future election interference” according to the Committee’s press release.

Congress also has pending a number of bills focused on the federal government’s cybersecurity posture and capabilities. In January, the House passed the “Federal CIO Authorization Act of 2019” (H.R. 247) that would codify the positions of Chief Information Officer (CIO) and Chief Information Security Officer (CISO), make the positions presidential appointments, require the CIO to report directly to the Office of Management and Budget (OMB) Director, require each agency to submit reports on all IT expenditures to the CIO, and task the CIO with submitting a plan to Congress “for consolidating information technology across the Federal Government…and increasing the use of shared services, including any recommendations for legislative changes that may be necessary to effect the proposal.” H.R. 247 is identical to a bill, the “Federal CIO Authorization Act of 2018” (H.R. 6901), the House overwhelmingly passed in December, but the Senate never took up the bill.

On July 17, the House Homeland Security Committee held a markup and reported out four such cybersecurity bills:

  • The “Securing the Homeland Security Supply Chain Act of 2019” (H.R. 3320) would “authorize the Secretary of Homeland Security to implement certain requirements for information relating to supply chain risk” with authority similar to those granted to the Department of Defense in the FY 2019 National Defense Authorization Act to exclude contractors with unacceptable supply chain risks.
  • The “DHS Acquisition Reform Act of 2019” (H.R. 3413) would “provide for certain acquisition authorities for the Under Secretary of Management of the Department of Homeland Security.”
  • The Pipeline Security Act (H.R. 3699) would “codify the Transportation Security Administration’s responsibility relating to securing pipelines against cybersecurity threats, acts of terrorism, and other nefarious acts that jeopardize the physical security or cybersecurity of pipelines.”
  • The “Cybersecurity Vulnerability Remediation Act” (H.R. 3710) would permit but not require the Cybersecurity and Infrastructure Security Agency (CISA) to “identify, develop, and disseminate actionable protocols to mitigate cybersecurity vulnerabilities, including in circumstances in which such vulnerabilities exist because software or hardware is no longer supported by a vendor.”

In June, the House took up and passed the “DHS Cyber Incident Response Teams Act of 2019” (H.R. 1158), as amended, by voice vote. H.R. 1158 would require the Cybersecurity and Infrastructure Security Agency’s (CISA) National Cybersecurity and Communications Integration Center (NCCIC) to “maintain cyber hunt and incident response teams for the purpose of providing, as appropriate and upon request, assistance “to asset owners and operators in restoring services following a cyber incident” among other circumstances. NCCIC must “continually assess and evaluate the cyber incident response teams and their operations using robust metrics” and may “include cybersecurity specialists from the private sector on cyber hunt and incident response teams.” A related bill has been marked up and reported out of the Senate Homeland Security and Governmental Affairs Committee, the “DHS Cyber Hunt and Incident Response Teams Act of 2019” (S. 315), that would charge NCCIC and CISA with substantially the same missions. The Senate Homeland Security Committee marked up and reported out two other such bills:

  • The “National Cybersecurity Preparedness Consortium Act of 2019” (S. 333) would allow the Department of Homeland Security to “work with a consortium to support efforts to address cybersecurity risks and incidents.” Consortiums are defined to be “a group primarily composed of nonprofit entities, including academic institutions, that develop, update, and deliver cybersecurity training in support of homeland security.”
  • The “Federal Rotational Cyber Workforce Program Act of 2019” (S. 406), which would establish a program under which cybersecurity employees would rotate at federal agencies.

In July, the Senate Homeland Security Committee marked up and reported out the “State and Local Government Cybersecurity Act of 2019” (S. 1846) that would provide the Department of Homeland Security (DHS) the authority “[t]o make grants to and enter into cooperative agreements or contracts with States, local governments, and other non-Federal entities” and direct the National Cybersecurity and Communications Integration Center (NCCIC) to work with “with Federal and non-Federal entities, such as the Multi-State Information Sharing and Analysis Center” on addressing a variety of cybersecurity-related responsibilities.

Congress also has proposed measures targeted at small businesses. On July 15, the House took and passed a pair of cybersecurity bills from the suspension calendar:

  • The “SBA Cyber Awareness Act” (H.R. 2331) would “require the Small Business Administrator (SBA) to issue annual reports assessing its IT and cybersecurity infrastructure and notify Congress and affected parties of cyber incidents when they occur.”
  • The “Small Business Development Center Cyber Training Act of 2019” (H.R. 1649) “help Small Business Development Centers (SBDCs) become better trained to assist small businesses with their cyber security and cyber strategy needs…[and] would establish a cyber counseling certification program in lead SBDCs to better assist small businesses with planning and implementing cybersecurity measures to defend against cyber attacks.”

Congress has also initiated legislation to better regulate the energy sector’s cybersecurity. On July 17, the House Energy and Commerce Committee marked up a quartet of energy sector cybersecurity bills:

  • The “Enhancing Grid Security through Public-Private Partnerships Act” (H.R. 359) “directs the Secretary of Energy, in consultation with States, other federal agencies, and industry stakeholders, to create and implement a program to enhance the physical and cyber security of electric utilities.
  • The “Cyber Sense Act of 2019” (H.R. 360) would establish “voluntary program [that] would identify cyber-secure products that could be used in the bulk- power system.”
  • The “Energy Emergency Leadership Act” (H.R. 362) would “create a new DOE Assistant Secretary position with jurisdiction over all energy emergency and security functions related to energy supply, infrastructure, and cybersecurity.”
  • The “Pipeline and LNG Facility Cybersecurity Preparedness Act” (H.R. 370) “would establish a program at DOE, in coordination with other Federal agencies, States, and the energy sector, to create policies and procedures to improve the physical and cyber security and resiliency of natural gas transmission and distribution pipelines, hazardous liquid pipelines, and liquefied natural gas (LNG) facilities.”

There are two bills regarding the Internet of Things that have been reported out of committee. On July 10, the Senate Commerce, Science, and Transportation Committee held a markup and reported out the “Developing Innovation and Growing the Internet of Things (DIGIT) Act” (S. 1611) sponsored by Senators Deb Fischer (R-NE), Cory Gardner (R-CO), Brian Schatz (D-HI), and Cory Booker (D-NJ). In her press release, Fischer explained the bill would “would convene a working group of federal entities and experts from the private and academic sectors tasked with providing recommendations to Congress on how to facilitate the growth of connected Internet of Things (IoT) technologies.” She added that “[t]he group’s recommendations would focus on how to plan for, and encourage, the development and deployment of the IoT in the U.S…[and] directs the Federal Communications Commission (FCC) to complete a report assessing spectrum needs required to support the Internet of Things.” S. 1611 is substantially similar to legislation (S. 88) the Senate passed unanimously in the last Congress the House never took up. It is not clear whether the same resistance exists in the House, but unlike the last Congress a companion DIGIT Act has not yet been introduced in the House.

Earlier this year, two versions of the same IoT bill were marked up and reported out of committee. The Senate Homeland Security and Governmental Affairs Committee marked up and reported out the “Internet of Things Cybersecurity Improvement Act of 2019” (S. 734) a week after the House Oversight and Reform Committee acted on the “Internet of Things Cybersecurity Improvement Act of 2019” (H.R. 1668) after adopting an amendment in the nature of a substitute that narrowed the scope of the bill. In general, these bills seek to leverage the federal government’s ability to set standards through acquisition processes to ideally drive the development of more secure IoT across the U.S. The stakeholders are responding to the security risks presented by weak or nonexistent security for IoT as seen in a number of major malware attacks. The legislation would require the NIST, the OMB, and the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) to work together to institute standards for IoT owned or controlled by most federal agencies. These standards would need to focus on secure development, identity management, patching, and configuration management and would be made part of Federal Acquisition Regulations (FAR), making them part of the federal government’s approach to buying and utilizing IoT. Thereafter, federal agencies and contractors would need to use and buy IoT that meets the new security standards.

Finally, House Democrats have made rolling back the Federal Communications Commission’s (FCC) repeal of the Obama Administration’s Open Internet Order (aka net neutrality) a priority. On April 3, the House Energy and Commerce Committee marked up and reported out the “Save the Internet Act of 2019” (H.R. 1644) that would undo the Federal Communications Commission’s (FCC) repeal of the Obama Administration’s 2015 net neutrality order and reclassify internet service providers (ISPs) under Title II of the Federal Communications Act as common carriers. The bill was subsequently passed by the House by a 232-190 vote, but the Senate has not yet taken up the bill and likely will not.