For this week, let’s examine a House bill, the “Information Transparency & Personal Data Control Act” (H.R. 2013) which is sponsored by Suzan DelBene (D-WA) and cosponsored by 22 other House Democrats. DelBene worked in Washington state’s technology sector before transitioning to public service, including a stint with Microsoft. At present, this is not a bipartisan bill and consequently may be viewed as one of the House Democratic bills released this Congress.
This bill’s profile was raised a bit last week the the New Democrat Coalition, “the largest ideological House caucus…more than forty percent of the Democratic Caucus” according to their website, have formally endorsed H.R. 2013. The group says of itself: “[t]he New Democrat Coalition is made up of 104 forward-thinking Democrats who are committed to pro-economic growth, pro-innovation, and fiscally responsible policies.” In their press release, the New Democrat Coalition summarized H.R. 2013 thusly:
This bill will give people control over their most sensitive information and improve enforceability. This legislation requires the Federal Trade Commission (FTC) to mandate disclosure from companies on what information they are collecting and why, especially if it is being shared with another party.
The primary sponsor of the “Information Transparency & Personal Data Control Act,” Suzan DelBene, serves as the Vice Chair for Policy Coordination for the New Democrat Coalition.
And, while the New Democrat Coalition may be the largest single group among House Democrats, their endorsement does not necessarily mean H.R. 2013 will now become the party’s de facto bill. Firstly, Speaker Nancy Pelosi (D-CA) has said she will oppose any bill that would weaken strong state laws like those in California under the soon to take effect “California Consumer Privacy Act” (CCPA) (A.B. 375). This is a position shared by a number of Democrats in the House or Senate. H.R. 2013 is not nearly as stringent a bill as the CCPA even though it does not entirely preempt state laws, so in order for this bill to pass the House, the bill itself would need to change or Members like Pelosi would need to soften their position. Also 23 of the New Democrats are from California and would likely feel pressure from some California stakeholders to oppose any bill that would weaken the CCPA and quite possibly pressure from the Speaker herself, too. Moreover, DelBene does not sit on House Energy and Commerce, the primary committee of jurisdiction, and it is more likely than any bill the House considers will be drafted by the Democrats on the committee such as Chair Frank Pallone Jr (D-NJ) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL).
However, let’s turn to the substance of H.R. 2013. Generally, this bill would require that all data “controllers” must secure opt-in consent from consumers to collect, use, share, or sell their “sensitive personal information” subject to significant exceptions. Controllers would need to draft and publish their data usage, security, and privacy plans, and then be audited annually by independent, third-parties. The FTC would implement and oversee this new regime with state attorneys general being able to bring enforcement actions if the FTC does not act. Controllers who violate the new standards would be subject to enforcement including fines in the first instance and injunctive and equitable remedies under the FTC Act.
In terms of who would be part of the new privacy regulation scheme, the bill sweeps fairly wide. A “controller” is defined as “a person that, on its own or jointly with other entities, determines the purposes and means of processing sensitive personal information.” The bill would explicitly pull “common carriers” (i.e. telecommunications companies) into the FTC’s jurisdiction. Common carriers are normally subject to the jurisdiction of the Federal Communications Commission in regards to privacy. However, because common carriers are explicitly named as being part of the FTC’s jurisdiction, that would suggest that other entities not usually under the agency’s jurisdiction would not be subject to this bill (e.g. non-profits). Would entities all over the world that qualify as controllers or processors be subject to the FTC’s enforcement powers the way U.S. firms are subject to the General Data Privacy Regulation (GDPR)? It would seem so.
Also, the FTC would have jurisdiction over “processors” who are people “that process data on behalf of the controller,” meaning that data brokers may get swept into the new privacy protection regulatory regime. However, it is not immediately clear if a data broker would be considered a controller or a processor. And finally, unlike some proposed data security bills, there is no carve out for entities subject to and in compliance with existing federal data security and privacy regimes like HIPAA and Gramm-Leach-Bliley.
In terms of implementation, like many other privacy bills, the FTC would be required to promulgate regulations within one year under the Administrative Procedure Act (i.e. notice and comment rulemaking) instead of the lengthier Moss-Magnuson procedures the agency usually must use. These regulations would put in place the requirements that controllers and processors of data would need to meet, including obtaining opt-in consent from consumers before their data could be collected and shared. As a general matter, consumers would need to opt-into the use and sharing of their “sensitive personal information” but they would need to opt-out of such practices if they pertain to “non-sensitive personal information.” The dividing line between the two types of information would be crucial, and the bill provides broad categories of information that would qualify as “sensitive personal information.” The FTC will undoubtedly need to flesh out some of the categories of “sensitive personal information” such as “health information,” “genetic information,” “biometric information,” and other terms.
Likewise, the FTC will need to grapple with the term “information related to employment,” which is one of the categories of non-sensitive personal information controllers would not need opt-in consent to collect, share, and use. It is easy to see how this term may overlap with some categories of sensitive personal information such as health information, Social Security number, financial account information, genetic information, and/or biometric information amongst others. This discussion of non-sensitive personal information also must mention another significant exception: “de-identified information (or the process of transforming personal data so that it is not directly relatable to an identified or identifiable consumer).” This provision seems to provide an incentive to controllers to de-identify sensitive personal information to the extent possible so that it is protected in the event of unauthorized access of acquisition but also so that it may be subject to the lesser requirements due for handling and using non-sensitive personal information. Presumably encrypting sensitive personal information would result in it being de-identified, for properly encrypted data could not be traced back to an identified or identifiable consumer. The bill is not entirely clear, and the FTC may well see the need to fill this gap when it promulgates regulations to effectuate this provision if it is enacted.
The FTC would also be charged with enforcing the new regime, but state attorneys general would also be empowered to bring enforcement actions in certain situations. Notably, state attorneys general could bring actions in the event the FTC does not act regarding alleged violations. However, state attorneys general would not be able to seek the full range of remedies available to the FTC and would instead only be able “to obtain appropriate injunctive relief,” which may be temporary and permanent injunctions, disgorgement, restitution, rescission, and other such relief. But, a recent Seventh Circuit case (see article below) may cause the sponsors to broaden this term to all equitable relief to ensure that all such remedies may be sought.
In addition to controllers needing to get consumers to opt-in for some types of data collection and sharing, they would also need to “[p]rovide users with an up-to-date, transparent privacy, security, and data use policy that meets general requirements” including being “concise and intelligible,” “clear and prominent in appearance,” and “uses clear and plain language.” This policy would also need to include the following, among other information:
- The “[i]dentity and contact information of the entity collecting the sensitive personal information.
- [T]he purpose or use for collecting, storing, processing, selling, sharing, or otherwise using the sensitive personal information.
- Third parties with whom the sensitive personal information will be shared and for what purposes.
- How consent to collecting, storing, processing, selling, sharing, or otherwise using the sensitive personal information, including sharing with third parties, may be withdrawn.
- What kind of sensitive personal information is collected and shared.
- Whether the sensitive personal information will be used to create profiles about users and whether they will be integrated across platforms.
- How sensitive personal information is protected from unauthorized access or acquisition.
Presumably the failure of a controller to comply with its own privacy, security, and data use policy could result in the FTC or a state attorney general bringing an action for unfair or deceptive practices under the FTC Act.
The exceptions are significant and depending on how the FTC construes these in regulation could determine how stringent or permissive the new data privacy regime would be. Despite the seemingly robust opt-in and transparency requirements, there are some significant exceptions to the general rule that consumers must opt-in before controllers may collect and share their sensitive personal information, namely:
- Preventing or detecting fraud, identity theft, or criminal activity.
- The use of such information to identify errors that impair functionality or otherwise enhancing or maintaining the availability of the services or information systems of the controller for authorized access and use.
- Protecting the vital interests of the consumer or another natural person.
- Responding in good faith to valid legal process or providing information as otherwise required or authorized by law.
- Protecting the property, services, or information systems of the controller against unauthorized access or use.
- Advancing a substantial public interest, including archival purposes, scientific or historical research, and public health, if such processing does not create a significant risk of harm to consumers.
Yet, the most significant exception may be in section (b)(2), which I’ll quote in full: “[t]he [FTC] regulations promulgated pursuant to subsection (a) with respect to the requirement to provide opt-in consent shall not apply to the processing, storage, and collection of sensitive personal information or behavioral data in which such processing does not deviate from purposes consistent with a controller’s relationship with users as understood by the reasonable user.” Consequently, for the consumer using their Gmail account, any of Google’s processing of sensitive personal information may not be considered a deviation “from purposes consistent with a controller’s relationship with users as understood by the reasonable user.” The same may also apply to the current practices of Apple, Yahoo!, Microsoft, Amazon, etc. Not only would this represent a huge carve out to the exception that consumers must opt-in after receiving clear and easy to understand notice of what data is being collected, shared, and processed, with who, and for what purposes, it would seem to advantage those controllers already operating in the marketplace for they would not need to give consumers the choice of whether to opt-in.
Controllers of sensitive personal data would need a “qualified, objective, independent third-party” to conduct an annual “privacy audit,” and then the controller would need to reveal publicly if it is in compliance. There may be issues related to the incentive structure in that these third-parties will be competing for the business of data controllers and may be inclined to slant their audit towards compliance for the sake of client management. Perhaps the bill would benefit from some of the measures enacted under Sarbanes-Oxley to weaken the incentives for auditors to water down their audits. Another issue may be that these audits do not need to be submitted to the FTC or state attorneys general until one of these regulatory officials makes known to the controller “allegations that a violation of this Act or any regulation issued under this Act has been committed by the controller.” From a compliance standpoint, submitting all audits to the FTC in the same way companies must submit financial information to the Securities and Exchange Commission (SEC) would allow the FTC to have a better sense of compliance with its regulations, flag early any industry-wide trends or problems, or, yes, take enforcement action against non-compliant controllers. Of course such a system would be generally less attractive to data controllers. Finally, audits would not be necessary for small businesses for controller with the sensitive personal information of less than 5,000 people, and no audits would be necessary for non-sensitive information.
In terms of preempting state laws like the CCPA, this bill takes a seeming middle path. H.R. 2013 would preempt state laws “to the degree the law is focused on the reduction of privacy risk through the regulation of the collection of sensitive personal information and the collection, storage, processing, sale, sharing with third parties, or other use of such information.” However, this preemption applies only to controllers subject to this bill. In what may prove important language, any controllers outside the scope of this bill would find themselves subject to state laws on privacy. Moreover, any state laws on processors would not be preempted by H.R. 2013, meaning entities like data brokers may still be subject to the CCPA, for example.
And, yet, this bill would seem to create some sunlight for states to add privacy and data security requirements above the federal floor created by this bill. To wit, the bill provides that “[a]ny private contract based on a State law that requires a party to provide additional or greater privacy for sensitive personal information or data security protections to an individual than this Act” would not be preempted. Therefore, in statute a state could make reference to H.R. 2013 as enacted and then require controllers and processors operating in those states to provide additional privacy or data security measures above and beyond those in FTC regulations.
The FTC would be directed to hire “50 new full-time employees to focus on privacy and data security, 15 of which shall have technology expertise,” and appropriations of $35 million would authorized for the FTC “for issues related to privacy and data security.” Of course, appropriators would then have to actually appropriate these funds before the FTC ever saw an additional dollar. And, to contextualize this funding increase, the House’s FY 2020 bill that funds the FTC would provide the agency with $349.7 million, so the “Information Transparency & Personal Data Control Act” would increase the agency’s funding by roughly 10% above the House’s preferred FY 2020 funding level and by a slightly higher percentage compared to FY 2019 funding for the FTC.
© Michael Kans and Michael Kans Blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans and Michael Kans Blog with appropriate and specific direction to the original content.