Oklahoma Committee Releases Strong Privacy Bill

Oklahoma legislators unveil a strong bill. Chances of enactment are unclear.

Oklahoma has followed other states in beginning consideration of data privacy legislation in the absence of a comprehensive federal privacy statute. The “Oklahoma Computer Data Privacy Act” (HB 1602) is a strong data privacy bill, one that would put teeth into Oklahoma’s regulation of the collection, usage, processing, selling, and disclosing of personal information. Thus far, the bill has only been reported out of the House of Representatives’ Technology Committee, and its prospects for enactment are unclear.

It is somewhat curious that deeply Republican Oklahoma would produce a data privacy and protection bill much stronger than deeply Democratic Virginia or Washington state. However, this apparent discrepancy may come down to the prevalence of big technology companies in both states and their respective influence in Richmond and Olympia.

Be that as it may, the Oklahoma Computer Data Privacy Act would create a mostly opt in regime, and businesses would have to garner the consent of Oklahoma residents to conduct many of the activities they do now. People would gain many of the same rights promised in other privacy bills regarding access to their personal information, and businesses would have many of the same exemptions to the obligations and responsibilities placed upon them one may find in other privacy bills. However, this bill allows Oklahoma residents to sue for violations, which is not always featured in other privacy bills.

Starting, as usual with definitions, “personal information” is broadly defined, for it would encompass a number of categories of information that could be linked to either a “particular consumer or household.” Specifically, the definition includes “information that identifies, relates to, describes, can be associated with or can reasonably be linked to, directly or indirectly, a particular consumer or household” including the following categories:

  • an identifier, including a real name, alias, mailing address, account name, date of birth, driver license number, unique identifier, Social Security number, passport number, signature, telephone number or other government-issued identification number, or other similar identifier,
  • an online identifier, including an electronic mail address or Internet Protocol address, or other similar identifier,
  • a physical characteristic or description, including a characteristic of a protected classification under state or federal law,
  • commercial information, including:
    • a record of personal property,
    • a good or service purchased, obtained or considered,
    • an insurance policy number, or
    • other purchasing or consuming histories or tendencies,
  • biometric information,
  • Internet or other electronic network activity information, including:
    • browsing or search history, and
    • other information regarding a consumer’s interaction with an Internet website, application or advertisement,
  • geolocation data,
  • audio, electronic, visual, thermal, olfactory or other similar information,
  • professional or employment-related information,
  • education information that is not publicly available personally identifiable information under the Family Educational Rights and Privacy Act of 1974,
  • financial information, including a financial institution account number, credit or debit card number, or password or access code associated with a credit or debit card or bank account,
  • medical information,
  • health insurance information, or
  • inferences drawn from any of the information listed under this paragraph to create a profile about a consumer that reflects the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities or aptitudes;

Moreover, the entity charged with drafting the regulations for and enforcing HB 1602, the Oklahoma Corporation Commission (OCC), may change this definition as technology and times change. Therefore, the bill’s definition of “personal information” is among the strongest encountered in either state or federal legislation.

HB 1602 defines what constitutes a “business purpose,” a term that is key throughout the bill, for many rights people would receive about knowing about how their personal information is being sold or shared will pertain to business purposes. Consequently, the definition is worth quoting in full:

  • the following operational purposes of a business or service provider, provided that the use of the information is reasonably necessary and proportionate to achieve the operational purpose for which the information was collected or processed or another operational purpose that is compatible with the context in which the information was collected:
    • auditing related to a current interaction with a consumer and any concurrent transactions, including counting ad impressions to unique visitors, verifying the positioning and quality of ad impressions, and auditing compliance with a specification or other standards for ad impressions,
    • detecting a security incident, protecting against malicious, deceptive, fraudulent or illegal activity, and prosecuting those responsible for any illegal activity described by this division,
    • identifying and repairing or removing errors that impair the intended functionality of computer hardware or software,
    • using personal information in the short term or for a transient use, provided that the information is not:
      • disclosed to a third party, and
      • used to build a profile about a consumer or alter an individual consumer’s experience outside of a current interaction with the consumer, including the contextual customization of an advertisement displayed as part of the same interaction,
    • performing a service on behalf of the business or service provider, including:
      • maintaining or servicing an account, providing customer service, processing or fulfilling an order or transaction, verifying customer information, processing a payment, providing financing, providing advertising or marketing services, or providing analytic services, or
      • performing a service similar to a service described by subdivision (a) of this division on behalf of the business or service provider,
    • undertaking internal research for technological development and demonstration, or
    • undertaking an activity to:
      • verify or maintain the quality or safety of a service or device that is owned by, manufactured by, manufactured for or controlled by the business, or
      • improve, upgrade or enhance a service or device described by subdivision (a) of this division, or
  • another operational purpose for which notice is given under this act, but specifically excepting cross-context targeted advertising, unless the customer has opted in to the same;

HB 1602 uses the term “commercial purpose” for data collection, processing, usage, selling, and disclosing, and it means:

…a purpose that is intended to result in a profit or other tangible benefit or the advancement of a person’s commercial or economic interests, such as by inducing another person to buy, rent, lease, subscribe to, provide or exchange products, goods, property, information or services or by enabling or effecting, directly or indirectly, a commercial transaction. 

However, “[t]he term does not include the purpose of engaging in speech recognized by state or federal courts as noncommercial speech, including political speech and journalism.” Hence, speech protected by the U.S. and Oklahoma Constitutions would be exempted from the term.

The definition of consent is strong and seeks to address consent gained through the use of so-called ark patterns: “an act that clearly and conspicuously communicates the individual’s authorization of an act or practice that is made in the absence of any mechanism in the user interface that has the purpose or substantial effect of obscuring, subverting or impairing decision-making or choice to obtain consent.”

Notably, the bill does not have explicit definitions of sell, share, or disclose. However, in the body of the bill, functional definitions of these terms are found. In Section 3, one finds the definition of what constitutes selling a person’s personal information, which occurs if a business

sells, rents, discloses, disseminates, makes available, transfers or otherwise communicates, orally, in writing, or by electronic or other means, the information to the other business or third party for monetary or other valuable consideration.

This is a fairly tight definition that encompasses much of the personal data market. One wonders if the requirement of monetary or valuable consideration might be ripe for exploitation if a large company, say like Twitter, is freely trading personal data and receiving only personal data in return. Does that qualify as valuable consideration? I would think so, but I expect this part of the definition to get tested.

Things get interesting with how the carve outs to selling personal information are phrased. For example, one such exemption is when a person “directs the business to intentionally disclose the information or uses the business to intentionally interact with a third party, provided that the third party does not sell the information, unless that disclosure is consistent with this act” (emphasis added.) It would appear possible that a third party could sell a person’s personal information without consent so long as notice is provided. And yet, that reading appears to be directly at odds with other provisions. Moreover, elsewhere in the section, this provision is clarified through noting a person must intentionally interact with the third party through affirmatively consenting. However, it appears one need merely consent to the intentional interaction and not the sale of personal information. This is either a significant, intentional loophole, or drafting that needs to be tightened to conform to the rest of the bill.

Large companies doing business in Oklahoma are swept into the bill except for “internet service providers” acting in that capacity. To qualify, a business must

  • Do business in the state
  • Collect personal information
  • Control the processing of the collected personal information; and
  • Meet one or more of the following:
    • Annual gross revenue of $10 million or more
    • Buying, selling, receiving, or sharing the personal information of 50,000 or more Oklahoma residents a year; or
    • Earn 25% of annual revenue from selling personal information

There are a number of carve outs for entities already subject to other privacy and data security regulations such as:

  • The “Health Insurance Portability and Accountability Act of 1996” (P.L. 104–191) (HIPAA) and the “Health Information Technology for Economic and Clinical Health Act” (HITECH Act) (P.L. 111-5)
  • “state health privacy laws”
  • The “Fair Credit Reporting Act” for the sale of personal information to credit reporting agencies to be used to generate a consumer report
  • The “Financial Services Modernization Act of 1999” (P.L. 106-102) (aka Gramm-Leach-Bliley)

Moreover, publicly available information is outside the scope of the bill, but this bill does not, as many other bills do, include information one posts on a social media platform in this category. Likewise, the same is true of “[d]e-identified or aggregate consumer information.”

Moreover, the non-commercial activities of the media would be exempted, and hence any data collection, processing, or disclosing for commercial purposes would be covered by HB 1602.

One also finds many of the same carve outs customary in privacy bills in the U.S., including compliance with federal, state, or local law or civil, criminal, and regulatory inquiries and investigations. Moreover, the preservation of and adherence to evidentiary privileges would allow an otherwise covered entity not to comply with HB 1602.

There is interesting language specifying that in the event HB 1602 and another state law conflict, whichever has the stronger provision shall win any dispute. And yet, the bill preempts any Oklahoma county, city, town, and municipal statutes on privacy and data security.

The OCC must promulgate regulations to implement, administer, and enforce the Oklahoma Computer Data Privacy Act on the following:

  • Procedures related to verifying requests
  • Opting in or opting out of the sale of one’s personal data
  • A universal opt-in button for people to consent to the sale of their personal information
  • Intelligible and easily understood notices and information

The OCC has discretion on whether to implement other regulations, including

  • Expanding the definition of personal information to keep it current and relevant
  • Revising the definition of identifier, which the bill defines as “data elements or other information that alone or in conjunction with other information can be used to identify a particular consumer, household or device that is linked to a particular consumer or household”
  • Updating the methods by which one may submit a request to exercise a right; and
  • Establishing exceptions for businesses to comply with federal or state law.

HB 1602 would allow for research for non-commercial purposes using personal information collected from people in the course of using a business’ service or product for other purposes so long as the research is compatible with the business purpose that led to the initial collection. The bill details a number of restrictions, requirements, and limitations on research under these auspices.

Among the rights residents of Oklahoma would gain under the bill, they could ask for and receive the categories and specific pieces of personal information a business has collected or amassed on them. This disclosure would need to include the categories of sources from which personal information was collected, the business or commercial purposes for collecting or selling personal information, and the categories of third parties to whom personal information is disclosed. Any information received must be in an electronic format that can be readily transmitted to another person.

Like most other bills, people could ask that businesses delete the personal information they have on them; however, there are a number of exceptions under which a business does not need to comply such as:

  • Needing to complete a transaction
  • Provide a requested good or service the person requested
  • “Detect a security incident; protect against malicious, deceptive, fraudulent or illegal activity; or prosecute those responsible for any illegal activity described by this paragraph;”
  • “Identify and repair or remove errors from computer hardware or software that impair its intended functionality;”
  • “Exercise free speech or ensure the right of another consumer to exercise the right of free speech or another right afforded by law;”

And yet, unlike other bills, Oklahoma residents would not receive the right to correct or amend incorrect personal information a business or third party may be holding. This seems like a strange omission given how consumer-friendly the bill otherwise is.

A business receiving requests must take steps to reasonably verify the identity of the requester, and if it does, it has 45 days to comply. If it cannot reasonably verify the identity, then it does not have to comply with the request. This could prove to be another choke point for people trying to exercise their rights as has happened in California. Businesses can take an additional 45 days to comply with notice to the requester, and if the business decides not to honor the request it must inform the person of the reasons why and inform them of their right to appeal the decision. Businesses may not charge unless requests are baseless, excessive or repetitive.

The Oklahoma Computer Data Privacy Act seems to establish a regime under businesses may sell one’s personal information only after she has opted in. It is important to keep in mind the expansive definition of what constitutes selling personal information seems intended to cover every conceivable transfer of personal information from a business to a third party. However, the bill would grandfather in current arrangements under which businesses are selling personal information to third parties, for there is language making clear that after enactment, people would need to opt in before such sales could occur. Consequently, it appears the drafters are envisioning a bifurcated system of consent depending on when the sale of personal information began. If before the bill became law, a person needs to opt out, but after enactment, it becomes an opt in regime for all new selling of personal information. A person’s consent is also needed for when a third party to whom a business has sold one’s personal information wants to sell the data.

Additionally, businesses must disclose in publicly available privacy policies that they collect, sell, or disclose one’s personal information for business purposes. These policies must include people’s rights, a list of categories of personal information collected, a different list of categories of personal information sold, and another list of categories of personal information disclosed for a business purpose. Moreover, there must be a list of all the categories of sources from which the business acquires personal information. Moreover, the business must also furnish a list of categories of third parties to whom it sells personal information.

Moreover, businesses must provide notice about each category of personal information they collect, the purposes for which each category will be used, and obtain consent before collection can occur. In the event a business is sold, and its data collection practices materially change, it must notify people and then obtain consent again.

Businesses and their service providers must implement security procedures, and the language from the bill is almost verbatim what is in almost every other bill:

A business or service provider shall implement and maintain reasonable security procedures and practices, including administrative, physical and technical safeguards appropriate to the nature of the information and the purposes for which the personal information will be used, to protect consumers’ personal information from unauthorized use, disclosure, access, destruction or modification, irrespective of whether a customer has opted in or out of a sale of data.

All agreements and contracts that would abridge or nullify the rights bestowed by the Oklahoma Computer Data Privacy Act would be null and void as they are made contrary to public policy under the bill.

Businesses using de-identified personal information could not re-identify such data without consent. Moreover, these businesses must have safeguards and processes in place to prevent re-identification and processes to stop unauthorized disclosures of de-identified personal information.

It would be illegal to discriminate against Oklahoma residents who exercise their rights under the bill by denying them goods or services, charging a different price, or providing a different level of quality. And yet, this prohibition may well be consumed by the exception allowing just such a thing to happen if the difference in price or quality is “reasonably related to the value provided to the consumer by the consumer’s data.” It is unclear how this determination will be made, and it seems likely this will be the fulcrum businesses use to get people to surrender their personal information. In the same vein, businesses can offer loyalty or rewards programs that allow the business to collect, sell, or disclose personal information in return for payment or other financial incentives. The bill bars the use of “financial incentive practices that are unjust, unreasonable, coercive or usurious in nature,” however. How these terms are defined will be crucial in determining what is permissible.

The OCC would enforce the bill and could seek monetary damages, injunctive relief, and reasonable attorney’s fees in cases against violators. There is a two-tiered penalty system with businesses being liable for up to $2,500 per violation and up to $7,500 per intentional violation. Moreover, the bill establishes a private right of action with the same two-tiered penalty system in addition to actual damages and injunctive relief. Businesses can be liable for the violations of third parties and services if it has actual knowledge of or a reasonable belief the violation will occur. Nonetheless, a service provider cannot be held liable for a business’ violation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Debra Gauthier on Unsplash

Further Reading, Other Developments, and Coming Events (16 February 2021)

Further Reading

  • India cuts internet around New Delhi as protesting farmers clash with police” By Esha Mitra and Julia Hollingsworth — CNN; “Twitter Temporarily Blocked Accounts Critical Of The Indian Government” By Pranav Dixit — BuzzFeed News. Prime Minister Narendra Modi’s government again shut down the internet as a way of managing unrest or discontent with government policies. The parties out of power have registered their opposition, but the majority seems intent on using this tactic time and again. One advocacy organization named India as the nation with the most shutdowns in 2019, by far. The government in New Delhi also pressed Twitter to take down tweets and accounts critical of the proposed changes in agricultural law. Twitter complied per its own policies and Indian law and then later restored the accounts and tweets.
  • Lacking a Lifeline: How a federal effort to help low-income Americans pay their phone bills failed amid the pandemic” By Tony Romm — The Washington Post. An excellent overview of this Federal Communications Commission (FCC) program and its shortcomings. The Trump era FCC blunted and undid Obama era FCC reforms designed to make the eligibility of potential users easier to discern, among other changes. At the end of the day, many enrollees are left with a fixed number of minutes for phone calls and 4GB of data a month, or roughly what my daughter often uses in a day.
  • She exposed tech’s impact on people of color. Now, she’s on Biden’s team.” By Emily Birnbaum — Protocol. The new Deputy Director for Science and Society in the Office of Science and Technology Policy (OSTP) is a former academic and researcher who often focused her studies on the intersection of race and technology, usually how the latter failed minorities. This is part of the Biden Administration’s fulfillment of its campaign pledges to establish a more inclusive White House. It remains to be seen how the administration will balance the views of those critical of big technology with those hailing from big technology as a number of former high ranking employees have already joined or are rumored to be joining the Biden team.
  • Vaccine scheduling sites are terrible. Can a new plan help Chicago fix them?” By Issie Lapowsky — Protocol. As should not be shocking, many jurisdictions across the country have problematic interfaces for signing up for vaccination against COVID-19. It sounds reminiscent of the problems that plagued the Obamacare exchanges rollout in that potentially well thought out policy was marred by a barely thought out public face.
  • Google launches News Showcase in Australia in sign of compromise over media code” By Josh Taylor — The Guardian; “Cracks in media code opposition as Microsoft outflanks Google and Facebook” By Lisa Visentin — The Sydney Morning Herald. Both Google and Canberra seem to be softening their positions as the company signed up a number of major media outlets for its News Showcase, a feature that will be made available in Australia that will compensate the news organizations at an undisclosed level. However, a few major players, Nine, News Corp., and the Australian Broadcasting Corporation, have not joined, with Nine saying it will not. Google’s de-escalation of rhetoric and tactics will likely allow Prime Minister Scott Morrison’s government to relax the proposed legislation that would mandate Google and Facebook compensate Australian news media (i.e., the News Media and Digital Platforms Mandatory Bargaining Code.) Microsoft’s theoretical entrance into the Australian market through Bing if Google and Facebook actually leave or limit their presence seems to be arguing against the latter two companies’ position that the new code is unworkable. It is not clear if Microsoft is acting earnestly or floating a possible scenario in order that the other companies be cast in a bad light. In any event, cristics of the platforms say the fight is not about the technical feasibility of compensating news media but rather about establishing a precedent of paying for content the platforms now get essentially for free. Other content creators and entities could start demanding payment, too. An interesting tidbit from the second article: Canada may soon join Australia and the European Union in enacting legislation requiring Big Tech to pay its media companies for using their content (i.e., “a more equitable digital regulatory framework across platforms and news media” according to a minister.)

Other Developments

  • The Maryland legislature overrode Governor Larry Hogan’s (R) veto, and the first tax on digital advertising has been enacted in the United States. The “Taxation – Tobacco Tax, Sales and Use Tax, and Digital Advertising Gross Revenues Tax” (HB0732) would impose a tax on digital advertising in the state and may be outside a federal bar on certain taxes on internet services. However, if the veto is overridden, there will inevitably be challenges, and quite likely a push in Congress to enact a federal law preempting such digital taxes. Additionally, the primary sponsor of the legislation has introduced another bill barring companies from passing along the costs of the tax to Maryland businesses and consumers.
    • In a bill analysis, the legislature asserted about HB0732:
      • The bill imposes a tax on the annual gross revenues of a person derived from digital advertising services in the State. The bill provides for the filing of the tax returns and making tax payments. The part of the annual gross revenues of a person derived from digital advertising services in the State are to be determined using an apportionment fraction based on the annual gross revenues of a person derived from digital advertising services in the State and the annual gross revenues of a person derived from digital advertising services in the United States. The Comptroller must adopt regulations that determine the state from which revenues from digital advertising services are derived.
      • The digital advertising gross revenues tax is imposed at the following rates:
        • 2.5% of the assessable base for a person with global annual gross revenues of $100.0 million through $1.0 billion;
        • 5% of the assessable base for a person with global annual gross revenues of $1.0 billion through $5.0 billion;
        • 7.5% of the assessable base for a person with global annual gross revenues of $5.0 billion through $15.0 billion; and
        • 10% of the assessable base for a person with global annual gross revenues exceeding $15.0 billion.
    • In his analysis, Maryland’s Attorney General explained:
      • House Bill 732 would enact a new “digital advertising gross revenues tax.” The tax would be “imposed on annual gross revenues of a person derived from digital advertising services in the State.” Digital advertising services are defined in the bill to include “advertisement services on a digital interface, including advertisements in the form of banner advertising, search engine advertising, interstitial advertising, and other comparable advertising services.” The annual gross revenues derived from digital advertising services is set out in a formula in the bill.
      • Attorney General Brian Frosh conceded there will be legal challenges to the new Maryland tax: there are “three grounds on which there is some risk that a reviewing court would find that the taxis unconstitutional: (1) preemption under the federal Internet Tax Freedom Act; (2) the Commerce Clause; and, (3) the First Amendment.”
  • Democratic Members introduced the “Secure Data and Privacy for Contact Tracing Act” (H.R.778/S.199) in both the House and Senate, legislation that “would provide grants to states that choose to use technology as part of contact tracing efforts for COVID-19 if they agree to adopt strong privacy protections for users” per their press release. Representatives Jackie Speier (D-CA) and Debbie Dingell (D-MI) introduced the House bill and Senators Brian Schatz (D-HI) and Tammy Baldwin (D-WI) the Senate version. Speier, Dingell, Schatz, and Baldwin contended “[t]he Secure Data and Privacy for Contact Tracing Actprovides grant funding for states to responsibly develop digital contact tracing technologies consistent with the following key privacy protections:
    • Digital contact tracing tech must be strictly voluntary and provide clear information on intended use.
    • Data requested must be minimized and proportionate to what is required to achieve contact tracing objectives.
    • Data must be deleted after contact tracing processing is complete, or at the end of the declaration of emergency.
    • States must develop a plan for how their digital contact tracing technology compliments more traditional contact tracing efforts and describe efforts to ensure their technology will be interoperable with other states. 
    • States must establish procedures for independent security assessments of digital contact tracing infrastructure and remediate vulnerabilities. 
    • Information gathered must be used strictly for public health functions authorized by the state and cannot be used for punitive measures, such as criminal prosecution or immigration enforcement.
    • Digital contact tracing tech must have robust detection capabilities consistent with CDC guidance on exposure. 
    • Digital contact tracing technology must ensure anonymity, allowing only authorized public health authorities or other authorized parties to have access to personally identifiable information.
  • The chair and ranking member of the Senate Intelligence Committee wrote the heads of the agencies leading the response to the Russian hack of the United States (U.S.) government and private sector entities through SolarWinds, taking them to task for their thus far cloistered, siloed approach. In an unusually blunt letter, Chair Mark Warner (D-VA) and Ranking Member Marco Rubio (R-FL) asked the agencies name a leader to the response triggered when former President Donald Trump triggered the system established in Presidential Policy Directive-41 because “[t]he federal government’s response so far has lacked the leadership and coordination warranted by a significant cyber event, and we have little confidence that we are on the shortest path to recovery.” Warner and Rubio directed this request to Director of National Intelligence Avril Haines, National Security Agency and Cyber Command head General Paul Nakasone, Federal Bureau of Investigation (FBI) Director Christopher Wray, and Cybersecurity and Infrastructure Security Agency (CISA) Acting Director Brandon Wales. Warner and Rubio further asserted:
    • The briefings we have received convey a disjointed and disorganized response to confronting the breach. Taking a federated rather than a unified approach means that critical tasks that are outside the central roles of your respective agencies are likely to fall through the cracks. The threat our country still faces from this incident needs clear leadership to develop and guide a unified strategy for recovery, in particular a leader who has the authority to coordinate the response, set priorities, and direct resources to where they are needed. The handling of this incident is too critical for us to continue operating the way we have been.
  • Huawei filed suit against the Federal Communications Commission’s (FCC) decision to “designate Huawei, as well as its parents, affiliates, and subsidiaries, as companies posing a national security threat to the integrity of our nation’s communications networks and the communications supply chain” through “In the Matter of Protecting Against National Security Threats to the Communications Supply Chain Through FCC Programs – Huawei Designation.” In the petition filed with the United States Court of Appeals for the Fifth Circuit, Huawei said it is “seek[ing] review of the Final Designation Order on the grounds that it exceeds the FCC’s statutory authority; violates federal law and the Constitution; is arbitrary, capricious, and an abuse of discretion, and not supported by substantial evidence, within the meaning of the Administrative Procedure Act, 5 U.S.C. § 701 et seq.; was adopted through a process that failed to provide Petitioners with the procedural protections afforded by the Constitution and the Administrative Procedure Act; and is otherwise contrary to law.”
  • According to unnamed sources, the Biden Administration has decided to postpone indefinitely the Trump Administration’s efforts to forcing ByteDance to sell TikTok as required by a Trump Administration executive order. Last September, it appeared that Oracle and Walmart had reached a deal in principle with ByteDance that quickly raised more questions that it settled (see here for more details and analysis.) There are reports of ByteDance working with the Committee on Foreign Investment in the United States (CFIUS), the inter-agency review group, that ordered ByteDance to spin off TikTok. TikTok and CFIUS are reportedly talking about what an acceptable divestment would look like, but of course, under recently implemented measures, the People’s Republic of China (PRC) would also have to sign off. Nonetheless, White House Press Secretary Jen Psaki remarked at a press conference “[t]here is a rigorous CFIUS process that is ongoing.”
  • The Biden Administration has asked two federal appeals courts to pause lawsuits brought to stop the United States (U.S.) government from enforcing the Trump Administration executive order banning TikTok from the United States (see here for more analysis.)
    • In the status report filed with the United States Court of Appeal for the District of Columbia, TikTok and the Department of Justice (DOJ) explained:
      • Defendants’ counsel informed Plaintiffs’ counsel regarding the following developments: As the Biden Administration has taken office, the Department of Commerce has begun a review of certain recently issued agency actions, including the Secretary’s prohibitions regarding the TikTok mobile application at issue in this case. In relation to those prohibitions, the Department plans to conduct an evaluation of the underlying record justifying those prohibitions. The government will then be better positioned to determine whether the national security threat described in the President’s August 6, 2020 Executive Order, and the regulatory purpose of protecting the security of Americans and their data, continue to warrant the identified prohibitions. The Department of Commerce remains committed to a robust defense of national security as well as ensuring the viability of our economy and preserving individual rights and data privacy.
    • In its unopposed motion, the DOJ asked the United States Court of Appeals for the Third Circuit “hold this case in abeyance, with status reports due at 60-day intervals.” The DOJ used exactly the same language as in the filing in the D.C. Circuit.
  • The Trump Administration’s President’s Council of Advisors on Science and Technology (PCAST) issued a report at the tail end of the  administration, “Industries of the Future Institutes: A New Model for American Science and Technology Leadership,” that “follows up on a recommendation from PCAST’s report, released June 30, 2020, involving the formation of a new type of multi-sector research and development organization: Industries of the Future Institutes (IotFIs)…[and] provides a framework to inform the design of IotFIs and thus should be used as preliminary guidance by funders and as a starting point for discussion among those considering participation.”
    • PCAST “propose[d] a revolutionary new paradigm for multi-sector collaboration—Industries of the Future Institutes (IotFIs)—to address some of the greatest societal challenges of our time and to ensure American science and technology (S&T) leadership for decades to come.” PCAST stated “[b]y driving research and development (R&D) at the intersection of two or more IotF areas, these Institutes not only will advance knowledge in the individual IotF topics, but they also will spur new research questions and domains of inquiry at their confluence.” PCAST added:
      • By engaging multiple disciplines and each sector of the U.S. R&D ecosystem—all within the same agile organizational framework—IotFIs will span the spectrum from discovery research to the development of new products and services at scale. Flexible intellectual property terms will incentivize participation of all sectors, and reduced administrative and regulatory burdens will optimize researcher time for creativity and productivity while maintaining appropriate safety, transparency, integrity, and accountability. IotFIs also will serve as a proving ground for new, creative approaches to organizational structure and function; broadening participation; workforce development; science, technology, engineering, and math education; and methods for engaging all sectors of the American research ecosystem. Ultimately, the fruits of IotFIs will sustain American global leadership in S&T, improve quality of life, and help ensure national and economic security for the future.
  • Per the European Commission’s (EC) request, the European Data Protection Board (EDPB) issued clarifications on the consistent application of the General Data Protection Regulation (GDPR) with a focus on health research. The EDPB explained:
    • The following response of the EDPB to the questions of the European Commission should be considered as a first attempt to take away some of the misunderstandings and misinterpretations as to the application of the GDPR to the domain of scientific health research. Generally speaking, most of these questions call for more time for in-depth analysis and/or a search for examples and best practices and can as yet not be completely answered.
    • In its guidelines (currently in preparation and due in 2021) on the processing personal data for scientific research purposes, the EDPB will elaborate further on these issues while also aiming to provide a more comprehensive interpretation of the various provisions in the GDPR that are relevant for the processing of personal data for scientific research purposes.
    • This will also entail a clarification of the extent and scope of the ‘special derogatory regime’ for the processing of personal data for scientific research purposes in the GDPR. It is important that this regime is not perceived as to imply a general exemption to all requirements in the GDPR in case of processing data for scientific research purposes. It should be taken into account that this regime only aims to provide for exceptions to specific requirements in specific situations and that the use of such exceptions is made dependent on ‘additional safeguards’ (Article 89(1) GDPR) to be in place.
  • The Government Accountability Office (GAO) has assessed how well the Federal Communications Commission (FCC) has rolled out and implemented its Lifeline National Verifier (referred to as Verifier by the GAO) to aid low income people in accessing telecommunications benefits. The Verifier was established in 2016 to address claims that allowing telecommunications carriers to make eligibility determinations for participation in the program to help people obtain lower cost communications had led to waste, fraud, and abuse. House Energy and Commerce Committee Chair Frank Pallone Jr. (D-NJ), Communications and Technology Subcommittee Chair Mike Doyle (D-PA), and six Democratic colleagues on the committee asked the GAO “to review FCC’s implementation of the Verifier.” The GAO explained “[t]his report examines (1) the status of the Verifier; (2) the extent to which FCC coordinated with state and federal stakeholders, educated consumers, and facilitated involvement of tribal stakeholders; and (3) the extent to which the Verifier is meeting its goals.” The GAO concluded:
    • The Lifeline program is an important tool that helps low-income Americans afford vital voice and broadband services. In creating the Lifeline National Verifier, FCC sought to facilitate eligible Americans’ access to Lifeline support while protecting the program from waste, fraud, and abuse. Although USAC, under FCC’s oversight, has made progress to implement the Verifier, many eligible consumers are unaware of it and may be unable to use it. Additionally, tribal governments and organizations do not have the information they need from FCC to effectively assist residents of tribal lands in using the Verifier to enroll in Lifeline, even though Lifeline support is critical to increasing access to affordable telecommunications services on tribal lands. Without FCC developing a plan to educate consumers about the Verifier and empowering tribal governments to assist residents of tribal lands with the Verifier, eligible consumers, especially those on tribal lands, will continue to lack awareness of the Verifier and the ability to use it.
    • Further, without measures and information to assess progress toward some of its goals, FCC lacks information it needs to refine and improve the Verifier. While it is too soon to determine if the Verifier is protecting against fraud, FCC has measures in place to monitor fraud moving forward. However, FCC lacks measures to track the Verifier’s progress toward the intent of its second goal of delivering value to Lifeline consumers. FCC also lacks information to help it assess and improve its efforts to meet the third goal of improving the consumer experience. Additionally, consumers may experience challenges with the Verifier’s online application, such as difficulty identifying the Verifier as a government service, and may be uncomfortable providing sensitive information to a website that does not use a “.gov” domain. Unless FCC identifies and addresses challenges with the Verifier’s manual review process and its online application, it will be limited in its ability to improve the consumer experience. As a result, some eligible consumers may abandon their applications and go without the support they need to access crucial telecommunications services. Given that a majority of Lifeline subscribers live in states without state database connections and therefore must undergo manual review more frequently, ensuring that challenges with the manual review process are resolved is particularly important.
    • The GAO recommended:
      • The Chairman of FCC should develop and implement a plan to educate eligible consumers about the Lifeline program and Verifier requirements that aligns with key practices for consumer education planning. (Recommendation 1)
      • The Chairman of FCC should provide tribal organizations with targeted information and tools, such as access to the Verifier, that equip them to assist residents of tribal lands with their Verifier applications. (Recommendation 2)
      • The Chairman of FCC should identify and use performance measures to track the Verifier’s progress in delivering value to consumers. (Recommendation 3)
      • The Chairman of FCC should ensure that it has quality information on consumers’ experience with the Verifier’s manual review process, and should use that information to improve the consumer experience to meet the Verifier’s goals. (Recommendation 4)
      • The Chairman of FCC should ensure that the Verifier’s online application and support website align with characteristics for leading federal website design, including that they are accurate, clear, understandable, easy to use, and contain a mechanism for users to provide feedback. (Recommendation 5)
      • The Chairman of FCC should convert the Verifier’s online application, checklifeline.org, to a “.gov” domain. (Recommendation 6)

Coming Events

  • The House Appropriations Committee’s Financial Services and General Government Subcommittee will hold an oversight hearing on the Election Assistance Commission (EAC) on 16 February with EAC Chair Benjamin Hovland.
  • On 17 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Connecting America: Broadband Solutions to Pandemic Problems” with these witnesses:
    • Free Press Action Vice President of Policy and General Counsel Matthew F. Wood
    • Topeka Public Schools Superintendent Dr. Tiffany Anderson
    • Communications Workers of America President Christopher M. Shelton
    • Wireless Infrastructure Association President and CEO Jonathan Adelstein
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Zachary Peterson on Unsplash

States Other Than CA and VA Consider Privacy Bills

The North Dakota legislature spikes two tech bills, one that would have regulated app stores and the other would have barred sales of personal data without consent.

With the recent enactment by ballot of the “California Privacy Rights Act” (aka Proposition 24) and the pending enactment of the “Consumer Data Privacy Act” in Virginia, other states are considering legislation to regulate privacy and other aspects of the technology industry’s businesses.

In North Dakota, an effort to take on Apple and Google’s 30% fees related to their application stores was defeated in the state Senate. Apparently, a lobbyist for the Epic Games and the Coalition for App Fairness drafted the initial bill and convinced a state Senator to introduce legislation and champion the issue. The Coalition for App Fairness, of which Epic Games is a member, “advocate[s] for enforcement and reforms, including legal and regulatory changes, to preserve consumer choice and a level playing field for app and game developers that rely on app stores and the most popular gatekeeper platforms” according to the press release announcing its establishment. And, of course, Epic Games is suing both Apple and Google in United States (U.S.) federal court over the 30% fee both companies take off the top of all in-app purchases, and so, this legislative push in a state is likely one of the strategies app developers will be pursuing. It is almost certain that similar legislation will crop up in other legislatures and maybe even in Congress.

SB 2333 would bar companies like Apple and Google from requiring application developers to use their application stores exclusively. Hence residents in North Dakota should be able to locate and download applications directly from developers or other non-Apple and non-Google sources. Apple, Google, and similarly situated companies could not mandate, as they presently do, that all in-application purchases must be conducted through their payment platforms. This provision would deny the companies their stranglehold on payments, which allows them to extract a 30% fee from such purchases. It was this very point that started the Epic Games litigation, for the company started offering Apple and Google platform users of its popular game Fortnite the option of buying directly from Epic Games at a price 30% lower than the one offered through the application stores. Apple and Google responded by kicking Epic Games out of their stores, sparking the current litigation. In this vein, an application store could not retaliate against companies that opt to use a separate payment system or block an application developer for the same. Developers could bring suit for violations asking to enjoin Apple and Google and asking for restitution, reasonable attorney’s fees, and other costs. SB 2333 would bar any contract or agreement contrary to this bill.

And yet, “special-purpose digital application distribution platforms” would be exempted from this bill. The definition provides examples of what these may be, including “a gaming console, music player, and other special-purpose devices connected to the internet.” These sound very much like Microsoft’s Xbox, Sony’s Play Station, Apple’s iPods, and Virtual Reality headsets like the Facebook-owned company, Oculus. Moreover, “digital application distribution platform for which cumulative gross receipts from sales on the digital application distribution platform to residents of this state” are less than $10 million a year are exempted. So, the bill seems to target Apple and Google.

Not surprisingly, Apple and Google fought against this bill. Apple’s Chief Privacy Engineer Erik Neuenschwander testified before a Senate committee that “Senate Bill 2333 threatens to destroy iPhone as you know it.” Not surprisingly, a representative of the Coalition for App Fairness argued “SB 2223 will benefit consumers and app developers in North Dakota by limiting the ability of dominant platforms to impose onerous and anticompetitive restrictions on app developers.” Nonetheless, the state Senate rejected a weakened version of SB 2333 by an 11-36 vote, killing the legislation.

Last fall, a federal court denied Epic Games’ request for a preliminary injunction requiring Apple to put Fortnite back into the App Store. The judge assigned the case had signaled this request would likely fail as its request for a temporary restraining order was also rejected. A May 2021 trial date has been set. The United States District Court for the Northern District of California summarized Epic’s motion:

In this motion for preliminary injunction, Epic Games asks the Court to force Apple to reinstate Fortnite to the Apple App Store, despite its acknowledged breach of its licensing agreements and operating guidelines, and to stop Apple from terminating its affiliates’ access to developer tools for other applications, including Unreal Engine, while Epic Games litigates its claims.

The court stated:

Epic Games bears the burden in asking for such extraordinary relief. Given the novelty and the magnitude of the issues, as well as the debate in both the academic community and society at large, the Court is unwilling to tilt the playing field in favor of one party or the other with an early ruling of likelihood of success on the merits. Epic Games has strong arguments regarding Apple’s exclusive distribution through the iOS App Store, and the in-app purchase (“IAP”) system through which Apple takes 30% of certain IAP payments. However, given the limited record, Epic Games has not sufficiently addressed Apple’s counter arguments. The equities, addressed in the temporary restraining order, remain the same.

The court held:

Apple and all persons in active concert or participation with Apple, are preliminarily enjoined from taking adverse action against the Epic Affiliates with respect to restricting, suspending or terminating the Epic Affiliates from the Apple’s Developer Program, on the basis that Epic Games enabled IAP direct processing in Fortnite through means other than the Apple IAP system, or on the basis of the steps Epic Games took to do so. This preliminary injunction shall remain in effect during the pendency of this litigation unless the Epic Affiliates breach: (1) any of their governing agreements with Apple, or (2) the operative App Store guidelines. This preliminary injunction supersedes the prior temporary restraining order.

In its complaint, Epic Games is arguing that Apple’s practices violate federal and California antitrust and anti-competition laws. Epic Games argued:

  • This case concerns Apple’s use of a series of anti-competitive restraints and monopolistic practices in markets for (i) the distribution of software applications (“apps”) to users of mobile computing devices like smartphones and tablets, and (ii) the processing of consumers’ payments for digital content used within iOS mobile apps(“in-app content”).
  • Apple imposes unreasonable and unlawful restraints to completely monopolize both markets and prevent software developers from reaching the over one billion users of its mobile devices (e.g., iPhone and iPad) unless they go through a single store controlled by Apple, the App Store, where Apple exacts an oppressive 30% tax on the sale of every app. Apple also requires software developers who wish to sell digital in-app content to those consumers to use a single payment processing option offered by Apple, In-App Purchase, which likewise carries a 30% tax.
  • In contrast, software developers can make their products available to users of an Apple personal computer (e.g., Mac or MacBook) in an open market, through a variety of stores or even through direct downloads from a developer’s website, with a variety of payment options and competitive processing fees that average 3%, a full ten times lower than the exorbitant 30% fees Apple applies to its mobile device in-app purchases.

In its late August denial of Epic Games’ request for a temporary restraining order, the court decided the plaintiff does not necessarily have an antitrust case strong enough to succeed on the merits, has not demonstrated irreparable harm because the “current predicament appears to be of its own making,” would unjustifiably be enriched if Fortnite is reinstated to the App Store without having to pay 30% of in app purchases to Apple, and is not operating in a public interest strong enough to overcome he expectation private parties will honor their contracts or resolve disputes through normal means.

Another North Dakota technology bill appears to have died in committee. The North Dakota House Industry, Business and Labor Committee voted not to pass HB 1330, a bill that would ban the sale of one’s personal data without opt-in consent. The penalties for doing so in violation of this proposed law are stiff and would depend entirely on private lawsuits with class actions being explicitly allowed. A company that violates this proscription would be liable for at least $10,000 and reasonable attorney’s fees while companies that knowingly violate the law would be facing at least $100,000 in damages, reasonable attorney’s fees, and punitive damages. As mentioned, the bill explicitly states class actions may be filed, which would likely result in massive cases arguing for millions of dollars in damages if a multinational were to knowingly violate this bill.

HB 1330 provides simple parameters to how entities may sell personal data:

A covered entity may not sell a user’s protected data to another person unless the user opts-in to allow the sale. To opt-in, the covered entity shall provide the user with the opportunity to affirmatively click or select approval of the sale. The user must be given the opportunity to opt-in to the sale of each type of protected data by individual selection. Protected data collected and sold by the covered entity must be described clearly in plain language to the user.

Given the bill does not include a definition of sell or sale, it is unclear if trading personal data or some other exchange short of money changing hands would qualify. If not, this considerable loophole would probably not stop companies like Facebook and Google from amassing massive troves of data, processing them, and then selling targeted advertising or other services based on the value of its data and profiles.

The definition of what is “personal data” is fairly expansive:

a user’s location; screen name; website address; interests; hometown; professional history; friends or followers; shopping habits; test scores; health conditions, insurance, or interests; internet browsing history; purchases or purchase history; the number of friends or followers of the user; socioeconomic status; religious affiliation; alcohol, tobacco, or drug usage; gambling habits; banking relationships; residence details; children’s information or household information; credit; banking and insurance policies; media usage; and relationship status.

And yet, some notable omissions include sexual orientation and political beliefs. Arguably, those types of information could be considered part of one’s “interests,” “internet browsing history” or “household information.” If a covered entity decided to make the case such information is outside the definition of “personal data,” then the collecting and selling of these data could continue without consent.

It bears note this bill does not give residents of North Dakota any control over whether data may be collected, processed, shared, or used. It merely bars the sale of certain data without opt-in consent.

It bears note that for whatever flaws this bill has, it uses an opt-in model whereas one currently enacted state privacy law and a pending privacy law do not. The California Privacy Rights Act (CPRA) would continue the right of California residents currently enjoy under the “California Consumer Privacy Act” (CCPA) (AB 375) to opt out of the sale of their personal data (see here for more analysis.). Likewise, Virginia’s the “Consumer Data Protection Act” (SB 1392/HB 2307) allows for the opting out of the sale of personal data (see here for more analysis.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by dksecord from Pixabay

Don’t Look Now; Second State After CA On The Verge Of Enacting Privacy Legislation…and It Isn’t WA

Acting on bills introduced in January, the VA House and Senate have passed identical privacy bills.

There are times where it seems there are far too many technology policy developments to stay on top of, and that is just in the United States (U.S.) And while I have written at some length about the Washington legislature making yet another run at enacting privacy legislation for the third straight year, I apparently should have been paying attention to the state of my residence, Virginia. Last month, the legislature start working on privacy bills and over the last week or so, both chambers have passed bills with identical text, meaning enactment is all but assured. And so, as this will be only the second universal privacy regime passed by a state and with no sign that 2021 is the year Congress and the White House agree on federal legislation, this may be the most significant development on privacy this year.

In mid-January, the “Consumer Data Protection Act” (SB 1392/HB 2307) was introduced and quickly made its way through both chambers of the Virginia legislature. In the last week, identical bills were passed by the Senate and the House of Delegates with only the formality remaining of reconciling the two bills before it is sent to the Governor. If it is enacted, as seems very likely, the bill becomes effective on 1 January 2023, giving entities covered by the bill just shy of two full years to prepare.

Big picture, this bill is one of the weaker privacy bills within sight of enactment. It would permit many of the same data collection and processing activities currently occurring in Virginia to continue largely in the same fashion in 2023. The bill uses the opt out consent model but only in limited circumstances, for if entities disclose how they propose to process personal information, there limited cases in which people could opt out. There is no private right of action, and the attorney general would have to give entities a 30 day window to cure any potential violations and would be barred from proceeding if his office receives an express written statement that the violations have been cured. Given how much weaker this bill than others, it is little wonder it is sailing through the Virginia legislature.

Those entities subject to the act are:

  • An entity that controls or processes the personal data of 100,000 or more residents; or
  • An entity that controls or processes the personal data of 25,000 or more residents and earned more than 50% of its gross revenue in the previous year from selling personal data

However, the bill has more carveouts characteristic of a number of privacy bills introduced over the last few years, including those covered by some of the following federal privacy regimes, among others:

  • Health Information Portability and Accountability Act of 1996 (HIPAA)/Health Information Technology for Economic and Clinical Health (HITECH) Act
  • Financial Services Modernization Act of 1999 (aka Gramm-Leach-Bliley)
  • Fair Credit Reporting Act (FCRA)
  • Family Educational Rights and Privacy Act (FERPA)
  • Children’s Online Privacy Protection Act (COPPA)

A key difference between this bill and others with similar language is that an entity merely needs to be covered by one of these laws and not necessarily compliant. Most other privacy bills require compliance with these and other federal regimes in order to be exempted.

The Consumer Data Protection Act uses the same terminology as the European Union’s (EU) General Data Protection Regulation (GDPR) regarding entities that determine how personal data will be processed and those that do the processing: controllers and processors respectively.

A number of definitions are crucial in the bill. Personal data excludes publicly available data and de-identified data, the latter of which creates a safe harbor incentive for entities to de-identify the personal data they collect, maintain, and process, for many of the new obligations entities covered by this bill face pertain to personal data. The definition of personal data is fairly broad as it includes “any information that is linked or reasonably linkable to an identified or identifiable natural person.” There is a subset of these data subject to more stringent protection: sensitive data which includes:

  • Personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status;
  • The processing of genetic or biometric data for the purpose of uniquely identifying a natural person;
  • The personal data collected from a known child; or
  • Precise geolocation data.

The definition of “Sale of personal data” may be so narrow that some common practices in the data world would be outside the definition, and this would matter because people are given the right to opt out of the sales of data and not necessarily the sharing of their personal data. Companies like Facebook have gone before Congress and stated they do not sell the personal data of its users, and this seems to be accurate. Instead, they trade and share personal data, activities which would seem to fall outside the definition in this bill which involves “the exchange of personal data for monetary consideration by the controller to a third party.” Had it been just “consideration,” then activities like Facebook would have been subject to the limitation that people can use to opt out. On the other hand, a fair reading of monetary consideration would seem to be cash or its equivalent, and it is arguable whether a controller trading personal data with a third party qualifies. This may get sorted out by a Virginia court.

There are the by now expected exceptions to the strictures against collecting and processing data without the consent of residents, some of which controllers and processors may bend out of all shape.

The Consumer Data Privacy Act would create the same sorts of rights for people that other privacy bills would. And as with almost all the other privacy bills, a person could submit a request to a controller that must be responded to within 45 days, which is not to say that action must occur within that timeframe. If the request is complex or there is some other reason why 45 days is not enough the controller may alert the person and then take another 45 days. If the controller denies the request, the person may use the appeal system each controller must have, and if they are still denied they may file a complaint with the state attorney general’s office.

Among the rights people would get visa vis controllers under the Consumer Data Privacy Act are:

  • Requesting whether a controller is processing their personal data, and if so, obtaining access to such personal data
  • Correcting inaccuracies in personal data depending the nature of the information and the purposes of the processing, suggesting for lower stakes processing and information of lesser importance controllers may be free to deny such requests
  • Asking that personal data be deleted
  • Receiving one’s data in portable format
  • Opting out of processing:
    • for the purpose of targeted advertising
    • the sale of personal data; and
    • “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer”

Taking the last right first, it appears people could not opt of most processing of their personal data. There are some other circumstances under which people in Virginia would be able to opt out but these are limited. Consequently, it appears the default would be controllers are able to collect and process within certain limits to be discussed below. The rights to delete, correct, and port are fairly standard.

Controllers must “[l]imit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer.” Moreover, it is made clear that processing without consent is permissible so long as it is reasonably necessary or compatible “with the disclosed purposes for which such personal data is processed.” Processing for purposes beyond those reasonably necessary or compatible is permitted but only with the consent of the person. And so, there will be fights about the purposes that would be exempted from the consent requirement as controllers will almost certainly seek to push the boundaries of what is “reasonably necessary” or “compatible. Of course, a controller may also write a disclosure notifying people of the very broad processing of personal data and so people would be on notice about this processing.

The Consumer Data Privacy Act uses boilerplate language about security requirements. Controllers must “[e]stablish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data…[and] [s]uch data security practices shall be appropriate to the volume and nature of the personal data at issue.” The implicit sliding scale in this formulation is elegant in style, but how difficult will it be for controllers, processors, the attorney general, people, and courts to determine where the lines lie for certain classes of information.

The bill bars processing of personal data in violation of federal and state anti-discrimination laws. Controllers cannot retaliate against people who exercise the rights established  by the act with some important caveats. This provision states that nothing “prohibit[s] a controller from offering a different price, rate, level, quality, or selection of goods or services to a consumer, including offering goods or services for no fee, if the consumer has exercised his right to opt out pursuant to § 59.1-573 (i.e., opting out of targeted advertising, the sale of one’s personal data, or for profiling to make decisions with legal effects.) Hence, exercising the opt out right could get costly as controllers would be free to offer different tiers of services or products. There is also a carveout for loyalty and rewards programs. And yet, sensitive data may not be processed without consent.

There is a provision nullifying contractual language ostensibly forcing people to forgo any of the rights bestowed by the bill.

Controllers must provide privacy policies that identify the categories of personal data being processed and the purposes of the processing, inform people how they can exercise their rights, and name the categories of personal data shared with third parties and the categories of third parties with whom personal data are shared. Controllers who process for targeted advertising or sell data must make these facts conspicuous in their privacy policies. There is no language on the complexity, comprehensibility or length of such policies. Given the dense and impenetrable privacy policies currently available to people, it stands to reason that this will continue to be the norm in Virginia.

Processors are bound to follow the direction of the controllers that share personal data with them, and this and other obligations must be set down in a contract between the parties. Processors will also need to help controllers in a number of ways, including helping them respond to requests and assisting them in the event of a data breach. Processors will be required to assist controllers which perform audits. Moreover, processors must return or delete personal data to the controller upon request and will have a duty of confidentiality.

For certain classes of processing, controllers will need to conduct data protection assessments:

  • Selling data
  • Targeted advertising
  • Profiling but only if there are “reasonably foreseeable risks” of
    • unfair or deceptive treatment of, or unlawful disparate impact on, consumers; (ii) financial, physical, or reputational injury to consumers; (iii) a physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person; or (iv) other substantial injury to consumers;
  • Sensitive data; and
  • “Any processing activities involving personal data that present a heightened risk of harm to consumers”

Controllers must conduct these assessments according to a number of factors and considerations:

Data protection assessments…shall identify and weigh the benefits that may flow, directly and indirectly, from the processing to the controller, the consumer, other stakeholders, and the public against the potential risks to the rights of the consumer associated with such processing, as mitigated by safeguards that can be employed by the controller to reduce such risks. The use of de-identified data and the reasonable expectations of consumers, as well as the context of the processing and the relationship between the controller and the consumer whose personal data will be processed, shall be factored into this assessment by the controller.

The Attorney General may request and receive these data protection assessments in the course of an investigation, but they must be kept confidential and would not be subject to a freedom of information request.

Regarding de-identified data, controllers holding this type of data must commit to not re-identifying it and make reasonable efforts to ensure these data cannot be associated with people. Additionally, if a controller holds personal data in pseudonymous form with “any information necessary to identify the consumer” being held safely and securely separate from the pseudonymous data, then the controller does not need to respond to a number of consumer requests.

Naturally, this privacy bill contains a long list of exceptions, including compliance with federal and state law and court orders and warrants. Many of these are fairly standard, but there are some that may lend themselves to creative, expansive interpretations by controllers and processors looking to get out of complying with the act such as:

  • Prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activities, or any illegal activity; preserve the integrity or security of systems; or investigate, report, or prosecute those responsible for any such action;
  • Conduct internal research to develop, improve, or repair products, services, or technology;
  • Effectuate a product recall;
  • Identify and repair technical errors that impair existing or intended functionality; or
  • Perform internal operations that are reasonably aligned with the expectations of the consumer or reasonably anticipated based on the consumer’s existing relationship with the controller or are otherwise compatible with processing data in furtherance of the provision of a product or service specifically requested by a consumer or the performance of a contract to which the consumer is a party.

The Virginia Attorney General would be able to enforce the act. However, before bringing an action, the Attorney General must “provide a controller or processor 30 days’ written notice identifying the specific provisions of this chapter the Attorney General, on behalf of a consumer, alleges have been or are being violated.” And amazingly, if the controller or processor provides “an express written statement that the alleged violations have been cured and that no further violations shall occur,” the Attorney General cannot bring an action for statutory damages unless there are further violations. In this case, the Attorney General could seek $7500 per violation.

There was a private right of action in the House’s version of the bill last year. It would have utilized the right of action currently available in the Virginia Consumer Act that would have been available to residents in the event the act is violated.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Developments, and Coming Events (1 February 2021)

Further Reading

  • Facebook and Apple Are Beefing Over the Future of the Internet” By Gilad Edelman — WIRED. The battle over coming changes to Apple’s iOS continues to escalate. Apple CEO Tim Cook said the changes that will change the app set up for iPhone users to an opt-in system for tracking people across the internet would help protect both privacy and democracy. This latter claim is a shot at Facebook and its role in the rise of extremist groups in the United States and elsewhere. Facebook CEP Mark Zuckerberg claimed this change was of a piece with Apple’s long term interests in driving the app market from a free to paid model that would benefit the Cupertino giant through its 30% fees on all in-app purchases. Zuckerberg also reiterated Facebook’s arguments that such a change by Apple will harm small businesses that will have a harder time advertising. Facebook is also making noise about suing Apple in the same way Epic Games has for its allegedly anti-competitive app store practices. Experts expect Apple’s change will take as much as 10% off of Facebook’s bottom line until it and other advertising players adjust their tactics. This will not be the last shots fired between the two tech giants.
  • Democratic Congress Prepares to Take On Big Tech” By Cecilia Kang — The New York Times. Senator Amy Klobuchar (D-MN) is vowing to introduce antitrust legislation this spring that could rein in big technology companies in the future. Klobuchar’s proposal will receive serious consideration because she now chairs the Senate Judiciary Committee’s subcommittee with jurisdiction over antitrust and competition policy. Klobuchar also plans to release a book this spring with her views on antitrust. Any proposal to reform antitrust law faces a steep uphill battle to 60 votes in the Senate.
  • Pressure builds on Biden, Democrats to revive net neutrality rules” By Tony Romm — The Washington Post. Until the Federal Communications Commission (FCC) has a third Democratic vote, pressure from the left will be on whom the Biden Administration will choose to nominate. Once a Democratic majority is in place, the pressure will be substantial to re-promulgate the Obama Administration net neutrality order.
  • Why Google’s Internet-Beaming Balloons Ran Out of Air” By Aaron Mak — Slate. Among the reasons Alphabet pulled the plug on Loon, its attempt to provide internet service in areas without it, include: the costs, lack of revenue since the areas without service tend to be poorer, the price barriers to people getting 4G devices, and resistance or indifference from governments and regulators.
  • A big hurdle for older Americans trying to get vaccinated: Using the internet” By Rebecca Heilweil — recode. Not surprisingly, the digital divide and basic digital literacy are barriers to the elderly, especially poorer and minorities segment of that demographic, securing online appointments for COVID-19 vaccination.

Other Developments

  • A group of House and Senate Democrats have reintroduced the “Public Health Emergency Privacy Act,” a bill that follows legislation of the same title introduced last spring to address gaps in United States (U.S.) privacy law turned up by the promise of widespread use of COVID-19 tracking apps. And while adoption and usage of these apps have largely underperformed expectations, the gaps and issues have not. And, so Representatives Suzan DelBene (D-WA), Anna Eshoo (D-CA), and Jan Schakowsky (D-IL) and Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) have introduced the “Public Health Emergency Privacy Act” (S.81) but did not make available bill text, so it is not possible at this point to determine how closely it matches last year’s bill, the “Public Health Emergency Privacy Act” (S.3749/H.R.6866) (see here for my analysis of last year’s bill.) However, in a sign that the bills may be identical or very close in their wording, the summary provided in May 2020 and the one provided last week are exactly the same:
    • Ensure that data collected for public health is strictly limited for use in public health;
    • Explicitly prohibit the use of health data for discriminatory, unrelated, or intrusive purposes, including commercial advertising, e-commerce, or efforts to gate access to employment, finance, insurance, housing, or education opportunities;
    • Prevent the potential misuse of health data by government agencies with no role in public health;
    • Require meaningful data security and data integrity protections – including data minimization and accuracy – and mandate deletion by tech firms after the public health emergency;
    • Protect voting rights by prohibiting conditioning the right to vote based on a medical condition or use of contact tracing apps;
    • Require regular reports on the impact of digital collection tools on civil rights;
    • Give the public control over their participation in these efforts by mandating meaningful transparency and requiring opt-in consent; and
    • Provide for robust private and public enforcement, with rulemaking from an expert agency while recognizing the continuing role of states in legislation and enforcement.
  • The United States Department of Justice (DOJ) filed charges against a United States (U.S.) national for “conspiring with others in advance of the 2016 U.S. Presidential Election to use various social media platforms to disseminate misinformation designed to deprive individuals of their constitutional right to vote.” In its complaint, the DOJ foes out of its way not to mention which candidate in the presidential election the accused was working to elect, contemporaneous reporting on the individual made clear he supported Donald Trump and sought to depress the vote for former Secretary of State Hillary Clinton. In its press release, the DOJ asserted:
    • The complaint alleges that in 2016, Mackey established an audience on Twitter with approximately 58,000 followers. A February 2016 analysis by the MIT Media Lab ranked Mackey as the 107th most important influencer of the then-upcoming Election, ranking his account above outlets and individuals such as NBC News (#114), Stephen Colbert (#119) and Newt Gingrich (#141).
    • As alleged in the complaint, between September 2016 and November 2016, in the lead up to the Nov. 8, 2016, U.S. Presidential Election, Mackey conspired with others to use social media platforms, including Twitter, to disseminate fraudulent messages designed to encourage supporters of one of the presidential candidates (the “Candidate”) to “vote” via text message or social media, a legally invalid method of voting.
    • For example, on Nov. 1, 2016, Mackey allegedly tweeted an image that featured an African American woman standing in front of an “African Americans for [the Candidate]” sign.  The image included the following text: “Avoid the Line. Vote from Home. Text ‘[Candidate’s first name]’ to 59925[.] Vote for [the Candidate] and be a part of history.”  The fine print at the bottom of the image stated: “Must be 18 or older to vote. One vote per person. Must be a legal citizen of the United States. Voting by text not available in Guam, Puerto Rico, Alaska or Hawaii. Paid for by [Candidate] for President 2016.”
    • The tweet included the typed hashtags “#Go [Candidate]” and another slogan frequently used by the Candidate. On or about and before Election Day 2016, at least 4,900 unique telephone numbers texted “[Candidate’s first name]” or some derivative to the 59925 text number, which was used in multiple deceptive campaign images tweeted by the defendant and his co-conspirators.
  • Six European and two North American nations worked in coordinated fashion to take down a botnet. Europol announced that “[l]aw enforcement and judicial authorities worldwide have this week disrupted one of most significant botnets of the past decade: EMOTET…[and] [i]nvestigators have now taken control of its infrastructure in an international coordinated action” per their press release. Europol added:
    • EMOTET has been one of the most professional and long lasting cybercrime services out there. First discovered as a banking Trojan in 2014, the malware evolved into the go-to solution for cybercriminals over the years. The EMOTET infrastructure essentially acted as a primary door opener for computer systems on a global scale. Once this unauthorised access was established, these were sold to other top-level criminal groups to deploy further illicit activities such data theft and extortion through ransomware.
  • On 26 January, Senator Ed Markey (D-MA) “asked Facebook why it continues to recommend political groups to users despite committing to stopping the practice” at an October 2020 hearing. Markey pressed CEO Mark Zuckerberg to “explain the apparent discrepancy between its promises to stop recommending political groups and what it has delivered.” Markey added:
    • Unfortunately, it appears that Facebook has failed to keep commitments on this topic that you made to me, other members of Congress, and your users. You and other senior Facebook officials have committed, and reiterated your commitment, to stop your platform’s practice of recommending political groups. First, on October 28, 2020, you appeared before the U.S. Senate Committee on Commerce, Science, and Transportation and stated that Facebook had stopped recommending groups with political content and social issues. When I raised concerns about Facebook’s system of recommending groups, you stated, “Senator, we have taken the step of stopping recommendations in groups for all political content or social issue groups as a precaution for this.”
    • It does not appear, however, that Facebook has kept these commitments. According to The Markup, Facebook “continued to recommend political groups to its users throughout December[of 2020]” — well after you responded to my question at the Commerce Committee hearing.
    • On 27 January, Zuckerberg announced on an earnings call that the platform would stop recommending political and civic groups to users.
  •  The United States (U.S.) Department of Transportation’s National Highway Traffic Safety Administration “announced the expansion of the Automated Vehicle Transparency and Engagement for Safe Testing (AV TEST) Initiative from a pilot to a full program” according to a press release. NHTSA announced the “new web pilot of the Department initiative to improve the safety and testing transparency of automated driving systems” in June 2020 that “aligns with the Department’s leadership on automated driving system vehicles, including AV 4.0:  Ensuring American Leadership in Automated Vehicle Technologies.”
  • The United Kingdom’s (UK) House of Lords amended the government’s Trade Bill that would allow for an agreement with the United States (U.S.) in a way that would block the U.S.’s position that essentially exports 47 USC 230 (Section 230) to the UK. The Lords agreed to this language:
    • (1)The United Kingdom may only become a signatory to an international trade agreement if the conditions in subsection (2) are satisfied.
    • (2) International trade agreements must be consistent with—
      • (a) other international treaties to which the United Kingdom is a party, and the domestic law of England and Wales (including any changes to the law after the trade agreement is signed), regarding the protection of children and other vulnerable user groups using the internet;
      • (b) the provisions on data protection for children, as set out in the age appropriate design code under section 123 of the Data Protection Act 2018 (age-appropriate design code) and other provisions of that Act which impact children; and(c)online protections provided for children in the United Kingdom that the Secretary of State considers necessary.
    • However, the House of Commons disagreed with this change, arguing “it is not an effective means of ensuring the protection of children online.”
    • In a House of Lords briefing document, it is explained:
      • The bill introduces measures to support the UK in implementing an independent trade policy, having left the European Union. It would:
        • enable the UK to implement obligations arising from acceding to the international Agreement on Government Procurement in its own right;
        • enable the UK to implement in domestic law obligations arising under international trade agreements the UK signs with countries that had an existing international trade agreement with the EU;
        • formally establish a new Trade Remedies Authority;
        • enable HM Revenue and Customs (HMRC) to collect information on the number of exporters in the UK; and
        • enable data sharing between HMRC and other private and public sector bodies to fulfil public functions relating to trade.
  • According to their press release, “a coalition of education advocates petitioned the Federal Communications Commission (FCC) to close the remote learning gap for the estimated 15 to 16 million students who lack home internet access” through the E-rate program. This petition follows an Executive Order (EO) signed by President Joe Biden on the first day of his Administration, calling on the FCC to expand broadband connectivity for children across the United States to help them with schooling and studies.
    • In their petition, the groups argued
      • In one of his first Executive Orders, President Biden stated: “The Federal Communications Commission is encouraged, consistent with applicable law, to increase connectivity options for students lacking reliable home broadband, so that they can continue to learn if their schools are operating remotely.”
      • Consistent with [Biden’s EO], the Commission can dramatically improve circumstances for these underserved students, and for schools all over the country that are struggling to educate all of their students, by taking the temporary, limited measures requested in this Petition.
      • As shown below, these actions are well within the Commission’s authority, and in fact all of the actions requested in this Petition could be taken by the Wireline Competition Bureau on delegated authority.
      • As noted above, the Petitioners ask that the Commission issue a declaratory ruling to clarify that, for the duration of the pandemic, the off-campus use of E-rate-supported services to enable remote learning constitutes an “educational purpose” and is therefore allowed under program rules.
      • The declaratory ruling will allow schools and libraries to extend E -rate-funded broadband networks and services outside of a school or library location during Funding Years 2020 and 2021, without losing E-rate funds they are otherwise eligible to receive. Importantly, this requested action would not require the collection of any additional Universal Service funds.
      • Given the severity of our current national emergency, the Petitioners ask that the Bureau release hundreds of millions of dollars—currently not designated for use but held in the E-rate program—to support remote learning. There is little justification for keeping E-rate funds in reserve when the country is facing such an enormous educational crisis.
      • The Commission should use the program’s existing discount methodologies, which take into account socioeconomic status and rural location, in calculating the amount of funding that applicants may receive.  Applicants will have the incentive to make cost-effective purchases because they will have to pay a share of the total cost of services.  
      • To facilitate the distribution of additional funding, Petitioners ask that the Commission direct the Universal Service Administrative Company (USAC) to establish a “remote learning application window” as soon as practicable for the specific purpose of allowing applicants to submit initial or revised requests for E-rate funding for off-campus services used for educational purposes during Funding Years 2020 and 2021.  
      • The Petitioners ask the Commission to waive all rules necessary to effectuate these actions for remote learning funding applications, including the competitive bidding, eligible services, and application rules, pursuant to section 1.3 of the Commission’s rules.
      • The Petitioners respectfully request expedited review of this petition, so that schools and libraries may take action to deploy solutions as soon as possible.
  • “A group of more than 70 organizations have sent a letter to Congress and the Biden/Harris administration warning against responding to the violence in the U.S. Capitol by renewing injudicious attacks on Section 230 of the Communications Decency Act” per their press release. They further urged “lawmakers to consider impacts on marginalized communities before making changes to Section 230, and call on lawmakers to take meaningful action to hold Big Tech companies accountable, including enforcement of existing anti-trust and civil rights law, and passing Federal data privacy legislation.” The signatories characterized themselves as “racial justice, LGBTQ+, Muslim, prison justice, sex worker, free expression, immigration, HIV advocacy, child protection, gender justice, digital rights, consumer, and global human rights organizations.” In terms of the substance of their argument, they asserted:
    • Gutting Section 230 would make it more difficult for web platforms to combat the type of dangerous rhetoric that led to the attack on the Capitol. And certain carve outs to the law could threaten human rights and silence movements for social and racial justice that are needed now more than ever. 
    • Section 230 is a foundational law for free expression and human rights when it comes to digital speech. It makes it possible for websites and online forums to host the opinions, photos, videos, memes, and creativity of ordinary people, rather than just content that is backed by corporations. 
    • The danger posed by uncareful changes to Section 230 is not theoretical. The last major change to the law, the passage of SESTA/FOSTA in 2018, put lives in danger. The impacts of this law were immediate and destructive, limiting the accounts of sex workers and making it more difficult to find and help those who were being trafficked online. This was widely seen as a disaster that made vulnerable communities less safe and led to widespread removal of speech online.
    • We share lawmakers’ concerns with the growing power of Big Tech companies and their unwillingness to address the harm their products are causing. Google and Facebook are just some of the many companies that compromise the privacy and safety of the public by harvesting our data for their own corporate gain, and allowing advertisers, racists and conspiracy theorists to use that data to target us. These surveillance-based business models are pervasive and an attack on human rights. But claims that Section 230 immunizes tech companies that break the law, or disincentivizes them from removing illegal or policy-violating content, are false. In fact, Amazon has invoked Section 230 to defend itself against a lawsuit over its decision to drop Parler from Amazon Web Services due to unchecked threats of violence on Parler’s platform. Additionally, because Section 230 protects platforms’ decisions to remove objectionable content, the law played a role in enabling the removal of Donald Trump from platforms, who could act without fear of excessive litigation.

Coming Events

  • On 3 February, the Senate Commerce, Science, and Transportation Committee will consider the nomination of Rhode Island Governor Gina Raimondo to be the Secretary of Commerce.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Nikolai Chernichenko on Unsplash

Rival Privacy Bill Introduced In Washington House

A more expansive alternative to the Washington Senate’s privacy bill is introduced.

A stakeholder in the Washington state House of Representatives has unveiled her privacy bill, which may serve as the template for the House’s ultimate position on a number of privacy issues. If so, once again the House and Senate will be at odds over significant provisions, for the Senate’s bill largely tracks with previously enacted bills. Therefore it is possible that once again privacy legislation will fail to reach the Governor for the third year in a row.

Representative Shelley Kloba (D) and her cosponsors introduced the “People’s Privacy Act” (HB 1433), and Kloba is the Vice Chair of the Innovation, Technology, and Economic Development Committee, one of the committees privacy legislation will need to move through. The American Civil Liberties Union of Washington described its role in the drafting of the bill and its supporters:

The People’s Privacy Act was created by the ACLU of Washington with input and support from the Tech Equity Coalition, a group of civil liberties and civil rights-focused organizations and individuals working to hold technology accountable and lift the voices of historically marginalized communities in decisions about technology and its use by government and corporate interests. The bill provides a strong people-focused alternative to SB 5062 (Sen. Reuven Carlyle, D – LD 36), which allows companies to override people’s decisions about if and how their information is collected, used, and shared. Unlike SB 5062, the People’s Privacy Act is enforceable through a private right of action, meaning that individuals would be able to sue covered entities that violate the Act.

The People’s Privacy Act is among the strongest privacy bills introduced in the United States (U.S.). It requires opt-in consent for collection and processing of personal data, allows consumers to sue for up to $10,000 in damages per violation, allows the state attorney general to sue for damages of $25,000 per violation or 4% of annual revenue, whichever figure is higher. It bars the sue of facial recognition technology and artificial intelligence-enabled profiling. And, because it is such a strong bill from a privacy perspective, it will almost certainly not get enacted as currently written.

Before we turn to the details of HB 1433, a brief recap of the bill moving through the Senate. Last September, Washington State Senator Reuven Carlyle (D-Seattle) released a discussion draft that tracked fairly closely to the last bill the Senate passed before the legislature adjourned (see here for analysis). Big picture, the bill still uses the concepts of data controllers and processors most famously enshrined in the European Union’s (EU) General Data Protection Regulation (GDPR). Like other privacy bills, generally, people in Washington State would not need to consent before an entity could collect and process its information. People would be able to opt out of some activities, but most could data collection and processing could still occur as it presently does. Earlier this month, Carlyle and cosponsors revised and then introduce their bill, the Washington Privacy Act (SB 5062) that tracks closely with the two bills produced by the Washington Senate and House last year lawmakers could not ultimately reconcile. (see here for analysis.) However, there are no provisions on facial recognition technology, which was largely responsible for sinking a privacy bill in Washington State two years ago. The sponsors have also taken the unusual step of appending language covering the collection and processing of personal data to combat infectious diseases like COVID-19.

The Washington State Senate Environment, Energy & Technology Committee held a mid-January hearing on SB 5062 and made available a number of materials, including an overview of SB 5062 and a comparison between it and the “California Privacy Rights Act.” A week later, the committee marked up the bill, adopted a substitute version, and then sent the bill to another committee. In the bill report, staff summarized public testimony on the bill into three groups:

  • PRO: Consumers only have rights that are granted to them by businesses. The bill provides new rights and gives consumers more control over the handling of their data. By providing a regulatory framework for the processing of data, consumers are provided data protections, businesses may advance services and operate with increased predictability, and public confidence and trust will be fostered. Contact tracing provisions are needed to build public confidence in using tools to help stop the spread of COVID-19.
  • CON: This bill does not provide meaningful consumer protection regulations. People need to be able to bring a private right of action, which this bill explicitly prohibits, in order to protect their privacy rights and hold businesses accountable. This approach protects businesses rather than consumers by providing several exemptions. Financial information should be included. This bill fails to protect sensitive data shared by children in schools. The bill should include protections for teenagers. Contact tracing provisions should be addressed in a separate bill. An opt-in framework provides better protections than the opt-in provisions of the bill. Major platforms are carved out of the bill. Local jurisdictions should be able to enact stronger privacy laws.
  • OTHER: This bill reflects all of the hard work that has gone into this issue over several years and represents a compromise amongst various stakeholders. We are concerned that the definition of targeted advertising is confusing. We recommend a couple of measures that will help consumers exercise their rights such as recognizing global opt out mechanisms and authorizing delegated authority. With regards to enforcement, we have concerns with the cure period. This bill provides tools needed for enforcement. Compliance is burdensome; nonprofits should be exempt from these requirements just as they are in California. We have concerns that the provisions regarding loyalty programs might invalidate some partnerships.

From the outset, in the legislative findings section of the People’s Privacy Act, it is clear the sponsors are viewing privacy and possible harms much more widely than other stakeholders, for among the reasons listed for the legislation are:

  • Privacy violations and misuse of personal information in the digital age can lead to a range of harms, including discrimination in employment, health care, housing, access to credit, and other areas; unfair price discrimination; domestic violence; abuse; stalking; harassment; entrapment; and financial, emotional, and reputational harms.
  • Privacy harms disproportionately affect low-income people and people of color.
  • Privacy violations not only threaten the fundamental rights and privileges of Washingtonians, but they also menace the foundation and supporting institutions of a free democratic state.

As is to be expected, many of the definitions are written broadly. In what is perhaps some wordsmithing, a new term is used extensively in the bill to describe the information covered entities are collecting and processing. This bill deems that information “captured personal information,” a phrase that embodies the view of the sponsors on the data practices of businesses. The bill defines this term as:

personal information about a Washington resident that is captured in an interaction in which a covered entity directly or indirectly makes available information, products, or services to an individual or household. Covered interactions include but are not limited to posting of information, offering of a product or service, the placement of targeted advertisements, or offering a membership or other ongoing relationship with an entity. For the purposes of this chapter, “captured personal information” includes biometric information, regardless of how captured.

“Personal information” is also defined as

any information that directly or indirectly identifies, relates to, describes, is capable of being associated with, or could reasonably be linked to a particular individual, household, or device. Information is reasonably linkable to an individual, household, or device if it can be used on its own or in combination with other information to identify an individual, household, or device.

The definition of covered entity (CE) is among the broadest yet encountered:

  • Those entities which have earned $10 million or more in 300 or more transactions in the previous year; or
  • An entity maintaining or processing the captured personal information (CPI) of 1,000 or more Washington state residents

Clearly, this would encompass almost every business in the state of Washington save for the very smallest or those that do not maintain or process CPI.

Also of note, the definition of harm is very wide, which matters because suits will be brought on the basis of harm. Therefore, it is worth quoting in whole despite its length, and so, harm is defined as:

  • potential or realized adverse consequences to an individual or to society, including but not limited to:
    • Direct or indirect financial harm;
    • Physical harm or threats to individuals or property, including but not limited to bias-related crimes and threats, harassment, and sexual harassment;
    • Discrimination in products, services, or economic opportunity, such as housing, employment, credit, insurance, education, or health care, on the basis of an individual or class of individuals’ actual or perceived age, race, national origin, sex, sexual orientation, gender identity, disability, and/or membership in another protected class, except as specifically authorized by law;
    • Interference with or surveillance of First Amendment protected activities by state actors, except as specifically authorized by law;
    • Interference with the right to vote or with free and fair elections;
    • Violation of individuals’ rights to due process or equal protection under the law;
    • Loss of individual control over captured personal information via nonconsensual sharing of private information, data breach, or other actions that violate the rights listed in section 4 of this act;
    • The nonconsensual capture of information or communications within an individual’s home or where an individual is entitled to have a reasonable expectation of privacy or access control; and
    • Other effects on an individual that may not be reasonably foreseeable to, contemplated by, or expected by the individual to whom the captured personal information relates, that are nevertheless reasonably foreseeable, contemplated by, or expected by the covered entity that alter or limit that individual’s choices or predetermines results.

Moreover, there is room for this list of types of harm to grow for the first clause in the definition makes clear the list is not exhaustive. How these harms would be determined is an unanswered question. Perhaps the sponsors see fit to leave this to Washington state courts.

The sponsors include an incentive to push CEs to encrypt CPI. The definition of “process” includes this clause: “a person or entity that operates on CPI that is encrypted or otherwise in a format that makes it not accessible or susceptible to being made accessible to such person or entity in any comprehensible form shall not be deemed to be processing such CPI.” This exception of the definition of process will allow CEs to sidestep some of the responsibilities if they encrypt CPI.

The residents of Washington state would have the following rights:

  • The right to access the personal information a CE is holding, including both the categories of information and the specific information
  • The right to correct inaccurate or incomplete information
  • The right to obtain personal information in a machine readable format
  • The right to refuse the processing of CPI above and beyond the primary transaction
  • The right to demand and obtain the deletion of CPI except if the CE is required by law to hold the data or another exception applies.
  • The right not to be subjected to surreptitious surveillance

CEs generally have 30 days to respond to verifiable requests with some exceptions, of course. CEs may ask for more information if they are unable to verify the identity of the requester or there is reasonable doubt about the identity of the requester. With respect to requests to correct inaccurate or incomplete information, CEs must provide reasonable means to do so.

CEs will have to provide short-form and long-form privacy policies, and the bill is oddly silent on the latter class of policies but details extensively what the former must have. Nevertheless, a short-form privacy policy must be presented to a person when first they happen upon a website or open an app or for in-person interactions at the point of sale. These privacy policies must be understandable to the least sophisticated person, be no more than 500 words, be clear and concise, be written in plain English, and lack unrelated, confusing, or contradictory language among other requirements. In this policy, a person must be informed of the CPI being processed, the purpose of the processing, how long the CPI is retained, if CPI is being monetized, the third parties with whom the CPI is being shared, and all methods used to collect CPI. The Washington Department of Commerce must develop and release a standard short-form privacy policy and a button or logo for websites and apps so people will know where to look for the privacy policy.

HB 1433 is perhaps strongest on the issue of consent. Nothing less than knowing, affirmative, unambiguous consent works, and there is even language making clear that visiting a website or opening an app would suffice to function as consent. Additionally, people in Washington must opt-in to processing whereas most bills have opt-out as the default. CEs must make it easily understood in a prominent fashion that people may decline consent and may withdraw consent at any time after granting it. If a person decides against opting in, the CE has the responsibility to delete any data that may have been collected. Moreover, in the event a person declines to consent, CEs can collect only the data necessary to determine whether a person will consent and nothing more.

Under the People’s Privacy Act, 13 is the age at which a person may utilize and enjoy all the rights in the bill. Anyone under this age would be covered by the “Children’s Online Privacy Protection Act of 1998” (COPPA) (P.L. 105-277).

The People’s Privacy Act largely bars different pricing or diminished service or products for those who decline to let their CPI be processed. There is, however, language that would allow for loyalty and rewards programs with the sizeable caveat that any CPI collected and processed must only be for those programs. The Washington Department of Commerce must study and report on the most effective ways of getting knowing, unambiguous consent. The Washington Department of Commerce must also promulgate regulations specifying how:

  • CEs must notify individuals of their rights under this chapter and obtain individuals’ freely given, specific, informed, and unambiguous opt-in consent for each use model of captured personal information processing; and
  • CEs must notify individuals of their right to withdraw their consent at any time and how the right may be exercised.

There must also be regulations “grouping different types of processing of captured personal information by use model and permitting a covered entity to simultaneously obtain freely given, specific, informed, and unambiguous opt-in consent from an individual for multiple transactions of the same use model.”

HB 1433 borrows from the field of tort law to set some of the security standards CEs will need to meet. For some CES, they must meet the reasonable standard of care in their field with respect to the storing, using, and transmitting CPI. The Washington Department of Commerce is given the authority to promulgate, if it wishes, reasonable standards that would preempt any lower standards.

CEs are only permitted to disclose CPI to third parties under contracts that bind them to the same security and privacy obligations. CEs must oversee such third parties and conduct audits at least annually to ensure compliance with the contract. Likewise, CEs must enter into contracts with data processors before disclosing CPI that limit the latter’s processing to the purposes for which consent was originally granted. If a CE knows that another entity to whom they disclosed information is violating HB 1433, then the CE must limit access to data and press the other party to delete the data. Additionally, CEs must also obtain consent from people before they may process their personal information obtained from a third party. Finally, CEs may not remotely turn on or access a camera or microphone without opt-in consent, which will only be effective for 90 days.

Biometric information would be subject to a higher level of protection that would bind CE and also Washington governmental entities (WGE). For example, these entities would not be able to hold biometric information for more than a year after the last time a person interacted with the entity. Likewise, the consent for the processing of biometric information is only good for one year, and the consent automatically expires, requiring the entity to permanently delete this information. Written notice must be provided to a person and consent obtained before this class of personal information may be processed.

There are strong protections against the use of CPI to discriminate in a number of ways. For example, CPI cannot be used for targeted advertising for a range of services, including “employment, finance, health care, credit, insurance, housing, or education opportunities” in discriminatory ways. CPI cannot be used to discriminate in public accommodations, sales, and targeted advertising. There is a bar on the use of facial recognition technology or artificial intelligence-enabled profiling by CEs and WGEs.

Invariably, privacy bills contain exceptions to the requirement that consent must be granted before certain information can be processed. The People’s Privacy Act is no different except that the exceptions are narrower and lesser in number. And so, the consent requirement is not necessary if a CE or WGE is responding to an emergency posing danger to a person. However, the entity will need to contemporaneously document the rationale for skipping the consent and then notify the person after the fact. There are exceptions for warrants, subpoenas, and for federal and state requirements. Finally, the processing of de-identified information does not require the consent of a person. Notably, there are no exceptions for security or to improve products or services like there are in most bills.

Unlike the Washington Senate’s bill, HB 1433 includes a private right of action that includes a rebuttable presumption of harm if there is a violation of this act. People can seek $10,000 in damages per violation or actual damages, whichever is higher, and any other relief the court thinks appropriate. The state attorney general is also empowered to bring enforcement actions and seek up to $25,000 per violation or 4% of the previous year’s revenue, whichever is higher. The attorney general may also seek restitution and any other equitable relief a court sees fit to provide. Finally, a new wrinkle for privacy bills. City and county prosecutors may bring actions in the same way the attorney general may, and this would provide a backstop if there is a lack of enforcement at the state level. It will be interesting to see if this provision migrates to other privacy bills. Interestingly, the bill states the statute of limitations to sue starts running when a violation or injury is found and not within an arbitrary, defined timeframe (e.g., five years.)

Finally, the bill preempts city and county privacy laws only to the extent they are weaker. And, interestingly, the People’s Privacy Act lacks the by now customary provisions exempting entities in compliance with other state or federal privacy regimes (e.g., Gramm-Leach-Bliley for financial services entities) and makes clear these entities will be required to comply with any requirements in this act that are above those imposed by another privacy framework.

And it must be mentioned, two of the Congressional stakeholders on privacy and data security hail from Washington state, and consideration and possible passage of a state law may limit their latitude on a federal bill they could support. Senator Maria Cantwell (D-WA) and Representative Cathy McMorris Rodgers (R-WA), who are the chair of the Senate Commerce, Science, and Transportation Committee and ranking member of the House Energy and Commerce Committee respectively, are expected to be involved in drafting their committee’s privacy bills, and a Washington state statute may affect their positions in much the same way as the “California Consumer Privacy Act” (CCPA) (AB 375) has informed a number of California Members’ position on privacy legislation, especially with respect to bills being seen as weaker than the CCPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Sponsors Take A New Run At Privacy Law in Washington State

Perhaps the third time is the charm? Legislators seek to pass a privacy law in Washington state for the third year in a row.

A group of Senators in Washington state’s Senate have introduced a slightly altered version of a privacy bill they floated last summer. A committee of jurisdiction will hold a hearing on 14 January 2021 on SB 5062. Of course, this would mark the third year in a row legislators have tried to enact the Washington privacy act. The new bill (SB 5062) tracks closely with the two bills produced by the Washington Senate and House last year lawmakers could not ultimately reconcile. However, there are no provisions on facial recognition technology, which was largely responsible for sinking a privacy bill in Washington State two years ago. The sponsors have also taken the unusual step of appending language covering the collection and processing of personal data to combat infectious diseases like COVID-19.

I analyzed the discussion draft that Washington State Senator Reuven Carlyle (D-Seattle) released over the summer, and so I will not recite everything about the new bill. It should suffice to highlight the differences between the discussion draft and the introduced legislation. Big picture, the bill still uses the concepts of data controllers and processors most famously enshrined in the European Union’s (EU) General Data Protection Regulation (GDPR). Like other privacy bills, generally, people in Washington State would not need to consent before an entity could collect and process its information. People would be able to opt out of some activities, but most could data collection and processing could still occur as it presently does.

The date on which the bill would take effect was pushed aback from 120 days in the discussion draft to 31 July 2022 in the introduced bill. While SB 5062 would cover non-profits, institutions of higher education, airlines, and others unlike the discussion draft, the effective date for the bill to cover would be 31 July 2026. The right of a person to access personal data a controller is processing is narrowed slightly in that it would no longer be the personal data the controller has but rather categories of personal data. The time controllers would have to respond to a certain class of request would be decreased from 45 to 15 days. This class includes requests to opt out of targeted advertising, the sale of personal data, and any profiling in furtherance of decisions with legal effects. Section 106’s requirement that processors have reasonable security measures has been massaged, rephrased and possibly weakened a bit.

One of the activities controllers and processors could undertake without meeting the requirements of the act was removed. Notably, they will no longer be able to “conduct internal research solely to improve or repair products, services, or technology.” There is also a clarification that using any of the exemptions in Section 110 does not make an entity a controller for purposes of the bill. There is a new requirement that the State Office of Privacy and Data Protection must examine current technology that allows for mass or global opt out or opt in and then report to the legislature. Finally, two of the Congressional stakeholders on privacy and data security hail from Washington state, and consideration and possible passage of a state law may limit their latitude on a federal bill they could support. Senator Maria Cantwell (D-WA) and Representative Cathy McMorris Rodgers (R-WA), who are the ranking members of the Senate Commerce, Science, and Transportation Committee and House Energy and Commerce Committee respectively, are expected to be involved in drafting their committee’s privacy bills, and a Washington state statute may affect their positions in much the same the “California Consumer Privacy Act” (CCPA) (AB 375) has informed a number of California Members’ position on privacy legislation, especially with respect to bills being seen as weaker than the CCPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Kranich17 from Pixabay

Canada Releases Privacy Bill

Canada’s newly released privacy bill shares commonalities with U.S. bills but features a stronger enforcement regime that could result in fines of up to 5% of annual worldwide revenue for the worst violations.

The government in Ottawa has introduced in Parliament the “Digital Charter Implementation Act, 2020” (Bill C-11) that would dramatically reform the nation’s privacy laws and significantly expand the power of the Office of Privacy Commissioner (OPC). The bill consists of two main parts, the Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act, and would partially repeal Canada’s federal privacy law: Personal Information Protection and Electronic Documents Act. Notably, the bill would allow the OPC to levy fines up to 5% of worldwide revenue or $25 million CAD (roughly $20 million USD), whichever is higher. Canadians would also get a private right of action under certain conditions.

Broadly, this bill shares many characteristics with a number of bills introduced in the United States Congress by Democratic Members. Consent would be needed in most cases where a Canadian’s personal information is collected, processed, used, shared, or disclosed although there are some notable exceptions. Canada’s federal privacy regulator would be able to seek and obtain stiff fines for non-compliance.

In the bill, its purpose is explained:

The purpose of this Act is to establish — in an era in which data is constantly flowing across borders and geographical boundaries and significant economic activity relies on the analysis, circulation and exchange of personal information — rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

The Department of Industry (aka Innovation, Science and Economic Development Canada) released this summary of the bill:

The Government of Canada has tabled the Digital Charter Implementation Act, 2020 to strengthen privacy protections for Canadians as they engage in commercial activities. The Act will create the Consumer Privacy Protection Act (CPPA), which will modernize Canada’s existing private sector privacy law, and will also create the new Personal information and Data Protection Tribunal Act, which will create the Personal Information and Data Tribunal, an entity that can impose administrative monetary penalties for privacy violations. Finally, the Act will repeal Part 2 of the existing Personal Information Protection and Electronic Documents Act (PIPEDA) and turn it into stand-alone legislation, the Electronic Documents Act. With each of these steps, the government is building a Canada where citizens have confidence that their data is safe and privacy is respected, while unlocking innovation that promotes a strong economy.

The Department added:

  • Changes enabled by CPPA will enhance individuals’ control over their personal information, such as by requesting its deletion, creating new data mobility rights that promote consumer choice and innovation, and by creating new transparency requirements over uses of personal information in areas such as artificial intelligence systems.
  • CPPA will also promote responsible innovation by reducing regulatory burden. A new exception to consent will address standard business practices; a new regime to clarify how organizations are to handle de-identified personal information, and another new exception to consent to allow organizations to disclose personal information for socially beneficial purposes, such as public health research, for example.
  • The new legislative changes will strengthen privacy enforcement and oversight in a manner similar to certain provinces and some of Canada’s foreign trading partners. It does so by: granting the Office of the Privacy Commissioner of Canada (OPC) order-making powers, which can compel organizations to comply with the law; force them to stop certain improper activities or uses of personal information; and order organizations to preserve information relevant to an OPC investigation. The new law will also enable administrative monetary penalties for serious contraventions of the law, subject to a maximum penalty of 3% of global revenues.
  • The introduction of the Personal Information and Data Tribunal Act will establish a new Data Tribunal, which will be responsible for determining whether to assign administrative monetary penalties that are recommended by the OPC following its investigations, determining the amount of any penalties and will also hear appeals of OPC orders and decisions. The Tribunal will provide for access to justice and contribute to the further development of privacy expertise by providing expeditious reviews of the OPC’s orders.
  • The Electronic Documents Act will take the electronic documents provisions of PIPEDA and enact them in standalone legislation. This change will simplify federal privacy laws and will better align the federal electronic documents regime to support service delivery initiatives by the Treasury Board Secretariat.

In a summary, the Department explained:

Under the CPPA, the Privacy Commissioner would have broad order-making powers, including the ability to force an organization to comply with its requirements under the CPPA and the ability to order a company to stop collecting data or using personal information. In addition, the Privacy Commissioner would also be able to recommend that the Personal Information and Data Protection Tribunal impose a fine. The legislation would provide for administrative monetary penalties of up to 3% of global revenue or $10 million [CAD] for non-compliant organizations. It also contains an expanded range of offences for certain serious contraventions of the law, subject to a maximum fine of 5% of global revenue or $25 million [CAD].

The CPPA broadly defines what constitutes “personal information” and what is therefore covered and protected by the bill. It would be “information about an identifiable individual,” a much wider scope than almost all the legislation in the United States, for example. Consequently, even information derived through processing that was not directly or indirectly collected from a person would seem to be covered by the bill. And, speaking of processing, the CPPA limits how personal information may collected and used, specifically “only for purposes that a reasonable person would consider appropriate in the circumstances.”

Moreover, entity can only collect personal information needed for purposes disclosed before collection or at the time of collection and only with the consent of the person. However, the CPPA would allow for “implied consent” if “the organization establishes that it is appropriate…taking into account the reasonable expectations of the individual and the sensitivity of the personal information that is to be collected, used or disclosed.” And, if the entity decides to collect and use personal information for any new purpose, it must obtain the consent of people in Canada before doing so. What’s more, organizations cannot condition the provision of products or services on people providing consent for collection of personal information beyond what is necessary. And, of course, consent gained under false, deceptive, or misleading pretenses is not valid and people may withdraw consent at any time.

In terms of the types of disclosures an organization must make in terms of purposes, the CPPA would require more than most proposed U.S. federal privacy laws. For example, an entity must tell people the specific personal information to be collected, processed, used, or disclosed, the reasonable consequences of any of the aforementioned, and the names of third parties or types of third partied with whom personal information would be shared.

The CPPA is very much like U.S. privacy bills in that there are numerous exceptions as to when consent is not needed for collecting, processing, and using personal information. Principally, this would be when a reasonable person would expect or understand this could happen or so long as the collection and processing activities are not to influence a person’s decisions or behavior. Activities that would fall in the former category are things such as collection, use, and processing needed to deliver a product or service, protecting the organization’s systems and information security, or the due diligence necessary to protect the organization from commercial risk. Moreover, if collection, use, and processing are in the public interest and consent cannot be readily obtained, then the organization may proceed. The same is true if there is an emergency situation that imperils the life or health of a person so long as disclosure to the person is made in writing expeditiously afterwards. However, neither consent nor knowledge are required for transfers of personal information to service providers, in employment settings, to prevent fraud, and for a number of other enumerated purposes.

There are wide exceptions to the consent requirement relating to collection and use of personal information in the event of investigations of breaches of agreements or contravention of federal or provincial law. Likewise, consent may not be needed if an organization is disclosing personal information to government institutions. Similarly, the collection and use of public information is authorized subject to regulations.

However, the CPPA makes clear that certain exceptions to the consent and knowledge requirements are simply not operative when the personal information in question is an “electronic address” or is stored on a computer system. In these cases, consent or knowledge would be needed before such collection of personal information is legal.

Organizations must dispose of personal information when it is no longer needed for the purpose it was originally collected except for personal information collected and used for decision making. In this latter case, information must be retained in case the person about whom the decision was made wants access. Organizations must dispose of personal information about a person upon his or her request unless doing so would result in the disposal of other people’s information or there is a Canadian law barring such disposal. If the organization refuses the request to dispose, it must inform the person in writing. If the organization grants the request, it must direct service providers to do the same and confirm destruction.

Organizations would have a duty to ensure personal information is accurate, and the applicability of this duty would turn on whether the information is being used to make decisions, is being shared with third parties, and if the information is being used on an ongoing basis.

The CPPA would impose security requirements for organizations collecting, using, and holding personal information. These data would need protection “through physical, organizational and technological security safeguards” appropriate to the sensitivity of the information. Specifically, these security safeguards “must protect personal information against, among other things, loss, theft and unauthorized access, disclosure, copying, use and modification.” Breaches must be reported as soon as feasible to the OPC and to affected people if there is a reasonable belief of “real risk of significant harm to an individual.” Significant harm is defined as “bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.” Real risk of significant harm is determined on the basis of

  • the sensitivity of the personal information involved in the breach;
  • the probability that the personal information has been, is being or will be misused; and
  • any other prescribed factor.

Organizations will also have a duty to explain their policies and practices under this act in plain language, including:

  • a description of the type of personal information under the organization’s control;
  • a general account of how the organization makes use of personal information, including how the organization applies the exceptions to the requirement to obtain consent under this Act;
  • a general account of the organization’s use of any automated decision system to make predictions, recommendations or decisions about individuals that could have significant impacts on them;
  • whether or not the organization carries out any international or interprovincial transfer or disclosure of personal information that may have reasonably foreseeable privacy implications;
  • how an individual may make a request for disposal under section 55 or access under section 63; and
  • the business contact information of the individual to whom complaints or requests for information may be made.

 Canadian nationals and residents would be able to access their personal information. Notably, “[o]n request by an individual, an organization must inform them of whether it has any personal information about them, how it uses the information and whether it has disclosed the information.” Access must also be granted to the requesting person. If the organization has disclosed a person’s information, when she makes a request to access, she must be told the names or types of third parties to whom her information was disclosed. Moreover, organizations using automated decision-making processes would have further responsibilities: “[i]f the organization has used an automated decision system to make a prediction, recommendation or decision about the individual, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision and of how the personal information that was used to make the prediction, recommendation or decision was obtained.” Additionally, if a person has been granted access to his personal information and it “is not accurate, up-to-date or complete,” then the organization must amend it and send the corrected information to third parties that has access to the information.

There are provisions required data portability (deemed data mobility by the CPPA). All organizations subject to the data mobility framework must transfer personal information upon request. People must be able to lodge complaints with organizations over compliance with the CPPA regarding their personal information. Organizations may not re-identify de-identified personal information.

Organizations would be able to draft and submit codes of conduct to the OPC for approval so long as they “provide[] for substantially the same or greater protection of personal information as some or all of the protection provided under this Act.” Likewise, an entity may apply to the OPC “for approval of a certification program that includes

(a) a code of practice that provides for substantially the same or greater protection of personal information as some or all of the protection provided under this Act;

(b) guidelines for interpreting and implementing the code of practice;

(c) a mechanism by which an entity that operates the program may certify that an organization is in compliance with the code of practice;

(d) a mechanism for the independent verification of an organization’s compliance with the code of practice;

(e) disciplinary measures for non-compliance with the code of practice by an organization, including the revocation of an organization’s certification; and

(f) anything else that is provided in the regulations.

However, complying with approved codes of conduct or a certification program does not mean an entity is complying with the CPPA.

The OPC would be granted a range of new powers to enforce the CPPA either through compliance orders (which resemble administrative actions taken by the United States Federal Trade Commission) that can be appealed to a new Personal Information and Data Protection Tribunal (Tribunal) and ultimately enforced in federal court if necessary. People in Canada would also get the right to sue in the event the OPC or the new Tribunal find an entity has contravened the CPPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by James Wheeler from Pixabay

Further Reading, Other Developments, and Coming Events (22 October)

Further Reading

  •  “A deepfake porn Telegram bot is being used to abuse thousands of women” By Matt Burgess — WIRED UK. A bot set loose on Telegram can take pictures of women and, apparently teens, too, and “takes off” their clothing, rendering a naked image of females who never took naked pictures. This seems to be the next iteration in deepfake porn, a problem that will surely get worse until governments legislate against it and technology companies have incentives to locate and take down such material.
  • The Facebook-Twitter-Trump Wars Are Actually About Something Else” By Charlie Warzel — The New York Times. This piece makes the case that there are no easy fixes for American democracy or for misinformation on social media platforms.
  • Facebook says it rejected 2.2m ads for breaking political campaigning rules” — Agence France-Presse. Facebook’s Vice President of Global Affairs and Communications Nick Clegg said the social media giant is employing artificial intelligence and humans to find and remove political advertisements that violate policy in order to avoid a repeat of 2016 where untrue information and misinformation played roles in both Brexit and the election of Donald Trump as President of the United States.
  • Huawei Fallout—Game-Changing New China Threat Strikes At Apple And Samsung” By Zak Doffman — Forbes. Smartphone manufacturers from the People’s Republic of China (PRC) appear ready to step into the projected void caused by the United States (U.S.) strangling off Huawei’s access to chips. Xiaomi and Oppo have already seen sales surge worldwide and are poised to pick up where Huawei is being forced to leave off, perhaps demonstrating the limits of U.S. power to blunt the rise of PRC technology companies.
  • As Local News Dies, a Pay-for-Play Network Rises in Its Place” By Davey Alba and Jack Nicas — The New York Times. With a decline and demise of many local media outlets in the United States, new groups are stepping into the void, and some are politically minded but not transparent about biases. The organization uncovered in this article is nakedly Republican and is running and planting articles at both legitimate and artificial news sites for pay. Sometimes conservative donors pay, sometimes campaigns do. Democrats are engaged in the same activity but apparently to a lesser extent. These sorts of activities will only erode further faith in the U.S. media.
  • Forget Antitrust Laws. To Limit Tech, Some Say a New Regulator Is Needed.” By Steve Lohr — The New York Times. This piece argues that anti-trust enforcement actions are plodding, tending to take years to finish. Consequently, this body of law is inadequate to the task of addressing the market dominance of big technology companies. Instead, a new regulatory body is needed along the lines of those regulating the financial services industries that is more nimble than anti-trust. Given the problems in that industry with respect to regulation, this may not be the best model.
  • “‘Do Not Track’ Is Back, and This Time It Might Work” By Gilad Edelman — WIRED. Looking to utilize the requirement in the “California Consumer Privacy Act” (CCPA) (AB 375) that requires regulated entities to respect and effectuate the use of a one-time opt-out mechanism, a group of entities have come together to build and roll out the Global Privacy Control. In theory, users could download this technical specification to their phones and computers, install it, use it once, and then all websites would be on notice regarding that person’s privacy preferences. Such a means would go to the problem turned up by Consumer Reports recent report on the difficulty of trying to opt out of having one’s personal information sold.
  • EU countries sound alarm about growing anti-5G movement” By Laurens Cerulus — Politico. 15 European Union (EU) nations wrote the European Commission (EC) warning that the nascent anti-5G movement borne of conspiracy thinking and misinformation threatens the Eu’s position vis-à-vis the United States (U.S.) and the People’s Republic of China (PRC). There have been more than 200 documented arson attacks in the EU with the most having occurred in the United Kingdom, France, and the Netherlands. These nations called for a more muscular, more forceful debunking of the lies and misinformation being spread about 5G.
  • Security firms call Microsoft’s effort to disrupt botnet to protect against election interference ineffective” By Jay Greene — The Washington Post. Microsoft seemingly acted alongside the United States (U.S.) Cyber Command to take down and impair the operation of Trickbot, but now cybersecurity experts are questioning how effective Microsoft’s efforts really were. Researchers have shown the Russian operated Trickbot has already stood up operations and has dispersed across servers around the world, showing how difficult it is to address some cyber threats.
  • Governments around the globe find ways to abuse Facebook” By Sara Fischer and Ashley Gold — Axios. This piece puts a different spin on the challenges Facebook faces in countries around the world, especially those that ruthlessly use the platform to spread lies and misinformation than the recent BuzzFeed News article. The new article paints Facebook as the well-meaning company being taken advantage of while the other one portrays a company callous to content moderation except in nations where it causes them political problems such as the United States, the European Union, and other western democracies.

Other Developments

  • The United States (U.S.) Department of Justice’s (DOJ) Cyber-Digital Task Force (Task Force) issued “Cryptocurrency: An Enforcement Framework,” that “provides a comprehensive overview of the emerging threats and enforcement challenges associated with the increasing prevalence and use of cryptocurrency; details the important relationships that the Department of Justice has built with regulatory and enforcement partners both within the United States government and around the world; and outlines the Department’s response strategies.” The Task Force noted “[t]his document does not contain any new binding legal requirements not otherwise already imposed by statute or regulation.” The Task Force summarized the report:
    • [I]n Part I, the Framework provides a detailed threat overview, cataloging the three categories into which most illicit uses of cryptocurrency typically fall: (1) financial transactions associated with the commission of crimes; (2) money laundering and the shielding of legitimate activity from tax, reporting, or other legal requirements; and (3) crimes, such as theft, directly implicating the cryptocurrency marketplace itself. 
    • Part II explores the various legal and regulatory tools at the government’s disposal to confront the threats posed by cryptocurrency’s illicit uses, and highlights the strong and growing partnership between the Department of Justice and the Securities and Exchange Commission, the Commodity Futures Commission, and agencies within the Department of the Treasury, among others, to enforce federal law in the cryptocurrency space.
    • Finally, the Enforcement Framework concludes in Part III with a discussion of the ongoing challenges the government faces in cryptocurrency enforcement—particularly with respect to business models (employed by certain cryptocurrency exchanges, platforms, kiosks, and casinos), and to activity (like “mixing” and “tumbling,” “chain hopping,” and certain instances of jurisdictional arbitrage) that may facilitate criminal activity.    
  • The White House’s Office of Science and Technology Policy (OSTP) has launched a new website for the United States’ (U.S.) quantum initiative and released a report titled “Quantum Frontiers: Report On Community Input To The Nation’s Strategy For Quantum Information Science.” The Quantum Initiative flows from the “National Quantum Initiative Act” (P.L. 115-368) “to  provide  for  a  coordinated  Federal  program  to  accelerate  quantum  research  and  development  for  the  economic and national security of the United States.” The OSTP explained that the report “outlines eight frontiers that contain core problems with fundamental questions confronting quantum information science (QIS) today:
    • Expanding Opportunities for Quantum Technologies to Benefit Society
    • Building the Discipline of Quantum Engineering
    • Targeting Materials Science for Quantum Technologies
    • Exploring Quantum Mechanics through Quantum Simulations
    • Harnessing Quantum Information Technology for Precision Measurements
    • Generating and Distributing Quantum Entanglement for New Applications
    • Characterizing and Mitigating Quantum Errors
    • Understanding the Universe through Quantum Information
    • OSTP asserted “[t]hese frontier areas, identified by the QIS research community, are priorities for the government, private sector, and academia to explore in order to drive breakthrough R&D.”
  • The New York Department of Financial Services (NYDFS) published its report on the July 2020 Twitter hack during which a team of hacker took over a number of high-profile accounts (e.g. Barack Obama, Kim Kardashian West, Jeff Bezos, and Elon Musk) in order to perpetrate a cryptocurrency scam. The NYDFS has jurisdiction over cryptocurrencies and companies dealing in this item in New York. The NYDFS found that the hackers used the most basic means to acquire permission to take over accounts. The NYDFS explained:
    • Given that Twitter is a publicly traded, $37 billion technology company, it was surprising how easily the Hackers were able to penetrate Twitter’s network and gain access to internal tools allowing them to take over any Twitter user’s account. Indeed, the Hackers used basic techniques more akin to those of a traditional scam artist: phone calls where they pretended to be from Twitter’s Information Technology department. The extraordinary access the Hackers obtained with this simple technique underscores Twitter’s cybersecurity vulnerability and the potential for devastating consequences. Notably, the Twitter Hack did not involve any of the high-tech or sophisticated techniques often used in cyberattacks–no malware, no exploits, and no backdoors.
    • The implications of the Twitter Hack extend far beyond this garden-variety fraud. There are well-documented examples of social media being used to manipulate markets and interfere with elections, often with the simple use of a single compromised account or a group of fake accounts.In the hands of a dangerous adversary, the same access obtained by the Hackers–the ability to take control of any Twitter users’ account–could cause even greater harm.
    • The Twitter Hack demonstrates the need for strong cybersecurity to curb the potential weaponization of major social media companies. But our public institutions have not caught up to the new challenges posed by social media. While policymakers focus on antitrust and content moderation problems with large social media companies, their cybersecurity is also critical. In other industries that are deemed critical infrastructure, such as telecommunications, utilities, and finance, we have established regulators and regulations to ensure that the public interest is protected. With respect to cybersecurity, that is what is needed for large, systemically important social media companies.
    • The NYDFS recommended the cybersecurity measures cryptocurrency companies in New York should implement to avoid similar hacks, including its own cybersecurity regulations that bind its regulated entities in New York. The NYDFS also called for a national regulator to address the lack of a dedicated regulator of Twitter and other massive social media platforms. The NYDFS asserted:
      • Social media companies currently have no dedicated regulator. They are subject to the same general oversight applicable to other companies. For instance, the SEC’s regulations for all public companies apply to public social media companies, and antitrust and related laws and regulations enforced by the Department of Justice and the FTC apply to social media companies as they do to all companies. Social media companies are also subject to generally applicable laws, such as the California Consumer Privacy Act and the New York SHIELD Act. The European Union’s General Data Protection Regulation, which regulates the storage and use of personal data, also applies to social media entities doing business in Europe.
      • But there are no regulators that have the authority to uniformly regulate social media platforms that operate over the internet, and to address the cybersecurity concerns identified in this Report. That regulatory vacuum must be filled.
      • A useful starting point is to create a “systemically important” designation for large social media companies, like the designation for critically important bank and non-bank financial institutions. In the wake of the 2007-08 financial crisis, Congress established a new regulatory framework for financial institutions that posed a systemic threat to the financial system of the United States. An institution could be designated as a Systemically Important Financial Institution (“SIFI”) “where the failure of or a disruption to the functioning of a financial market utility or the conduct of a payment, clearing, or settlement activity could create, or increase, the risk of significant liquidity or credit problems spreading among financial institutions or markets and thereby threaten the stability of the financial system of the United States.”
      • The risks posed by social media to our consumers, economy, and democracy are no less grave than the risks posed by large financial institutions. The scale and reach of these companies, combined with the ability of adversarial actors who can manipulate these systems, require a similarly bold and assertive regulatory approach.
      • The designation of an institution as a SIFI is made by the Financial Stability Oversight Council (“FSOC”), which Congress established to “identify risks to the financial stability of the United States” and to provide enhanced supervision of SIFIs.[67] The FSOC also “monitors regulatory gaps and overlaps to identify emerging sources of systemic risk.” In determining whether a financial institution is systemically important, the FSOC considers numerous factors including: the effect that a failure or disruption to an institution would have on financial markets and the broader financial system; the nature of the institution’s transactions and relationships; the nature, concentration, interconnectedness, and mix of the institution’s activities; and the degree to which the institution is regulated.
      • An analogue to the FSOC should be established to identify systemically important social media companies. This new Oversight Council should evaluate the reach and impact of social media companies, as well as the society-wide consequences of a social media platform’s misuse, to determine which companies they should designate as systemically important. Once designated, those companies should be subject to enhanced regulation, such as through the provision of “stress tests” to evaluate the social media companies’ susceptibility to key threats, including cyberattacks and election interference.
      • Finally, the success of such oversight will depend on the establishment of an expert agency to oversee designated social media companies. Systemically important financial companies designated by the FSOC are overseen by the Federal Reserve Board, which has a long-established and deep expertise in banking and financial market stability. A regulator for systemically important social media would likewise need deep expertise in areas such as technology, cybersecurity, and disinformation. This expert regulator could take various forms; it could be a completely new agency or could reside within an established agency or at an existing regulator.
  • The Government Accountability Office (GAO) evaluated how well the Trump Administration has been implementing the “Open, Public, Electronic and Necessary Government Data Act of 2018” (OPEN Government Data Act) (P.L. 115-435). As the GAO explained, this statute “requires federal agencies to publish their information as open data using standardized, nonproprietary formats, making data available to the public open by default, unless otherwise exempt…[and] codifies and expands on existing federal open data policy including the Office of Management and Budget’s (OMB) memorandum M-13-13 (M-13-13), Open Data Policy—Managing Information as an Asset.”
    • The GAO stated
      • To continue moving forward with open government data, the issuance of OMB implementation guidance should help agencies develop comprehensive inventories of their data assets, prioritize data assets for publication, and decide which data assets should or should not be made available to the public.
      • Implementation of this statutory requirement is critical to agencies’ full implementation and compliance with the act. In the absence of this guidance, agencies, particularly agencies that have not previously been subject to open data policies, could fall behind in meeting their statutory timeline for implementing comprehensive data inventories.
      • It is also important for OMB to meet its statutory responsibility to biennially report on agencies’ performance and compliance with the OPEN Government Data Act and to coordinate with General Services Administration (GSA) to improve the quality and availability of agency performance data that could inform this reporting. Access to this information could inform Congress and the public on agencies’ progress in opening their data and complying with statutory requirements. This information could also help agencies assess their progress and improve compliance with the act.
    • The GAO made three recommendations:
      • The Director of OMB should comply with its statutory requirement to issue implementation guidance to agencies to develop and maintain comprehensive data inventories. (Recommendation 1)
      • The Director of OMB should comply with the statutory requirement to electronically publish a report on agencies’ performance and compliance with the OPEN Government Data Act. (Recommendation 2)
      • The Director of OMB, in collaboration with the Administrator of GSA, should establish policy to ensure the routine identification and correction of errors in electronically published performance information. (Recommendation 3)
  • The United States’ (U.S.) National Security Agency (NSA) issued a cybersecurity advisory titled “Chinese State-Sponsored Actors Exploit Publicly Known Vulnerabilities,” that “provides Common Vulnerabilities and Exposures (CVEs) known to be recently leveraged, or scanned-for, by Chinese state-sponsored cyber actors to enable successful hacking operations against a multitude of victim networks.” The NSA recommended a number of mitigations generally for U.S. entities, including:
    • Keep systems and products updated and patched as soon as possible after patches are released.
    • Expect that data stolen or modified (including credentials, accounts, and software) before the device was patched will not be alleviated by patching, making password changes and reviews of accounts a good practice.
    • Disable external management capabilities and set up an out-of-band management network.
    • Block obsolete or unused protocols at the network edge and disable them in device configurations.
    • Isolate Internet-facing services in a network Demilitarized Zone (DMZ) to reduce the exposure of the internal network.
    • Enable robust logging of Internet-facing services and monitor the logs for signs of compromise.
    • The NSA then proceeded to recommend specific fixes.
    • The NSA provided this policy backdrop:
      • One of the greatest threats to U.S. National Security Systems (NSS), the U.S. Defense Industrial Base (DIB), and Department of Defense (DOD) information networks is Chinese state-sponsored malicious cyber activity. These networks often undergo a full array of tactics and techniques used by Chinese state-sponsored cyber actors to exploit computer networks of interest that hold sensitive intellectual property, economic, political, and military information. Since these techniques include exploitation of publicly known vulnerabilities, it is critical that network defenders prioritize patching and mitigation efforts.
      • The same process for planning the exploitation of a computer network by any sophisticated cyber actor is used by Chinese state-sponsored hackers. They often first identify a target, gather technical information on the target, identify any vulnerabilities associated with the target, develop or re-use an exploit for those vulnerabilities, and then launch their exploitation operation.
  • Belgium’s data protection authority (DPA) (Autorité de protection des données in French or Gegevensbeschermingsautoriteit in Dutch) (APD-GBA) has reportedly found that the Transparency & Consent Framework (TCF) developed by the Interactive Advertising Bureau (IAB) violates the General Data Protection Regulation (GDPR). The Real-Time Bidding (RTB) system used for online behavioral advertising allegedly transmits the personal information of European Union residents without their consent even before a popup appears on their screen asking for consent. The APD-GBA is the lead DPA in the EU in investigating the RTB and will likely now circulate their findings and recommendations to other EU DPAs before any enforcement will commence.
  • None Of Your Business (noyb) announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish Data Protection Commission (DPC) today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.) Last month, after the DPC directed Facebook to stop transferring the personal data of EU citizens to the U.S., the company filed suit in the Irish High Court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • noyb further asserted:
      • Instead of making a decision in the pending procedure, the DPC has started a second, new investigation into the same subject matter (“Parallel Procedure”), as widely reported (see original reporting by the WSJ). No logical reasons for the Parallel Procedure was given, but the DPC has maintained that Mr Schrems will not be heard in this second case, as he is not a party in this Parallel Procedure. This Paralell procedure was criticised by Facebook publicly (link) and instantly blocked by a Judicial Review by Facebook (see report by Reuters).
      • Today’s Judicial Review by noyb is in many ways the counterpart to Facebook’s Judicial Review: While Facebook wants to block the second procedure by the DPC, noyb wants to move the original complaints procedure towards a decision.
      • Earlier this summer, the CJEU struck down the adequacy decision for the agreement between the EU and (U.S. that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data to the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • Most likely with the aim of helping British chances for an adequacy decision from the European Union (EU), the United Kingdom’s Information Commissioner’s Office (ICO) published guidance that “discusses the right of access [under the General Data Protection Regulation (GDPR)] in detail.” The ICO explained “is aimed at data protection officers (DPOs) and those with specific data protection responsibilities in larger organisations…[but] does not specifically cover the right of access under Parts 3 and 4 of the Data Protection Act 2018.”
    • The ICO explained
      • The right of access, commonly referred to as subject access, gives individuals the right to obtain a copy of their personal data from you, as well as other supplementary information.
  • The report the House Education and Labor Ranking Member requested from the Government Accountability Office (GAO) on the data security and data privacy practices of public schools. Representative Virginia Foxx (R-NC) asked the GAO “to review the security of K-12 students’ data. This report examines (1) what is known about recently reported K-12 cybersecurity incidents that compromised student data, and (2) the characteristics of school districts that experienced these incidents.” Strangely, the report did have GAO’s customary conclusions or recommendations. Nonetheless, the GAO found:
    • Ninety-nine student data breaches reported from July 1, 2016 through May 5, 2020 compromised the data of students in 287 school districts across the country, according to our analysis of K-12 Cybersecurity Resource Center (CRC) data (see fig. 3). Some breaches involved a single school district, while others involved multiple districts. For example, an attack on a vendor system in the 2019-2020 school year affected 135 districts. While information about the number of students affected was not available for every reported breach, examples show that some breaches affected thousands of students, for instance, when a cybercriminal accessed 14,000 current and former students’ personally identifiable information (PII) in one district.
    • The 99 reported student data breaches likely understate the number of breaches that occurred, for different reasons. Reported incidents sometimes do not include sufficient information to discern whether data were breached. We identified 15 additional incidents in our analysis of CRC data in which student data might have been compromised, but the available information was not definitive. In addition, breaches can go undetected for some time. In one example, the personal information of hundreds of thousands of current and former students in one district was publicly posted for 2 years before the breach was discovered.
    • The CRC identified 28 incidents involving videoconferences from April 1, 2020 through May 5, 2020, some of which disrupted learning and exposed students to harm. In one incident, 50 elementary school students were exposed to pornography during a virtual class. In another incident in a different district, high school students were targeted with hate speech during a class, resulting in the cancellation that day of all classes using the videoconferencing software. These incidents also raise concerns about the potential for violating students’ privacy. For example, one district is reported to have instructed teachers to record their class sessions. Teachers said that students’ full names were visible to anyone viewing the recording.
    • The GAO found gaps in the protection and enforcement of student privacy by the United States government:
      • [The Department of] Education is responsible for enforcing Family Educational Rights and Privacy Act (FERPA), which addresses the privacy of PII in student education records and applies to all schools that receive funds under an applicable program administered by Education. If parents or eligible students believe that their rights under FERPA have been violated, they may file a formal complaint with Education. In response, Education is required to take appropriate actions to enforce and deal with violations of FERPA. However, because the department’s authority under FERPA is directly related to the privacy of education records, Education’s security role is limited to incidents involving potential violations under FERPA. Further, FERPA amendments have not directly addressed educational technology use.
      • The “Children’s Online Privacy Protection Act” (COPPA) requires the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s privacy. The COPPA Rule, which took effect in 2000 and was later amended in 2013, requires operators of covered websites or online services that collect personal information from children under age 13 to provide notice and obtain parental consent, among other things. COPPA generally applies to the vendors who provide educational technology, rather than to schools directly. However, according to FTC guidance, schools can consent on behalf of parents to the collection of students’ personal information if such information is used for a school-authorized educational purpose and for no other commercial purpose.
  • Upturn, an advocacy organization that “advances equity and justice in the design, governance, and use of technology,” has released a report showing that United States (U.S.) law enforcement agencies have multiple means of hacking into encrypted or protected smartphones. There have long been the means and vendors available in the U.S. and abroad for breaking into phones despite the claims of a number of nations like the Five Eyes (U.S., the United Kingdom, Australia, Canada, and New Zealand) that default end-to-end encryption was a growing problem that allowed those preying on children and engaged in terrorism to go undetected. In terms of possible bias, Upturn is “is supported by the Ford Foundation, the Open Society Foundations, the John D. and Catherine T. MacArthur Foundation, Luminate, the Patrick J. McGovern Foundation, and Democracy Fund.”
    • Upturn stated:
      • Every day, law enforcement agencies across the country search thousands of cellphones, typically incident to arrest. To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”
      • This report documents the widespread adoption of MDFTs by law enforcement in the United States. Based on 110 public records requests to state and local law enforcement agencies across the country, our research documents more than 2,000 agencies that have purchased these tools, in all 50 states and the District of Columbia. We found that state and local law enforcement agencies have performed hundreds of thousands of cellphone extractions since 2015, often without a warrant. To our knowledge, this is the first time that such records have been widely disclosed.
    • Upturn argued:
      • Law enforcement use these tools to investigate not only cases involving major harm, but also for graffiti, shoplifting, marijuana possession, prostitution, vandalism, car crashes, parole violations, petty theft, public intoxication, and the full gamut of drug-related offenses. Given how routine these searches are today, together with racist policing policies and practices, it’s more than likely that these technologies disparately affect and are used against communities of color.
      • We believe that MDFTs are simply too powerful in the hands of law enforcement and should not be used. But recognizing that MDFTs are already in widespread use across the country, we offer a set of preliminary recommendations that we believe can, in the short-term, help reduce the use of MDFTs. These include:
        • banning the use of consent searches of mobile devices,
        • abolishing the plain view exception for digital searches,
        • requiring easy-to-understand audit logs,
        • enacting robust data deletion and sealing requirements, and
        • requiring clear public logging of law enforcement use.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Mehmet Turgut Kirkgoz from Pixabay

Brown’s Bill Is The Most Privacy Friendly So Far

The top Democrat on the Senate Banking Committee has released the most pro-privacy bill yet.

Even though Senate Banking, Housing and Urban Affairs Ranking Member Sherrod Brown (D-OH) introduced his draft privacy bill in June, the “Data Accountability and Transparency Act of 2020,” too much has been happening to take a proper look at the bill. Now that I have, I can say this is the most privacy and consumer rights friendly bill introduced in this Congress and quite possibly any of the recent Congresses. I wonder if Democrats could pass such a strong, restrictive bill even with super majorities in both chambers and a Democratic President, for the resistance by industry would be very fierce.

In terms of what this bill would do, most notably, a new agency would be created, the Data Accountability and Transparency Agency (DATA) that would be outside the appropriations process like the Consumer Financial Protection Bureau (CFPB), which limits Congress’ power over the agency. It would be headed by a Director appointed by the President and then confirmed by the Senate who could serve a five-year term. The agency would also have a Deputy Director. Again, this uses the CFPB as the template and not the Federal Trade Commission (FCC) or Federal Communications Commission (FCC), independent agencies with five Commissioners each. Also,  like the CFPB, and unlike the FTC, the agency would be charged with policing unfair, deceptive, and abusive privacy practices in violation of this new law. It appears the DATA (incidentally, a terrible acronym for an agency) would work alongside existing federal agencies, and so the FTC could still police privacy and data security.

Moreover, the Brown bill uses preemption model from the “Financial Services Modernization Act of 1999” (P.L. 106-102) (aka Gramm–Leach–Bliley) under which states would be allowed to regulate privacy above the federal standard so long as  a state statute is not inconsistent. And, state statutes would preempted only to the degree they are countered to the new federal law.

And, of course, Brown’s bill allows people to sue for violations, and on the most generous terms I’ve seen among the privacy bills.

Not surprisingly, the definitions are drafted in ways that are uber pro-privacy. For example, ‘‘personal data’’ is defined as “electronic data that, alone or in combination with other data—

  • could be linked or reasonably linkable to an individual, household, or device; or
  • could be used to determine that an individual or household is part of a protected class.”

This is a very broad definition of personal information a U.S. resident would have protected under the bill because it covers more than just data like names, addresses, Social Security numbers, etc. and instead covers all data that could be linked to a person, household, or device. This is a broader definition than most bills, which actually specify the sorts of data. For example, some bills treat specific geolocation data as deserving more protection than other data. However, it is often the case that this is defined as any such data that pinpoints a person’s location to within 1750 feet, meaning that data that locates a person within, say, 2000 feet, or less than half a mile would not be protected. Brown’s definition is simpler, broader, and quite possibly much easier to implement.

Likewise, what is considered a violation under the bill is also very broadly written. A ‘‘privacy harm’’ is “an adverse consequence, or a potential adverse consequence, to an individual, a group of individuals, or society caused, or potentially caused, in whole or in part, by the collection, use, or sharing of personal data, including:

(A) direct or indirect financial loss or economic harm, including financial loss or economic harm arising from fraudulent activities or data security breaches;

(B) physical harm, harassment, or a threat to an individual or property;

(C) psychological harm, including anxiety, embarrassment, fear, other trauma, stigmatization, reputational harm, or the revealing or exposing of an individual, or a characteristic of an individual, in an unexpected way;

(D) an adverse outcome or decision, including relating to the eligibility of an individual for the rights, benefits, or privileges in credit and insurance (including the denial of an application or obtaining less favorable terms), housing, education, professional certification, employment (including hiring, firing, promotion, demotion, and compensation), or the provision of health care and related services;

(E) discrimination or the otherwise unfair or unethical differential treatment with respect to an individual, including in a manner that is prohibited under section 104;

(F) the interference with, or the surveillance of, activities that are protected by the First Amendment to the Constitution of the United States;

(G) the chilling of free expression or action of an individual, or society generally, due to perceived or actual pervasive and excessive collection, use, or sharing of personal data;

(H) the impairment of the autonomy of an individual or society generally; and

(I) any harm fairly traceable to an invasion of privacy tort; and

(J) any other adverse consequence, or potential adverse consequence, consistent with the provisions of this Act, as determined by the Director.

I’ve quoted the entire definition of “privacy harm” because I think it helps one understand the full range of what harms the new privacy agency would be policing. First, it would be beyond actual financial or economic harms and go “psychological harm,” which may present courts with problems as they try to navigate what anguish meets this standard and which does not. Second, it covers activities that are protected under the First Amendment, the chilling of free expression, the impairment of a person’s impairment, and “any harm fairly traceable to an invasion of privacy tort.” This may be the widest definition of what is harm of any of the privacy bills introduced in this or any other recent Congress. Finally, the DATA could determine any other consequence, real or potential, qualifies as a privacy harm.  

“protected class’’ means the actual or perceived race, color, ethnicity, national origin, religion, sex, gender, gender identity, sexual orientation, familial status, biometric information, lawful source of income, or disability of an individual or a group of individuals.

The bill would outright ban data collection, use, or sharing unless for a permissible purpose, which include:

  • To provide a good, service, or specific feature requested by an individual in an intentional interaction.
  • To engage in journalism, provided that the data aggregator has reasonable safeguards and processes that prevent the collection, use, or sharing of personal data for commercial purposes other than journalism.
  • To employ an individual, including for administration of wages and benefits, except that a data aggregator may not invasively collect, use, or share the employee’s personal data in carrying out this paragraph.
  • Where mandated to comply with Federal, State, or local law.
  • Consistent with due process, direct compliance with a civil, criminal, or regulatory inquiry, investigation, subpoena, or summons.
  • To bring or defend legal claims, provided that the parties or potential parties take all necessary measures, including, as applicable, obtaining a protective order, to protect against unnecessary public disclosure of personal data.
  • To detect or respond to security incidents, protect against malicious, deceptive, fraudulent, or illegal activity, or prosecute those responsible for that activity.
  • Free expression by individuals on a social network or media platform.
  • In exigent circumstances, if first responders or medical personnel, in good faith, believe danger of death or serious physical injury to an individual, or danger of serious and unlawful injury to property, requires collection, use, or sharing of personal data relating to the exigent circumstances.
  • The development and delivery of advertisements—
    • based on the content of the website, online service, or application to which the individual or device is connected; and
    • excludes advertising based on the use of any personal data collected or stored from previous interactions with the individual or device, a profile of the individual or device, or the previous online or offline behavior of the individual or device.
  • To offer discounted or free goods or services to an individual if—
    • the offering is in connection with the voluntary participation by the individual in a program that rewards individuals for patronage; and
    • personal data is only collected to track purchases for loyalty rewards under the program

Again, I’ve quoted at length to show how restrictive the bill is. This is the list of permissible purposes, and one will not find a list of exemptions that pare back the privacy rights ostensibly granted by the bill. For the private sector, the first purpose will be the most relevant as they would be allowed to provide services, products, or goods requested by a person who has intentionally interacted with the entity (aka a data aggregator under the bill). Use of the word intention would seem to rule out accidental or questionable interaction. There is also not language making product or service development an exception like it is in many other bills.

Moreover, with respect to the online advertising industry, behavioral advertising would seem to not be a permissible purpose, at least the variety under which a company aggregates data from different sources to form a profile on a person. Moreover, “[c]ollecting, using, or sharing personal data to generate advertising revenue to support or carry out a permissible purpose is not a permissible purpose.”

The “Data Accountability and Transparency Act of 2020” would permit loyalty or reward programs and even allow a business to offer tiered pricing. And, entities could not charge higher or different prices if a person exercises her rights under the bill.

Brown’s bill would place very strict limits of what entities could do with personal data. To wit, it is provided that “[e]xcept where strictly necessary to carry out a permissible purpose, a data aggregator shall not—

  • share personal data with affiliated entities, service providers, or third parties;
  • use personal data for any purpose other than to carry out a permissible purpose;
  • retain personal data for any time longer than strictly necessary to carry out a permissible purpose; or
  • derive or infer data from any element or set of personal data.”

There is a list of prohibited practices, including, as mentioned, a bar on charging higher prices or providing lesser service or products if one chooses to exercise his rights under the bill. Also, businesses would be prohibited from re-identifying anonymized data or from commingling personal data from different sources. Violating these prohibitions could lead to treble damages.

It also seems like the bill bans most differential pricing:

It is unlawful for a data aggregator to collect, use, or share personal data for advertising, marketing, soliciting, offering, selling, leasing, licensing, renting, or otherwise commercially contracting for housing, employment, credit, or insurance in a manner that discriminates against or otherwise makes the opportunity unavailable or offered on different terms on the basis of a protected class or otherwise materially contributes to unlawful discrimination.

I suppose if there is differential pricing not based on a protected class, then it might be acceptable. However, I’m struggling to think of what that might look like.

This section also makes illegal the use of personal data for vote suppression. This language is an obvious non-starter with Republicans like Senate Majority Leader Mitch McConnell (R-KY) and would find few fans in the White House given recent and persistent allegations of vote suppression efforts by the Trump Campaign in 2016.

Brown’s use of the disparate impact standard in proving discrimination is anathema to most conservatives who have long made the case that disparate treatment should be the measuring stick for determining if discrimination has occurred.

Moreover, if a data aggregator use automated decision-making systems, then it must continually assess whether any bias against a protected class is occurring or any disparate impact against a protected class is happening.  

People would be able to access and port their personal information, and this right is much broader than those provided in other bills. They would be able to access to specific pieces of information collected, used or shared about, the permissible purposes used to collect the data, and the service providers and third parties with whom the information was shared. On this latter point, normally other privacy bills provide a person with access, upon request, to the categories of such entities and not the actual entities themselves.

Brown’s privacy bill provides a right of transparency that mandates in each party’s online privacy policy that the following information be made available:

  • A description of the personal data that the data aggregator collects, uses, or shares.
  • The specific sources from which personal data is collected.
  • A description of the sources from which personal data is collected.
  • The permissible purposes for which personal data is collected, used, or shared.
  • The affiliates, service providers, or third parties with which the data aggregator shares personal data, and the permissible purpose for such sharing.
  • A description of the length of time for which personal data is retained.
  • If personal data is collected and retained as anonymized data, a description of the techniques and methods used to create the anonymized data.

Again, this right provides more specific information than comparable rights in other privacy bills.

Data aggregators would have the affirmative duty to ensure information it collects is correct, and people would have the “right to require that a data aggregator that retains the individual’s personal data correct any inaccurate or incomplete personal data.” Moreover, data aggregators must correct any inaccurate or incorrect information as directed by a person. In other bills, there is language requiring businesses to make best or reasonable efforts but nothing like a guarantee for people or a duty for businesses.

People would be able to ask data aggregators to delete personal information, and they must unless these data are needed to complete a permissible purpose.

Brown’s bill has novel language stipulating “[a]n individual has the right to object to the claimed permissible purpose for any personal data that a data aggregator has collected, used, or shared of such individual.” Consequently, a data aggregator must “produce evidence supporting the data aggregator’s claim that the collection, use, or sharing of such individual’s personal data—

  • was strictly necessary to carry out a permissible purpose;
  • was not used or shared for any other purpose; and
  • has not been retained for any time longer than strictly necessary to carry out a permissible purpose.”

Presumably, failing to produce evidence at all or sufficient evidence constitutes a violation punishable by the new agency.

People would also be allowed to request that a person must review material decisions made via automated processes.

Brown puts an interesting twist on the customary language in almost all privacy bills requiring security commensurate with the type of information being collected, used, and shared. The bill creates a duty of care, which as I seem to be recalling against my will from law school, makes any violations of such duty a tort, permitting people to sue under tort law. Nonetheless, the bill that

A data aggregator shall implement and maintain reasonable security procedures and practices, including administrative, physical, and technical safeguards, appropriate to the nature of the personal data and the purposes for which the personal data will be collected, used, or shared…

Moreover, this duty of a data aggregator extends to service providers and the former are made explicitly liable for the violations of the latter.

If a data aggregator receives a request to exercise these rights that is verified, it must do so and at no cost. This would not apply to frivolous and irrelevant requests, however.

This new agency would be housed in the Federal Reserve Bank and would be able to keep and use the proceeds from its actions to fund operations. Just like the CFPB, this would ensure independence from Congress and the Executive Branch, and just like the CFPB, this is likely a non-starter with Republicans.

The new Data Accountability and Transparency Agency, as noted, would be empowered to “take any action authorized under this Act to prevent a data aggregator or service provider from committing or engaging in any unfair, deceptive, or abusive act or practice in connection with the collection, use, or sharing of personal data.” Moreover,

The Agency may prescribe rules applicable to a data aggregator identifying unlawful, unfair, deceptive, or abusive acts or practices in connection with the collection, use, or sharing of personal data, which may include requirements for the purpose of preventing such acts or practices. Rules under this section shall not limit, or be interpreted to limit, the scope of unlawful, deceptive, or abusive acts or practices in connection with the collection, use, or sharing of personal data.

The agency’s powers to punish unfair acts is drafted similarly to the FTC’s powers which add the caveat that any such acts must be unavoidable and is not outweighed by countervailing benefits to people or competition. It bears note that the agency would be able to punish unfair practices “likely” to cause privacy harms or “other substantial harm” to people in addition to actual practices.

An abusive practice is one that:

  • materially interferes with the ability of an individual to understand a term of condition of a good or service; or
  • takes unreasonable advantage of—
    • a lack of understanding on the part of the individual of the material risks, costs, or conditions of the product or service;
    • the inability of the individual to protect their interests in selecting or using a product or service; or
    • the reasonable reliance by the individual on a data aggregator or service provider to act in the interests of the individual.

Deceptive practices are not defined, and so it is likely the new agency’s powers would be the same as the FTC’s with respect to this type of illegal conduct. Also, the new agency would be able to punish violations of any privacy law, which would bring all the disparate privacy regimes under the roof of one entity in the U.S.

The new agency would receive the authority to punish bad actors in the same bifurcated fashion as the FTC and some other agencies: either through an administrative proceeding or by going to federal court. However, regarding the latter route, the agency would not need to ask the Department of Justice (DOJ) to file suit for it. This detail is salient because this is more and more coming to be the de facto Democratic position on this issue.

Whatever the case, the agency would be able to seek any appropriate legal or equitable relief, the latter term encompassing injunctions, disgorgement, restitution, and other such relief for violations. And, of course, the new agency would be able to punish violations of this new law or any federal privacy with civil fines laid out in tiers:

  • For any violation of a law, rule, or final order or condition imposed in writing by the Agency, a civil penalty may not exceed $5,000 for each day during which such violation or failure to pay continues.
  • [F]or any person that recklessly engages in a violation of this Act or any Federal privacy law, a civil penalty may not exceed $25,000 for each day during which such violation continues.
  • [F]or any person that knowingly violates this Act or any Federal privacy law, a civil penalty may not exceed $1,000,000 for each day during which such violation continues.”

It seems like these tiers would only result in the per day violation total and would not be multiplied by the number of affected people. If so, $5,000 a day is a sum most large companies would probably not register, and even $25,000 a day is bearable for enormous companies like Facebook or Amazon.

Nonetheless, violations arising from re-identifying personal data are punished under the last tier (i.e. $1 million per day), and any of these violations might result in criminal prosecution, for the agency may refer such violations to DOJ. CEOs and Boards of Directors could be prosecuted for knowing and intentional violations, which is a fairly high bar, and face up to ten years in prison and a $10 million fine if convicted.

Brown’s bill provides people with a right to sue entities, including government agencies under some circumstances, that violate this act. Also, people may sue the new agency for failing to promulgate required regulations or for adopting rules that violate the act. Plaintiffs would be able to win between $100-$1000 per violation per day, punitive damages, attorney’s fees and litigation costs, and any other relief the court sees fit to grant. Many Republicans and industry stakeholders are, of course, opposed to a private right of action, but Brown’s is beyond what the other bills are offering because it would allow for the award of punitive damages and fees related to the bringing of the litigation. They would likely argued, with justification, that there would be a wave of class action lawsuits. Another non-starter with Republicans is that the act circumvents a threshold consideration that weeds out lawsuits in federal court by stating that a mere violation of the act is an injury for purposes of the lawsuit. This language sidesteps the obstacle upon which may suits are dashed, for one whose privacy has been violated often cannot show an injury in the form of monetary or economic losses.

Like other bills, pre-dispute arbitration agreements and pre-dispute joint action waiver” signed by any person shall not be valid or enforceable in court, meaning companies cannot limit legal liability by requiring that people waive their rights as part of the terms of service as is now customary.  

As noted previously, the bill would not preempt all state privacy laws. Rather only those portions of state laws that conflict with this act would be preempted, and states would be free to legislate requirements more stringent than the new federal privacy regulatory structure. Moreover, the bill makes clear that common law actions would still be available.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Jan Alexander from Pixabay