
Subscribe to my newsletter, The Wavelength, if you want the content on my blog delivered to your inbox four times a week before it’s posted here.
This week, the Federal Trade Commission (FTC) held its sixth PrivacyCon, a commissioner and key advisor to the FTC’s chair proposed a new approach to how the agency polices privacy violations. This new course would reorient the FTC from going after entities that violate people’s privacy and data security to looking at the broader personal data ecosystem. In doing so, the FTC would seek stiffer penalties in settlements and court cases, including deletion of all personal data and algorithms related to the violations and disgorgement of profits associated with the violations.
Commissioner Rebecca Kelly Slaughter sketched a new, ambitious course for how the agency polices privacy under its existing powers. It must be stressed that what she proposes does not require Congress to add to the agency’s powers, but it would necessitate a conceptual shift in how the FTC uses its powers, which would undoubtedly be challenged in court if the agency started punishing entities for “data abuse.” Having said that, under Section 5 of the FTC Act, the agency has immense latitude:
Unfair methods of competition in or affecting commerce, and unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.
Kelly Slaughter began:
First, I know we are here today for “Privacy”Con, but I would like to challenge everyone to reject “privacy” as the animating framework for the important issues being discussed at today’s conference and among thought leaders generally with respect to our data-driven economy. Today’s agenda addresses algorithmic bias, issues around consent, misinformation during the pandemic, and special concerns related to kids and teens, as well as more conventional privacy concepts. The FTC has done work on all these fronts and has recently issued guidance on algorithmic bias and unfairness, and worked to reevaluate outdated and deceptive consent frameworks around dark patterns. These issues go way beyond “privacy” as it is traditionally conceived.
Kelly slaughter explained why the FTC, policymakers, stakeholders, and the public should reorient their thinking and move beyond what has traditionally been considered privacy:
- The broad agenda of this PrivacyCon reflects a growing understanding that the data issues with which both the Commission and society at large are concerned have moved past the narrow framework of who has access to your personal data. This emerging understanding is why I prefer the term “data abuses” to the narrower language of “privacy.”
- Words matter, and “data abuses” reflects the fact that rampant corporate data collection, sharing, and exploitation harms consumers, workers, and competition in ways that go well beyond more traditional or libertarian privacy concerns. We must examine a wide variety of data abuses, including questions of racial bias, civil rights, and economic exclusion, considering practices that undermine personal autonomy and dignity, and reevaluating damaging and dangerous business models and market practices. In addition to examining these practices, we need to consider what to do about the problems we find in the market.
It appears that Kelly Slaughter is urging stakeholders to look beyond the immediate concerns about privacy (e.g., Apple, Google, and many, many others track my every move through my phone) to the downstream effects of data collection and processing. She is arguing that while the former category of harm is significant, the latter is much more malicious, for “data abuse” impinges one’s civil rights, employment, housing, and financial opportunities, and one’s very self in many cases. Kelly Slaughter is implicitly arguing that data abuse threatens democracy and fundamental rights. Moreover, Kelly Slaughter presumably is suggesting that “data abuses” can be policed under Section 5 of the FTC Act, the proscription against unfair and deceptive practices.
Next Kelly Slaughter proposed that the debate about privacy policy dispense with the notice and consent model:
And the second challenge I would like to issue today is the following: Can we move away from the outdated notice-and-consent model to govern questions surrounding personal data, and instead turn our focus to the underlying business structures and incentives that are anchored in indiscriminate collection and application of personal data to fuel data-driven business models such as behavioral advertising? It is this underlying incentive structure that has caused so many of the harms and privacy risks we’re here to discuss today.
Kelly Slaughter explained why she thinks changing the focus on how to address privacy from notice and consent to data minimization would be more effective:
Rather than focusing on opt-in versus opt-out, and whether privacy policies are clear enough, I believe we should be discussing the concept of data minimization, a principle that would ensure companies can collect only the information necessary to provide consumers with the service on offer, and use the data they collect only to provide that service. That minimization could be coupled with further use, purpose, sharing, and security requirements to ensure that the information companies can permissibly collect isn’t used to build tools or services that imperil people’s civil rights, economic opportunities, or personal autonomy. Corporate self-dealing is also a serious problem in the data ecosystem and as long as key digital markets are controlled by just a few large, data-hungry online platforms, both consumers and prospective entrants are at their mercy.
Kelly Slaughter may be viewing the collection and processing of personal data from the vantage of the sorts of problems that individuals cannot effectively protect themselves from or change (e.g. global warming, pollution, and other collective action problems.) She is likely addressing the number of choices everyone must make daily about which services, products, and websites to use. Also, one must accept or decline cookie settings on many websites and depending on the jurisdiction may be able to exercise certain rights, meaning more choices and more action. And all these choices are made in realms where dark patterns and flouting of laws and codes reign. Therefore, if privacy and personal data are indeed issues that one’s actions cannot meaningfully affect, then it makes sense to have governments step in and impose limits on those entities amassing everyone’s data.
And, according to Kelly Slaughter, she is not the only member of the FTC concerned about “data abuse.” She argued that there is “renewed energy” at the agency that has created space for “meaningful changes:”
The Commission has a shared concern about many of these practices and I’ve heard the call from members of the public at our two open meetings for us to take decisive action against these abuses. This moment of renewed energy at the FTC offers a window of time to catalyze meaningful changes in the markets and ensure that the data economy actually works for people, not just the largest corporate players.
Kelly Slaughter was careful to highlight that data collection and processing is also a competition issue as the Facebooks and Googles are sitting on troves of data that give them huge advantages:
of course, unchecked data collection is not just a consumer protection issue. It is also a competition issue, as the enormous amounts of data incumbents have collected gives them a profound advantage when competing against new entrants or seeking to enter new product markets themselves. We absolutely must look at these issues holistically, rather than myopically viewing them through the lens of either competition or consumer protection.
She may be suggesting that the FTC look into the data abuse practices from an antitrust angle, which would be a new approach regarding some of the largest technology companies.
Kelly Slaughter urged her colleagues at the FTC to move beyond the case-by-case common law approach that has characterized the agency’s enforcement of its statutory mandate. She may be proposing policy statements or rulemakings (the latter of which is very hard for the FTC):
I believe that the FTC has an obligation to use all the tools in its toolbox to address these issues. Simply challenging the application of abusive data practices on a case-by-case basis isn’t likely to bring about the systemic change we need to see in the market.
Recently appointed FTC Chief Technologist Erie Meyer echoed Kelly Slaughter. Meyer was named to her position by Chair Lina Khan, and it is unlikely Meyer would endorse public a position in scripted remarks at odds with Khan’s. Consequently, a fair reading of Meyer’s remarks is that Khan is in agreement, She said:
Before we kick offer today’s event, I want to share a few places where the market should expect changes and how the FTC will approach its work when it comes to protecting the public from the misuse and abuse of data. The approach is not through a narrow lens of consumer protection. Data abuses don’t happen in a vacuum. They’re fed by incentives. Among them beating out competitors. So with that broader view, you can expect key changes in our work. We’re going to make sure that data abusers face consequences for the wrongdoing and provide real help for affected individuals.
Obviously, Meyer is echoing Kelly Slaughter’s argument that the FTC should dispense with a privacy focus regarding personal data collection and processing and move instead to a more holistic model that encompasses so-called “data abuses.” What is more, Meyer is calling for new types of punishments to combat data abuses. One recent example that springs to mind is a settlement the FTC reached with Everalbum, Inc.:
As part of the proposed settlement, Everalbum, Inc. must obtain consumers’ express consent before using facial recognition technology on their photos and videos. The proposed order also requires the company to delete models and algorithms it developed by using the photos and videos uploaded by its users.
This sort of approach has been percolating in the minds of policymakers at the FTC. In an October 2020 speech, Kelly Slaughter advocated for new approaches to personal data cases, including disgorgement:
In some instances, our data privacy orders lack remedies that would directly help consumer victims. If monetary relief is not possible, consumers should still receive direct notice of the law violation, its possible impact, and any mitigation options available. If refunds and notice are both impossible, the Commission should employ creative approaches to mitigate consumer harm through admissions of liability, requiring opt-in regimes for existing customers, funding of education campaigns, disgorgement of data, or other creative solutions that might vary case by case.
Consequently, it is safe to assume the 3 Democratic FTC Commissioners will seek settlements for data abuses that require disgorgement of data and algorithms developed through the collection, processing, and use of such data. Also, the FTC could push entities to switch to an opt-in regime. In this vein, Meyer signaled the agency may be shifting away from pursuing only large fines and one-time admissions or disclosures of information as a means of changing practices prospectively:
When a firm breaks the law or worse, breaks the law over and over and over, regulators like the Federal Trade Commission need to design and impose remedies that fix things. Fixing things doesn’t mean making a disclosure longer or a one-time fine bigger. It means making sure that the firm cannot and will not benefit from ill- gotten data including against their competitors. It means making sure that the rest of the industry is deterred from engaging in similar wrongdoing. [It] [m]ight mean that we need to look at restructuring business incentives or corporate structure.
Meyer advocated for victims of data abuse getting “actual help,” in part, through orders and settlements making companies delete data and algorithms and disgorge money obtained through data abuses. Of course, on this last point, disgorgement, a recent Supreme Court of the United States decision struck down a commonly used method the agency has wielded against companies to get ill-gotten funds (i.e., Section 13(b).) Absent a statutory fix (the House just passed the “Consumer Protection and Recovery Act” (H.R. 2668)), the FTC would need to use other powers to seek and obtain equitable monetary relief. Meyer floated the possibility the FTC would seek to treat data abusers the same way it currently does abusive debt collectors through a ban from debt collection activities. It is not clear the agency could do the same to data abusers.
Meyer then expanded on the importance of fighting data abuses:
Data abuse is not just an issue of privacy. It’s a matter of civil rights and national security. People from communities whose rights and safety are constantly threatened can tell you, this isn’t just about someone knowing what you’ve looked up online. The U.S. Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act. The Department of Justice charged a zoom executive alleging that his actions led to people use Zoom’s data to track down and intimidate family members of people who use the platform to discuss the Tiananmen Square massacre. There’s been a 2,920% increase in reports of identity theft via government benefits this year. So what this means, for example, is when a bad actor applies for something like unemployment benefits using personal information gleaned from a data breach from one of these firms.
Meyer further argued that data abuses are systemic and not individual scandals and have real world effects on people. She continued:
A pandemic has sharpened the view of what happened to our country’s resilience because of these data disasters. We’re moving away from a legalistic approach. This means we’ll be approaching investigations with a disciplinary lens including privacy engineers and designers, financial analysts and product managers and yes, technologists. This won’t happen overnight.
In short, Meyer is spelling out an approach embraced by Commissioner Rohit Chopra who has consistently brought the view of an economist as opposed to a lawyer to the Commission’s work.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.
Photo by Markus Spiske on Unsplash
Photo by Marius Masalar on Unsplash