Other Developments
- The Supreme Court of the United States (SCOTUS) reversed a decision by the United States Court of Appeals for the Ninth Circuit that Facebook’s login notification text system, especially its database of phone numbers, violated a federal law barring abusive telemarketing practices. The case turned on whether Facebook’s system is an “automatic telephone dialing system” as defined by the statute, and SCOTUS found that the company’s system did not for in order for such a system to qualify it must be able to store or produce phone numbers using a random or sequential generator. In relevant part, SCOTUS found:
- The Telephone Consumer Protection Act of 1991 (TCPA) proscribes abusive telemarketing practices by, among other things, imposing restrictions on making calls with an “automatic telephone dialing system.” As defined by the TCPA, an “automatic telephone dialing system” is a piece of equipment with the capacity both “to store or produce tele-phone numbers to be called, using a random or sequential number generator,” and to dial those numbers. 47 U. S. C. §227(a)(1). The question before the Court is whether that definition encompasses equipment that can “store” and dial telephone numbers, even if the device does not “us[e] a random or sequential number generator.” It does not. To qualify as an “automatic telephone dialing system,” a device must have the capacity either to store a telephone number using a random or sequential generator or to produce a telephone number using a random or sequential number generator.
- Petitioner Facebook, Inc., maintains a social media platform with an optional security feature that sends users “login notification” text messages when an attempt is made to access their Facebook account from an unknown device or browser. If necessary, the user can then log into Face-book and take action to secure the account. To opt into this service, the user must provide and verify a cell phone number to which Facebook can send messages.
- In 2014, respondent Noah Duguid received several login-notification text messages from Facebook, alerting him that someone had attempted to access the Facebook account as-sociated with his phone number from an unknown browser. But Duguid has never had a Facebook account and never gave Facebook his phone number. Unable to stop the notifications, Duguid brought a putative class action against Facebook. He alleged that Facebook violated the TCPA by maintaining a database that stored phone numbers and programming its equipment to send automated text messages to those numbers each time the associated account was accessed by an unrecognized device or web browser.
- Facebook moved to dismiss the suit, arguing primarily that Duguid failed to allege that Facebook used an auto-dialer because he did not claim Facebook sent text messages to numbers that were randomly or sequentially generated. Rather, Facebook argued, Duguid alleged that Facebook sent targeted, individualized texts to numbers linked to specific accounts. The U. S. District Court for the Northern District of California agreed and dismissed Duguid’s amended complaint with prejudice. 2017 WL 635117, *4–*5 (Feb. 16, 2017).
- The United States Court of Appeals for the Ninth Circuit reversed. As relevant here, the Ninth Circuit held that Duguid had stated a claim under the TCPA by alleging that Facebook’s notification system automatically dialed stored numbers. An auto dialer, the Court of Appeals held, need not be able to use a random or sequential generator to store numbers; it need only have the capacity to “‘store numbers to be called’” and “‘to dial such numbers automatically.’” 926 F. 3d 1146, 1151 (2019) (quoting Marks v. Crunch San Diego, LLC, 904 F. 3d 1041, 1053 (CA9 2018)).
- We granted certiorari to resolve a conflict among the Courts of Appeals regarding whether an auto dialer must have the capacity to generate random or sequential phone numbers.4 591 U. S. ___ (2020). We now reverse the Ninth Circuit’s judgment.
- The Supreme Court of the United States (SCOTUS) also upheld the 2017 Federal Communications Commission’s (FCC) revision of media ownership rules that were challenged on the grounds that the agency did not properly account for the likely effect on minority and female ownership of media. SCOTUS disagreed and found the FCC acted legally in promulgating those rules. SCOTUS explained:
- Under the Communications Act of 1934, the Federal Communications Commission possesses broad authority to regulate broadcast media in the public interest. Exercising that statutory authority, the FCC has long maintained strict ownership rules. The rules limit the number of radio stations, television stations, and newspapers that a single entity may own in a given market. Under Section 202(h) of the Telecommunications Act of 1996, the FCC must review the ownership rules every four years, and must repeal or modify any ownership rules that the agency determines are no longer in the public interest.
- In a 2017 order, the FCC concluded that three of its ownership rules no longer served the public interest. The FCC therefore repealed two of those rules—the Newspaper/Broadcast Cross-Ownership Rule and the Radio/Television Cross-Ownership Rule. And the Commission modified the third—the Local Television Ownership Rule. In conducting its public interest analysis under Section 202(h), the FCC considered the effects of the rules on competition, localism, viewpoint diversity, and minority and female ownership of broadcast media outlets. The FCC concluded that the three rules were no longer necessary to promote competition, localism, and viewpoint diversity, and that changing the rules was not likely to harm minority and female ownership.
- A non-profit advocacy group known as Prometheus Radio Project, along with several other public interest and consumer advocacy groups, petitioned for review, arguing that the FCC’s decision was arbitrary and capricious under the Administrative Procedure Act. In particular, Prometheus contended that the record evidence did not support the FCC’s predictive judgment regarding minority and female ownership. Over Judge Scirica’s dissent, the U. S. Court of Appeals for the Third Circuit agreed with Prometheus and vacated the FCC’s 2017 order.
- On this record, we conclude that the FCC’s 2017 order was reasonable and reasonably explained for purposes of the APA’s deferential arbitrary-and-capricious standard. We therefore reverse the judgment of the Third Circuit.
- In a blog posting, the United Kingdom’s Information Commissioner’s Office (ICO) detailed its ambitions to update its guidance on anonymisation and pseudonymisation, and “to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.” The agency will solicit input before drafting this guidance, however. The ICO explained:
- The recent ICO Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. However, we recognise there are other dimensions to data sharing. The code is not a conclusion, but a milestone in this ongoing work. We will continue to provide clarity and advice in how data can be shared in line with the law.
- Building on this promise, we are now outlining our plans to update our guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing. We recognise that questions about when data is personal data or anonymous information are some of the most challenging issues organisations face.
- Our refreshed guidance will assist organisations in meeting these challenges. We will set out our views on approaches like the spectrum of identifiability, and how these can be practically applied. We will provide advice on how to assess the appropriate controls that need to be in place and we will be grounding our guidance in practical steps organisations can take.
- The key topics we will be exploring include:
- Anonymisation and the legal framework – legal, policy and governance issues around the application of anonymisation in the context of data protection law;
- Identifiability – outlining approaches such as the spectrum of identifiability and their application in data sharing scenarios, including guidance on managing re-identification risk, covering concepts such as the ‘reasonably likely’ and ‘motivated intruder’ tests;
- Guidance on pseudonymisation techniques and best practices;
- Accountability and governance requirements in the context of anonymisation and pseudonymisation, including data protection by design and DPIAs;
- Anonymisation and research – how anonymisation and pseudonymisation apply in the context of research;
- Guidance on privacy enhancing technologies (PETs) and their role in safe data sharing;
- Technological solutions – exploring possible options and best practices for implementation; and
- Data sharing options and case studies – supporting organisations to choose the right data sharing measures in a number of contexts including sharing between different organisations and open data release. Developed with key stakeholders, our case studies will demonstrate best practice.
- In a statement, acting Federal Trade Commission Chair Rebecca Kelly Slaughter announced the agency would not appeal the United States Court of Appeals for the Ninth Circuit decision for Qualcomm in the FTC’s antitrust action. Last summer, the Ninth Circuit reversed a lower court’s ruling that FTC has indeed proven that Qualcomm’s chip licensing practices violated United States antitrust law (see here for more detail and analysis.) According to one account, the FTC faced unfavorable timing because it had a deadline, but the agency lacks a full complement of members and the Department of Justice’s (DOJ) Solicitor General’s office is also not fully staffed. Moreover, it was reported the DOJ did not want to appeal the case because the FTC did not have a strong position. This announcement comes a few weeks after Slaughter went before a House committee and argued the FTC should bring hard cases in an attempt to more vigorously enforce antitrust law. Slaughter argued:
- Given the significant headwinds facing the Commission in this matter, the FTC will not petition the Supreme Court to review the decision of the Court of Appeals for the Ninth Circuit in FTC v. Qualcomm. The FTC’s staff did an exceptional job presenting the case, and I continue to believe that the district court’s conclusion that Qualcomm violated the antitrust laws was entirely correct and that the court of appeals erred in concluding otherwise. Now more than ever, the FTC and other law enforcement agencies need to boldly enforce the antitrust laws to guard against abusive behavior by dominant firms, including in high-technology markets and those that involve intellectual property. I am particularly concerned about the potential for anticompetitive or unfair behavior in the context of standard setting and the FTC will closely monitor conduct in this arena.
- European Data Protection Board (EDPB) adopted a statement on the draft ePrivacy Regulation stated it welcomed “the agreed negotiation mandate adopted by the Council on the protection of privacy and confidentiality in the use of electronic communication services (’the Council’s position’), as a positive step towards a new ePrivacy Regulation.” However, the EDPB offered observations and critiques aimed at ensuring the European Union’s (EU) rewrite of privacy in electronic communications rules heed existing EU law, especially the General Data Protection Regulation (GDPR). The EDPB pointed to recent Court of Justice for the European Union case law barring indiscriminate collection and retention of location and metadata in criminal cases save for the most dangerous threats. The Board stressed the ePrivacy Regulation must ensure the confidentiality of communications, would like to see a number of exceptions to the bar on data processing tightened, emphasized the need for strong encryption, called for better language on consent in line with the GDPR, and advocated for a better system of cooperation among national authorities in enforcing the new regime.
- The Government Accountability Office (GAO) issued its most recent assessment of the Office of Management and Budget’s (OMB) Data Center Optimization Initiative (DCOI) at the request of the Armed Services Committees, the Senate Homeland Security and Governmental Affairs, and the House Oversight and Reform Committee. The GAO found that the data agencies reported would achieve $1.1 billion in savings in shutting down or consolidating data centers. However, the GAO found that Trump Administration guidance resulted in agencies being able to measure success in a key metric according to their own standards, which has undermined efficiency. The GAO noted that 53 of its 125 recommendations made on the DCOI since 2016 remain open. The GAO called on OMB to “reexamine its DCOI guidance regarding how to measure server utilization and revise it to better and more consistently address server efficiency.” The GAO concluded:
- Agencies continue to report progress toward meeting their goals for data center closures and achieving the related savings. Specifically, almost all of the 24 DCOI agencies met their goals for data center closures in fiscal year 2019 and also planned to meet their closure goals for 2020. Additionally, in fiscal year 2019, almost all of the agencies met their savings goals and all planned to meet their 2020 cost savings goals for a total of $1.1 billion in savings over the 2 years. While agencies’ efforts in both respects have made an important contribution to achieving the overall goals of DCOI, taking action to address our prior recommendations could help those agencies that did not meet their goals to achieve even more benefits from DCOI.
- Agencies reported mixed progress against OMB’s optimization metrics for both fiscal years 2019 and 2020. While most agencies have not met all of their optimization targets, taking action to address our prior recommendations could help those agencies to realize fully the expected benefits of DCOI.
- While OMB developed an effective server utilization metric in 2016, the agency’s 2019 DCOI guidance revisions resulted in a metric that no longer reported on actual server utilization, resulting in an incomplete picture of utilization. Without better guidance on how to report on server utilization, the server-related optimization metrics will lack meaningful information about agencies’ DCOI performance. Absent complete information, OMB and Congress may be hindered in providing oversight and making appropriate decisions about budgeting for data center utilization.
- The Federal Bureau of Investigation (FBI) issued its “2020 Internet Crime Report,” in which the agency reported a record number of claims submitted to its Internet Crime Complaint Center (IC3). The FBI explained:
- IC3 received a record number of complaints from the American public in 2020: 791,790, with reported losses exceeding $4.1 billion. This represents a 69% increase in total complaints from 2019. Business E-mail Compromise (BEC) schemes continued to be the costliest: 19,369 complaints with an adjusted loss of approximately $1.8 billion. Phishing scams were also prominent: 241,342 complaints, with adjusted losses of over $54 million. The number of ransomware incidents also continues to rise, with 2,474 incidents reported in 2020.
- Senators Josh Hawley (R-MO), Mike Lee (R-UT), and Marsha Blackburn (R-TN) wrote the acting chair of the Federal Trade Commission (FTC) and the chair of the Senate Judiciary Committee regarding Politico’s blockbuster story that the Obama Administration’s FTC overruled staff recommendations that Google be sued for antitrust violations. Then then FTC, led by Chair Jon Leibowitz, declined to move forward with essentially the same case the United States (U.S.) Department of Justice (DOJ) is now bringing against Google. The agency’s lawyers said sue, while the agency’s economists said do not. In their letter to acting FTC Chair Rebecca Kelly Slaughter, Hawley, Lee, and Blackburn asked that she “dispatch relevant officials to Congress as soon as possible,” and they asked Senate Judiciary Committee Chair Dick Durbin (D-IL) “to call Google executives and relevant FTC officials to publicly testify, including the former Commissioners who now represent big tech firms.” The Republican Senators are not calling for any legislation to correct to monopolies they decry in technology markets, however.
- A coalition of groups have banded together to ban what they call “surveillance advertising” through a variety of policy changes, including “comprehensive privacy legislation to reforming our antitrust laws and liability standards.” They asserted:
- Surveillance advertising – the core profit-driver for gatekeepers like Facebook and Google, as well as adtech middlemen – is the practice of extensively tracking and profiling individuals and groups, and then microtargeting ads at them based on their behavioral history, relationships, and identity.
- These dominant firms curate the content each person sees on their platforms using those dossiers – not just the ads, but newsfeeds, recommendations, trends, and so forth – to keep each user hooked, so they can be served more ads and mined for more data.
- Big Tech platforms amplify hate, illegal activities, and conspiracism – and feed users increasingly extreme content – because that’s what generates the most engagement and profit. Their own algorithmic tools have boosted everything from white supremacist groups and Holocaust denialism to COVID-19 hoaxes, counterfeit opioids and fake cancer cures. Echo chambers, radicalization, and viral lies are features of these platforms, not bugs—central to the business model.
- And surveillance advertising is further damaging the information ecosystem by starving the traditional news industry, especially local journalism. Facebook and Google’s monopoly power and data harvesting practices have given them an unfair advantage, allowing them to dominate the digital advertising market, siphoning up revenue that once kept local newspapers afloat. So while Big Tech CEOs get richer, journalists get laid off.
Further Reading
- “China has brought its repressive surveillance tools to Hong Kong” By Dan McDevitt — Nikkei Asia. In this piece by a “grants and communications manager at GreatFire.org, a group focused on monitoring and challenging Chinese internet censorship,” the argument is being made that the People’s Republic of China (PRC) is implementing new measures to restrict the internet access and technological freedom of those living in Hong Kong. The author calls on tech multinationals not to knuckle under to Beijing’s demands as has been the case in the past, for without cooperation from these companies, the government will have a more difficult time cracking down on free speech and dissidents. The tech companies may not want to jeopardize access to the PRC market to the extent they have a presence.
- “Ransomware Gang Fully Doxes Bank Employees in Extortion Attempt” by Lorenzo Franceschi-Bicchierai — Vice’s Motherboard. Is doxing the next form ransoming will take? Or perhaps strategic leaks of embarrassing or compromising information?
- “Massive Facebook study on users’ doubt in vaccines finds a small group appears to play a big role in pushing the skepticism” By Elizabeth Dwoskin — The Washington Post. It seems that just as with COVID and other viruses, the dispersion of anti-vaccine misinformation can be traced to a small number of super-spreaders. Facebook is currently trying to work through the posts of its platform that may contribute to hesitancy about the vaccine, which can range from relating symptoms after getting dosed to outright lies about the vaccines. In a perfect world, Facebook could land on a way to rejigger its algorithms to stop amplification of harmful content and promote content that would encourage people to get vaccinated.
- “China Punishes Microsoft’s LinkedIn Over Lax Censorship” By Paul Mozur, Raymond Zhong and Steve Lohr — The New York Times. The Cyberspace Administration of China, Beijing’s online censor, informed LinkedIn’s subsidiary in the People’s Republic of China (PRC) it ran afoul of the country’s laws banning certain online content. According to this article, the agency did not bother to tell LinkedIn which posts were offensive. The Microsoft-owned company is a small player in the PRC, paling in comparison to WeChat, but it remains the only big American platform operating there.
- “China emerges as quantum tech leader while Biden vows to catch up” Akira Oikawa, Yuki Okoshi and Yuki Misumi — Nikkei Asia. By one measure, the United States and Japan lead the People’s Republic of China (PRC) in patents for quantum computing, specifically in quantum computing hardware. But, in another regard, and overall, the PRC is ahead, especially in what experts foresee as the crucial field of quantum encryption and communications (i.e., the ability to both secure and read communications in a world with vastly more powerful computing.) The Biden Administration and Tokyo are keen to work together to catch up and negate the PRC’s lead.
Coming Events
- The Federal Communications Commission (FCC) will hold an open meeting on 22 April. No agenda has been announced as of yet.
- The Federal Trade Commission (FTC) will hold a workshop titled “Bringing Dark Patterns to Light” on 29 April.
- On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.
Photo by Ian Hutchinson on Unsplash