Further Reading, Other Developments, and Coming Events (10 December)

Further Reading

  • Social media superspreaders: Why Instagram, not Facebook, will be the real battleground for COVID-19 vaccine misinformation” By Isobel Asher Hamilton — Business Insider. According to one group, COVID-19 anti-vaccination lies and misinformation are proliferating on Instagram despite its parent company’s, Facebook, efforts to find and remove such content. There has been dramatic growth in such content on Instagram, and Facebook seems to be applying COVID-19 standards more loosely on Instagram. In fact, some people kicked off of Facebook for violating that platform’s standards on COVID-19 are still on Instagram spreading the same lies, misinformation, and disinformation. For example, British anti-vaccination figure David Icke was removed from Facebook for making claims that COVID-19 was caused by or related to 5G, but he has a significant following on Instagram.
  • ‘Grey area’: China’s trolling drives home reality of social media war” By Chris Zappone — The Sydney Morning Herald. The same concept that is fueling aggressive cyber activity at a level below outright war has spread to diplomacy. The People’s Republic of China (PRC) has been waging “gray” social media campaigns against a number of Western nations, including Australia, mainly be propagating lies and misinformation. The most recent example is the spreading a fake photo of an Australian soldier appearing to kill an Afghan child. This false material seems designed to distract from the real issues between the two nations arising from clashing policies on trade and human rights. The PRC’s activities do not appear to violate Australia’s foreign interference laws and seem to have left Canberra at a loss as to how to respond effectively.
  • Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show” By Elizabeth Dwoskin, Nitasha Tiku and Heather Kelly — The Washington Post. Facebook will apparently seek to revamp its algorithms to target the types of hate speech that have traditionally targeted women and minority groups. Up until now all attacks were treated equally so that something like “white people suck” would be treated the same way as anti-Semitic content. Facebook has resisted changes for years even though experts and civil rights groups made the case that people of color, women, and LGBTI people endure far more abuse online. There is probably no connection between Facebook’s more aggressive content moderation policies and the advent of a new administration in Washington more receptive to claims that social media platforms allow the abuse of these people.
  • How Joe Biden’s Digital Team Tamed the MAGA Internet” By Kevin Roose — The New York Times. Take this piece with a block of salt. The why they won articles are almost always rife with fallacies, including the rationale that if a candidate won, his or her strategy must have worked. It is not clear that the Biden Campaign’s online messaging strategy of being nice and emphasizing positive values actually beat the Trump Campaign’s “Death Star” so much as the President’s mishandling of the pandemic response and cratering of the economy did him in.
  • Coronavirus Apps Show Promise but Prove a Tough Sell” By Jennifer Valentino-DeVries — The New York Times. It appears the intersection of concerns about private and public sector surveillance from two very different groups has worked to keep down rates of adopting smartphone COVID tracking apps in the United States. There are people wary of private sector practices to hoover up as much data as possible, and others concerned about the government’s surveillance activities. Consequently, many are shunning Google and Apple’s COVID contact tracing apps to the surprise of government, industry, and academia. A pair of studies show resistance to downloading or using such apps even if there are very strong privacy safeguards. This result may well be a foreseeable outcome from U.S. policies that have allowed companies and the security services to collect and use vast quantities of personal information.
  • UAE target of cyber attacks after Israel deal, official says” — Reuters. A top cybersecurity official in the United Arab Emirates claimed his nation’s financial services industries were targeted for cyber attack and implied Iran and affiliated hackers were responsible.

Other Developments

  • President-elect Joe Biden announced his intention to nominate California Attorney General Xavier Becerra to serve as the next Secretary of Health and Human Services (HHS). If confirmed by the Senate, California Governor Gavin Newsom would name Becerra’s successor who would need to continue enforcement of the “California Consumer Privacy Act” (CCPA) (AB 375) while also working towards the transition to the “California Privacy Rights Act” (Proposition 24) approved by California voters last month. The new statute establishes the California Privacy Protection Agency that will assume the Attorney General’s responsibilities regarding the enforcement of California’s privacy laws. However, Becerra’s successor may play a pivotal role in the transition between the two regulators and the creation of the new regulations needed to implement Proposition 24.
  • The Senate approved the nomination of Nathan Simington to be a Commissioner of the Federal Communications Commission (FCC) by a 49-46 vote. Once FCC Chair Ajit Pai steps down, the agency will be left with two Democratic and two Republican Commissioners, pending the Biden Administration’s nominee to fill Pai’s spot. If the Senate stays Republican, it is possible the calculation could be made that a deadlocked FCC is better than a Democratic agency that could revive net neutrality rules among other Democratic and progressive policies. Consequently, Simington’s confirmation may be the first step in a FCC unable to develop substantive policy.
  • Another federal court has broadened the injunction against the Trump Administration’s ban on TikTok to encompass the entirety of the Department of Commerce’s September order meant to stop the usage of the application in the United States (U.S.) It is unclear as to whether the Trump Administration will appeal, and if it should, whether a court would decide the case before the Biden Administration begins in mid-January. The United States Court for the District of Columbia found that TikTok “established that  the government likely exceeded IEEPA’s express limitations as part of an agency action that was arbitrary and capricious” and would likely suffer irreparable harm, making an injunction an appropriate remedy.
  • The United States’ National Security Agency (NSA) “released a Cybersecurity Advisory on Russian state-sponsored actors exploiting CVE-2020-4006, a command-injection vulnerability in VMware Workspace One Access, Access Connector, Identity Manager, and Identity Manager Connector” and provided “mitigation and detection guidance.”
  • The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint alert, warning that U.S. think tanks are being targeted by “persistent continued cyber intrusions by advanced persistent threat (APT) actors.” The agencies stated “[t]his malicious activity is often, but not exclusively, directed at individuals and organizations that focus on international affairs or national security policy.” CISA and the FBI stated its “guidance may assist U.S. think tanks in developing network defense procedures to prevent or rapidly detect these attacks.” The agencies added:
    • APT actors have relied on multiple avenues for initial access. These have included low-effort capabilities such as spearphishing emails and third-party message services directed at both corporate and personal accounts, as well as exploiting vulnerable web-facing devices and remote connection capabilities. Increased telework during the COVID-19 pandemic has expanded workforce reliance on remote connectivity, affording malicious actors more opportunities to exploit those connections and to blend in with increased traffic. Attackers may leverage virtual private networks (VPNs) and other remote work tools to gain initial access or persistence on a victim’s network. When successful, these low-effort, high-reward approaches allow threat actors to steal sensitive information, acquire user credentials, and gain persistent access to victim networks.
    • Given the importance that think tanks can have in shaping U.S. policy, CISA and FBI urge individuals and organizations in the international affairs and national security sectors to immediately adopt a heightened state of awareness and implement the critical steps listed in the Mitigations section of this Advisory.
  • A group of Democratic United States Senators have written the CEO of Alphabet and Google about its advertising policies and how its platforms may have been used to spread misinformation and contribute to voter suppression. Thus far, most of the scrutiny about the 2020 election and content moderation policy has fallen on Facebook and Twitter even though Google-owned YouTube has been flagged as containing the same amount of misinformation. Senators Amy Klobuchar (D-MN) and Mark Warner (D-VA) led the effort and expressed “serious concerns regarding recent reports that Google is profiting from the sale of ads spreading election-related disinformation” to Alphabet and Google CEO Sundar Pichai. Klobuchar, Warner, and their colleagues asserted:
    • Google is also helping organizations spreading election-related disinformation to raise revenue by placing ads on their websites. While Google has some policies in place to prevent the spread of election misinformation, they are not properly enforced and are inadequate. We urge you to immediately strengthen and improve enforcement of your policies on election-related disinformation and voter suppression, reject all ads spreading election-related disinformation, and stop providing advertising services on sites that spread election-related disinformation.
    • …a recent study by the Global Disinformation Index (GDI) found that Google services ads on 145 out of 200 websites GDI examined that publish disinformation. 
    • Similarly, a recent report from the Center for Countering Digital Hate (CCDH) found that Google has been placing ads on websites publishing disinformation designed to undermine elections. In examining just six websites publishing election-related disinformation, CCDH estimates that they receive 40 million visits a month, generating revenue for these sites of up to $3.4 million annually from displaying Google ads. In addition, Google receives $1.6 million from the advertisers’ payments annually.  These sites published stories ahead of the 2020 general election that contained disinformation alleging that voting by mail was not secure, that mail-in voting was being introduced to “steal the election,” and that election officials were “discarding mail ballots.” 
  • A bipartisan group of United States Senators on one committee are urging Congressional leadership to include funding to help telecommunications companies remove and replace Huawei and ZTE equipment and to aid the Federal Communications Commission (FCC) in drafting accurate maps of broadband service in the United States (U.S.). Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) and a number of his colleagues wrote the leadership of both the Senate and House and argued:
    • we urge you to provide full funding for Public Law 116-124, the Secure and Trusted Communications Networks Act, and Public Law 116-130, the Broadband DATA Act.   
    • Closing the digital divide and winning the race to 5G are critical to America’s economic prosperity and global leadership in technology. However, our ability to connect all Americans and provide access to next-generation technology will depend in large part on the security of our communications infrastructure. The Secure and Trusted Communications Networks Act (“rip and replace”) created a program to help small, rural telecommunications operators remove equipment posing a security threat to domestic networks and replace it with equipment from trusted providers. This is a national security imperative. Fully funding this program is essential to protecting the integrity of our communications infrastructure and the future viability of our digital economy at large.
    • In addition to safeguarding the security of the nation’s communications systems, developing accurate broadband maps is also critically important. The United States faces a persistent digital divide, and closing this divide requires accurate maps that show where broadband is available and where it is not. Current maps overstate broadband availability, which prevents many underserved communities, particularly in rural areas, from receiving the funds needed to build or expand broadband networks to millions of unconnected Americans. Fully funding the Broadband DATA Act will ensure more accurate broadband maps and better stewardship over the millions of dollars the federal government awards each year to support broadband deployment. Without these maps, the government risks overbuilding existing networks, duplicating funding already provided, and leaving communities unserved.  
  • The Government Accountability Office (GAO) released an assessment of 5G policy options that “discusses (1) how the performance goals and expected uses are to be realized in U.S. 5Gwireless networks; (2) the challenges that could affect the performance or usage of 5G wireless networks in the U.S.; and (3) policy options to address these challenges.” The report had been requested by the chairs and ranking members of the House Armed Services, Senate Armed Services, Senate Intelligence, and House Intelligence Committees along with other Members. The GAO stated “[w]hile 5G is expected to deliver significantly improved network performance and greater capabilities, challenges may hinder the performance or usage of 5G technologies in the U.S. We grouped the challenges into the following four categories:
    • availability and efficient use of spectrum
    • security of 5G networks
    • concerns over data privacy
    • concerns over possible health effects
    • The GAO presented the following policy options along with opportunities and considerations for each:
      • Spectrum-Sharing Technologies Opportunities:
        • Could allow for more efficient use of the limited spectrum available for 5G and future generations of wireless networks.
        • It may be possible to leverage existing5G testbeds for testing the spectrum sharing technologies developed through applied research.
      • Spectrum-Sharing Technologies Considerations:
        • Research and development is costly, must be coordinated and administered, and its potential benefits are uncertain. Identifying a funding source, setting up the funding mechanism, or determining which existing funding streams to reallocate will require detailed analysis.
      • Coordinated Cybersecurity Monitoring Opportunities:
        • A coordinated monitoring program would help ensure the entire wireless ecosystem stays knowledgeable about evolving threats, in close to real time; identify cybersecurity risks; and allow stakeholders to act rapidly in response to emerging threats or actual network attacks.
      • Coordinated Cybersecurity Monitoring Considerations:
        • Carriers may not be comfortable reporting incidents or vulnerabilities, and determinations would need to be made about what information is disclosed and how the information will be used and reported.
      • Cybersecurity Requirements Opportunities
        • Taking these steps could produce a more secure network. Without a baseline set of security requirements the implementation of network security practices is likely to be piecemeal and inconsistent.
        • Using existing protocols or best practices may decrease the time and cost of developing and implementing requirements.
      • Cybersecurity Requirements Considerations
        • Adopting network security requirements would be challenging, in part because defining and implementing the requirements would have to be done on an application-specific basis rather than as a one-size-fits-all approach.
        • Designing a system to certify network components would be costly and would require a centralized entity, be it industry-led or government-led.
      • Privacy Practices Considerations
        • Development and adoption of uniform privacy practices would benefit from existing privacy practices that have been implemented by states, other countries, or that have been developed by federal agencies or other organizations.
      • Privacy Practices Opportunities
        • Privacy practices come with costs, and policymakers would need to balance the need for privacy with the direct and indirect costs of implementing privacy requirements. Imposing requirements can be burdensome, especially for smaller entities.
      • High-band Research Opportunities
        • Could result in improved statistical modeling of antenna characteristics and more accurately representing propagation characteristics.
        • Could result in improved understanding of any possible health effects from long-term radio frequency exposure to high-band emissions.
      • High-band Research Considerations
        • Research and development is costly and must be coordinated and administered, and its potential benefits are uncertain. Policymakers will need to identify a funding source or determine which existing funding streams to reallocate.

Coming Events

  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up on 10 December.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Tima Miroshnichenko from Pexels

Tech Election Results

A number of tech ballot initiatives were considered.

There were a number of significant technology measures put before voters in states in yesterday’s election. The most significant were in California as voters agreed to replace the “California Consumer Privacy Act” (CCPA) (AB 375) with a new privacy bill, voted for another technology-related ballot initiative, and rejected another one. In voting for Proposition 24, California voters chose to replace the recently effective CCPA with the “California Privacy Rights Act” (CPRA) (see here for my analysis) that will largely be operative on 1 January 2023, meaning the CCPA will continue to be the law of California until then unless a federal privacy law is enacted that preempts all state laws.

California voters voted for Proposition 22 that would allow Uber, Lyft and other companies to “Classif[y] app-based drivers as “independent contractors,” instead of “employees,” and provide[] independent-contractor drivers other compensation, unless certain criteria are met.” This ballot initiative would essentially negate AB 5, legislation that codified a court ruling that created the presumption that a person hired by an employer is an employee and not a contractor. Uber and Lyft have been fighting enforcement of AB 5 in court.

Voters also rejected Proposition 25 that would have permitted a 2018 statute to take effect that would have abolished cash bail in California with a system that determines who gets bail on the basis of algorithms. Elsewhere, Michigan voters overwhelmingly voted to support Proposal 20-2. Require Warrant for Electronic Data that would change state law to make electronic communications data protected to the extent police would need to obtain a search warrant before accessing it. In Massachusetts, voters supported expanding a right to repair cars law that would require auto manufacturers to make available telematic data to third-party repair garages. This law is seen as a precursor of a similar right to repair hardware that could soon be placed on ballots throughout the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay

Further Reading, Other Developments, and Coming Events (22 October)

Further Reading

  •  “A deepfake porn Telegram bot is being used to abuse thousands of women” By Matt Burgess — WIRED UK. A bot set loose on Telegram can take pictures of women and, apparently teens, too, and “takes off” their clothing, rendering a naked image of females who never took naked pictures. This seems to be the next iteration in deepfake porn, a problem that will surely get worse until governments legislate against it and technology companies have incentives to locate and take down such material.
  • The Facebook-Twitter-Trump Wars Are Actually About Something Else” By Charlie Warzel — The New York Times. This piece makes the case that there are no easy fixes for American democracy or for misinformation on social media platforms.
  • Facebook says it rejected 2.2m ads for breaking political campaigning rules” — Agence France-Presse. Facebook’s Vice President of Global Affairs and Communications Nick Clegg said the social media giant is employing artificial intelligence and humans to find and remove political advertisements that violate policy in order to avoid a repeat of 2016 where untrue information and misinformation played roles in both Brexit and the election of Donald Trump as President of the United States.
  • Huawei Fallout—Game-Changing New China Threat Strikes At Apple And Samsung” By Zak Doffman — Forbes. Smartphone manufacturers from the People’s Republic of China (PRC) appear ready to step into the projected void caused by the United States (U.S.) strangling off Huawei’s access to chips. Xiaomi and Oppo have already seen sales surge worldwide and are poised to pick up where Huawei is being forced to leave off, perhaps demonstrating the limits of U.S. power to blunt the rise of PRC technology companies.
  • As Local News Dies, a Pay-for-Play Network Rises in Its Place” By Davey Alba and Jack Nicas — The New York Times. With a decline and demise of many local media outlets in the United States, new groups are stepping into the void, and some are politically minded but not transparent about biases. The organization uncovered in this article is nakedly Republican and is running and planting articles at both legitimate and artificial news sites for pay. Sometimes conservative donors pay, sometimes campaigns do. Democrats are engaged in the same activity but apparently to a lesser extent. These sorts of activities will only erode further faith in the U.S. media.
  • Forget Antitrust Laws. To Limit Tech, Some Say a New Regulator Is Needed.” By Steve Lohr — The New York Times. This piece argues that anti-trust enforcement actions are plodding, tending to take years to finish. Consequently, this body of law is inadequate to the task of addressing the market dominance of big technology companies. Instead, a new regulatory body is needed along the lines of those regulating the financial services industries that is more nimble than anti-trust. Given the problems in that industry with respect to regulation, this may not be the best model.
  • “‘Do Not Track’ Is Back, and This Time It Might Work” By Gilad Edelman — WIRED. Looking to utilize the requirement in the “California Consumer Privacy Act” (CCPA) (AB 375) that requires regulated entities to respect and effectuate the use of a one-time opt-out mechanism, a group of entities have come together to build and roll out the Global Privacy Control. In theory, users could download this technical specification to their phones and computers, install it, use it once, and then all websites would be on notice regarding that person’s privacy preferences. Such a means would go to the problem turned up by Consumer Reports recent report on the difficulty of trying to opt out of having one’s personal information sold.
  • EU countries sound alarm about growing anti-5G movement” By Laurens Cerulus — Politico. 15 European Union (EU) nations wrote the European Commission (EC) warning that the nascent anti-5G movement borne of conspiracy thinking and misinformation threatens the Eu’s position vis-à-vis the United States (U.S.) and the People’s Republic of China (PRC). There have been more than 200 documented arson attacks in the EU with the most having occurred in the United Kingdom, France, and the Netherlands. These nations called for a more muscular, more forceful debunking of the lies and misinformation being spread about 5G.
  • Security firms call Microsoft’s effort to disrupt botnet to protect against election interference ineffective” By Jay Greene — The Washington Post. Microsoft seemingly acted alongside the United States (U.S.) Cyber Command to take down and impair the operation of Trickbot, but now cybersecurity experts are questioning how effective Microsoft’s efforts really were. Researchers have shown the Russian operated Trickbot has already stood up operations and has dispersed across servers around the world, showing how difficult it is to address some cyber threats.
  • Governments around the globe find ways to abuse Facebook” By Sara Fischer and Ashley Gold — Axios. This piece puts a different spin on the challenges Facebook faces in countries around the world, especially those that ruthlessly use the platform to spread lies and misinformation than the recent BuzzFeed News article. The new article paints Facebook as the well-meaning company being taken advantage of while the other one portrays a company callous to content moderation except in nations where it causes them political problems such as the United States, the European Union, and other western democracies.

Other Developments

  • The United States (U.S.) Department of Justice’s (DOJ) Cyber-Digital Task Force (Task Force) issued “Cryptocurrency: An Enforcement Framework,” that “provides a comprehensive overview of the emerging threats and enforcement challenges associated with the increasing prevalence and use of cryptocurrency; details the important relationships that the Department of Justice has built with regulatory and enforcement partners both within the United States government and around the world; and outlines the Department’s response strategies.” The Task Force noted “[t]his document does not contain any new binding legal requirements not otherwise already imposed by statute or regulation.” The Task Force summarized the report:
    • [I]n Part I, the Framework provides a detailed threat overview, cataloging the three categories into which most illicit uses of cryptocurrency typically fall: (1) financial transactions associated with the commission of crimes; (2) money laundering and the shielding of legitimate activity from tax, reporting, or other legal requirements; and (3) crimes, such as theft, directly implicating the cryptocurrency marketplace itself. 
    • Part II explores the various legal and regulatory tools at the government’s disposal to confront the threats posed by cryptocurrency’s illicit uses, and highlights the strong and growing partnership between the Department of Justice and the Securities and Exchange Commission, the Commodity Futures Commission, and agencies within the Department of the Treasury, among others, to enforce federal law in the cryptocurrency space.
    • Finally, the Enforcement Framework concludes in Part III with a discussion of the ongoing challenges the government faces in cryptocurrency enforcement—particularly with respect to business models (employed by certain cryptocurrency exchanges, platforms, kiosks, and casinos), and to activity (like “mixing” and “tumbling,” “chain hopping,” and certain instances of jurisdictional arbitrage) that may facilitate criminal activity.    
  • The White House’s Office of Science and Technology Policy (OSTP) has launched a new website for the United States’ (U.S.) quantum initiative and released a report titled “Quantum Frontiers: Report On Community Input To The Nation’s Strategy For Quantum Information Science.” The Quantum Initiative flows from the “National Quantum Initiative Act” (P.L. 115-368) “to  provide  for  a  coordinated  Federal  program  to  accelerate  quantum  research  and  development  for  the  economic and national security of the United States.” The OSTP explained that the report “outlines eight frontiers that contain core problems with fundamental questions confronting quantum information science (QIS) today:
    • Expanding Opportunities for Quantum Technologies to Benefit Society
    • Building the Discipline of Quantum Engineering
    • Targeting Materials Science for Quantum Technologies
    • Exploring Quantum Mechanics through Quantum Simulations
    • Harnessing Quantum Information Technology for Precision Measurements
    • Generating and Distributing Quantum Entanglement for New Applications
    • Characterizing and Mitigating Quantum Errors
    • Understanding the Universe through Quantum Information
    • OSTP asserted “[t]hese frontier areas, identified by the QIS research community, are priorities for the government, private sector, and academia to explore in order to drive breakthrough R&D.”
  • The New York Department of Financial Services (NYDFS) published its report on the July 2020 Twitter hack during which a team of hacker took over a number of high-profile accounts (e.g. Barack Obama, Kim Kardashian West, Jeff Bezos, and Elon Musk) in order to perpetrate a cryptocurrency scam. The NYDFS has jurisdiction over cryptocurrencies and companies dealing in this item in New York. The NYDFS found that the hackers used the most basic means to acquire permission to take over accounts. The NYDFS explained:
    • Given that Twitter is a publicly traded, $37 billion technology company, it was surprising how easily the Hackers were able to penetrate Twitter’s network and gain access to internal tools allowing them to take over any Twitter user’s account. Indeed, the Hackers used basic techniques more akin to those of a traditional scam artist: phone calls where they pretended to be from Twitter’s Information Technology department. The extraordinary access the Hackers obtained with this simple technique underscores Twitter’s cybersecurity vulnerability and the potential for devastating consequences. Notably, the Twitter Hack did not involve any of the high-tech or sophisticated techniques often used in cyberattacks–no malware, no exploits, and no backdoors.
    • The implications of the Twitter Hack extend far beyond this garden-variety fraud. There are well-documented examples of social media being used to manipulate markets and interfere with elections, often with the simple use of a single compromised account or a group of fake accounts.In the hands of a dangerous adversary, the same access obtained by the Hackers–the ability to take control of any Twitter users’ account–could cause even greater harm.
    • The Twitter Hack demonstrates the need for strong cybersecurity to curb the potential weaponization of major social media companies. But our public institutions have not caught up to the new challenges posed by social media. While policymakers focus on antitrust and content moderation problems with large social media companies, their cybersecurity is also critical. In other industries that are deemed critical infrastructure, such as telecommunications, utilities, and finance, we have established regulators and regulations to ensure that the public interest is protected. With respect to cybersecurity, that is what is needed for large, systemically important social media companies.
    • The NYDFS recommended the cybersecurity measures cryptocurrency companies in New York should implement to avoid similar hacks, including its own cybersecurity regulations that bind its regulated entities in New York. The NYDFS also called for a national regulator to address the lack of a dedicated regulator of Twitter and other massive social media platforms. The NYDFS asserted:
      • Social media companies currently have no dedicated regulator. They are subject to the same general oversight applicable to other companies. For instance, the SEC’s regulations for all public companies apply to public social media companies, and antitrust and related laws and regulations enforced by the Department of Justice and the FTC apply to social media companies as they do to all companies. Social media companies are also subject to generally applicable laws, such as the California Consumer Privacy Act and the New York SHIELD Act. The European Union’s General Data Protection Regulation, which regulates the storage and use of personal data, also applies to social media entities doing business in Europe.
      • But there are no regulators that have the authority to uniformly regulate social media platforms that operate over the internet, and to address the cybersecurity concerns identified in this Report. That regulatory vacuum must be filled.
      • A useful starting point is to create a “systemically important” designation for large social media companies, like the designation for critically important bank and non-bank financial institutions. In the wake of the 2007-08 financial crisis, Congress established a new regulatory framework for financial institutions that posed a systemic threat to the financial system of the United States. An institution could be designated as a Systemically Important Financial Institution (“SIFI”) “where the failure of or a disruption to the functioning of a financial market utility or the conduct of a payment, clearing, or settlement activity could create, or increase, the risk of significant liquidity or credit problems spreading among financial institutions or markets and thereby threaten the stability of the financial system of the United States.”
      • The risks posed by social media to our consumers, economy, and democracy are no less grave than the risks posed by large financial institutions. The scale and reach of these companies, combined with the ability of adversarial actors who can manipulate these systems, require a similarly bold and assertive regulatory approach.
      • The designation of an institution as a SIFI is made by the Financial Stability Oversight Council (“FSOC”), which Congress established to “identify risks to the financial stability of the United States” and to provide enhanced supervision of SIFIs.[67] The FSOC also “monitors regulatory gaps and overlaps to identify emerging sources of systemic risk.” In determining whether a financial institution is systemically important, the FSOC considers numerous factors including: the effect that a failure or disruption to an institution would have on financial markets and the broader financial system; the nature of the institution’s transactions and relationships; the nature, concentration, interconnectedness, and mix of the institution’s activities; and the degree to which the institution is regulated.
      • An analogue to the FSOC should be established to identify systemically important social media companies. This new Oversight Council should evaluate the reach and impact of social media companies, as well as the society-wide consequences of a social media platform’s misuse, to determine which companies they should designate as systemically important. Once designated, those companies should be subject to enhanced regulation, such as through the provision of “stress tests” to evaluate the social media companies’ susceptibility to key threats, including cyberattacks and election interference.
      • Finally, the success of such oversight will depend on the establishment of an expert agency to oversee designated social media companies. Systemically important financial companies designated by the FSOC are overseen by the Federal Reserve Board, which has a long-established and deep expertise in banking and financial market stability. A regulator for systemically important social media would likewise need deep expertise in areas such as technology, cybersecurity, and disinformation. This expert regulator could take various forms; it could be a completely new agency or could reside within an established agency or at an existing regulator.
  • The Government Accountability Office (GAO) evaluated how well the Trump Administration has been implementing the “Open, Public, Electronic and Necessary Government Data Act of 2018” (OPEN Government Data Act) (P.L. 115-435). As the GAO explained, this statute “requires federal agencies to publish their information as open data using standardized, nonproprietary formats, making data available to the public open by default, unless otherwise exempt…[and] codifies and expands on existing federal open data policy including the Office of Management and Budget’s (OMB) memorandum M-13-13 (M-13-13), Open Data Policy—Managing Information as an Asset.”
    • The GAO stated
      • To continue moving forward with open government data, the issuance of OMB implementation guidance should help agencies develop comprehensive inventories of their data assets, prioritize data assets for publication, and decide which data assets should or should not be made available to the public.
      • Implementation of this statutory requirement is critical to agencies’ full implementation and compliance with the act. In the absence of this guidance, agencies, particularly agencies that have not previously been subject to open data policies, could fall behind in meeting their statutory timeline for implementing comprehensive data inventories.
      • It is also important for OMB to meet its statutory responsibility to biennially report on agencies’ performance and compliance with the OPEN Government Data Act and to coordinate with General Services Administration (GSA) to improve the quality and availability of agency performance data that could inform this reporting. Access to this information could inform Congress and the public on agencies’ progress in opening their data and complying with statutory requirements. This information could also help agencies assess their progress and improve compliance with the act.
    • The GAO made three recommendations:
      • The Director of OMB should comply with its statutory requirement to issue implementation guidance to agencies to develop and maintain comprehensive data inventories. (Recommendation 1)
      • The Director of OMB should comply with the statutory requirement to electronically publish a report on agencies’ performance and compliance with the OPEN Government Data Act. (Recommendation 2)
      • The Director of OMB, in collaboration with the Administrator of GSA, should establish policy to ensure the routine identification and correction of errors in electronically published performance information. (Recommendation 3)
  • The United States’ (U.S.) National Security Agency (NSA) issued a cybersecurity advisory titled “Chinese State-Sponsored Actors Exploit Publicly Known Vulnerabilities,” that “provides Common Vulnerabilities and Exposures (CVEs) known to be recently leveraged, or scanned-for, by Chinese state-sponsored cyber actors to enable successful hacking operations against a multitude of victim networks.” The NSA recommended a number of mitigations generally for U.S. entities, including:
    • Keep systems and products updated and patched as soon as possible after patches are released.
    • Expect that data stolen or modified (including credentials, accounts, and software) before the device was patched will not be alleviated by patching, making password changes and reviews of accounts a good practice.
    • Disable external management capabilities and set up an out-of-band management network.
    • Block obsolete or unused protocols at the network edge and disable them in device configurations.
    • Isolate Internet-facing services in a network Demilitarized Zone (DMZ) to reduce the exposure of the internal network.
    • Enable robust logging of Internet-facing services and monitor the logs for signs of compromise.
    • The NSA then proceeded to recommend specific fixes.
    • The NSA provided this policy backdrop:
      • One of the greatest threats to U.S. National Security Systems (NSS), the U.S. Defense Industrial Base (DIB), and Department of Defense (DOD) information networks is Chinese state-sponsored malicious cyber activity. These networks often undergo a full array of tactics and techniques used by Chinese state-sponsored cyber actors to exploit computer networks of interest that hold sensitive intellectual property, economic, political, and military information. Since these techniques include exploitation of publicly known vulnerabilities, it is critical that network defenders prioritize patching and mitigation efforts.
      • The same process for planning the exploitation of a computer network by any sophisticated cyber actor is used by Chinese state-sponsored hackers. They often first identify a target, gather technical information on the target, identify any vulnerabilities associated with the target, develop or re-use an exploit for those vulnerabilities, and then launch their exploitation operation.
  • Belgium’s data protection authority (DPA) (Autorité de protection des données in French or Gegevensbeschermingsautoriteit in Dutch) (APD-GBA) has reportedly found that the Transparency & Consent Framework (TCF) developed by the Interactive Advertising Bureau (IAB) violates the General Data Protection Regulation (GDPR). The Real-Time Bidding (RTB) system used for online behavioral advertising allegedly transmits the personal information of European Union residents without their consent even before a popup appears on their screen asking for consent. The APD-GBA is the lead DPA in the EU in investigating the RTB and will likely now circulate their findings and recommendations to other EU DPAs before any enforcement will commence.
  • None Of Your Business (noyb) announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish Data Protection Commission (DPC) today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.) Last month, after the DPC directed Facebook to stop transferring the personal data of EU citizens to the U.S., the company filed suit in the Irish High Court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • noyb further asserted:
      • Instead of making a decision in the pending procedure, the DPC has started a second, new investigation into the same subject matter (“Parallel Procedure”), as widely reported (see original reporting by the WSJ). No logical reasons for the Parallel Procedure was given, but the DPC has maintained that Mr Schrems will not be heard in this second case, as he is not a party in this Parallel Procedure. This Paralell procedure was criticised by Facebook publicly (link) and instantly blocked by a Judicial Review by Facebook (see report by Reuters).
      • Today’s Judicial Review by noyb is in many ways the counterpart to Facebook’s Judicial Review: While Facebook wants to block the second procedure by the DPC, noyb wants to move the original complaints procedure towards a decision.
      • Earlier this summer, the CJEU struck down the adequacy decision for the agreement between the EU and (U.S. that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data to the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • Most likely with the aim of helping British chances for an adequacy decision from the European Union (EU), the United Kingdom’s Information Commissioner’s Office (ICO) published guidance that “discusses the right of access [under the General Data Protection Regulation (GDPR)] in detail.” The ICO explained “is aimed at data protection officers (DPOs) and those with specific data protection responsibilities in larger organisations…[but] does not specifically cover the right of access under Parts 3 and 4 of the Data Protection Act 2018.”
    • The ICO explained
      • The right of access, commonly referred to as subject access, gives individuals the right to obtain a copy of their personal data from you, as well as other supplementary information.
  • The report the House Education and Labor Ranking Member requested from the Government Accountability Office (GAO) on the data security and data privacy practices of public schools. Representative Virginia Foxx (R-NC) asked the GAO “to review the security of K-12 students’ data. This report examines (1) what is known about recently reported K-12 cybersecurity incidents that compromised student data, and (2) the characteristics of school districts that experienced these incidents.” Strangely, the report did have GAO’s customary conclusions or recommendations. Nonetheless, the GAO found:
    • Ninety-nine student data breaches reported from July 1, 2016 through May 5, 2020 compromised the data of students in 287 school districts across the country, according to our analysis of K-12 Cybersecurity Resource Center (CRC) data (see fig. 3). Some breaches involved a single school district, while others involved multiple districts. For example, an attack on a vendor system in the 2019-2020 school year affected 135 districts. While information about the number of students affected was not available for every reported breach, examples show that some breaches affected thousands of students, for instance, when a cybercriminal accessed 14,000 current and former students’ personally identifiable information (PII) in one district.
    • The 99 reported student data breaches likely understate the number of breaches that occurred, for different reasons. Reported incidents sometimes do not include sufficient information to discern whether data were breached. We identified 15 additional incidents in our analysis of CRC data in which student data might have been compromised, but the available information was not definitive. In addition, breaches can go undetected for some time. In one example, the personal information of hundreds of thousands of current and former students in one district was publicly posted for 2 years before the breach was discovered.
    • The CRC identified 28 incidents involving videoconferences from April 1, 2020 through May 5, 2020, some of which disrupted learning and exposed students to harm. In one incident, 50 elementary school students were exposed to pornography during a virtual class. In another incident in a different district, high school students were targeted with hate speech during a class, resulting in the cancellation that day of all classes using the videoconferencing software. These incidents also raise concerns about the potential for violating students’ privacy. For example, one district is reported to have instructed teachers to record their class sessions. Teachers said that students’ full names were visible to anyone viewing the recording.
    • The GAO found gaps in the protection and enforcement of student privacy by the United States government:
      • [The Department of] Education is responsible for enforcing Family Educational Rights and Privacy Act (FERPA), which addresses the privacy of PII in student education records and applies to all schools that receive funds under an applicable program administered by Education. If parents or eligible students believe that their rights under FERPA have been violated, they may file a formal complaint with Education. In response, Education is required to take appropriate actions to enforce and deal with violations of FERPA. However, because the department’s authority under FERPA is directly related to the privacy of education records, Education’s security role is limited to incidents involving potential violations under FERPA. Further, FERPA amendments have not directly addressed educational technology use.
      • The “Children’s Online Privacy Protection Act” (COPPA) requires the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s privacy. The COPPA Rule, which took effect in 2000 and was later amended in 2013, requires operators of covered websites or online services that collect personal information from children under age 13 to provide notice and obtain parental consent, among other things. COPPA generally applies to the vendors who provide educational technology, rather than to schools directly. However, according to FTC guidance, schools can consent on behalf of parents to the collection of students’ personal information if such information is used for a school-authorized educational purpose and for no other commercial purpose.
  • Upturn, an advocacy organization that “advances equity and justice in the design, governance, and use of technology,” has released a report showing that United States (U.S.) law enforcement agencies have multiple means of hacking into encrypted or protected smartphones. There have long been the means and vendors available in the U.S. and abroad for breaking into phones despite the claims of a number of nations like the Five Eyes (U.S., the United Kingdom, Australia, Canada, and New Zealand) that default end-to-end encryption was a growing problem that allowed those preying on children and engaged in terrorism to go undetected. In terms of possible bias, Upturn is “is supported by the Ford Foundation, the Open Society Foundations, the John D. and Catherine T. MacArthur Foundation, Luminate, the Patrick J. McGovern Foundation, and Democracy Fund.”
    • Upturn stated:
      • Every day, law enforcement agencies across the country search thousands of cellphones, typically incident to arrest. To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”
      • This report documents the widespread adoption of MDFTs by law enforcement in the United States. Based on 110 public records requests to state and local law enforcement agencies across the country, our research documents more than 2,000 agencies that have purchased these tools, in all 50 states and the District of Columbia. We found that state and local law enforcement agencies have performed hundreds of thousands of cellphone extractions since 2015, often without a warrant. To our knowledge, this is the first time that such records have been widely disclosed.
    • Upturn argued:
      • Law enforcement use these tools to investigate not only cases involving major harm, but also for graffiti, shoplifting, marijuana possession, prostitution, vandalism, car crashes, parole violations, petty theft, public intoxication, and the full gamut of drug-related offenses. Given how routine these searches are today, together with racist policing policies and practices, it’s more than likely that these technologies disparately affect and are used against communities of color.
      • We believe that MDFTs are simply too powerful in the hands of law enforcement and should not be used. But recognizing that MDFTs are already in widespread use across the country, we offer a set of preliminary recommendations that we believe can, in the short-term, help reduce the use of MDFTs. These include:
        • banning the use of consent searches of mobile devices,
        • abolishing the plain view exception for digital searches,
        • requiring easy-to-understand audit logs,
        • enacting robust data deletion and sealing requirements, and
        • requiring clear public logging of law enforcement use.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Mehmet Turgut Kirkgoz from Pixabay

Further Reading, Other Developments, and Coming Events (14 October)

Further Reading

  •  “The Man Who Speaks Softly—and Commands a Big Cyber Army” By Garrett Graff — WIRED. A profile of General Paul Nakasone, the leader of both the United States’ National Security Agency (NSA) and Cyber Command, who has operated mostly in the background during the tumultuous Trump Administration. He has likely set the template for both organizations going forward for some time. A fascinating read chock with insider details.
  • Facebook Bans Anti-Vaccination Ads, Clamping Down Again” by Mike Isaac — The New York Times. In another sign of the social media platform responding to pressure in the United States and Europe, it was announced that anti-vaccination advertisements would no longer be accepted. This follows bans on Holocaust denial and QAnon material. Of course, this newest announcement is a classic Facebook half-step. Only paid advertisements will be banned, but users can continue to post about their opposition to vaccination.
  • To Mend a Broken Internet, Create Online Parks” By Eli Pariser — WIRED. An interesting argument that a public online space maintained by the government much like parks or public libraries may be just what democracies across the globe need to roll back the tide of extremism and division.
  • QAnon is tearing families apart” By Travis Andrews — The Washington Post. This is a terrifying tour through the fallout of the QAnon conspiracy that sucks some in so deeply they are marginally connected to reality in many ways.
  • AT&T has trouble figuring out where it offers government-funded Internet” By John Brodkin — Ars Technica.  So, yeah, about all that government cash given to big telecom companies that was supposed to bring more broadband coverage. Turns out, they definitely took the cash. The broadband service has been a much more elusive thing to verify. In one example, AT&T may or may not have provided service to 133,000 households in Mississippi after receiving funds from the Federal Communications Commission (FCC). Mississippi state authorities are arguing most of the service is non-existent. AT&T is basically saying it’s all a misunderstanding.

Other Developments

  • The California Attorney General’s Office (AG) has released yet another revision of the regulations necessary to implement the “California Consumer Privacy Act” (CCPA) (AB 375) and comments are due by 28 October. Of course, if Proposition 24 passes next month, the “California Privacy Rights Act” will largely replace the CCPA, requiring the drafting of even more regulations. Nonetheless, what everyone thought was the final set of CCPA regulations took effect on 14 August, but in the notice from the Office of Administrative Law was notice that the AG had withdrawn four portions of the proposed regulations. In the new draft regulations, the AG explained:
    • Proposed section 999.306, subd. (b)(3), provides examples of how businesses that collect personal information in the course of interacting with consumers offline can provide the notice of right to opt-out of the sale of personal information through an offline method.
    • Proposed section 999.315, subd. (h), provides guidance on how a business’s methods for submitting requests to opt-out should be easy and require minimal steps. It provides illustrative examples of methods designed with the purpose or substantial effect of subverting or impairing a consumer’s choice to opt-out.
    • Proposed section 999.326, subd. (a), clarifies the proof that a business may require an authorized agent to provide, as well as what the business may require a consumer to do to verify their request.
    • Proposed section 999.332, subd. (a), clarifies that businesses subject to either section 999.330, section 999.331, or both of these sections are required to include a description of the processes set forth in those sections in their privacy policies.
  • Facebook announced an update to its “hate speech policy to prohibit any content that denies or distorts the Holocaust.” Facebook claimed:
    • Following a year of consultation with external experts, we recently banned anti-Semitic stereotypes about the collective power of Jews that often depicts them running the world or its major institutions.  
    • Today’s announcement marks another step in our effort to fight hate on our services. Our decision is supported by the well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people. According to a recent survey of adults in the US aged 18-39, almost a quarter said they believed the Holocaust was a myth, that it had been exaggerated or they weren’t sure.
  • In a 2018 interview, Facebook CEO Mark Zuckerberg asserted:
    • I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong…
    • What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed.
    • He clarified in a follow up email:
      • I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.
      • Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed. These issues are very challenging but I believe that often the best way to fight offensive bad speech is with good speech.
  • The Government Accountability Office (GAO) issued an evaluation of the Trump Administration’s 5G Strategy and found more processes and actions are needed if this plan to vault the United States (U.S.) ahead of other nations will come to fruition. Specifically, “report examines the extent to which the Administration has developed a national strategy on 5G that address our six desirable characteristics of an effective national strategy.” The GAO identified the six desirable characteristics: (1) purpose, scope, and methodology; (2) problem definition and risk assessment; (3) goals, subordinate objectives, activities, and performance measures; (4) resources, investments, and risk management; (5) organizational roles, responsibilities, and coordination; and (6) integration and implementation. However, this assessment is necessarily limited, for National Security Council staff took the highly unusual approach of not engaging with the GAO, which may be another norm broken by the Trump Administration. The GAO stated “[t]he March 2020 5G national strategy partially addresses five of our desirable characteristics of an effective national strategy and does not address one, as summarized in table 1:
    • The GAO explained:
      • According to National Telecommunications and Information Administration (NTIA) and Office of Science and Technology Policy (OSTP) officials, the 5G national strategy was intentionally written to be at a high level and as a result, it may not include all elements of our six desirable characteristics of national strategies. These officials stated that the 5G implementation plan required by the Secure 5G and Beyond Act of 2020 is expected to include specific details, not covered in the 5G national strategy, on the U.S. government’s response to 5G risks and challenges. The implementation plan is expected to align and correspond to the lines of effort in the 5G national strategy. NTIA officials told us that the implementation plan to the 5G national strategy would be finalized by the end of October 2020. However, the officials we spoke to were unable to provide details on the final content of the implementation plan such as whether the plan would include all elements of our six desirable characteristics of national strategies given that it was not final. National strategies and their implementation plans should include all elements of the six desirable characteristics to enhance their usefulness as guidance and to ensure accountability and coordinate investments. Until the administration ensures that the implementation plan includes all elements of the six desirable characteristics, the guidance the plan provides decision makers in allocating resources to address 5G risks and challenges will likely be limited.
  • The Irish Council for Civil Liberties (ICCL) wrote the European Commission (EC) to make the case the United Kingdom (UK) is not deserving of an adequacy decision after Brexit because of institutional and cultural weaknesses at the Information Commissioner’s Office (ICO). The ICCL made the case that the ICO has been one of the most ineffectual enforcers of the General Data Protection Regulation (GDPR), especially with respect to what the ICCL called the largest data infringement under the GDPR and the largest data breach of all time: Real-Time Bidding. The ICCL took the ICO to task with having not followed through on fining companies for GDPR violations and having a tiny staff dedicated to data protection and technology issues. The ICCL invoked Article 45 of the GDPR to encourage the EC to deny the UK the adequacy decision it would need in order to transfer the personal data of EU residents to the UK.
  • In an unrelated development, the Information Commissioner’s Office (ICO) wrapped up its investigation into Facebook and Cambridge Analytica and detailed its additional findings in a letter to the Digital, Culture and Media and Sport Select Committee in the House of Commons. ICO head Elizabeth Denham asserted:
    • [w]e concluded that SCL Elections Ltd and Cambridge Analytica (SCL/CA) were purchasing significant volumes of commercially available personal data (at one estimate over 130 billion data points), in the main about millions of US voters, to combine it with the Facebook derived insight information they had obtained from an academic at Cambridge University, Dr Aleksandr Kogan, and elsewhere. In the main their models were also built from ‘off the shelf’ analytical tools and there was evidence that their own staff were concerned about some of the public statements the leadership of the company were making about their impact and influence.
    • From my review of the materials recovered by the investigation I have found no further evidence to change my earlier view that SCL/CA were not involved in the EU referendum campaign in the UK -beyond some initial enquiries made by SCL/CA in relation to UKIP data in the early stages of the referendum process. This strand of work does not appear to have then been taken forward by SCL/CA
    • I have concluded my wider investigations of several organisations on both the remain and the leave side of the UK’s referendum about membership of the EU. I identified no significant breaches of the privacy and electronic marketing regulations and data protection legislation that met the threshold for formal regulatory action. Where the organisation continued in operation, I have provided advice and guidance to support better future compliance with the rules.
    • During the investigation concerns about possible Russian interference in elections globally came to the fore. As I explained to the sub-committee in April 2019, I referred details of reported possible Russia-located activity to access data linked to the investigation to the National Crime Agency. These matters fall outside the remit of the ICO. We did not find any additional evidence of Russian involvement in our analysis of material contained in the SCL / CA servers we obtained.
  • The United States Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint cybersecurity advisory regarding “recently observed advanced persistent threat (APT) actors exploiting multiple legacy vulnerabilities in combination with a newer privilege escalation vulnerability.” CISA and the FBI revealed that that these tactics have penetrated systems related to elections but claimed there has been no degrading of the integrity of electoral systems.
  • The agencies stated:
    • The commonly used tactic, known as vulnerability chaining, exploits multiple vulnerabilities in the course of a single intrusion to compromise a network or application. 
    • This recent malicious activity has often, but not exclusively, been directed at federal and state, local, tribal, and territorial (SLTT) government networks. Although it does not appear these targets are being selected because of their proximity to elections information, there may be some risk to elections information housed on government networks.
    • CISA is aware of some instances where this activity resulted in unauthorized access to elections support systems; however, CISA has no evidence to date that integrity of elections data has been compromised.
  • Canada’s Privacy Commissioner Daniel Therrien released the “2019-2020 Annual Report to Parliament on the Privacy Act and Personal Information Protection and Electronic Documents Act” and asserted:
    • Technologies have been very useful in halting the spread of COVID-19 by allowing essential activities to continue safely. They can and do serve the public good.
    • At the same time, however, they raise new privacy risks. For example, telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. E-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.
    • As the pandemic speeds up digitization, basic privacy principles that would allow us to use public health measures without jeopardizing our rights are, in some cases, best practices rather than requirements under the existing legal framework.
    • We see, for instance, that the law has not properly contemplated privacy protection in the context of public-private partnerships, nor does it mandate app developers to consider Privacy by Design, or the principles of necessity and proportionality.
    • The law is simply not up to protecting our rights in a digital environment. Risks to privacy and other rights are heightened by the fact that the pandemic is fueling rapid societal and economic transformation in a context where our laws fail to provide Canadians with effective protection.
    • In our previous annual report, we shared our vision of how best to protect the privacy rights of Canadians and called on parliamentarians to adopt rights-based privacy laws.
    • We noted that privacy is a fundamental human right (the freedom to live and develop free from surveillance). It is also a precondition for exercising other human rights, such as equality rights in an age when machines and algorithms make decisions about us, and democratic rights when technologies can thwart democratic processes.
    • Regulating privacy is essential not only to support electronic commerce and digital services; it is a matter of justice.

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The House Intelligence Committee will conduct a virtual hearing titled “Misinformation, Conspiracy Theories, and ‘Infodemics’: Stopping the Spread Online.”
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • The Senate Commerce, Science, and Transportation Committee will reportedly hold a hearing on 29 October regarding 47 U.S.C. 230 with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Thanks for your Like • donations welcome from Pixabay

New Round of CCPA Regulations Released For Comment

New CCPA regulations seem tailored to address problems turned up by Consumer Reports on how difficult it can be to block businesses  from using a person’s personal information.

The California Attorney General’s Office (OAG) has released yet another revision of the regulations necessary to implement the “California Consumer Privacy Act” (CCPA) (AB 375) and comments are due by 28 October. Of course, if Proposition 24 passes next month, the “California Privacy Rights Act” will largely replace the CCPA, requiring the drafting of even more regulations. Nonetheless, what everyone thought was the final set of CCPA regulations took effect on 14 August, but in the notice from the Office of Administrative Law was notice that the AG had withdrawn four portions of the proposed regulations. In the new draft regulations, the AG explained:

The new Section 999.306, pertaining to a business’s obligations to notify people of their right to opt out of the sale of their personal information, is expanded to encompass offline data collection. Specifically, it would require any non-online businesses that are collecting information to provide notification that people may opt out of any selling of their information. Of course, under the CCPA, people may not merely opt out of having their information collected. The privacy-minded must, instead, request that their personal information be deleted. And, it also bears note that the CCPA defines sell to include virtually any transmission of personal information:

“Sell,” “selling,” “sale,” or “sold,” means selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.

And so, people may exercise their right to stop businesses from selling or sharing their personal information even though the business itself could still collect and use it. In any event, if a recent report is accurate and foretells the future, this may become a meaningless right. Consumer Reports released a study it did on specifically on the this right. For those people (like me) who expected a significant number of businesses to make it hard for people to exercise their rights, this study confirms this suspicion. Consumer Reports noted more than 40% of data brokers had hard to find links or extra, complicated steps for people to tell them not to sell their personal information.

Interestingly, the next revised Section of CCPA regulations is designed to address the findings of the Consumer Reports findings, suggesting the OAG was well aware of this problem. Section 999.326 requires:

A business’s methods for submitting requests to opt-out shall be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.  A business shall not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out.

The OAG provided a number of illustrative examples of how businesses should effectuate this requirement, most of which are plainly aiming at tricks and tactics to fool or frustrate people.

The revised Section 999.326 provides more detail on how businesses may respond to a person’s use of an authorized agent in submitting requests to know or delete. A business may demand proof of representation. I suspect the use of a service or product to communicate a person’s privacy preferences with businesses may fall into the category of authorized agents. It seems quite likely that some businesses will always require such verification as a means of adding friction to the process. But still, I’m guessing the OAG may have had concerns about unauthorized agents trying to access people’s personal information.

Finally, revised Section 999.332 makes clear that any businesses collecting the personal information of children under the age of 13 or teenagers between the ages of 13 and 15 must include in its privacy policies the provisions of the regulations effectuating the heightened privacy rights afforded these two groups under the CCPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by My pictures are CC0. When doing composings: from Pixabay

Further Reading, Other Developments, and Coming Events (7 October)

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications –
    • The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • Consumer Reports released a study it did on the “California Consumer Privacy Act” (CCPA) (AB 375), specifically on the Do-Not-Sell right California residents were given under the newly effective privacy statute. For those people (like me) who expected a significant number of businesses to make it hard for people to exercise their rights, this study confirms this suspicion. Consumer Reports noted more than 40% of data brokers had hard to find links or extra, complicated steps for people to tell them not to sell their personal information.
    • In “CCPA: Are Consumers Digital Rights Protected?,” Consumer Reports used this methodology:
    • Consumer Reports’ Digital Lab conducted a mixed methods study to examine whether the new CCPA is working for consumers. This study focused on the Do-Not-Sell (DNS) provision in the CCPA, which gives consumers the right to opt out of the sale of their personal information to third parties through a “clear and conspicuous link” on the company’s homepage.1 As part of the study, 543 California residents made DNS requests to 214 data brokers listed in the California Attorney General’s data broker registry. Participants reported their experiences via survey.
    • Consumer Reports found:
      • Consumers struggled to locate the required links to opt out of the sale of their information. For 42.5% of sites tested, at least one of three testers was unable to find a DNS link. All three testers failed to find a “Do Not Sell” link on 12.6% of sites, and in several other cases one or two of three testers were unable to locate a link.
        • Follow-up research focused on the sites in which all three testers did not find the link revealed that at least 24 companies on the data broker registry do not have the required DNS link on their homepage.
        • All three testers were unable to find the DNS links for five additional companies, though follow-up research revealed that the companies did have DNS links on their homepages. This also raises concerns about compliance, since companies are required to post the link in a “clear and conspicuous” manner.
      • Many data brokers’ opt-out processes are so onerous that they have substantially impaired consumers’ ability to opt out, highlighting serious flaws in the CCPA’s opt-out model.
        • Some DNS processes involved multiple, complicated steps to opt out, including downloading third-party software.
        • Some data brokers asked consumers to submit information or documents that they were reluctant to provide, such as a government ID number, a photo of their government ID, or a selfie.
        • Some data brokers confused consumers by requiring them to accept cookies just to access the site.
        • Consumers were often forced to wade through confusing and intimidating disclosures to opt out.
        • Some consumers spent an hour or more on a request.
        • At least 14% of the time, burdensome or broken DNS processes prevented consumers from exercising their rights under the CCPA.
      • At least one data broker used information provided for a DNS request to add the user to a marketing list, in violation of the CCPA.
      • At least one data broker required the user to set up an account to opt out, in violation of the CCPA.
      • Consumers often didn’t know if their opt-out request was successful. Neither the CCPA nor the CCPA regulations require companies to notify consumers when their request has been honored. About 46% of the time, consumers were left waiting or unsure about the status of their DNS request.
      • About 52% of the time, the tester was “somewhat dissatisfied” or “very dissatisfied” with the opt-out processes.
      • On the other hand, some consumers reported that it was quick and easy to opt out, showing that companies can make it easier for consumers to exercise their rights under the CCPA. About 47% of the time, the tester was “somewhat satisfied” or “very satisfied” with the opt-out process.
    • Consumer Reports recommended:
      • The Attorney General should vigorously enforce the CCPA to address noncompliance.
      • To make it easier to exercise privacy preferences, consumers should have access to browser privacy signals that allow them to opt out of all data sales in one step.
      • The AG should more clearly prohibit dark patterns, which are user interfaces that subvert consumer intent, and design a uniform opt-out button. This will make it easier for consumers to locate the DNS link on individual sites.
      • The AG should require companies to notify consumers when their opt-out requests have been completed, so that consumers can know that their information is no longer being sold.
      • The legislature or AG should clarify the CCPA’s definitions of “sale” and “service provider” to more clearly cover data broker information sharing.
      • Privacy should be protected by default. Rather than place the burden on consumers to exercise privacy rights, the law should require reasonable data minimization, which limits the collection, sharing, retention, and use to what is reasonably necessary to operate the service.
  • Two agencies of the Department of the Treasury have issued guidance regarding the advisability and legality of paying ransomware to individuals or entities under United States (U.S.) sanction at a time when ransomware attacks are on the rise. It bears note that a person or entity in the U.S. may face criminal and civil liability for paying a sanctioned ransomware entity even if they did not know it was sanctioned. One of the agencies reasoned that paying ransoms to such parties is contrary to U.S. national security policy and only encourages more ransomware attacks.
    • The Office of Foreign Assets Control (OFAC) issued an “advisory to highlight the sanctions risks associated with ransomware payments related to malicious cyber-enabled activities.” OFAC added:
      • Demand for ransomware payments has increased during the COVID-19 pandemic as cyber actors target online systems that U.S. persons rely on to continue conducting business. Companies that facilitate ransomware payments to cyber actors on behalf of victims, including financial institutions, cyber insurance firms, and companies involved in digital forensics and incident response, not only encourage future ransomware payment demands but also may risk violating OFAC regulations. This advisory describes these sanctions risks and provides information for contacting relevant U.S. government agencies, including OFAC, if there is a reason to believe the cyber actor demanding ransomware payment may be sanctioned or otherwise have a sanctions nexus.
    • Financial Crimes Enforcement Network (FinCEN) published its “advisory to alert financial institutions to predominant trends, typologies, and potential indicators of ransomware and associated money laundering activities. This advisory provides information on:
      • (1) the role of financial intermediaries in the processing of ransomware payments;
      • (2) trends and typologies of ransomware and associated payments;
      • (4) reporting and sharing information related to ransomware attacks.
  • The Government Accountability Office (GAO) found uneven implementation at seven federal agencies in meeting the Office of Management and Budget’s (OMB) requirements in using the category management initiative for buying information technology (IT). This report follows in a long line of assessments of how the federal government is not spending its billions of dollars invested in IT to maximum effect. The category management initiative was launched two Administrations ago as a means of driving greater efficiency and savings for the nearly $350 billion the U.S. government spends annually in services and goods, much of which could be bought in large quantities instead of piecemeal by agency as is now the case.
    • The chair and ranking member of the House Oversight Committee and other Members had asked the GAO “to conduct a review of federal efforts to reduce IT contract duplication and/or waste” specifically “to determine the extent to which (1) selected agencies’ efforts to prevent, identify, and reduce duplicative or wasteful IT contracts were consistent with OMB’s category management initiative; and (2) these efforts were informed by spend analyses.” The GAO ended up looking at the Departments of Agriculture (USDA), Defense (DOD), Health and Human Services (HHS), Homeland Security (DHS), Justice (DOJ), State (State), and Veterans Affairs (VA).
    • The GAO found:
      • The seven agencies in our review varied in their implementation of OMB’s category management activities that contribute to identifying, preventing, and reducing duplicative IT contracts. Specifically, most of the agencies fully implemented the two activities to identify a Senior Accountable Official and develop processes and policies for implementing category management efforts, and to engage their workforces in category management training. However, only about half the agencies fully implemented the activities to reduce unaligned IT spending, including increasing the use of Best in Class contract solutions, and share prices paid, terms, and conditions for purchased IT goods and services. Agencies cited several reasons for their varied implementation, including that they were still working to define how to best integrate category management into the agency.
      • Most of the agencies used spend analyses to inform their efforts to identify and reduce duplication, and had developed and implemented strategies to address the identified duplication, which, agency officials reported resulted in millions in actual and anticipated future savings. However, two of these agencies did not make regular use of the spend analyses.
      • Until agencies fully implement the activities in OMB’s category management initiative, and make greater use of spend analyses to inform their efforts to identify and reduce duplicative contracts, they will be at increased risk of wasteful spending. Further, agencies will miss opportunities to identify and realize savings of potentially hundreds of millions of dollars.
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) provided “specific Chinese government and affiliated cyber threat actor tactics, techniques, and procedures (TTPs) and recommended mitigations to the cybersecurity community to assist in the protection of our Nation’s critical infrastructure.” CISA took this action “[i]n light of heightened tensions between the United States and China.”
    • CISA asserted
      • According to open-source reporting, offensive cyber operations attributed to the Chinese government targeted, and continue to target, a variety of industries and organizations in the United States, including healthcare, financial services, defense industrial base, energy, government facilities, chemical, critical manufacturing (including automotive and aerospace), communications, IT, international trade, education, videogaming, faith-based organizations, and law firms.
    • CISA recommends organizations take the following actions:
      • Adopt a state of heightened awareness. Minimize gaps in personnel availability, consistently consume relevant threat intelligence, and update emergency call trees.
      • Increase organizational vigilance. Ensure security personnel monitor key internal security capabilities and can identify anomalous behavior. Flag any known Chinese indicators of compromise (IOCs) and TTPs for immediate response.
      • Confirm reporting processes. Ensure personnel know how and when to report an incident. The well-being of an organization’s workforce and cyber infrastructure depends on awareness of threat activity. Consider reporting incidents to CISA to help serve as part of CISA’s early warning system (see the Contact Information section below).
      • Exercise organizational incident response plans. Ensure personnel are familiar with the key steps they need to take during an incident. Do they have the accesses they need? Do they know the processes? Are various data sources logging as expected? Ensure personnel are positioned to act in a calm and unified manner.
  • The Supreme Court of the United States (SCOTUS) declined to hear a case on an Illinois revenge porn law that the Illinois State Supreme Court upheld, finding it did not impinge on a woman’s First Amendment rights. Bethany Austin was charged with a felony under an Illinois law barring the nonconsensual dissemination of private sexual pictures when she printed and distributed pictures of her ex-fiancé’s lover. Because SCOTUS decided not to hear this case, the Illinois case and others like it remain Constitutional.
    • The Illinois State Supreme Court explained the facts of the case:
      • Defendant (aka Bethany Austin) was engaged to be married to Matthew, after the two had dated for more than seven years. Defendant and Matthew lived together along with her three children. Defendant shared an iCloud account with Matthew, and all data sent to or from Matthew’s iPhone went to their shared iCloud account, which was connected to defendant’s iPad. As a result, all text messages sent by or to Matthew’s iPhone automatically were received on defendant’s iPad. Matthew was aware of this data sharing arrangement but took no action to disable it.
      • While Matthew and defendant were engaged and living together, text messages between Matthew and the victim, who was a neighbor, appeared on defendant’s iPad. Some of the text messages included nude photographs of the victim. Both Matthew and the victim were aware that defendant had received the pictures and text messages on her iPad. Three days later, Matthew and the victim again exchanged several text messages. The victim inquired, “Is this where you don’t want to message [because] of her?” Matthew responded, “no, I’m fine. [S]omeone wants to sit and just keep watching want [sic] I’m doing I really do not care. I don’t know why someone would wanna put themselves through that.” The victim replied by texting, “I don’t either. Soooooo baby ….”
      • Defendant and Matthew cancelled their wedding plans and subsequently broke up. Thereafter, Matthew began telling family and friends that their relationship had ended because defendant was crazy and no longer cooked or did household chores.
      • In response, defendant wrote a letter detailing her version of events. As support, she attached to the letter four of the naked pictures of the victim and copies of the text messages between the victim and Matthew. When Matthew’s cousin received the letter along with the text messages and pictures, he informed Matthew.
      • Upon learning of the letter and its enclosures, Matthew contacted the police. The victim was interviewed during the ensuing investigation and stated that the pictures were private and only intended for Matthew to see. The victim acknowledged that she was aware that Matthew had shared an iCloud account with defendant, but she thought it had been deactivated when she sent him the nude photographs.
    • In her petition for SCOTUS to hear her case, Austin asserted:
      • Petitioner Bethany Austin is being prosecuted under Illinois’ revenge porn law even though she is far from the type of person such laws were intended to punish. These laws proliferated rapidly in recent years because of certain reprehensible practices, such as ex-lovers widely posting images of their former mates to inflict pain for a bad breakup, malicious stalkers seeking to damage an innocent person’s reputation, or extortionists using intimate photos to collect ransom. Austin did none of those things, yet is facing felony charges because she tried to protect her reputation from her former fiancé’s lies about the reason their relationship ended.
      • The Illinois Supreme Court rejected Petitioner’s constitutional challenge to the state revenge porn law only because it ignored well-established First Amendment rules: It subjected the law only to intermediate, rather than strict scrutiny, because it incorrectly classified a statute that applies only to sexual images as content neutral; it applied diminished scrutiny because the speech at issue was deemed not to be a matter of public concern; and it held the law need not require a showing of malicious intent to justify criminal penalties, reasoning that such intent can be inferred from the mere fact that the specified images were shared. Each of these conclusions contradicts First Amendment principles recently articulated by this Court, and also is inconsistent with decisions of various state courts, including the Vermont Supreme Court.
    • Illinois argued in its brief to SCOTUS:
      • The nonconsensual dissemination of private sexual images exposes victims to a wide variety of serious harms that affect nearly every aspect of their lives. The physical, emotional, and economic harms associated with such conduct are well-documented: many victims are exposed to physical violence, stalking, and harassment; suffer from emotional and psychological harm; and face limited professional prospects and lowered income, among other repercussions. To address this growing problem and protect its residents from these harms, Illinois enacted section 11-23.5,720 ILCS 5/11-23.5. Petitioner—who was charged with violating section 11-23.5 after she disseminated nude photos of her fiancé’s paramour without consent—asks this Court to review the Illinois Supreme Court’s decision rejecting her First Amendment challenge.
  • Six U.S. Agency for Global Media (USAGM) whistleblowers have filed a complaint concerning “retaliatory actions” with the Office of the Inspector General (OIG) at the Department of State and the Office of Special Counsel, arguing the newly installed head of USAGM punished them for making complaints through proper channels about his actions. This is the latest development at the agency. the United States Court of Appeals for the District of Columbia enjoined USAGM from “taking any action to remove or replace any officers or directors of the OTF,” pending the outcome of the suit which is being expedited.
  • Additionally, USAGM CEO and Chair of the Board Michael Pack is being accused in two different letters of seeking to compromise the integrity and independence of two organizations he oversees. There have been media accounts of the Trump Administration’s remaking of USAGM in ways critics contend are threatening the mission and effectiveness of the Open Technology Fund (OTF), a U.S. government non-profit designed to help dissidents and endangered populations throughout the world. The head of the OTF has been removed, evoking the ire of Members of Congress, and other changes have been implemented that are counter to the organization’s mission. Likewise, there are allegations that politically-motivated policy changes seek to remake the Voice of America (VOA) into a less independent entity.
  • The whistleblowers claimed in their complaint:
    • Each of the Complainants made protected disclosures –whether in the form of OIG complaints, communications with USAGM leadership, and/or communications with appropriate Congressional committees–regarding their concerns about official actions primarily taken by Michael Pack, who has been serving as the Chief Executive Officer for USAGM since June 4, 2020. The Complainants’ concerns involve allegations that Mr. Pack has engaged in conduct that violates federal law and/or USAGM regulations, and that constitutes an abuse of authority and gross mismanagement. Moreover, each of the Complainants was targeted for retaliatory action by Mr. Pack because of his belief that they held political views opposed to his, which is a violation of the Hatch Act.
    • Each of the Complainants was informed by letter, dated August 12, 2020, that their respective accesses to classified information had been suspended pending further investigation. Moreover, they were all concurrently placed on administrative leave. In each of the letters to the Complainants, USAGM claimed that the Complainants had been improperly granted security clearances, and that the Complainants failed to take remedial actions to address personnel and security concerns prior to permitting other USAGM employees to receive security clearances. In addition, many or all of the Complainants were earlier subject to retaliatory adverse personnel actions in the form of substantial limitations on their ability to carry out their work responsibilities(i.e. a significant change in duties and responsibilities), which limitations were imposed without following appropriate personnel procedures.

Further Reading

  • Big Tech Was Their Enemy, Until Partisanship Fractured the Battle Plans” By Cecilia Kang and David McCabe — The New York Times. There’s a bit of court intrigue in this piece about how Republicans declined to join Democrats in the report on the antirust report released this week, sapping the recommendations on how to address Big Tech of power.
  • Facebook Keeps Data Secret, Letting Conservative Bias Claims Persist” By Bobby Allyn — NPR. Still no evidence of an anti-conservative bias at Facebook, according to experts, and the incomplete data available seem to indicate conservative content may be more favored by users than liberal content. Facebook does not release data that settle the question, however, and there are all sorts of definitional questions that need answers before this issue could be definitely settled. And yet, some food for thought is a significant percentage of sharing a link may be driven by bots and not humans.
  • News Corp. changes its tune on Big Tech” By Sara Fischer — Axios.  After beating the drum for years about the effect of Big Tech on journalism, the parent company of the Wall Street Journal and other media outlets is much more conciliatory these days. It may have something to do with all the cash the Googles and Facebooks of the world are proposing to throw at some media outlets for their content. It remains to be seen how this change in tune will affect the Australian Competition and Consumer Commission’s (ACCC) proposal to ensure that media companies are compensated for articles and content online platforms use. In late July the ACCC released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.”
  • Silicon Valley Opens Its Wallet for Joe Biden” By Daniel Oberhaus — WIRED. In what will undoubtedly be adduced as evidence that Silicon Valley is a liberal haven, this article claims according to federal elections data for this election cycle, Alphabet, Amazon, Apple, Facebook, Microsoft, and Oracle employees have contributed $4,787,752 to former Vice President Joe Biden and $239,527 to President Donald Trump. This is only for contributions of $200 and higher, so it is likely these data are not complete.
  • Facebook bans QAnon across its platforms” By Ben Collins and Brandy Zadrozny — NBC News. The social media giant has escalated and will remove all content related to the conspiracy group and theory known as QAnon. However, believers have been adaptable and agile in dropping certain terms and using methods to evade detection. Some experts say Facebook’s actions are too little, too late as these beliefs are widespread and are fueling a significant amount of violence and unrest in the real world.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Katie White from Pixabay

Senate Commerce Hearing On Privacy

Senate stakeholders appear no closer to resolving the two key impasses in privacy legislation: preemption and a private right of action.

A week after the introduction of the “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) was introduced, the Senate Commerce, Science, and Transportation Committee held a hearing titled “Revisiting the Need for Federal Data Privacy Legislation with four former Federal Trade Commission (FTC) Commissioners and California’s Attorney General. Generally speaking, Members used the hearing to elicit testimony on the aspects of a privacy bill they would like to see with the chair and ranking member asking the witnesses about the need for preemption and the benefits of one national privacy standard and the need for people to be able to sue as a means of supplementing limited capacity of the FTC and state attorneys general to police violations of a new law respectively.

The SAFE DATA Act (see here for more analysis) was introduced last week by Committee Chair Roger Wicker (R-MS), Senate Majority Whip and  Communications, Technology, Innovation, and the Internet Subcommittee Chair John Thune (R-SD), Transportation and Safety Subcommittee Chair Deb Fischer (R-NE), and Safety, and Senator Marsha Blackburn (R-TN). Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis).

Chair Roger Wicker (R-MS) stated “[d]uring this Congress, protecting consumer data privacy has been a primary focus of this Committee…[and] [w]e held one of the first hearings of my chairmanship to examine how Congress should address this issue.” He said “[a]t that time, we heard that individuals needed rigorous privacy protections to ensure that businesses do not misuse their data…[and] [w]e heard that individuals need to be able to access, control, and delete the data that companies have collected on them.” Wicker stated “[w]e heard that businesses need a consistent set of rules applied reasonably and fairly to allow for continued innovation and growth in the digital economy…[a]nd we heard that the FTC needs enhanced authority and resources in order to oversee and enforce privacy protections.”

Wicker stated “[i]n the nearly two years since, members of this Committee have done a great deal of work developing legislation to address data privacy.” He said “[w]hile we worked, the world of data privacy did not stand still…[and] [t]he state of California implemented its California Consumer Privacy Act (CCPA) and began enforcement this past summer.” Wicker contended “[l]ong-held concerns remain that the CCPA is difficult to understand and comply with and could become worse if the law is further expanded and amended through an upcoming ballot measure this fall.” He claimed “[t]he European Union has continued to enforce the General Data Protection Regulation (GDPR)…[and] [t]he EU’s main focus appears to be going after the biggest American companies rather than providing clear guidance for all businesses with European citizens as customers.”

Wicker noted

The picture in Europe is even more complex following the recent court ruling invalidating the EU-U.S. Privacy Shield framework, which governed how U.S. companies treated the data of EU citizens. Though the issues in that case were more related to national security than consumer privacy, the result was yet more uncertainty about the future of trans-Atlantic data flows. I look forward to holding a hearing before the end of the year on the now-invalidated Privacy Shield.

Wicker asserted “[t]he biggest new development that has impacted data privacy – as it has impacted so many facets of our life – is the COVID-19 pandemic, which has resulted in millions of Americans working from home.” He said “[t]he increased use of video conferencing, food delivery apps, and other online services increases the potential for privacy violations…[and] [t]he need to collect a great deal of data for contact tracing and to track the spread of the disease likewise raises privacy concerns if done improperly.”

Wicker declared that “[f]or all of these reasons and more, the need for a uniform, national privacy law is greater than ever…[and] [l]ast week I introduced the SAFE DATA Act.” He argued

The SAFE DATA Act would provide Americans with more choice and control over their data. It would require businesses to be more transparent and hold them to account for their data practices. It would strengthen the FTC’s ability to be an effective enforcer of new data privacy rules. And it would establish a nationwide standard so that businesses know how to comply no matter where their customers live, and so that consumers know their data is safe wherever the company that holds their data is located.

Wicker stated that “[t]he SAFE DATA Act is the result of nearly two years of discussions with advocacy groups, state and local governments, nonprofits, academics, and businesses of every size and from every sector of the economy – my thanks to all of those.” He claimed “[t]he diversity of voices was essential in crafting a law that would work consistently and fairly for all Americans.” Wicker contended “we have a chance to pass a strong national privacy law that achieves the goals of privacy advocates with real consensus among members of both parties and a broad array of industry members.”

Ranking Member Maria Cantwell (D-WA) stated “[p]rotecting Americans’ privacy rights is critical, and that has become even sharper in the focus of the COVID-19 crisis, where so much of our lives have moved online.” She noted “[t]he American people deserve strong privacy protections for their personal data, and Congress must work to act in establishing these protections.” Cantwell said “[l]ast year, along with Senators [Brian] Schatz (D-HI), [Amy] Klobuchar (D-MN), and [Ed] Markey (D-MA), I introduced the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968).” She claimed “[t]he bill is pretty straightforward…[and] provides foundational privacy rights to consumers, creates rules to prevent abuse of consumer data, and holds companies accountable with real enforcement measures.” Cantwell said “[u]nfortunately, other legislation throughout the legislative process, I think has taken different approaches…[and] [t]hese bills allow companies to maintain the status quo, burying important disclosure information in long contracts, hiding where consumer data is sold, and changing the use of consumer data without their consent.” She conclude that “obviously, I believe these loopholes are unacceptable.”

Cantwell argued

Most strikingly, these bills would actually weaken consumer rights around the country by preempting stronger state laws. Attorney General Becerra is with us today and I appreciate him being able to join us, because this would have an impact on a broad preemption, I should say, would have an impact on 40 million Californians who are protected by your privacy law and the privacy protections in your state. So we need to resolve this issue. But we can’t do so at the expense of states who have already taken action to protect the privacy of their citizens.

Cantwell stated that “[f]inally, we also know that individuals must have the right to their day in court, when privacy is violated, even with the resources and expertise and enforcers like the FTC–many of you, I know, know these rules well, and Attorneys General–we will never be able to fully police the thousands and thousands of companies collecting consumer data if you are the only cop on the beat.” She said “I’d like to go further, but there are many issues that we’re going to address here. I want to say that the legislation also needs to cover the complex issues of, you know, health and safety standards and important issues.” Cantwell stated “[t]he Supreme Court discussion that we’re now having, I think will launch us into a very broad discussion of privacy rights and where they exist within the Constitution.” She explained “[j]ust this recent court ruling that put at risk the little known but vital important provision of the FTC Act 13b, which allows the FTC to go to court to obtain refunds and other redress for consumers–the 10 billion dollars for example in the Volkswagen case–without this provision, the core mission of the FTC would be crippled.”

Cantwell asserted “I think all of these issues, and the important issues of privacy rights, should and will have a fair discussion, if we can have time to discuss them in this process…[and] I believe the issue of how the government interferes in our own privacy rights, whether the government oversteps our privacy rights, is a major issue to be discussed by this body in the next month.” She added “I don’t believe in some of the tactics that government has used to basically invade the privacy rights of individuals.”

Cantwell stated that “next week the minority will be introducing a report that we’ve been working on about the value of local journalism…[and] I believe the famous economist who said that markets need perfect information.” She argued “[w]e’re talking about the fact that if markets are distorted by information, then that really cripples our economy…[and] I think local journalism in a COVID crisis is proving that it’s valued information with the correct information on our local communities, and I think that this is something we need to take into consideration as we consider privacy laws and we consider these issues moving forward.”

Former FTC Commissioner and Microsoft’s Corporate Vice President, Chief Privacy Officer, and Deputy General Counsel for Global Privacy and Regulatory Affairs Julie Brill explained that “Microsoft believes that comprehensive federal privacy legislation should support four key principles: consumer empowerment, transparency, corporate responsibility, and strong enforcement:

  • Consumer Empowerment. Empower consumers with the tools they need to control their personal information, including the ability to make informed choices about the data they provide to companies, to understand what data companies know about them, to obtain a copy of their data, to make sure the data is accurate and up to date, and to delete their data. Americans care deeply about having meaningful control over their data. In just the past nine months, from January 1, 2020 to September 18, 2020, Microsoft received over 14 and a half million unique global visitors to its privacy dashboard, where they were able to exercise their ability to control their data. This continued engagement with the control tools we provide included over 4 and a half million visitors from the United States, representing the greatest level engagement from any single country.
  • Transparency. Require companies to be transparent about their data collection and use practices, by providing people with concise and understandable information about what personal information is collected from them, and how that information is used and shared.
  • Corporate Responsibility. Place direct requirements on companies to ensure that they collect and use consumers’ data in ways that are responsible, and demonstrate that they are worthy stewards of that data.
  • Strong Enforcement. Provide for strong enforcement through regulators, and ensure they have sufficient resources to enforce the legal requirements that organizations must uphold, but also to be well-grounded in the data collection and analysis technologies that are used in the modern digital economy. These are the key elements that are required to build a robust and lasting U.S. privacy law.

George Washington University Law School Professor, King’s College Visiting Professor, and United Kingdom Competition and Markets Authority Non-Executive Director and former FTC Chair William E. Kovacic said:

As Congress defines the substantive commands of a new omnibus law, I suggest a close review of the FTC’s experience in implementing the Telemarketing Sales Rule. To my mind, this experience offers several insights into the design of privacy protections:

  • In addition to unfair or deceptive acts and practices, the definition of forbidden behavior should encompass abusive conduct, as the FTC has developed that concept in the elaboration of the Telemarketing Sales Rule (TSR). I single out 2003 TSR amendments, which established the National Do Not Call Registry, popularly known as the Do Not Call Rule (DNC Rule). In applying the concept of abusive conduct, the DNC Rule used a definition of harm that reached beyond quantifiable economic costs of the challenged practice (i.e., the time lost and inconvenience associated with responding to unwanted telephone calls to the home). The DNC Rule’s theory of harm focused on the fact that, to many citizens, telemarketing calls were annoying, irritating intrusions into the privacy of the home. A new privacy regime could build on this experience and allow privacy regulators, by rulemaking and by law enforcement, to address comparable harms and to create standards that map onto common expectations for data protection and security.
  • The coverage of the omnibus statute should be comprehensive. Privacy authorities should have power to apply the law to all commercial actors (i.e., with no exclusions for specific economic sectors)and to not-for-profit institutions such as charitable bodies and universities.
  • The omnibus law should clarify that its restrictions on the accumulation and use of date about individuals apply to their status as consumers and employees. Since the late 1990s, the FTC at times has engaged in debatable interpretations of its authority under Section 5 of the Federal Trade Commission Act to assure foreign jurisdictions that it has authority to enforce promises regarding the collection and transfer by firms of information about their employees.

Kovacic stated “[w]ith this general framework in mind, my testimony proposes that an omnibus privacy law should enhance the institutional arrangements for administering anew substantive privacy framework. This statement

  • Sets out criteria to assess the performance of the entities implementing U.S. privacy policy, and to determine how to allocate tasks to institutions responsible for policy development and law enforcement.
  • Suggests approaches to increase the coherence and effectiveness of the US privacy system and to make the United States a more effective participant in the development of international privacy policy.
  • Considers whether the FTC, with an enhanced mandate, should serve as the national privacy regulator, or whether the FTC’s privacy operations should be spun off to provide the core of a new privacy institution.

Kovacic explained

This statement concludes that the best solution is to take steps that would enhance the FTC’s role by (a) eliminating gaps in its jurisdiction, (b) expanding its capacity to promote cooperation among agencies with privacy portfolios and to encourage convergence upon superior policy norms, and (c) providing resources necessary to fulfill these duties. The proposal for an enlarged FTC role considers two dimensions of privacy regulation. The first is what might be called the “consumer-facing” elements of a privacy. My testimony deals mainly with the relationship between consumers and enterprises (for-profit firms and not-for-profit institutions, such as universities) that provide them with goods and services. My testimony does not address the legal mechanisms that protect privacy where the actors are government institutions. Thus, I do not examine the appropriate framework for devising and implementing policies that govern data collection and record-keeping responsibilities of federal agencies, such as bodies that conduct surveillance for national security purposes.

21st Century Privacy Coalition Co-Chair and Former FTC Chair Jon Leibowitz asserted:

  • Congress does not need to reinvent the wheel. Many of the elements I would propose are consistent with recommendations made by my former agency in its 2012 Privacy Report, drafted after years of work and engagement with stakeholders of all kinds. Technology will continue to change, but the basic principles enshrined in the Report remain the most effective way to give consumers the protections they deserve.
  • My view, and that of the Report, is that national privacy legislation must give consumers statutory rights to control how their personal information is used and shared, and provide increased visibility into companies’ practices when it comes to managing consumer data. Such an approach should provide consumers with easy-to-understand privacy choices based upon the nature of the information itself—its sensitivity, the risk of consumer harm if such information is the subject of an unauthorized disclosure—and the context in which it is collected. For example, consumers expect sensitive information—including health and financial data, precise geolocation, Social Security numbers, and children’s information—to receive heightened protection to ensure confidentiality.
  • Therefore, a muscular privacy law should require affirmative express consent for the use and sharing of consumers’ sensitive personally identifiable information, and opt-out rights for non-sensitive information. But consumers do not expect to consistently provide affirmative consent to ensure that companies fulfill their online orders or protect them from fraud; thus, inferred consent for certain types of operational uses of information by companies makes sense. Consumers should also have rights of access and deletion where appropriate, and deserve civil rights protections thoughtfully built for the Internet age.
  • Another key tenet of the FTC Report is that privacy should not be about who collects an individual’s personal information, but rather should be about what information is collected and how it is protected and used. That is why federal privacy legislation should be technology- and industry-neutral. Companies that collect, use, or share the same type of covered personal information should not be subject to different privacy requirements based on how they classify themselves in the marketplace.
  • Rigorous standards should be backed up with tough enforcement. To that end, Congress should provide the FTC with the ability to impose civil penalties on violators for first-time offenses, something all of the current Commissioners—and I believe all the former Commissioners testifying here today—support. Otherwise, malefactors will continue to get two bites at the apple of the unsuspecting consumer. And there is no question in my mind that the FTC should have the primary authority to administer the national privacy law. The FTC has the unparalleled institutional knowledge and experience gained from bringing more than 500 cases to protect the privacy and security of consumer information, including those against large companies like Google, Twitter, Facebook, Uber, Dish Network, and others. Congress should not stop there.
  • The way to achieve enhanced enforcement is by giving the FTC, an agency that already punches above its weight, the resources and authority to carry out its mandate effectively. As of 2019, there were fewer employees (“FTEs”) at the agency now than there were in 1980, and the American population has grown by more than 100 million people since then. The number of FTEs has actually decreased since I left the agency in 2013 until this year.
  • Moreover, the FTC clearly has a role to play in developing rules to address details that Congress may not want to tackle in the legislation itself as well as new developments in technology that could overwhelm (or circumvent) enforcement. For that reason, you should give the agency some APA rulemaking authority to effectively implement your law. Having said that, Congress should not overwhelm the FTC with mandated rulemaking after rulemaking, which would only bog the agency down instead of permitting it to focus on enforcing the new law.

California Attorney General Xavier Becerra argued:

  • In the data privacy space, the optimal federal legal framework recognizes that privacy protections must keep pace with innovation, the hallmark of our data-driven economy. State law is the backbone of consumer privacy in the United States. Federal law serves as the glue that ties our communities together. To keep pace, we must all work from the same baseline playbook, but be nimble enough to adapt to real-world circumstances on the field where we meet them. I urge this committee to proceed in your work in a manner that respects—and does not preempt—more rigorous state laws, including those we have in California.
  • Like any law, the CCPA is not perfect, but it is an excellent first step. Consumers deserve more privacy and easier tools. For example, in the regulations implementing the CCPA, the California Department of Justice tried to address the frustration of consumers who must proceed website-by-website, browser-by-browser in order to opt out of the sale of their personal information. One provision of our regulations intended to facilitate the submission of a request to opt-out of sale by requiring businesses to comply when a consumer has enabled a global privacy control at the device or browser level, which should be less time-consuming and burdensome. I urge the technology community to develop consumer-friendly controls to make exercise of the right to opt out of the sale of information meaningful and frictionless. Making technology work for consumers is just as important as the benefits businesses receive in innovating.
  • There are also ways in which CCPA could go further and require refinement of its compliance measures. For example, the CCPA currently only requires disclosure of “categories of sources” from which personal information is collected and “categories of third parties” to whom personal information is sold. More specific disclosures, including the names of businesses that were the source or recipient of the information, should be required so that consumers can know the extent to which their information has been shared, bartered, and sold. If I receive junk mail from a company, I should be able to find out how it got my address and to whom it shared the information so I can stop the downstream purchase of my personal data. For now, businesses are not legally required to share that granularity of information. Consumers should also have the ability to correct the personal information collected about them, so as to prevent the spreading of misinformation.
  • On a broader level, if businesses want to use consumers’ data, they should have a duty to protect and secure it, and wherever feasible, minimize data collection. Businesses should no longer approach consumer data with the mindset, “collect now, monetize later.” There should be a duty imposed to use a consumer’s personal information in accordance with the purposes for which the consumer allowed its collection, and in the consumer’s interest, especially with the collection and storage of sensitive information, like precise geolocation. Although CCPA requires transparent notice at collection, moving beyond a notice-and-consent framework to contemplate use limitations would make our privacy rights more robust and balanced.
  • We need clear lines on what is illegal data use from the context of civil rights protections. Indirect inferences based on personal information should not be used against us in healthcare decisions, insurance coverage or employment determinations. We need greater transparency on how algorithms impact people’s fundamental rights of healthcare, housing and employment, and how they may be perpetuating systemic racism and bias. Predatory online practices, such as increased cross-site tracking after a user browses healthcare websites, must be addressed.
  • Finally, new laws should include a private right of action to complement and fortify the work of state enforcers. While my office is working hard to protect consumer privacy rights in California, and our sister states do the same in their jurisdictions, we cannot do this work alone. While we endeavor to hold companies accountable for violations of privacy laws, trying to defend the privacy rights of 40 million people in California alone is a massive undertaking. Violators know this. They know our scope and reach are limited to remedying larger and more consequential breaches of privacy. Consumers need the authority to pursue remedies themselves for violations of their rights. Private rights of action provide a critical adjunct to government enforcement, and enable consumers to assert their rights and seek appropriate remedies. Consumer privacy must be real, it deserves its day in court.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by KaraSuva from Pixabay

A Washington State Privacy Bill…Rises From The Dead

One of the sponsors of a privacy bill that died earlier this year has reintroduced a modified version with new language in the hopes of passing the bill next year.

Washington State Senator Reuven Carlyle (D-Seattle) has floated a new draft of privacy legislation in the hopes it will be pass after forerunner bills dying in the last two legislative sessions. Carlyle has made a number of changes in the “Washington Privacy Act 2021” documented in this chart showing the differences between the new bill, the last version of the bill passed by the Washington State Senate last year, the “California Consumer Privacy Act” (CCPA) (AB 375), and the “California Privacy Rights Act” (CPRA) (aka Proposition 24) on this year’s ballot. But in the main, the bill tracks closely with the two bills produced by the Washington Senate and House last year lawmakers could not ultimately reconcile. However, there are no provisions on facial recognition technology, which was largely responsible for sinking a privacy bill in Washington State two years ago. Carlyle has taken the unusual step of appending language covering the collection and processing of personal data to combat infectious diseases like COVID-19.

Big picture, the bill still uses the concepts of data controllers and processors most famously enshrined in the European Union’s (EU) General Data Protection Regulation (GDPR). Like other privacy bills, generally, people in Washington State would not need to consent before an entity could collect and process its information. People would be able to opt out of some activities, but most could data collection and processing could still occur as it presently does.

Washingtonians would be able to access, correct, delete, and port their personal data. Moreover, people would be able to opt out of certain data processing: “for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal effects concerning a consumer or similarly significant effects concerning a consumer.” Controllers must provide at least two secure and reliable means by which people could exercise these rights and may not require the creation of a new account. Rather, a controller can require a person to use an existing account to exercise her rights.

Controllers must act on the request within 45 days and are allowed one 45-day extension “where reasonably necessary, taking into account the complexity and number of the requests.” It is not clear what would justify a 45-day extension except for numerous, complex requests, but in any event, the requester must be informed of an extension. Moreover, if a controller decides not to comply with the request, it must let the person know within 45 days, the reasons for noncompliance, and how an appeal may be filed. People would be permitted two free requests a year (although nothing stops a controller from meeting additional requests for free), and controllers may charge thereafter to cover reasonable costs and to deal with repetitive requests. Controllers may also just deny repetitive requests, too, and they may also deny requests they cannot authenticate. In the event of the latter, a controller may ask for more information so the person can prove his identity but is not required to.

Each controller would need to establish an internal appeals process for people to use if their request to exercise a right is denied. There is a specified timeline, and, at the end of this process, if a person is unhappy with the decision, the controller must offer to turn the matter over to the Office of the Attorney General of Washington for adjudication.

Like last year’s bills, this draft makes clear the differentiated roles of controllers and processors in the data ecosystem regulated by Washington State. Processors must follow a controller’s instructions and has an obligation to help the controller comply with the act. These obligations must set out in a contract between each controller and processor “that sets out the processing instructions to which the processor is bound, including the nature and purpose of the processing, the type of personal data subject to the processing, the duration of the processing, and the obligations and rights of both parties.” Additionally, who is a controller and who is a processor will necessarily be a fact driven analysis and it is possible for one entity to be both depending on the circumstances.

Notably, processors must help controllers respond to requests from people exercising their rights, secure personal data, and assist in complying with Washington State’s data breach protocol if a breach occurs. Processors must implement and use security commensurate to the personal data they are holding and processing.

Controllers are obligated to furnish privacy policies to people that must include the categories of personal data processed, the purposes for any processing, the categories of personal data shared with third parties, and the categories of third parties with whom sharing occurs. Moreover, if a controller sells personal data for targeted advertising, a controller has a special obligation to make people aware on a continuing basis, including their right to opt out if they choose. Data collection is limited to what is reasonably necessary for the disclosed purposes of the data processing. And yet, a controller may ask for and obtain consent to process for purposes beyond those reasonably necessary to effectuate the original purposes disclosed to the person. Controllers would also need to minimize the personal data it has on hand.

Controllers must “establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data…[that] shall be appropriate to the volume and nature of the personal data at issue.” Controllers would not be allowed to process personal data in a way that would violate discrimination laws. And so, controllers may not “process personal data on the basis of a consumer’s or a class of consumers’ actual or perceived race, color, ethnicity, religion, national origin, sex, gender, gender identity, sexual orientation, familial status, lawful source of income, or disability, in a manner that unlawfully discriminates against the consumer or class of consumers with respect to the offering or provision of (a) housing, (b) employment, (c) credit, (d) education, or (e) the goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation.” Controllers could not retaliate against people who exercise any of their rights to access, correct, delete, or port their personal data through offering differently priced or quality products or services. And yet, controllers may offer different prices and services as part of a loyalty program that is voluntary for people to join and may share personal data with third parties for reasons limited to the loyalty program.

Regarding another subset of personal data, consent will be needed before processing can occur. This subset is “sensitive data,” which is defined as “(a) personal data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sexual orientation, or citizenship or immigration status; (b) the processing of genetic or biometric data for the purpose of uniquely identifying a natural person; (c) the personal data from a known child; or (d) specific geolocation data.”

The bill also bars a person waiving his or her rights under any type of agreement, and this will be null and void for reasons of public policy.

Controllers would not need to reidentify deidentified personal data to respond to a request from a person. However, the way this section is written gives rise to questions about the drafter’s intentions. The section would not require controllers to respond to requests from people to access, correct, delete or port personal data if the “controller is not reasonably capable of associating the request with the personal data, or…it would be unreasonably burdensome for the controller to associate the request with the personal data” if other conditions are true as well. Given that this provision comes right after the language on reidentifying deidentified data, it seems like the aforementioned language would apply to other personal data. And so, some controllers could respond to a request by arguing they cannot associate the request or it would be unduly burdensome. Perhaps this is not what the drafters intend, but this could become a route whereby controllers deny legitimate requests.

This section of the bill also makes clear that people will not be able to exercise their rights of access, correction, deletion, or porting if the personal data are “pseudonymous data.” This term is defined as “personal data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.” This is a concept that would seem to encourage controllers and processors to store data in a state that would strip identifiers from the personal data in order for them not to have to incur the cost and time of responding to requests. It bears note the concept and definition appear heavily influenced by the GDPR, which provides:

pseudonymisation’ means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person

Data protection assessments will be necessary for a subset of processing activities: targeted advertising, selling personal data, processing sensitive data, any processing of personal data that presents “a heightened risk of harm to consumers” and another case that requires explanation. This last category is for those controllers who are profiling such that a reasonably foreseeable risk is presented of:

  • “Unfair or deceptive treatment of, or disparate impact on, consumers;
  • financial, physical, or reputational injury to consumers;
  • a physical or other intrusion upon the solitude or seclusion, or the
  • private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person; or
  • other substantial injury to consumers.”

These “data protection assessments must take into account the type of personal data to be processed by the controller, including the extent to which the personal data are sensitive data, and the context in which the personal data are to be processed.” Moreover, data protection assessments “must identify and weigh the benefits that may flow directly and indirectly from the processing to the controller, consumer, other stakeholders, and the public against the potential risks to the rights of the consumer associated with such processing,  as mitigated by safeguards that can be employed by the controller to reduce such risks.” Moreover, the bill stipulates “[t]he use of deidentified data and the reasonable expectations of consumers, as well as the context of the processing and the relationship between the controller and the consumer whose personal data will be processed, must be factored into this assessment by the controller.” And, crucially, controllers must provide data protection assessments to the Washington Attorney General upon request, meaning they could inform an enforcement action or investigation.

Section 110 of the “Washington Privacy Act 2021” lays out the reasons one usually finds in privacy bills as to the circumstances when controllers and processors are not bound by the act, including but not limited to:

  • Comply with federal, state, or local laws, rules, or regulations;
  • Comply with a civil, criminal, or regulatory inquiry, investigation, subpoena, or summons by federal, state, local, or other governmental authorities;
  • Cooperate with law enforcement agencies concerning conduct or activity that the controller or processor reasonably and in good faith believes may violate federal, state, or local laws, rules, or regulations;
  • Provide a product or service specifically requested by a consumer, perform a contract to which the consumer is a party, or take steps at the request of the consumer prior to entering into a contract;
  • Take immediate steps to protect an interest that is essential for the life of the consumer or of another natural person, and where the processing cannot be manifestly based on another legal basis;
  • Prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activities, or any illegal activity; preserve the integrity or security of systems; or investigate, report, or prosecute those responsible for any such action;

Moreover, the act does “not restrict a controller’s or processor’s ability to collect, use, or retain data to:

  • Conduct internal research solely to improve or repair products, services, or technology;
    Identify and repair technical errors that impair existing or intended functionality; or
  • Perform solely internal operations that are reasonably aligned with the expectations of the consumer based on the consumer’s existing relationship with the controller, or are otherwise compatible with processing in furtherance of the provision of a product or service specifically requested by a consumer or the performance of a contract to which the consumer is a party.

It seems reasonable to expect controllers and processors to try and read these provisions as liberally as they can in order to escape or circumvent the obligations of the act. I do not level this claim as a criticism; rather, it is what will undoubtedly occur if a regulated entity has halfway decent legal counsel.

One also finds legal liability for controllers that was in last year’s bill, too. The act makes clear that controllers cannot be liable for a processor’s violation if “at the time of disclosing the personal data, the disclosing controller or processor did not have actual knowledge that the recipient intended to commit a violation.” Consequently, even if a reasonable person could foresee that a processor would likely violate the act, unless the controller actually knows a violation is imminent, then the controller cannot be held liable. This structuring of the legal liability will likely result in controllers claiming they did not know of processors’ violations and create a disincentive for controllers to press processors to comply with the statutory and contractual requirements binding both.

The bill reiterates:

Personal data that are processed by a controller pursuant to [any of the aforementioned carveouts in Section 110] must not be processed for any purpose other than those expressly listed in this section. Personal data that are processed by a controller pursuant to this section may be processed solely to the extent that such processing is:

(i) Necessary, reasonable, and proportionate to the purposes listed in this section; and

(ii) adequate, relevant, and limited to what is necessary in relation to the specific purpose or purposes listed in this section.

Finally, controllers bear the burden of making the case that the exception being used complies with this section. This would serve to check a regulated entity’s inclination to read terms and requirements as generously as possible for them and their conduct.

The bill would not create a new right for people to sue, but if there are existing grounds a person uses to sue (e.g. product liability, tort, contract law, etc.) and wins, the liability would be distributed between a controller and processor according to their liability.

In terms of enforcement by the Attorney General, violations of this act are treated as violations of the Washington State Consumer Protection Act, and violations are considered violations of the ban on unfair and deceptive practices with civil liability as high as $7,500 per violation. However, the Attorney general must first “provide a controller thirty days’ written notice identifying the specific provisions of this title the Attorney General, on behalf of a consumer, alleges have been or are being violated.” If a cure is affected, then the Attorney General may not seek monetary damages. But if a cure is not, then the Attorney General may take the matter to court.

The act preempts all county, city, and local data processing laws.

There is new language in the bill pertaining to public health emergencies, privacy, and contact tracing. However, the provisions are divided between two different titles with one controlling private sector entities and the other public sector entities. Incidentally, at the federal level, privacy bills have not tended to include provisions to address public health emergencies and instead standalone bills have been drafted and introduced.

In regard to private sector entities, controllers and processors would not be able to process “covered data” for a “covered purpose” which relates to the symptoms of infectious diseases and tracking their spread, unless certain conditions are met. For example, these entities would need to make available a privacy policy and people must consent to such processing. Additionally, controllers and processors would not be able to disclose “any covered data processed for a covered purpose” to any law enforcement agency in the U.S., sell “any covered data processed for a covered purpose,” or “[s]hare any covered data processed for a covered purpose with another controller, processor, or third party unless such sharing is governed by contract” according to the terms laid out in this section and disclosed to the person per the privacy policy that must be disclosed. However, private sector entities could disclose covered data processed for a covered purpose to federal, state, and local agencies pursuant to laws permitting such disclosure. So, this would likely be under public health or emergency laws.

This section of the bill defines “covered purpose” as

processing of covered data concerning a consumer for the purposes of detecting symptoms of an infectious disease, enabling the tracking of a consumer’s contacts with other consumers, or with specific locations to identify in an automated fashion whom consumers have come into contact with, or digitally notifying, in an automated manner, a consumer who may have become exposed to an infectious disease, or other similar purposes directly related to a state of emergency declared by the governor pursuant to RCW 43.06.010 and any restrictions imposed under the state of emergency declared by the governor pursuant to RCW 43.06.200 through 43.06.270.

There is a section that seems redundant. This provision establishes the right of a person to opt out of processing her covered data for a covered purpose, but the previous section makes clear a person’s covered data may not be processed without her consent. Nonetheless, a person may determine whether his covered data is being processed, request a correction of inaccurate information, and request the deletion of “covered data.” The provisions on how controllers are required to respond to and process such requests are virtually identical to those established for the exercise of the rights to access, correct, delete, and port in the bill.

The relationship and responsibilities between controllers and processors track very closely to those imposed for normal data processing.

Controllers would need to make available privacy policies specific to processing covered data. The bill provides:

Controllers that process covered data for a covered purpose must provide consumers with a clear and conspicuous privacy notice that includes, at a minimum:

  • How a consumer may exercise the rights contained in section 203 of this act, including how a consumer may appeal a controller’s action with regard to the consumer’s request;
  • The categories of covered data processed by the controller;
  • The purposes for which the categories of covered data are processed;
  • The categories of covered data that the controller shares with third parties, if any; and
  • The categories of third parties, if any, with whom the controller shares covered data.

Controllers would also need to limit collection of covered data to what is reasonably necessary for processing and minimize collection. Moreover, controllers may not process covered data in ways that exceed what is reasonably necessary for covered purposes unless consent is obtained from each person. But then the bill further limits what processing of covered data is permissible by stating that controllers cannot “process covered data or deidentified data that was processed for a covered purpose for purposes of marketing, developing new products or services, or engaging in commercial product or market research.” Consequently, other processing purposes would be permissible provided consent has been obtained. And so, a covered entity might process covered data to improve the current means of collecting covered data for the covered purpose.

There is no right to sue entities for violating this section, but it appears controllers may bear more legal responsibility for the violations of its processors regarding covered data. Moreover, the enforcement language is virtually identical to the earlier provisions in the bill as to how the Attorney General may punish violators.

The bill’s third section would regulate the collection and processing of covered data for covered purposes by public sector entities, and for purposes of this section controllers are defined as “local government, state agency, or institutions of higher education which, alone or jointly with others, determines the purposes and means of the processing of covered data.” This section is virtually identical to the second section with the caveat that people would not be given the right to determine if their covered data has been collected and processed for a covered purpose, to request a correction of covered data, and to ask that such data be deleted. Also, a person could not ask to opt out of collection.

Finally, two of the Congressional stakeholders on privacy and data security hail from Washington state, and consideration and possible passage of a state law may limit their latitude on a federal bill they could support. Senator Maria Cantwell (D-WA) and Representative Cathy McMorris Rodgers (R-WA), who are the ranking members of the Senate Commerce, Science, and Transportation Committee and House Energy and Commerce’s Consumer Protection and Commerce Subcommittee respectively, are involved in drafting their committee’s privacy bills, and a Washington state statute may affect their positions in much the same the “California Consumer Privacy Act” (CCPA) (AB 375) has informed a number of California Members’ position on privacy legislation, especially with respect to bills being seen as weaker than the CCPA.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo Credit: Ally Laws on Pixabay

Further Reading, Other Developments, and Coming Events ( 4 September)

Here is today’s Further Reading, Other Developments, and Coming Events.

Coming Events

  • The United States-China Economic and Security Review Commission will hold a hearing on 9 September on “U.S.-China Relations in 2020: Enduring Problems and Emerging Challenges” to “evaluate key developments in China’s economy, military capabilities, and foreign relations, during 2020.”
  • On 10 September, the General Services Administration (GSA) will have a webinar to discuss implementation of Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) that bars the federal government and its contractors from buying the equipment and services from Huawei, ZTE, and other companies from the People’s Republic of China.
  • The Federal Communications Commission (FCC) will hold a forum on 5G Open Radio Access Networks on 14 September. The FCC asserted
    • Chairman [Ajit] Pai will host experts at the forefront of the development and deployment of open, interoperable, standards-based, virtualized radio access networks to discuss this innovative new approach to 5G network architecture. Open Radio Access Networks offer an alternative to traditional cellular network architecture and could enable a diversity in suppliers, better network security, and lower costs.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.”
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled ““Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Election Assistance Commission (EAC) “released the Election Risk Profile Tool, a user-friendly assessment tool to equip election officials and federal agencies in prioritizing and managing cybersecurity risks to the Election Infrastructure Subsector.” The agencies stated “[t]he new tool is designed to help state and local election officials understand the range of risks they face and how to prioritize mitigation efforts…[and] also addresses areas of greatest risk, ensures technical cybersecurity assessments and services are meeting critical needs, and provides a sound analytic foundation for managing election security risk with partners at the federal, state and local level.”
    • CISA and the EAC explained “[t]he Election Risk Profile Tool:
      • Is a user-friendly assessment tool for state and local election officials to develop a high-level risk profile across a jurisdiction’s specific infrastructure components;
      • Provides election officials a method to gain insights into their cybersecurity risk and prioritize mitigations;
      • Accepts inputs of a jurisdiction’s specific election infrastructure configuration; and
      • Outputs a tailored risk profile for jurisdictions, which identifies specific areas of highest risk and recommends associated mitigation measures that the jurisdiction could implement to address the risk areas.
  • The cybersecurity agencies of the Five Eyes nations have released a Joint Cybersecurity Advisory: Technical Approaches to Uncovering and Remediating Malicious Activity that “highlights technical approaches to uncovering malicious activity and includes mitigation steps according to best practices.” The agencies asserted “[t]he purpose of this report is to enhance incident response among partners and network administrators along with serving as a playbook for incident investigation.”
    • The Australian Cyber Security Centre, Canada’s Communications Security Establishment, the United States’ Cybersecurity and Infrastructure Security Agency, the United Kingdom’s National Cyber Security Centre, and New Zealand’s National Cyber Security Centre and Computer Emergency Response Team summarized the key takeaways from the Joint Advisory:
      • When addressing potential incidents and applying best practice incident response procedures:
      • First, collect and remove for further analysis:
        • Relevant artifacts,
        • Logs, and
        • Data.
      • Next, implement mitigation steps that avoid tipping off the adversary that their presence in the network has been discovered.
      • Finally, consider soliciting incident response support from a third-party IT security organization to:
        • Provide subject matter expertise and technical support to the incident response,
        • Ensure that the actor is eradicated from the network, and
        • Avoid residual issues that could result in follow-up compromises once the incident is closed.
  • The United States’ (U.S.) Department of Justice (DOJ) and Federal Trade Commission (FTC) signed an Antitrust Cooperation Framework with their counterpart agencies from Australia, Canada, New Zealand, And United Kingdom. The Multilateral Mutual Assistance and Cooperation Framework for Competition Authorities (Framework) “aims to strengthen cooperation between the signatories, and provides the basis for a series of bilateral agreements among them focused on investigative assistance, including sharing confidential information and cross-border evidence gathering.” Given that a number of large technology companies are under investigation in the U.S., the European Union (EU) and elsewhere, signaling a shift in how technology multinationals are being viewed, this agreement may enable cross-border efforts to collectively address alleged abuses. However, the Framework “is not intended to be legally binding and does not give rise to legal rights or obligations under domestic or international law.” The Framework provides:
    • Recognising that the Participants can benefit by sharing their experience in developing, applying, and enforcing Competition Laws and competition policies, the Participants intend to cooperate and provide assistance, including by:
      • a) exchanging information on the development of competition issues, policies and laws;
      • b) exchanging experience on competition advocacy and outreach, including to consumers, industry, and government;
      • c) developing agency capacity and effectiveness by providing advice or training in areas of mutual interest, including through the exchange of officials and through experience-sharing events;
      • d) sharing best practices by exchanging information and experiences on matters of mutual interest, including enforcement methods and priorities; and
      • e) collaborating on projects of mutual interest, including via establishing working groups to consider specific issues.
  • Dynasplint Systems alerted the United States Department of Health and Human Services (HHS) that it suffered a breach affecting more than 100,000 people earlier this year. HHS’ Office of Civil Rights (OCR) is investigating possible violations of Health Insurance Portability and Accountability Act regulations regarding the safeguarding of patients’ health information. If Dynasplint failed to properly secure patient information or its systems, OCR could levy a multimillion dollar fine for the size breach. For example, in late July, OCR fined a company over $1 million for the theft of an unencrypted laptop that exposed the personal information of a little more than 20,000 people.
    • Dynasplint, a Maryland manufacturer of range of motion splints, explained:
      • On June 4, 2020, the investigation determined that certain information was accessed without authorization during the incident.
      • The information may have included names, addresses, dates of birth, Social Security numbers, and medical information.
      • Dynasplint Systems reported this matter to the FBI and will provide whatever cooperation is necessary to hold perpetrators accountable.
  • The California Legislature has sent two bills to Governor Gavin Newsom (D) that would change how technology is regulated in the state, including one that would alter the “California Consumer Privacy Act” (AB 375) (CCPA) if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. The two bills are:
    • AB 1138 would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.”
    •  AB 1281 would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
  • Senators Senator Shelley Moore Capito (R-WV), Amy Klobuchar (D-MN) and Jerry Moran (R-KS) have written “a letter to Federal Trade Commission (FTC) Chairman Joseph Simons urging the FTC to take action to address the troubling data collection and sharing practices of the mobile application (app) Premom” and “to request information on the steps that the FTC plans to take to address this issue.” They asserted:
    • A recent investigation from the International Digital Accountability Council (IDAC) indicated that Premom may have engaged in deceptive consumer data collection and processing, and that there may be material differences between Premom’s stated privacy policies and its actual data-sharing practices. Most troubling, the investigation found that Premom shared its users’ data without their consent.
    • Moore Capito, Klobuchar, and Moran stated “[i]n light of these concerning reports, and given the critical role that the FTC plays in enforcing federal laws that protect consumer privacy and data under Section 5 of the Federal Trade Commission Act and other sector specific laws, we respectfully ask that you respond to the following questions:
      • 1. Does the FTC treat persistent identifiers, such as the non-resettable device hardware identifiers discussed in the IDAC report, as personally identifiable information in relation to its general consumer data security and privacy enforcement authorities under Section 5 of the FTC Act?  
      • 2. Is the FTC currently investigating or does it plan to investigate Premom’s consumer data collection, transmission, and processing conduct described in the IDAC report to determine if the company has engaged in deceptive practices?
      • 3. Does the FTC plan to take any steps to educate users of the Premom app that the app may still be sharing their personal data without their permission if they have not updated the app? If not, does the FTC plan to require Premom to conduct such outreach?
      • 4. Please describe any unique or practically uncommon uses of encryption by the involved third-party companies receiving information from Premom that could be functionally interpreted to obfuscate oversight of the involved data transmissions.
      • 5. How can the FTC use its Section 5 authority to ensure that mobile apps are not deceiving consumers about their data collection and sharing practices and to preempt future potentially deceptive practices like those Premom may have engaged in?

Further Reading

  • Justice Dept. Plans to File Antitrust Charges Against Google in Coming Weeks” By Katie Benner and Cecilia Kang – The New York Times; “The Justice Department could file a lawsuit against Google this month, overriding skepticism from its own top lawyers” By Tonty Romm – The Washington Post; “There’s a partisan schism over the timing of a Google antitrust lawsuit” By Timothy B. Lee – Ars Technica. The New York Times explains in its deeply sourced article that United States Department of Justice (DOJ) attorneys want more time to build a better case against Google, but that Attorney General William Barr is pressing for the filing of a suit as early as the end of this month in order for the Trump Administration to show voters it is taking on big tech. Additionally, a case against a tech company would help shore up the President’s right flank as he and other prominent conservatives continue to insist in the absence of evidence that technology companies are biased against the right. The team of DOJ attorneys has shrunk from 40 to about 20 as numerous lawyers asked off the case once it was clear what the Attorney General wanted. These articles also throw light on to the split between Republican and Democratic state attorneys general in the case they have been working on with the former accusing the latter of stalling for time in the hopes a Biden DOJ will be harsher on the company and the latter accusing the former of trying to file a narrow case while Donald Trump is still President that would impair efforts to address the range of Google’s alleged antitrust abuses.
  • Facebook Moves to Limit Election Chaos in November” By Mike Isaac – The New York Times. The social network giant unveiled measures to fight misinformation the week before the United States election and afterwards should people try to make factually inaccurate claims about the results. Notably, political advertisements will be banned a week before the 3 November election, but this seems like pretty weak tea considering it will be business as usual until late October. Even though the company frames these moves as “additional steps we’re taking to help secure the integrity of the U.S. elections by encouraging voting, connecting people to authoritative information, and reducing the risks of post-election confusion,” the effect of misinformation, disinformation, and lies that proliferate on Facebook will have likely already taken root by late October. It is possible the company still wants the advertising revenue it would forgo if it immediately banned political advertising. Another proposed change is to provide accurate information about voting generally and COVID-19 and voting. In fact, the platform corrected a post of President Donald Trump’s that expressed doubts about mail-in voting.
  • Washington firm ran fake Facebook accounts in Venezuela, Bolivia and Mexico, report finds” By Craig Timberg and Elizabeth Dwoskin – The Washington Post. In tandem with taking down fake content posted by the Internet Research Agency, Facebook also removed accounts traced back to a Washington, D.C. public relations firm, CLS Strategies, that was running multiple accounts to support the government in Bolivia and the opposition party in Venezuela, both of which are right wing. Using information provided by Facebook, Stanford University’s Internet Observatory released a report stating that “Facebook removed a network of 55 Facebook accounts,4 2 Pages and 36 Instagram accounts attributed to the US-based strategic communications firm CLS Strategies for engaging in coordinated inauthentic behavior (CIB).” Stanford asserted these key takeaways:
    • 11 Facebook pages related to Bolivia mainly supported Bolivia’s Interim President Jeanine Áñez and disparaged Bolivia’s former president Evo Morales. All had similar creation dates and manager location settings.
    • Venezuela-focused assets supported and promoted Venezuelan opposition leaders but changed in tone in 2020, reflecting factional divides in the opposition and a turn away from Juan Guaidó.
    • In addition to fake accounts, removed Facebook accounts include six profiles that match the names and photos of CLS Strategies employees listed publicly on their website and appear to be their real accounts.
    • CLS Strategies has a disclosed contract with the Bolivian government to provide strategic communications counsel for Bolivia’s 2020 elections and to strengthen democracy and human rights in Bolivia.
    • Coordinated inauthentic behavior reports from Facebook and Twitter have increasingly included assets linked to marketing and PR firms originating and acting around the world. The firms’ actions violate the platforms’ terms by operating internationally and failing to identify their origins and motivations to users.
    • In its release on the issue, Facebook explained:
      • In August, we removed three networks of accounts, Pages and Groups. Two of them — from Russia and the US — targeted people outside of their country, and another from Pakistan focused on both domestic audiences in Pakistan and also in India. We have shared information about our findings with law enforcement, policymakers and industry partners.
  • Belarusian Officials Shut Down Internet With Technology Made by U.S. Firm” By Ryan Gallagher – Bloomberg. A United States firm, Sandvine, sold deep packet inspection technology to the government in Belarus through a Russian intermediary. The technology was ostensibly to be used by the government to fend off dangers to the nation’s networks but was instead deployed to shut down numerous social media and news sites on the internet the day of the election. However, Belarusian activists quickly determined how to use workarounds, launching the current unrest that threatens to topple the regime. The same company’s technology has been used elsewhere in the world to cut off access to the internet as detailed by the University of Toronto’s Citizen Lab in 2018.
  • Canada has effectively moved to block China’s Huawei from 5G, but can’t say so” – Reuters. In a move reminiscent of how the People’s Republic of China (PRC) tanked Qualcomm’s proposed purchase of NXP Semiconductors in 2018, Canada has effectively barred Huawei from its 5G networks by not deciding, which eventually sent a signal to its telecommunications companies to use Ericsson and Nokia instead. This way, there is no public announcement or policy statement the PRC can object to, and the country toes the line with its other Five Eyes partners that have banned Huawei in varying degrees. Additionally, given that two Canadian nationals are being held because Huawei Chief Financial Officer Meng Wanzhou is being detained in Canada awaiting extradition to the Unted States to face criminal charges, Ottawa needs to manage its relations with the PRC gingerly.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Simon Steinberger from Pixabay