Further Reading, Other Developments, and Coming Events (15 December)

Further Reading

  • DHS, State and NIH join list of federal agencies — now five — hacked in major Russian cyberespionage campaign” By Ellen Nakashima and Craig Timberg — The Washington Post; “Scope of Russian Hack Becomes Clear: Multiple U.S. Agencies Were Hit” By David E. Sanger, Nicole Perlroth and Eric Schmitt — The New York Times; The list of United States (U.S.) government agencies breached by Sluzhba vneshney razvedki Rossiyskoy Federatsii (SVR), the Russian Federation’s Foreign Intelligence Service, has grown. Now the Department of Homeland Security, Defense, and State and the National Institutes of Health are reporting they have been breached. It is unclear if Fortune 500 companies in the U.S. and elsewhere and U.S. nuclear laboratories were also breached in this huge, sophisticated espionage exploit. It appears the Russians were selective and careful, and these hackers may have only accessed information held on U.S. government systems. And yet, the Trump Administration continues to issue equivocal statements neither denying nor acknowledging the hack, leaving the public to depend on quotes from anonymous officials. Perhaps admitting the Russians hacked U.S. government systems would throw light on Russian interference four years ago, and the President is loath to even contemplate that attack. In contrast, President Donald Trump has made all sorts of wild, untrue claims about vote totals being hacked despite no evidence supporting his assertions. It appears that the declaration of mission accomplished by some agencies of the Trump Administration over no Russian hacking of or interference with the 2020 election will be overshadowed by what may prove the most damaging hack of U.S. government systems ever.
  • Revealed: China suspected of spying on Americans via Caribbean phone networks” By Stephanie Kirchgaessner — The Guardian. This story depends on one source, so take it for what it is worth, but allegedly the People’s Republic of China (PRC) is using vulnerabilities in mobile communications networks to hack into the phones of Americans travelling in the Caribbean. If so, the PRC may be exploiting the same Signaling System 7 (SS7) weaknesses an Israeli firm, Circles, is using to sell access to phones, at least according to a report published recently by the University of Toronto’s Citizen Lab.
  • The Cartel Project | Revealed: The Israelis Making Millions Selling Cyberweapons to Latin America” By Amitai Ziv — Haaretz. Speaking of Israeli companies, the NSO Group among others are actively selling offensive cyber and surveillance capabilities to Central American nations often through practices that may be corrupt.
  • U.S. Schools Are Buying Phone-Hacking Tech That the FBI Uses to Investigate Terrorists” By Tom McKay and Dhruv Mehrotra — Gizmodo. Israeli firm Cellebrite and competitors are being used in school systems across the United States (U.S.) to access communications on students’ phones. The U.S. Supreme Court caselaw gives schools very wide discretion for searches, and the Fourth Amendment is largely null and void on school grounds.
  • ‘It’s Hard to Prove’: Why Antitrust Suits Against Facebook Face Hurdles” By Mike Issac and Cecilia Kang — The New York Times. The development of antitrust law over the last few decades may have laid an uphill path for the Federal Trade Commission (FTC) and state attorneys general in securing a breakup of Facebook, something that has not happened on a large scale since the historic splintering of AT&T in the early 1980’s.
  • Exclusive: Israeli Surveillance Companies Are Siphoning Masses Of Location Data From Smartphone Apps” By Thomas Brewster — Forbes. Turns out Israeli firms are using a feature (or what many would call a bug) in the online advertising system that allows those looking to buy ads to get close to real-time location data from application developers looking to sell advertising space. By putting out a shingle as a Demand Side Platform, it is possible to access reaps of location data, and two Israeli companies are doing just that and offering the service of locating and tracking people using this quirk in online advertising. And this is not just companies in Israel. There is a company under scrutiny in the United States (U.S.) that may have used these practices and then provided location data to federal agencies.

Other Developments

  • The Government Accountability Office (GAO) evaluated the United States’ (U.S.) Department of Defense’s electromagnetic spectrum (EMS) operations found that the DOD’s efforts to maintain EMS superiority over the Russian Federation and the People’s Republic of China (PRC). The GAO concluded:
    • Studies have shown that adversaries of the United States, such as China and Russia, are developing capabilities and strategies that could affect DOD superiority in the information environment, including the EMS. DOD has also reported that loss of EMS superiority could result in the department losing control of the battlefield, as its Electromagnetic Spectrum Operations (EMSO) supports many warfighting functions across all domains. DOD recognizes the importance of EMSO to military operations in actual conflicts and in operations short of open conflict that involve the broad information environment. However, gaps we identified in DOD’s ability to develop and implement EMS-related strategies have impeded progress in meeting DOD’s goals. By addressing gaps we found in five areas—(1) the processes and procedures to integrate EMSO throughout the department, (2) governance reforms to correct diffuse organization, (3) responsibility by an official with appropriate authority, (4) a strategy implementation plan, and (5) activities that monitor and assess the department’s progress in implementing the strategy—DOD can capitalize on progress that it has already made and better support ensuring EMS superiority.
    • The GAO recommended:
      • The Secretary of Defense should ensure that the Vice Chairman of the Joint Chiefs of Staff, as Senior Designated Official of the Electromagnetic Spectrum Operations Cross-Functional Team (CFT), identifies the procedures and processes necessary to provide for integrated defense-wide strategy, planning, and budgeting with respect to joint electromagnetic spectrum operations, as required by the FY19 NDAA. (Recommendation 1)
      • The Secretary of Defense should ensure that the Vice Chairman of the Joint Chiefs of Staff as Senior Designated Official of the CFT proposes EMS governance, management, organizational, and operational reforms to the Secretary. (Recommendation 2)
      • The Secretary of Defense should assign clear responsibility to a senior official with authority and resources necessary to compel action for the long-term implementation of the 2020 strategy in time to oversee the execution of the 2020 strategy implementation plan. (Recommendation 3)
      • The Secretary of Defense should ensure that the designated senior official for long-term strategy implementation issues an actionable implementation plan within 180 days following issuance of the 2020 strategy. (Recommendation 4)
      • The Secretary of Defense should ensure that the designated senior official for long-term strategy implementation creates oversight processes that would facilitate the department’s implementation of the 2020 strategy. (Recommendation 5)
  • A forerunner to Apple’s App Store has sued the company, claiming it has monopolized applications on its operating system to the detriment of other parties and done the same with respect to its payment system. The company behind Cydia is arguing that it conceived of and created the first application store for the iPhone, offering a range of programs Apple did not. Cydia is claiming that once Apple understood how lucrative an app store would be, it blocked Cydia and established its own store, the exclusive means through which programs can be installed and used on the iOS. Furthermore, this has enabled Apple to levy 30% of all in-application purchases made, which is allegedly a $50 billion market annually. This is the second high-profile suit this year against Apple. Epic Games, the maker of the popular game, Fortnite, sued Apple earlier this year on many of the same grounds because the company started allowing users to buy directly from it for a 30% discount. Apple responded by removing the game from the App Store, which has blocked players from downloading updated versions. That litigation has just begun. In its complaint, Cydia asserts:
    • Historically, distribution of apps for a specific operating system (“OS”) occurred in a separate and robustly competitive market. Apple, however, began coercing users to utilize no other iOS app distribution service but the App Store, coupling it closer and closer to the iPhone itself in order to crowd out all competition. But Apple did not come up with this idea initially—it only saw the economic promise that iOS app distribution represented after others, like [Cydia], demonstrated that value with their own iOS app distribution products/services. Faced with this realization, Apple then decided to take that separate market (as well as the additional iOS app payment processing market described herein) for itself.
    • Cydia became hugely popular by offering a marketplace to find and obtain third party iOS applications that greatly expanded the capabilities of the stock iPhone, including games, productivity applications, and audio/visual applications such as a video recorder (whereas the original iPhone only allowed still cameraphotos). Apple subsequently took many of these early third party applications’ innovations, incorporating them into the iPhone directly or through apps.
    • But far worse than simply copying others’ innovations, Apple also recognized that it could reap enormous profits if it cornered this fledgling market for iOS app distribution, because that would give Apple complete power over iOS apps, regardless of the developer. Apple therefore initiated a campaign to eliminate competition for iOS app distribution altogether. That campaign has been successful and continues to this day. Apple did (and continues to do) so by, inter alia, tying the App Store app to iPhone purchases by preinstalling it on all iOS devices and then requiring it as the default method to obtain iOS apps, regardless of user preference for other alternatives; technologically locking down the iPhone to prevent App Store competitors like Cydia from even operating on the device; and imposing contractual terms on users that coerce and prevent them from using App Store competitors. Apple has also mandated that iOS app developers use it as their sole option for app payment processing (such as in-app purchases), thus preventing other competitors, such as Cydia, from offering the same service to those developers.
    • Through these and other anticompetitive acts, Apple has wrongfully acquired and maintained monopoly power in the market (or aftermarket) for iOS app distribution, and in the market (or aftermarket) for iOS app payment processing. Apple has frozen Cydia and all other competitors out of both markets, depriving them of the ability to compete with the App Store and to offer developers and consumers better prices, better service, and more choice. This anticompetitive conduct has unsurprisingly generated massive profits and unprecedented market capitalization for Apple, as well as incredible market power.
  • California is asking to join antitrust suit against Google filed by the United States Department of Justice (DOJ) and eleven state attorneys general. This antitrust action centers on Google’s practices of making Google the default search engine on Android devices and paying browsers and other technology entities to make Google the default search engine. However, a number of states that had initially joined the joint state investigation of Google have opted not to join this action and will instead be continuing to investigate, signaling a much broader case than the one filed in the United States District Court for the District of Columbia. In any event, if the suit does proceed, and a change in Administration could result in a swift change in course, it may take years to be resolved. Of course, given the legion leaks from the DOJ and state attorneys general offices about the pressure U.S. Attorney General William Barr placed on staff and attorneys to bring a case before the election, there is criticism that rushing the case may result in a weaker, less comprehensive action that Google may ultimately fend off.
    • And, there is likely to be another lawsuit against Google filed by other state attorneys general. A number of attorneys general who had orginally joined the effort led by Texas Attorney General Ken Paxton in investigating Google released a statement at the time the DOJ suit was filed, indicating their investigation would continue, presaging a different, possibly broader lawsuit that might also address Google’s role in other markets. The attorneys general of New York, Colorado, Iowa, Nebraska, North Carolina, Tennessee, and Utah did not join the case that was filed but may soon file a related but parallel case. They stated:
      • Over the last year, both the U.S. DOJ and state attorneys general have conducted separate but parallel investigations into Google’s anticompetitive market behavior. We appreciate the strong bipartisan cooperation among the states and the good working relationship with the DOJ on these serious issues. This is a historic time for both federal and state antitrust authorities, as we work to protect competition and innovation in our technology markets. We plan to conclude parts of our investigation of Google in the coming weeks. If we decide to file a complaint, we would file a motion to consolidate our case with the DOJ’s. We would then litigate the consolidated case cooperatively, much as we did in the Microsoft case.
  • France’s Commission nationale de l’informatique et des libertés (CNIL) handed down multi-million Euro fines on Google and Amazon for putting cookies on users’ devices. CNIL fined Google a total of €100 million and Amazon €35 million because its investigation of both entities determined “when a user visited [their] website, cookies were automatically placed on his or her computer, without any action required on his or her part…[and] [s]everal of these cookies were used for advertising purposes.”
    • CNIL explained the decision against Google:
      • [CNIL] noticed three breaches of Article 82 of the French Data Protection Act:
      • Deposit of cookies without obtaining the prior consent of the user
        • When a user visited the website google.fr, several cookies used for advertising purposes were automatically placed on his or her computer, without any action required on his or her part.
        • Since this type of cookies can only be placed after the user has expressed his or her consent, the restricted committee considered that the companies had not complied with the requirement provided for in Article 82 of the French Data Protection Act regarding the collection of prior consent before placing cookies that are not essential to the service.
      • Lack of information provided to the users of the search engine google.fr
        • When a user visited the page google.fr, an information banner displayed at the bottom of the page, with the following note “Privacy reminder from Google”, in front of which were two buttons: “Remind me later” and “Access now”.
        • This banner did not provide the user with any information regarding cookies that had however already been placed on his or her computer when arriving on the site. The information was also not provided when he or she clicked on the button “Access now”.
        • Therefore, the restricted committee considered that the information provided by the companies did not enable the users living in France either to be previously and clearly informed regarding the deposit of cookies on their computer or, therefore, to be informed of the purposes of these cookies and the available means enabling to refuse them.
      • Partial failure of the « opposition » mechanism
        • When a user deactivated the ad personalization on the Google search by using the available mechanism from the button “Access now”, one of the advertising cookies was still stored on his or her computer and kept reading information aimed at the server to which it is attached.
        • Therefore, the restricted committee considered that the “opposition” mechanism set up by the companies was partially defective, breaching Article 82 of the French Data Protection Act.
    • CNIL explained the case against Amazon:
      • [CNIL] noticed two breaches of Article 82 of the French Data Protection Act:
      • Deposit of cookies without obtaining the prior consent of the user
        • The restricted committee noted that when a user visited one of the pages of the website amazon.fr, a large number of cookies used for advertising purposes was automatically placed on his or her computer, before any action required on his or her part. Yet, the restricted committee recalled that this type of cookies, which are not essential to the service, can only be placed after the user has expressed his or her consent. It considered that the deposit of cookies at the same time as arriving on the site was a practice which, by its nature, was incompatible with a prior consent.
      • Lack of information provided to the users of the website amazon.fr
        • First, the restricted committee noted that, in the case of a user visiting the website amazon.fr, the information provided was neither clear, nor complete.
        • It considered that the information banner displayed by the company, which was “By using this website, you accept our use of cookies allowing to offer and improve our services. Read More.”, only contained a general and approximate information regarding the purposes of all the cookies placed. In particular, it considered that, by reading the banner, the user could not understand that cookies placed on his or her computer were mainly used to display personalized ads. It also noted that the banner did not explain to the user that it could refuse these cookies and how to do it.
        • Then, the restricted committee noticed that the company’s failure to comply with its obligation was even more obvious regarding the case of users that visited the website amazon.fr after they had clicked on an advertisement published on another website. It underlined that in this case, the same cookies were placed but no information was provided to the users about that.
  • Senator Amy Klobuchar (D-MN) wrote the Secretary of Health and Human Services (HHS), to express “serious concerns regarding recent reports on the data collection practices of Amazon’s health-tracking bracelet (Halo) and to request information on the actions [HHS] is taking to ensure users’ health data is secure.” Klobuchar stated:
    • The Halo is a fitness tracker that users wear on their wrists. The tracker’s smartphone application (app) provides users with a wide-ranging analysis of their health by tracking a range of biological metrics including heartbeat patterns, exercise habits, sleep patterns, and skin temperature. The fitness tracker also enters into uncharted territory by collecting body photos and voice recordings and transmitting this data for analysis. To calculate the user’s body fat percentage, the Halo requires users to take scans of their body using a smartphone app. These photos are then temporarily sent to Amazon’s servers for analysis while the app returns a three-dimensional image of the user’s body, allowing the user to adjust the image to see what they would look like with different percentages of body fat. The Halo also offers a tone analysis feature that examines the nuances of a user’s voice to indicate how the user sounds to others. To accomplish this task, the device has built-in microphones that listen and records a user’s voice by taking periodic samples of speech throughout the day if users opt-in to the feature.
    • Recent reports have raised concerns about the Halo’s access to this extensive personal and private health information. Among publicly available consumer health devices, the Halo appears to collect an unprecedented level of personal information. This raises questions about the extent to which the tracker’s transmission of biological data may reveal private information regarding the user’s health conditions and how this information can be used. Last year, a study by BMJ (formerly the British Medical Journal) found that 79 percent of health apps studied by researchers were found to share user data in a manner that failed to provide transparency about the data being shared. The study concluded that health app developers routinely share consumer data with third-parties and that little transparency exists around such data sharing.
    • Klobuchar asked the Secretary of Health and Human Services Alex Azar II to “respond to the following questions:
      • What actions is HHS taking to ensure that fitness trackers like Halo safeguard users’ private health information?
      • What authority does HHS have to ensure the security and privacy of consumer data collected and analyzed by health tracking devices like Amazon’s Halo?
      • Are additional regulations required to help strengthen privacy and security protections for consumers’ personal health data given the rise of health tracking devices? Why or why not?
      • Please describe in detail what additional authority or resources that the HHS could use to help ensure the security and protection of consumer health data obtained through health tracking devices like the Halo.

Coming Events

  • On 15 December, the Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing titled “The Role of Private Agreements and Existing Technology in Curbing Online Piracy” with these witnesses:
    • Panel I
      • Ms. Ruth Vitale, Chief Executive Officer, CreativeFuture
      • Mr. Probir Mehta, Head of Global Intellectual Property and Trade Policy, Facebook, Inc.
      • Mr. Mitch Glazier, Chairman and CEO, Recording Industry Association of America
      • Mr. Joshua Lamel, Executive Director, Re:Create
    • Panel II
      • Ms. Katherine Oyama, Global Director of Business Public Policy, YouTube
      • Mr. Keith Kupferschmid, Chief Executive Officer, Copyright Alliance
      • Mr. Noah Becker, President and Co-Founder, AdRev
      • Mr. Dean S. Marks, Executive Director and Legal Counsel, Coalition for Online Accountability
  • The Senate Armed Services Committee’s Cybersecurity Subcommittee will hold a closed briefing on Department of Defense Cyber Operations on 15 December with these witnesses:
    • Mr. Thomas C. Wingfield, Deputy Assistant Secretary of Defense for Cyber Policy, Office of the Under Secretary of Defense for Policy
    • Mr. Jeffrey R. Jones, Vice Director, Command, Control, Communications and Computers/Cyber, Joint Staff, J-6
    • Ms. Katherine E. Arrington, Chief Information Security Officer for the Assistant Secretary of Defense for Acquisition, Office of the Under Secretary of Defense for Acquisition and Sustainment
    • Rear Admiral Jeffrey Czerewko, United States Navy, Deputy Director, Global Operations, J39, J3, Joint Staff
  • The Senate Banking, Housing, and Urban Affairs Committee’s Economic Policy Subcommittee will conduct a hearing titled “US-China: Winning the Economic Competition, Part II” on 16 December with these witnesses:
    • The Honorable Will Hurd, Member, United States House of Representatives;
    • Derek Scissors, Resident Scholar, American Enterprise Institute;
    • Melanie M. Hart, Ph.D., Senior Fellow and Director for China Policy, Center for American Progress; and
    • Roy Houseman, Legislative Director, United Steelworkers (USW).
  • On 17 December the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency’s (CISA) Information and Communications Technology (ICT) Supply Chain Risk Management (SCRM) Task Force will convene for a virtual event, “Partnership in Action: Driving Supply Chain Security.”

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Naya Shaw from Pexels

Further Reading, Other Developments, and Coming Events (10 December)

Further Reading

  • Social media superspreaders: Why Instagram, not Facebook, will be the real battleground for COVID-19 vaccine misinformation” By Isobel Asher Hamilton — Business Insider. According to one group, COVID-19 anti-vaccination lies and misinformation are proliferating on Instagram despite its parent company’s, Facebook, efforts to find and remove such content. There has been dramatic growth in such content on Instagram, and Facebook seems to be applying COVID-19 standards more loosely on Instagram. In fact, some people kicked off of Facebook for violating that platform’s standards on COVID-19 are still on Instagram spreading the same lies, misinformation, and disinformation. For example, British anti-vaccination figure David Icke was removed from Facebook for making claims that COVID-19 was caused by or related to 5G, but he has a significant following on Instagram.
  • ‘Grey area’: China’s trolling drives home reality of social media war” By Chris Zappone — The Sydney Morning Herald. The same concept that is fueling aggressive cyber activity at a level below outright war has spread to diplomacy. The People’s Republic of China (PRC) has been waging “gray” social media campaigns against a number of Western nations, including Australia, mainly be propagating lies and misinformation. The most recent example is the spreading a fake photo of an Australian soldier appearing to kill an Afghan child. This false material seems designed to distract from the real issues between the two nations arising from clashing policies on trade and human rights. The PRC’s activities do not appear to violate Australia’s foreign interference laws and seem to have left Canberra at a loss as to how to respond effectively.
  • Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show” By Elizabeth Dwoskin, Nitasha Tiku and Heather Kelly — The Washington Post. Facebook will apparently seek to revamp its algorithms to target the types of hate speech that have traditionally targeted women and minority groups. Up until now all attacks were treated equally so that something like “white people suck” would be treated the same way as anti-Semitic content. Facebook has resisted changes for years even though experts and civil rights groups made the case that people of color, women, and LGBTI people endure far more abuse online. There is probably no connection between Facebook’s more aggressive content moderation policies and the advent of a new administration in Washington more receptive to claims that social media platforms allow the abuse of these people.
  • How Joe Biden’s Digital Team Tamed the MAGA Internet” By Kevin Roose — The New York Times. Take this piece with a block of salt. The why they won articles are almost always rife with fallacies, including the rationale that if a candidate won, his or her strategy must have worked. It is not clear that the Biden Campaign’s online messaging strategy of being nice and emphasizing positive values actually beat the Trump Campaign’s “Death Star” so much as the President’s mishandling of the pandemic response and cratering of the economy did him in.
  • Coronavirus Apps Show Promise but Prove a Tough Sell” By Jennifer Valentino-DeVries — The New York Times. It appears the intersection of concerns about private and public sector surveillance from two very different groups has worked to keep down rates of adopting smartphone COVID tracking apps in the United States. There are people wary of private sector practices to hoover up as much data as possible, and others concerned about the government’s surveillance activities. Consequently, many are shunning Google and Apple’s COVID contact tracing apps to the surprise of government, industry, and academia. A pair of studies show resistance to downloading or using such apps even if there are very strong privacy safeguards. This result may well be a foreseeable outcome from U.S. policies that have allowed companies and the security services to collect and use vast quantities of personal information.
  • UAE target of cyber attacks after Israel deal, official says” — Reuters. A top cybersecurity official in the United Arab Emirates claimed his nation’s financial services industries were targeted for cyber attack and implied Iran and affiliated hackers were responsible.

Other Developments

  • President-elect Joe Biden announced his intention to nominate California Attorney General Xavier Becerra to serve as the next Secretary of Health and Human Services (HHS). If confirmed by the Senate, California Governor Gavin Newsom would name Becerra’s successor who would need to continue enforcement of the “California Consumer Privacy Act” (CCPA) (AB 375) while also working towards the transition to the “California Privacy Rights Act” (Proposition 24) approved by California voters last month. The new statute establishes the California Privacy Protection Agency that will assume the Attorney General’s responsibilities regarding the enforcement of California’s privacy laws. However, Becerra’s successor may play a pivotal role in the transition between the two regulators and the creation of the new regulations needed to implement Proposition 24.
  • The Senate approved the nomination of Nathan Simington to be a Commissioner of the Federal Communications Commission (FCC) by a 49-46 vote. Once FCC Chair Ajit Pai steps down, the agency will be left with two Democratic and two Republican Commissioners, pending the Biden Administration’s nominee to fill Pai’s spot. If the Senate stays Republican, it is possible the calculation could be made that a deadlocked FCC is better than a Democratic agency that could revive net neutrality rules among other Democratic and progressive policies. Consequently, Simington’s confirmation may be the first step in a FCC unable to develop substantive policy.
  • Another federal court has broadened the injunction against the Trump Administration’s ban on TikTok to encompass the entirety of the Department of Commerce’s September order meant to stop the usage of the application in the United States (U.S.) It is unclear as to whether the Trump Administration will appeal, and if it should, whether a court would decide the case before the Biden Administration begins in mid-January. The United States Court for the District of Columbia found that TikTok “established that  the government likely exceeded IEEPA’s express limitations as part of an agency action that was arbitrary and capricious” and would likely suffer irreparable harm, making an injunction an appropriate remedy.
  • The United States’ National Security Agency (NSA) “released a Cybersecurity Advisory on Russian state-sponsored actors exploiting CVE-2020-4006, a command-injection vulnerability in VMware Workspace One Access, Access Connector, Identity Manager, and Identity Manager Connector” and provided “mitigation and detection guidance.”
  • The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint alert, warning that U.S. think tanks are being targeted by “persistent continued cyber intrusions by advanced persistent threat (APT) actors.” The agencies stated “[t]his malicious activity is often, but not exclusively, directed at individuals and organizations that focus on international affairs or national security policy.” CISA and the FBI stated its “guidance may assist U.S. think tanks in developing network defense procedures to prevent or rapidly detect these attacks.” The agencies added:
    • APT actors have relied on multiple avenues for initial access. These have included low-effort capabilities such as spearphishing emails and third-party message services directed at both corporate and personal accounts, as well as exploiting vulnerable web-facing devices and remote connection capabilities. Increased telework during the COVID-19 pandemic has expanded workforce reliance on remote connectivity, affording malicious actors more opportunities to exploit those connections and to blend in with increased traffic. Attackers may leverage virtual private networks (VPNs) and other remote work tools to gain initial access or persistence on a victim’s network. When successful, these low-effort, high-reward approaches allow threat actors to steal sensitive information, acquire user credentials, and gain persistent access to victim networks.
    • Given the importance that think tanks can have in shaping U.S. policy, CISA and FBI urge individuals and organizations in the international affairs and national security sectors to immediately adopt a heightened state of awareness and implement the critical steps listed in the Mitigations section of this Advisory.
  • A group of Democratic United States Senators have written the CEO of Alphabet and Google about its advertising policies and how its platforms may have been used to spread misinformation and contribute to voter suppression. Thus far, most of the scrutiny about the 2020 election and content moderation policy has fallen on Facebook and Twitter even though Google-owned YouTube has been flagged as containing the same amount of misinformation. Senators Amy Klobuchar (D-MN) and Mark Warner (D-VA) led the effort and expressed “serious concerns regarding recent reports that Google is profiting from the sale of ads spreading election-related disinformation” to Alphabet and Google CEO Sundar Pichai. Klobuchar, Warner, and their colleagues asserted:
    • Google is also helping organizations spreading election-related disinformation to raise revenue by placing ads on their websites. While Google has some policies in place to prevent the spread of election misinformation, they are not properly enforced and are inadequate. We urge you to immediately strengthen and improve enforcement of your policies on election-related disinformation and voter suppression, reject all ads spreading election-related disinformation, and stop providing advertising services on sites that spread election-related disinformation.
    • …a recent study by the Global Disinformation Index (GDI) found that Google services ads on 145 out of 200 websites GDI examined that publish disinformation. 
    • Similarly, a recent report from the Center for Countering Digital Hate (CCDH) found that Google has been placing ads on websites publishing disinformation designed to undermine elections. In examining just six websites publishing election-related disinformation, CCDH estimates that they receive 40 million visits a month, generating revenue for these sites of up to $3.4 million annually from displaying Google ads. In addition, Google receives $1.6 million from the advertisers’ payments annually.  These sites published stories ahead of the 2020 general election that contained disinformation alleging that voting by mail was not secure, that mail-in voting was being introduced to “steal the election,” and that election officials were “discarding mail ballots.” 
  • A bipartisan group of United States Senators on one committee are urging Congressional leadership to include funding to help telecommunications companies remove and replace Huawei and ZTE equipment and to aid the Federal Communications Commission (FCC) in drafting accurate maps of broadband service in the United States (U.S.). Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) and a number of his colleagues wrote the leadership of both the Senate and House and argued:
    • we urge you to provide full funding for Public Law 116-124, the Secure and Trusted Communications Networks Act, and Public Law 116-130, the Broadband DATA Act.   
    • Closing the digital divide and winning the race to 5G are critical to America’s economic prosperity and global leadership in technology. However, our ability to connect all Americans and provide access to next-generation technology will depend in large part on the security of our communications infrastructure. The Secure and Trusted Communications Networks Act (“rip and replace”) created a program to help small, rural telecommunications operators remove equipment posing a security threat to domestic networks and replace it with equipment from trusted providers. This is a national security imperative. Fully funding this program is essential to protecting the integrity of our communications infrastructure and the future viability of our digital economy at large.
    • In addition to safeguarding the security of the nation’s communications systems, developing accurate broadband maps is also critically important. The United States faces a persistent digital divide, and closing this divide requires accurate maps that show where broadband is available and where it is not. Current maps overstate broadband availability, which prevents many underserved communities, particularly in rural areas, from receiving the funds needed to build or expand broadband networks to millions of unconnected Americans. Fully funding the Broadband DATA Act will ensure more accurate broadband maps and better stewardship over the millions of dollars the federal government awards each year to support broadband deployment. Without these maps, the government risks overbuilding existing networks, duplicating funding already provided, and leaving communities unserved.  
  • The Government Accountability Office (GAO) released an assessment of 5G policy options that “discusses (1) how the performance goals and expected uses are to be realized in U.S. 5Gwireless networks; (2) the challenges that could affect the performance or usage of 5G wireless networks in the U.S.; and (3) policy options to address these challenges.” The report had been requested by the chairs and ranking members of the House Armed Services, Senate Armed Services, Senate Intelligence, and House Intelligence Committees along with other Members. The GAO stated “[w]hile 5G is expected to deliver significantly improved network performance and greater capabilities, challenges may hinder the performance or usage of 5G technologies in the U.S. We grouped the challenges into the following four categories:
    • availability and efficient use of spectrum
    • security of 5G networks
    • concerns over data privacy
    • concerns over possible health effects
    • The GAO presented the following policy options along with opportunities and considerations for each:
      • Spectrum-Sharing Technologies Opportunities:
        • Could allow for more efficient use of the limited spectrum available for 5G and future generations of wireless networks.
        • It may be possible to leverage existing5G testbeds for testing the spectrum sharing technologies developed through applied research.
      • Spectrum-Sharing Technologies Considerations:
        • Research and development is costly, must be coordinated and administered, and its potential benefits are uncertain. Identifying a funding source, setting up the funding mechanism, or determining which existing funding streams to reallocate will require detailed analysis.
      • Coordinated Cybersecurity Monitoring Opportunities:
        • A coordinated monitoring program would help ensure the entire wireless ecosystem stays knowledgeable about evolving threats, in close to real time; identify cybersecurity risks; and allow stakeholders to act rapidly in response to emerging threats or actual network attacks.
      • Coordinated Cybersecurity Monitoring Considerations:
        • Carriers may not be comfortable reporting incidents or vulnerabilities, and determinations would need to be made about what information is disclosed and how the information will be used and reported.
      • Cybersecurity Requirements Opportunities
        • Taking these steps could produce a more secure network. Without a baseline set of security requirements the implementation of network security practices is likely to be piecemeal and inconsistent.
        • Using existing protocols or best practices may decrease the time and cost of developing and implementing requirements.
      • Cybersecurity Requirements Considerations
        • Adopting network security requirements would be challenging, in part because defining and implementing the requirements would have to be done on an application-specific basis rather than as a one-size-fits-all approach.
        • Designing a system to certify network components would be costly and would require a centralized entity, be it industry-led or government-led.
      • Privacy Practices Considerations
        • Development and adoption of uniform privacy practices would benefit from existing privacy practices that have been implemented by states, other countries, or that have been developed by federal agencies or other organizations.
      • Privacy Practices Opportunities
        • Privacy practices come with costs, and policymakers would need to balance the need for privacy with the direct and indirect costs of implementing privacy requirements. Imposing requirements can be burdensome, especially for smaller entities.
      • High-band Research Opportunities
        • Could result in improved statistical modeling of antenna characteristics and more accurately representing propagation characteristics.
        • Could result in improved understanding of any possible health effects from long-term radio frequency exposure to high-band emissions.
      • High-band Research Considerations
        • Research and development is costly and must be coordinated and administered, and its potential benefits are uncertain. Policymakers will need to identify a funding source or determine which existing funding streams to reallocate.

Coming Events

  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up on 10 December.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Tima Miroshnichenko from Pexels

CPRA From Another View

Let’s see how the CPRA would work from the view of a Californian.

Of course, I analyzed California Ballot Proposition 24, the “California Privacy Rights Act,” at some length in a recent issue, but I think taking on the proposed rewrite of the “California Consumer Privacy Act” (AB 375) from a different angle may provide value in understanding what this law would and would not do. In this piece, I want to provide a sense of what the California resident would be presented with under the new privacy statute.

As noted in my article the other day, as under the CCPA, the CPRA would still not allow people to deny businesses the right to collect and process their personal information unlike some of the bills pending in Congress. Californians could stop the sale or sharing of personal information, but not the collection and processing of personal data short of forgoing or limiting online interactions and many in-person interactions. A person could request the deletion of personal information collected and processed subject to certain limitations and exceptions businesses are sure to read as broadly as possible.

So, businesses subject to the CPRA will have to inform people at the point of collection “the categories of personal information to be collected and the purposes for which the categories of personal Information are collected or used and whether such Information is sold or shared.” Easy enough, as far as this goes. I live in Sacramento, and I log into Facebook, there should be notice about the categories of personal information (e.g. data such as IP address, physical address, name, geolocation data, browsing history, etc.) As a citizen of California afforded privacy rights by the CPRA, I would not be able to tell Facebook not to collect and process these sorts of data. I would be able to ask that they delete these data and to stop their selling or sharing of these data subject to significant limitations on these rights. Therefore, a baseline assumption in the CPRA, as in the CCPA, that it is either in the public interest that data collection and processing are a net good for California, its people, and its businesses, or a concession that it is too late to stop such practices, for strong law stopping some of these practice will result in these companies, some of which are headquartered in the state, to stop offering their free services and/or leave the state.

In the same notice described in the preceding paragraph, I would also be told whether Facebook sells or shares my personal information. I would also be alerted as to whether “sensitive personal information” is being collected and if these are being sold or shared.

Of course, with both categories of information collected from people in California, the use of the data must be compatible with the disclosed purpose for collection. And, so, presumably, the notice provided to me would include the why of the data collection, but whatever the purpose, so long as it is disclosed to me, it would be legal, generally speaking, under the CPRA. The only limitation seems to be purposes incompatible with the context in which the personal information was collected

My personal data could not be stored by a business indefinitely, for the law limits storage for each disclosed purpose for any the time necessary to undertake and complete the purpose.

It must also be stressed that Californians will all but certainly be presented with notice in the physical world when they shop in larger grocery store chains, national or large regional retailers, airlines, car rental firms, etc. In the case of hotels, car rental firms, and airlines, just to name three categories of businesses likely to be covered by the CPRA and to be collecting data on people, the notice may be appended to the boilerplate contractual language no one I know reads. It may be written in the clearest language imaginable, but a person must be advised of what data are being collected, the purpose of the collection and use, and whether it is being sold and shared. For the privacy purist, the only way not to have one’s information collected would be to not engage with these companies. Likewise, walking into a retail establishment large enough to qualify as a business under the CPRA may entail seeing notice posted somewhere in the store, possibly alongside information indicating customers are under surveillance by camera, that personal information is being collected.

I would be able to immediately ask the business to delete my personal information, but it would be allowed to keep this on hand during the period it is completing a transaction or providing goods or services. But there is language that may be interpreted broadly by a business to keep my personal information such as an exception to conduct a product recall or to anticipate future transactions as part of our ongoing business relationship. I would expect this to be very broadly read in favor of keeping personal data. Nonetheless, if it is a service or product used frequently, say, Amazon, then I would need to go back after every use and request my personal information be deleted. But if I placed four Amazon orders a month, the platform could reasonably deny my request because it is occurring in the course of an ongoing business transaction. There are other possible grounds on which a business might not delete a person’s personal or sensitive personal information such as ensuring the security and integrity of the service and product with the caveat that my personal information would have to somehow be “reasonably necessary and proportionate for those purposes.” Would the business make this determination? Subject to guidance or regulations?

However, the exception to the right to delete that is nearly opaque is “[t]o enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business and compatible with the context in which the consumer provided the information.” It is not clear to me the sort of “internal uses” this encapsulates. Data processing so the business can better target the person? This provision is drafted so broadly the new privacy agency must explicate it so businesses and Californians understand what this entails. Also, keep in mind, if I lived in California, I would have to repeat these deletion requests for each and every business collecting information on me.

I would be able to correct my personal information with a business but only with “commercially reasonable efforts,” suggesting cases in which correction are difficult would allow businesses to decline my request. For anyone who has ever tried to correct one’s personal information with a company, the frustration attendant on such endeavors can be significant. A major American automaker switched two letters my wife’s last name, and no matter how many times we asked that her name be spelled correctly, this massive corporation could not or would not make this change. This may end up as a right that is largely without effect.

I would be able to ask for and receive my personal information after a fashion. For example, I would be able to ask for and obtain the exact personal information the business has collected itself but only the categories of information obtained through means other direct collection (i.e. data brokers and other businesses.). To make this section even more convoluted, I would also receive the categories of personal information the business has directly collected on me. Moreover, I could learn the commercial or business purposes for collection and processing and the third parties with whom my personal information is sold or shared. However, if a business includes all this and other information on its website as part of its privacy policy, it would only have to send me the specific pieces of personal information it has collected directly from me. Whatever the case, I would generally only be able to receive information from the previous 12 months.

Separately from the aforementioned rights, I could also learn to whom a business is selling, sharing, and disclosing my information. However, if we drew a Venn Diagram between this right and the previous one, the most significant right bestowed by this section of the CPRA would be that of learning “[t]he categories of personal information that the business disclosed about the consumer for a business purpose and the categories of persons to whom It was disclosed for a business purpose.”

The CPRA would provide me the right to opt out of a business selling or sharing my personal information, and businesses would need to alert people of this right. If I were between the age of 13 and 16, I would need to opt in to selling or sharing my personal information. Moreover, for my children under the age of 13, I, or my wife, would need to opt in for their personal information to be sold or shared.

I would also be able to limit the use and disclosure of my sensitive personal information to an uncertain extent. The CPRA makes clear this is not an absolute right, and businesses would be able to use a number of exceptions to continue using this class of information. For example, a business would be able to do so “to ensure security and Integrity to the extent the use of the consumer’s personal Information is reasonably necessary and proportionate for these purposes.” Likewise, a business could use sensitive personal information for “[s]hort-term, transient use, including but not limited to non-personalized advertising shown as part of a consumer’s current Interaction with the business.” There are other exceptions, and the new California state agency established by the CPRA would be able to promulgate regulations to further define those situations in which use and disclosure may continue against my wishes.

Otherwise, a business would be unable to use or disclose my sensitive personal information once I elect to stop this practice. However, this right pertains only to the use of this type of information to infer my characteristics subject to the drafting of regulations.  

I would not be discriminated against for exercising any of the rights the CPRA grants me with a significant catch on which I’ll say more in a moment. This right would stop businesses from denying me goods or services, charging me a different price, or providing a different level of service or quality. And yet, a business would be able to charge me a different price or rate or give me a lesser level of service or product “if that difference is reasonably related to the value provided to the business by the consumer’s data.” This strikes me as a situation where the exception will eat the rule, for any business with any level of resources will make the claim that the value of my personal information is vital to providing me a service or product for free, and if I deny them the use of this information, the value proposition has changed and I must be charged to have the same level of service, or alternatively without payment, the business could only provide me with a lesser level of service or product. It is my guess that this right would be functionally null.

Moreover, this section is tied to loyalty and reward programs, which would also be exempt from this right so long as the case could be made that the value of my data justifies the difference in price or service. It is not hard to see to incentive structure here being such that businesses would likely establish new programs in order to pressure people in California not to exercise rights in the CPRA and to continue using their personal information in the current fashion. Of course, this is this provision “[a] business shall not use financial incentive practices that are unjust, unreasonable, coercive, or usurious in nature,” but where exactly is the line between a business offering a rewards or loyalty program purportedly tied to the value of the data it collects and processes and these sorts of practices. It may be very hard to divine and will likely require a case-by-case process to delineate the legal from the illegal.

I would generally have two ways to exercise the rights I would be given under the CPRA unless the business only operates online, and then it would be by email. The business would have 45 days after verifying my request for my personal information or to correct or delete to comply, and this would need to be free of charge. However, this 45-day period may be extended once so long as the business informs me. It would seem 90 days would become the de facto norm. A business may also be able to demand “authentication of the consumer that is reasonable in light of the nature of the personal information requested.” The intent is obviously for a business to be sure someone is not malicious or mischievously trying to change someone else’s information in what may come to be an extension of doxing or other vexatious practices seen elsewhere in the online realm. However, this may also likely be read liberally by some businesses as a means of trying up another barrier in the way of my exercise of these rights.

I would be wise as a California resident to understand some of the global limitations of the rights bestowed by the CPRA. For example, all bets are off with respect to a business’ compliance “with federal, state, or local laws or…with a court order or subpoena to provide Information.” A business would be within its legal rights to comply, my wishes be damned. Moreover, law enforcement agencies can direct businesses bot to delete my personal information for up to 90 days while a proper court order is obtained. Moreover, likely as an incentive for businesses, deidentified personal information is not subject to the obligations placed on businesses, and the same is true of “aggregate consumer information.” Obviously, a business would ideally use the safe harbor of deidentification where possible in order to render stolen data less desirable and valuable to thieves. Of course, at least one study has shown that deidentified data can be used to identify and link to people fairly easily and another stated “numerous supposedly anonymous datasets have recently been released and re-identified.” This may be less safe a harbor for my personal information than the drafters of the CPRA intend.

It also bears mention that some publicly available information shall not be considered personal information under the CPRA. The catch here is that not all of my personal information in the public sphere meets the terms of this exception, for new language in the CPRA to modify the CCPA definition makes clear the information has to be “lawfully obtained,” “truthful” and “a matter of public concern.” Additionally, businesses would be barred from using personal information made widely available that is probably not being disclosed lawfully (e.g. someone plastering my home address on social media.) And yet, the California Department of Motor Vehicles (DMV) has been selling the personal information of people to private investigators, bail bondsmen, and others, a legally sanctioned activity, but allowing this practice to funnel the personal information of Californians to businesses and data brokers would arguably not be a matter of public concern. Therefore, this exception may be written tightly enough to anticipate and forestall likely abuses.

Like the CCPA, the CPRA does not bear on use of my personal information in areas of like already regulated, often by the federal government such as health information or credit information. Any rights I would have with respect to these realms would remain unaffected by the CPRA.

I would receive protection in the event of specified types of data breaches, namely if my personal information were neither encrypted nor redacted, the CPRA’s breach provisions come into play. Under the CCPA, if my personal information were not encrypted but was redacted and stolen, a breach would occur, and the same was true if it were not redacted but encrypted. So, this seems to be a weakening of the trigger that would allow me to sue if my personal information were subject to unauthorized exfiltration or access, theft, or disclosure. Additionally, if my “email address in combination with a password or security question and answer that would permit access to the account” are exposed or stolen, I could also sue. Moreover, any unauthorized stealing, accessing, disclosing, or exposure of my personal information must be due to a “business’s violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information” before a breach could occur.

Once a breach has occurred, however, I can sue for between $100-750 per incident plus actual damages but only after giving a business 30 days to cure the breach if possible. If there are no tangible monetary damages, as is often the case in breaches, then I would be left to weigh suing to recover the statutory damages. But if it’s one breach or a handful of breaches, it may not be worth the time and effort it takes to litigate, meaning this is likely the circumstances in which class actions will thrive.

Alternatively, the California Privacy Protection Agency will be empowered to bring actions against businesses that violate the CPRA, but the bill is silent on whether I would be made whole if I did not sue and the agency recovers money from the business. This is not entirely clear.

Finally, there are provisions that contemplate technological means for people to make their preferences under the CPRA known to many businesses at the same time or with minimal repetitive effort. I suppose this envisions someone designing an app that one could use that would do the hard work for you. This seems like language designed to seed the ground in California for developers to create and offer CPRA compliant products. Likewise, one could designate a person to undertake this work for you, which also suggests a market opportunity for an entity that can make the economics of such a business model work. In any event, I would likely be charged for using a service like either of these, leading one to the uncomfortable conclusion that these provisions may drive a greater bifurcation in the world of technology between the haves and haves not.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by OpenClipart-Vectors from Pixabay

CPRA Analyzed

The CCPA follow on bill on the ballot in California will significantly change how the state regulates privacy, which will set the de facto standard for the U.S. in the absence of federal legislation.      

With the “California Privacy Rights Act” (CPRA) having been successfully added to the November ballot on which Californians will vote in November, it is worth taking a deeper look at a bill. This bill would replace the “California Consumer Privacy Act” (CCPA) (AB 375), which just came into full effect with the publishing of final regulations on 14 August. Nonetheless, as the Office of the Attorney General was drafting regulations, the organization that pushed for passage of the CCPA, Californians for Consumer Privacy (CCP), completed the drafting of a follow on bill. CCP Chair and Founder Alistair Mactaggart explained his reasoning for a second ballot initiative: “[f]irst, some of the world’s largest companies have actively and explicitly prioritized weakening the CCPA…[and] [s]econd, technological tools have evolved in ways that exploit a consumer’s data with potentially dangerous consequences.” Moreover, if polling released earlier this month by CCP is close to being accurate, then an overwhelming majority of Californians support enactment of the CPRA, meaning a significantly new privacy scheme will come into effect in the new few years in California.

Of course, it may be fair to assert this bill looks to solve a number of problems created by the rush in June 2018 to draft a bill all parties could accept in order to get the CCPA removed from the ballot. Consequently, the CCPA package that was enacted was sloppily drafted in some places with inconsistent provisions that necessitated two rounds of legislation to fix or clarify the CCPA.

As under the CCPA, the CPRA would still not allow people to deny businesses the right to collect and process their personal information unlike some of the bills pending in Congress. Californians could stop the sale or sharing of personal information, but not the collection and processing of personal data short of forgoing or limiting online interactions and many in-person interactions. A person could request the deletion of personal information collected and processed subject to certain limitations and exceptions businesses are sure to read as broadly as possible. Additionally, a new agency would be created to police and enforce privacy rights, but legitimate questions may be posed about its level of resources. Nonetheless, the new statute would come into effect on 1 January 2023, leaving the CCPA as the law of California in the short term, and then requiring businesses and people to adjust to the new regime.

In the findings section CCP explicitly notes the bills introduced to weaken or rollback the CCPA as part of the reason as to why the CPRA should be enacted. Changes to the California Code made by ballot initiative are much harder to change or modify than the legislative route for enacting statutes. Notably, the CPRA would limit future amendments to only those in furtherance of the act, which would rule out any attempts to weaken or dilute the new regime. Moreover, the bill looks at privacy rights through the prism of an imbalance in information and is founded on the notion that should people in California have more information and real choice in how and when their personal data is shared, proceeded, and collected, then the most egregious data practices would stop. Of course, this conceptual framework differs from the one used by others in viewing data collection and processing as being more like pollution or air quality, situations any one individual is going to have limited impact over, thus necessitating collective government action to address deleterious effects. In the view of the CCP, Californians will be on better footing to negotiate their privacy with companies like Facebook and Google. Notably, the CCP asserted:

  • In the same way that Ingredient labels on foods help consumers shop more effectively, disclosure around data management practices will help consumers become more informed counterparties In the data economy, and promote competition, Additionally, if a consumer can tell a business not to sell his or her data, then that consumer will not have to scour a privacy policy to see whether the business Is, In fact, selling that data, and the resulting savings in time Is worth, in the aggregate, a tremendous amount of money.
  • Consumers should have the information and tools necessary to limit the use of their information to non-invasive, pro-privacy advertising, where their personal information is not sold to or shared with hundreds of businesses they’ve never heard of, If they choose to do so. Absent these tools, it will be virtually Impossible for consumers to fully understand these contracts they are essentially entering into when they interact with various businesses.

The CPRA would change the notification requirements for businesses interested in collecting, processing, and sharing personal data in Section 1798.100 of the Civil Code (i.e. language added by the CCPA and some of the follow bills the legislature passed.) This requirement would be binding on the companies that control collection and not just the entities doing the actual collecting, which suggests concern that the ultimate user of personal data would be shielded from revealing its identity to people. Worse still, the CCPA language may create an incentive to use front companies or third parties to collect personal data. Moreover, the CPRA makes clear that if a company is using another company to collect personal data it will ultimately control, it may meet its notice requirements by posting prominently on its website all the enumerated information. This may be a loophole large companies may use to avoid informing people who is controlling data collection.

The new language that tightens the information people must be provided as part of this notice, namely the purposes for which personal data is collected or used and whether the entity is proposing to sell or share this information. Moreover, the CPRA would mandate that the notice also include any additional purposes for which personal data are collected and used “that are incompatible with the disclosed purpose for which the personal information was collected.”

The changes to Section 1798.100 and the underlying CCPA language that would remain will apply to a new category of information created by the CPRA: “sensitive personal information.” This term is defined to mean:

  • personal Information that reveals
    • a consumer’s social security, driver’s license, state Identification card, or passport number;
    • a consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account;
    • a consumer’s precise geolocation;
    • a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership;
    • the contents of a consumer’s mail, email and text messages, unless the business Is the Intended recipient of the communication;
    • a consumer’s genetic data; and
  • the processing of biometric Information for the purpose of uniquely identifying a consumer;
  • personal Information collected and analyzed concerning a consumer’s health; or
  • personal Information collected and analyzed concerning a consumer’s sex life or sexual orientation.

However, should any of these data be “publicly available” as defined by the CPRA, then it is no longer subject to the heightened requirements normally due this new class of information. For example, the new notice people must be given will list the categories of sensitive personal information collected and the purposes for which such information is collected or used. Additionally, people must be told whether this subset of personal data will be shared or sold.

The CPRA would limit collection, use, processing, and sharing of personal data to purposes “reasonably necessary and proportionate” to achieve the purpose of the information collection. Quite clearly, much will hang on what turns out to be “reasonable,” and this may be construed by the new data protection agency in regulation and ultimately courts in litigation. However, this provision also allows the “collection, use, retention, and sharing of a consumer’s personal information…for another disclosed purpose that Is compatible with the context in which the personal information was collected.” This will also need fleshing out either by regulation or litigation, or both. This seems to allow companies to specific another purpose for its data activities so long as it is compatible with the context of collection. And yet, it is not clear what would determine compatibility. If a person is agreeing to a grocery store chain’s data activities, might the company legally try to collect information regarding a person’s health?

This section also requires businesses to enter into contracts with third parties, service providers, and contractors to ensure they follow the CPRA and to specify that information sold or shared by the business is for limited and specific purposes.

Businesses are obligated to use “reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal Information from unauthorized or illegal access, destruction, use, modification, or disclosure.” This is a familiar construct that contemplates a sliding scale of security measures with lesser steps being all right for less valuable information, say deidentified data, but with higher standards being needed for more sensitive personal data. The challenge in such a regime is that reasonable minds might theoretically disagree about reasonable measures, but it may be the purview of the caselaw construing the CPRA that will point the way to how businesses should secure information.

Section 1798.105 spells out a person’s right to delete personal information and expands the obligation of businesses to direct their service providers and contractors to delete information upon receipt of a valid request. Third parties would be notified of deletion requests and expected to also delete unless doing so would be impossible or “involves disproportionate effort,” a term likely to be given as expansive a reading as possible by many businesses. There still are numerous exceptions for deletion requests, many of which will also likely be read expansively by businesses reluctant to honor deletion requests, including:

  • Complete the transaction for which the personal information was collected, fulfill the terms of a written warranty or product recall conducted In accordance with federal low, provide a good or service requested by the consumer, or reasonably anticipated by the consumer within the context of a business’s ongoing business relationship with the consumer, or otherwise perform a contract between the business and the consumer.
  • Help to ensure security and integrity to the extent the use of the consumer’s personal information Is reasonably necessary and proportionate for those purposes.
  • Debug to identify and repair errors that Impair existing Intended functionality.
    Exercise free speech, ensure the right of another consumer to exercise that consumer’s right of free speech, or exercise another right provided for by law.
  • To enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business and compatible with the context in which the consumer provided the Information.
  • Comply with a legal obligation.

However, the CPRA eliminated the exception that could be used to deny deletion requests in the CCPA that allowed a business to “use the consumer’s personal information, internally, in a lawful manner that is compatible with the context in which the consumer provided the information.”

The CPRA creates a new section, 1798.106, titled “Consumers’ Right to Correct Inaccurate Personal Information,” that requires businesses to correct inaccurate personal information in light of the type of information and why it is being processed. Businesses must disclose that people have this right if a person submits a verifiable request to correct inaccurate personal information. However, companies are only required to make reasonable commercial efforts in correcting inaccurate personal information. It appears that a rulemaking is necessary to flesh out what would be a reasonable commercial efforts.

Section 1798.110 is amended by the CPRA but more or less keeps the CCPA’s right to know about and access being collected about them with some significant changes. For example, there is an expansion of one of the categories businesses must provide to people who utilize this right. This category is the commercial or business purpose for which collection and selling of personal information must be given to requesters currently under the CCPA. Under the CPRA, businesses would also need to inform people of the other entities with whom they share personal information, thus closing a significant loophole, for companies like Facebook share people’s information but do not sell it. Under the CCPA, a Facebook would not need to divulge to a person with which companies it is sharing one’s information.

Also, the CPRA would deem in compliance those companies that post on its website the categories of personal information, the sources of this information, its business or commercial purposes, and the categories of third parties to whom personal information is disclosed. It seems likely many companies will go this route, meaning the only personal information they would need to furnish upon a request would be the specific pieces of information on the person making the request. And yet, the CRPA strikes the CCPA requirement that businesses keep personal information for one-time transactions or to reidentify or link to these data.

Section 1798.115 of the CCPA would also be changed by expanding the universe of data a person may request and receive regarding how their personal information is shared and sold. The CPRA keeps the basic structure of the CCPA in this regard and merely expands it to include shared as well as sold for the following:

  • The categories of personal information sold or shared and the categories of third parties to whom such information was sold or shared
  • The categories of personal information disclosed about a person for business purposes and the categories of persons to whom such information was disclosed

Third parties would be barred from selling or sharing personal information that has been sold to or shared with them unless they provide explicit notice and people have the opportunity to opt-out.

The CRPA similarly changes Section 1798.120 (aka the right to opt out of the sharing or selling of one’s personal information). However, it keeps the CCPA’s right to opt out of sales or sharing at any time. Likewise, teenagers between 13-16 would need to affirmatively agree to selling or sharing, and for any child less than 13, his or her parents must affirmatively agree.

A new Section 1798.121, a “Consumers’ Right to Limit Use and Disclosure of Sensitive Personal Information,” would allow people to stop businesses from collecting and using sensitive personal information in some cases. As a general matter, if a person limits collection or use of this class of information, then the business would be limited to “that use which is necessary to perform the services or provide the goods reasonably expected by an average consumer who requests such goods or services” subject to some of the exceptions embodied in the definition of “business purpose.” Businesses may, however, provide notice of additional uses of sensitive personal information a person must further limit if these new uses ore objectionable or unwanted.

The CPRA changes the provision barring retaliation against people who opt out of certain practices or use their CCPA rights. The general prohibition on punishing people who use their rights under the bill with different prices, services, or products would be maintained. The CPRA would expand this protection to employees, contractors, and applicants for employment. However, the CPRA keeps the CCPA exemption for so-called loyalty programs to offer different prices or services but only if the difference is reasonably related to the value the person’s data provides to the business. The CCPA contains language requiring the linkage to be directly related, this is change may seen as a subtle weakening of the connection between the value of a person’s data and the rewards or prices offered through membership in a loyalty program. This will almost certainly result in businesses in California using current programs or establishing new programs to press people to share personal information in exchange for better prices or services. After all, all they would need to do is show the value of the person’s data is reasonably related to the advantages of membership. Like other similar provisions in the bill, regulation and litigation will define the parameters of what is reasonably related. Like the CCPA, the new bill would require people to opt into such programs, and should a person refuse, the business would need to wait 12 months before making the offer again.

Many of the previously discussed changes to the CCPA necessitate alterations to a key section of the statute, Section 1798.130, that details notice, disclosure, correction, and deletion requests. Businesses with physical locations must still offer two means for people to make such requests, but the CPRA would allow online businesses to merely make available an email address. Anyone who has ever tried to resolve disputes and problems via email knows this process can often be frustrating, but the new statute would allow companies like Facebook or Google to merely offer an email address.

The new 1798.130 also makes clear the 45-day window for businesses to deliver required information to people after receiving a verified request also includes making requested corrections and deletions. A potential hurdle is established for requests, however. In light of the type of information in question, a business may seek to authenticate a person’s identity before granting the request but may not force a person to create an account with the business if they do not have one. To be fair, this provision may be aimed at the mischief that could be created if a person decides to impersonate someone else and ask businesses to delete their personal information. There are likely even other such possible situations in which havoc could be wreaked by a malicious person.

In any event, the disclosure of information would need to cover the previous 12 months under the CPRA, and after new regulations are put in place, people would be able to ask for and receive information stretching back before the preceding 12 months. But such a request could be denied on the grounds of impossibility or disproportionate effort. Presumably the new regulations would address when these types of situations may be the case. Another limitation on this right is that businesses would not need to provide information before 1 January 2022.

If a person submits requests to learn what type of personal information has been collected or has been sold or shared to a business’ contractor or service provider, they have no obligation to respond. And yet, these entities must assist a business that receives such requests.

The CPRA stipulates that businesses are required to provide the following types of information if person asks for the data the entity has:

the categories of sources from which the consumer’s personal information was collected; the business or commercial purpose for collecting, or selling or sharing the consumer’s personal information; and the categories of third parties to whom the business discloses the consumer’s personal information.

A business is also obligated to provide the “specific pieces of personal information obtained from the consumer in a format that is easily understandable to the average consumer, and to the extent technically feasible, in a structured, commonly used, machine-readable format, which also may be transmitted to another entity at the consumer’s request without hindrance.”

Regarding the type of information a business must give to people who ask to know what, if any, information was sold or shared about them, a business must furnish two lists:

  • A list of the categories of personal information It has sold or shared about consumers in the preceding 12 months by reference to the enumerated category or categories in [the revised definition of personal information and new definition of sensitive personal information] that most closely describe the personal Information sold or shared, or If the business has not sold or shared consumers’ personal information in the preceding 12 months, the business shall prominently disclose that fact in Its privacy policy.
  • A list of the categories of personal information it has disclosed about consumers for a business purpose in the preceding 12 months by reference to the enumerated category in subdivision (c) that most closely describes the personal information disclosed, or If the business has not disclosed consumers’ personal information for a business purpose In the preceding 12 months, the business shall disclose that fact.

The categories of personal information a business must provide are “information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household:

(A) Identifiers such as a real name, alias, postal address, unique personal Identifier, online identifier, Internet Protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.

(B) Any personal information described in subdivision (e) of Section 1798.80.

(C) Characteristics of protected classifications under California or federal law.

(D) Commercial information, including records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies.

(E) Biometric information.

(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an Internet website, application, or advertisement.

(G) Geolocation data.

(H) Audio, electronic, visual, thermal, olfactory, or similar Information. (I) Professional or employment-related Information.

(J) Education information, defined as information that is not publicly available personally Identifiable information as defined In the Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99).

(K) Inferences drawn from any of the Information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.

The CPRA modifies the CCPA standards on links on a business’ website allowing people to opt out of the sale or sharing of personal information. It also adds a requirement that such a link be placed on a website to allow people to opt out of the use or disclosure of sensitive personal information. A business would now be allowed to have one link for both if it wants, and it would also be allowed to remind people of the advantages of being a member of the business’ loyalty program and any associated charges or fees with not joining. This provision would seem to allow some businesses, at least those who can make the case of a reasonable relation between the discounts provided and the value of personal information, to pose a possibly uncomfortable dilemma to people: your privacy or your money. Put another way, the CPRA may well result in a price being put on one’s privacy with those of means or those intensely dedicated to privacy being able or willing to limit these practices while everyone else will acquiesce in the face of higher prices of worse services or products. Additionally, companies would not need to have links on their website if they allow for opting out through their platform, technology, or app.

If a person opts out, companies would have to wait 12 months before asking again whether they would not mind allowing the business to sell or share their personal information or use of disclose their sensitive personal information. But, one should bear in mind that even if a person opts out of the sale or sharing of personal information, a business may still collect or process it subject to other requirements in the CPRA. This right is limited to the dissemination of personal information through sales or a sharing arrangement.

The CPRA revises some key definitions and introduces new definitions, the most significant of which was discussed earlier: sensitive personal information. Another key change is to criteria for businesses subject to the CPRA. Each of the three thresholds for becoming a regulated business are changed:

  • First, language is changed to make clear a company must have earned $25 million in gross revenues in the preceding year to qualify on the basis of income.
  • Second, the threshold for the number of people is changed. It is raised from 50,000 to 100,000, and instead of counting people and devices, the latter is stricken and now households may be counted. Obviously, a household will likely include multiple devices, so counting by household allows for a higher threshold generally. Also, the counting is limited to the activities of businesses buying, selling, or sharing personal information, and so mere collection and processing is not counted, meaning if a business does not partake in any of the three enumerated categories, it would not qualify under this prong even if collects and processes the personal information of, say, 1 million Californians.
  • Thirdly, the threshold for businesses deriving 50% or more of their income selling consumers’ personal information is broadened to include sharing, meaning more entities might qualify on the basis of this prong.

Also, of note, the definition of business purpose was altered, and new definitions are provided for consent, contractor, cross-context behavioral advertising, dark pattern, non-personalized advertising and others.

The section on exemptions to the bars in the CCPA is rewritten and expanded by the CPRA. Businesses may disregard the obligations placed on by this privacy statute under a number of circumstances. For example, added circumstances include complying with a subpoena or court order or responding to direction by law enforcement agencies. Moreover, government agencies would able to make emergency requests for personal information to business if acting in good faith, asserts a legal right to do so, and follows with a court order within 3 days. There is also language that adds contractors to the CCPA’s provisions on the liability of a business for violations by its service providers that requires actual knowledge of such violations.

The CPRA keeps the CCPA’s grant of authority to allow people to sue for violations but casually tightens the circumstances under which this may happen to those in which one’s personal information is not encrypted and not redacted. The CCPA allows for a suit if a person’s personal information is neither encrypted nor redacted. Consequently, if a business uses either method of securing information it cannot be sued.

As noted, the bill would establish a California Privacy Protection Agency that would take over enforcement of the revised CCPA from the Office of the Attorney General. It would consist of a five-member board including a chair. At the earlier date of either 1 July 2021 or six months after the new agency informs the Attorney General it is ready to begin drafting rules, it shall have rulemaking authority. However, before this date, the Attorney General may have the authority or opportunity to begin some of the CPRA rulemakings during an interregnum that may serve to complicate implementation. Nonetheless, among other powers, the new agency would be able to investigate and punish violations with fines of up $2,500 per violation except for intentional violations and those involving the personal information of minor children, which could be fined at a rate of $7,500 per violation. Like the Federal Trade Commission, the California Privacy Protection Agency would be able to bring administrative actions inside the agency or go to court to sue. However, this new entity would only be provided $5 million during its first year and $10 million a year thereafter, which begs the question as to whether the new agency will be able to police privacy in California in a muscular way.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Developments, and Coming Events (17 August)

Here are Coming Events, Other Developments, and Further Reading.

Coming Events

  • On 18 August, the National Institute of Standards and Technology (NIST) will host the “Bias in AI Workshop, a virtual event to develop a shared understanding of bias in AI, what it is, and how to measure it.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • On 14 August, the California Office of Administrative Law (OAL) approved the Attorney General’s proposed final regulations to implement the California Consumer Privacy Act (CCPA) (A.B.375) and they took effect that day. The Office of the Attorney General (OAG) had requested expedited review so the regulations may become effective on 1 July as required by the CCPA. With respect to the substance, the final regulations are very similar to the third round of regulations circulated for comment in March, in part, in response to legislation passed and signed into law last fall that modified the CCPA.
    • The OAL released an Addendum to the Final Statement of Reasons and explained
      • In addition to withdrawing certain provisions for additional consideration, the OAG has made the following non-substantive changes for accuracy, consistency, and clarity. Changes to the original text of a regulation are non-substantive if they clarify without materially altering the requirements, rights, responsibilities, conditions, or prescriptions contained in the original text.
    • For further reading on the third round of proposed CCPA regulations, see this issue of the Technology Policy Update, for the second round, see here, and for the first round, see here. Additionally, to read more on the legislation signed into law last fall, modifying the CCPA, see this issue.
    • Additionally, Californians for Consumer Privacy have succeeded in placing the “California Privacy Rights Act” (CPRA) on the November 2020 ballot. This follow on statute to the CCPA could again force the legislature into making a deal that would revamp privacy laws in California as happened when the CCPA was added to the ballot in 2018. It is also possible this statute remains on the ballot and is added to California’s laws. In either case, much of the CCPA and its regulations may be moot or in effect for only the few years it takes for a new privacy regulatory structure to be established as laid out in the CPRA. See here for more detail.
  • In a proposed rule issued for comment, the Federal Communications Commission (FCC) explained it is taking “further steps to protect the nation’s communications networks from potential security threats as the [FCC] integrates provisions of the recently enacted Secure and Trusted Communications Networks Act of 2019 (Secure Networks Act) (P.L. 116-124) into its existing supply chain rulemaking proceeding….[and] seeks comment on proposals to implement further Congressional direction in the Secure Networks Act.” Comments are due by 31 August.
    • The FCC explained
      • The concurrently adopted Declaratory Ruling finds that the 2019 Supply Chain Order, 85 FR 230, January 3, 2020, satisfies the Secure Networks Act’s requirement that the Commission prohibit the use of funds for covered equipment and services. The Commission now seeks comment on sections 2, 3, 5, and 7 of the Secure Networks Act, including on how these provisions interact with our ongoing efforts to secure the communications supply chain. As required by section 2, the Commission proposes several processes by which to publish a list of covered communications equipment and services. Consistent with sections 3, 5, and 7 of the Secure Networks Act, the Commission proposes to (1) ban the use of federal subsidies for any equipment or services on the new list of covered communications equipment and services; (2) require that all providers of advanced communications service report whether they use any covered communications equipment and services; and (3) establish regulations to prevent waste, fraud, and abuse in the proposed reimbursement program to remove, replace, and dispose of insecure equipment.
    • The agency added
      • The Commission also initially designated Huawei Technologies Company (Huawei) and ZTE Corporation (ZTE) as covered companies for purposes of this rule, and it established a process for designating additional covered companies in the future. Additionally, last month, the Commission’s Public Safety and Homeland Security Bureau issued final designations of Huawei and ZTE as covered companies, thereby prohibiting the use of USF funds on equipment or services produced or provided by these two suppliers.
      • The Commission takes further steps to protect the nation’s communications networks from potential security threats as it integrates provisions of the recently enacted Secure Networks Act into the Commission’s existing supply chain rulemaking proceeding. The Commission seeks comment on proposals to implement further Congressional direction in the Secure Networks Act.
  • The White House’s Office of Science & Technology Policy (OSTP) released a request for information (RFI) “[o]n behalf of the National Science and Technology Council’s (NSTC) Subcommittee on Resilience Science and Technology (SRST), OSTP requests input from all interested parties on the development of a National Research and Development Plan for Positioning, Navigation, and Timing (PNT) Resilience.” OSTP stated “[t]he plan will focus on the research and development (R&D) and pilot testing needed to develop additional PNT systems and services that are resilient to interference and manipulation and that are not dependent upon global navigation satellite systems (GNSS)…[and] will also include approaches to integrate and use multiple PNT services for enhancing resilience. The input received on these topics will assist the Subcommittee in developing recommendations for prioritization of R&D activities.”
    • Executive Order 13905, Strengthening National Resilience Through Responsible Use of Positioning, Navigation, and Timing Services, was issued on February 12, 2020, and President Donald Trump explained the policy basis for the initiative:
      • It is the policy of the United States to ensure that disruption or manipulation of PNT services does not undermine the reliable and efficient functioning of its critical infrastructure. The Federal Government must increase the Nation’s awareness of the extent to which critical infrastructure depends on, or is enhanced by, PNT services, and it must ensure critical infrastructure can withstand disruption or manipulation of PNT services. To this end, the Federal Government shall engage the public and private sectors to identify and promote the responsible use of PNT services.
    • In terms of future steps under the EO, the President directed the following:
      • The Departments of Defense, Transportation, and Homeland Security must use the PNT profiles in updates to the Federal Radionavigation Plan.
      • The Department of Homeland Security must “develop a plan to test the vulnerabilities of critical infrastructure systems, networks, and assets in the event of disruption and manipulation of PNT services. The results of the tests carried out under that plan shall be used to inform updates to the PNT profiles…”
      • The heads of Sector-Specific Agencies (SSAs) and the heads of other executive departments and agencies (agencies) coordinating with the Department of Homeland Security, must “develop contractual language for inclusion of the relevant information from the PNT profiles in the requirements for Federal contracts for products, systems, and services that integrate or utilize PNT services, with the goal of encouraging the private sector to use additional PNT services and develop new robust and secure PNT services. The heads of SSAs and the heads of other agencies, as appropriate, shall update the requirements as necessary.”
      • the Federal Acquisition Regulatory Council, in consultation with the heads of SSAs and the heads of other agencies, as appropriate, shall incorporate the [contractual language] into Federal contracts for products, systems, and services that integrate or use PNT services.
      • The Office of Science and Technology Policy (OSTP) must “coordinate the development of a national plan, which shall be informed by existing initiatives, for the R&D and pilot testing of additional, robust, and secure PNT services that are not dependent on global navigation satellite systems (GNSS).”
  • An ideologically diverse bipartisan group of Senators wrote the official at the United States Department of Justice in charge of the antitrust division and the chair of the Federal Trade Commission (FTC) “regarding allegations of potentially anticompetitive practices and conduct by online platforms toward content creators and emerging competitors….[that] stemmed from a recent Wall Street Journal report that Alphabet Inc., the parent company of Google and YouTube, has designed Google Search to specifically give preference to YouTube and other Google-owned video service providers.”
    • The Members asserted
      • There is no public insight into how Google designs its algorithms, which seem to deliver up preferential search results for YouTube and other Google video products ahead of other competitive services. While a company favoring its own products, in and of itself, may not always constitute illegal anticompetitive conduct, the Journal further reports that a significant motivation behind this action was to “give YouTube more leverage in business deals with content providers seeking traffic for their videos….” This exact conduct was the topic of a Senate Antitrust Subcommittee hearing led by Senators Lee and Klobuchar in March this year.
    • Senators Thom Tillis (R-NC), Mike Lee (R-UT), Amy Klobuchar (D-MN), Richard Blumenthal (D-CT), Marsha Blackburn (R-TN), Josh Hawley (R-MO), Elizabeth Warren (D-MA), Mazie Hirono (D-HI), Cory Booker (D-NJ) and Ted Cruz (R-TX) signed the letter.
  • The National Security Agency (NSA) and the Federal Bureau of Investigation (FBI) released a “Cybersecurity Advisory [and a fact sheet and FAQ] about previously undisclosed Russian malware” “called Drovorub, designed for Linux systems as part of its cyber espionage operations.” The NSA and FBI asserted “[t]he Russian General Staff Main Intelligence Directorate (GRU) 85th Main Special Service Center (GTsSS) military unit 26165” developed and deployed the malware. The NSA and FBI stated the GRU and GTsSS are “sometimes publicly associated with APT28, Fancy Bear, Strontium, and a variety of other identities as tracked by the private sector.”
    • The agencies contended
      • Drovorub represents a threat to National Security Systems, Department of Defense, and Defense Industrial Base customers that use Linux systems. Network defenders and system administrators can find detection strategies, mitigation techniques, and configuration recommendations in the advisory to reduce the risk of compromise.
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published Cybersecurity Best Practices for Operating Commercial Unmanned Aircraft Systems (UAS) “a companion piece to CISA’s Foreign Manufactured UASs Industry Alert,…[to] assist in standing up a new UAS program or securing an existing UAS program, and is intended for information technology managers and personnel involved in UAS operations.” CISA cautioned that “[s]imilar to other cybersecurity guidelines and best practices, the identified best practices can aid critical infrastructure operators to lower the cybersecurity risks associated with the use of UAS, but do not eliminate all risk.”
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) released the “Identity, Credential, and Access Management (ICAM) Value Proposition Suite of documents in collaboration with SAFECOM and the National Council of Statewide Interoperability Coordinators (NCSWIC), Office of the Director of National Intelligence (ODNI), and Georgia Tech Research Institute (GTRI)…[that] introduce[] ICAM concepts, explores federated ICAM use-cases, and highlights the potential benefits for the public safety community:”
    • ICAM Value Proposition Overview
      • This document provides a high-level summary of federated ICAM benefits and introduces domain-specific scenarios covered by other documents in the suite.
    • ICAM Value Proposition Scenario: Drug Response
      • This document outlines federated ICAM use cases and information sharing benefits for large-scale drug overdose epidemic (e.g., opioid, methamphetamine, and cocaine) prevention and response.

Further Reading

  • Trump’s Labor Chief Accused of Intervening in Oracle Pay Bias Case” By Noam Scheiber, David McCabe and Maggie Haberman – The New York Times. In the sort of conduct that is apparently the norm across the Trump Administration, there are allegations that the Secretary of Labor intervened in departmental litigation to help a large technology firm aligned with President Donald Trump. Starting in the Obama Administration and continuing into the Trump Administration, software and database giant Oracle was investigated, accused, and sued for paying non-white, non-male employees significantly less in violation of federal and state law. Estimates of Oracle’s liability ranged between $300-800 million, and litigators in the Department of Labor were seeking $400 million and had taken the case to trial. Secretary Eugene Scalia purportedly stepped in and lowered the dollar amount to $40 million and the head litigator is being offered a transfer from Los Angeles to Chicago in a division in which she has no experience. Oracle’s CEO Safra Catz and Chair Larry Ellison have both supported the President more enthusiastically and before other tech company heads engaged.
  • Pentagon wins brief waiver from government’s Huawei ban” By Joe Gould – Defense News. A Washington D.C. trade publication is reporting the Trump Administration is using flexibility granted by Congress to delay the ban on contractors using Huawei, ZTE, and other People’s Republic of China (PRC) technology for the Department of Defense. Director of National Intelligence John Ratcliffe granted the waiver at the request of Under Secretary of Defense for Acquisition and Sustainment Ellen Lord, claiming:
    • You stated that DOD’s statutory requirement to provide for the military forces needed to deter war and protect the security of out country is critically important to national security. Therefore, the procurement of goods and services in support of DOD’s statutory mission is also in the national security interests of the United States.
    • Section 889 of the “John S. McCain National Defense Authorization Act (NDAA) for FY 2019” (P.L. 115-232) requires agencies to remove this equipment and systems and also not to contract with private sector entities that use such equipment and services. It is the second part of the ban the DOD and its contractors are getting a reprieve from for an interim rule putting in place such a ban was issued last month.
  • DOD’s IT supply chain has dozens of suppliers from China, report finds” By Jackson Barnett – fedscoop. A data analytics firm, Govini, analyzed a sample of prime contracts at the Department of Defense (DOD) and found a surge in the presence of firms from the People’s Republic of China (PRC) in the supply chains in the software and information technology (IT) sectors. This study has obvious relevance to the previous article on banning PRC equipment and services in DOD supply chains.
  • Facebook algorithm found to ‘actively promote’ Holocaust denial” by Mark Townsend – The Guardian. A British counter-hate organization, the Institute for Strategic Dialogue (ISD), found that Facebook’s algorithms lead people searching for the Holocaust to denial sites and posts. The organization found the same problem on Reddit, Twitter, and YouTube, too. ISD claimed:
    • Our findings show that the actions taken by platforms can effectively reduce the volume and visibility of this type of antisemitic content. These companies therefore need to ask themselves what type of platform they would like to be: one that earns money by allowing Holocaust denial to flourish, or one that takes a principled stand against it.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Foundry Co from Pixabay

CCPA Regulations Finalized

Final CCPA regulations submitted, but it is not clear if they will be approved by 1 July as required by the statute. However, if a follow-on ballot initiative becomes law, these regulations could be moot or greatly changed.

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

The Office of California Attorney General Xavier Becerra has submitted its finalized regulations to implement the “California Consumer Privacy Act” (CCPA) (AB 375) to the Office of Administrative Law (OAL), typically the last step in the regulatory process in California. The Office of the Attorney General (OAG) is requesting expedited review so the regulations may become effective on 1 July as required by the CCPA. However, the OAL has 30 days per California law to review regulations to ensure compliance with California’s Administrative Procedure Act (APA). However, under Executive Order N-40-20, issued in response to the COVID-19 pandemic, the OAL has been given an additional 60 days beyond the 30 statutory days to review regulations, so it is possible the CCPA regulations are not effective on 1 July. In fact, it could be three months before they are effective, meaning early September.

With respect to the substance, the final regulations are very similar to the third round of regulations circulated for comment in March, in part, in response to legislation passed and signed into law last fall that modified the CCPA. The OAG released other documents along with the finalized regulations:

For further reading on the third round of proposed CCPA regulations, see this issue of the Technology Policy Update, for the second round, see here, and for the first round, see here. Additionally, to read more on the legislation signed into law last fall, modifying the CCPA, see this issue.

Moreover, Californians for Consumer Privacy have submitted the “California Privacy Rights Act” (CPRA) for the November 2020 ballot. This follow on statute to the CCPA could again force the legislature into making a deal that would revamp privacy laws in California as happened when the CCPA was added to the ballot in 2018. It is also possible this statute remains on the ballot and is added to California’s laws. In either case, much of the CCPA and its regulations may be moot or in effect for only the few years it takes for a new privacy regulatory structure to be established as laid out in the CPRA. See here for more detail.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Becerra Deflects Calls To Delay Implementation of CCPA

Through the issuance of a press release, California Attorney General Xavier Becerra answered the calls for delaying enforcement of the “California Consumer Privacy Act” (CCPA) (AB 375), by “reminding consumers of their data privacy rights amidst the COVID-19 public health emergency.” Last month, some industry stakeholders pressured Becerra to delay enforcement of the CCPA because companies were trying to cope with the onslaught of COVID-19, which is siphoning off resources that would otherwise be used to comply with California’s newly effective privacy law. More than 30 groups asked Becerra to not enforce the CCPA until January 2, 2021.

Regarding the CCPA, Becerra stated that “[a]s the health emergency leads more people to look online to work, shop, connect with family and friends, and be entertained, it is more important than ever for consumers to know their rights under the CCPA.” He noted that “CCPA went into effect on January 1, 2020 and offers new rights to consumers that can be used both during the emergency and afterwards.

  • Websites that collect and sell your personal information should have a “Do Not Sell My Information” link, which you can click on to opt-out of the sale of your personal information.
  • If you want to minimize or reduce the data collected by businesses during or after the emergency, you can request that the business delete personal data that it has collected from you.
  • You can also request that a business disclose to you what personal information the business collects, uses, shares, or sells. You may exercise this right twice during a 12-month period.

Becerra also offered tips on how to avoid email scams, protecting virtual meetings and home networks, and means to ensure children are safe and protected online.