States Other Than CA and VA Consider Privacy Bills

The North Dakota legislature spikes two tech bills, one that would have regulated app stores and the other would have barred sales of personal data without consent.

With the recent enactment by ballot of the “California Privacy Rights Act” (aka Proposition 24) and the pending enactment of the “Consumer Data Privacy Act” in Virginia, other states are considering legislation to regulate privacy and other aspects of the technology industry’s businesses.

In North Dakota, an effort to take on Apple and Google’s 30% fees related to their application stores was defeated in the state Senate. Apparently, a lobbyist for the Epic Games and the Coalition for App Fairness drafted the initial bill and convinced a state Senator to introduce legislation and champion the issue. The Coalition for App Fairness, of which Epic Games is a member, “advocate[s] for enforcement and reforms, including legal and regulatory changes, to preserve consumer choice and a level playing field for app and game developers that rely on app stores and the most popular gatekeeper platforms” according to the press release announcing its establishment. And, of course, Epic Games is suing both Apple and Google in United States (U.S.) federal court over the 30% fee both companies take off the top of all in-app purchases, and so, this legislative push in a state is likely one of the strategies app developers will be pursuing. It is almost certain that similar legislation will crop up in other legislatures and maybe even in Congress.

SB 2333 would bar companies like Apple and Google from requiring application developers to use their application stores exclusively. Hence residents in North Dakota should be able to locate and download applications directly from developers or other non-Apple and non-Google sources. Apple, Google, and similarly situated companies could not mandate, as they presently do, that all in-application purchases must be conducted through their payment platforms. This provision would deny the companies their stranglehold on payments, which allows them to extract a 30% fee from such purchases. It was this very point that started the Epic Games litigation, for the company started offering Apple and Google platform users of its popular game Fortnite the option of buying directly from Epic Games at a price 30% lower than the one offered through the application stores. Apple and Google responded by kicking Epic Games out of their stores, sparking the current litigation. In this vein, an application store could not retaliate against companies that opt to use a separate payment system or block an application developer for the same. Developers could bring suit for violations asking to enjoin Apple and Google and asking for restitution, reasonable attorney’s fees, and other costs. SB 2333 would bar any contract or agreement contrary to this bill.

And yet, “special-purpose digital application distribution platforms” would be exempted from this bill. The definition provides examples of what these may be, including “a gaming console, music player, and other special-purpose devices connected to the internet.” These sound very much like Microsoft’s Xbox, Sony’s Play Station, Apple’s iPods, and Virtual Reality headsets like the Facebook-owned company, Oculus. Moreover, “digital application distribution platform for which cumulative gross receipts from sales on the digital application distribution platform to residents of this state” are less than $10 million a year are exempted. So, the bill seems to target Apple and Google.

Not surprisingly, Apple and Google fought against this bill. Apple’s Chief Privacy Engineer Erik Neuenschwander testified before a Senate committee that “Senate Bill 2333 threatens to destroy iPhone as you know it.” Not surprisingly, a representative of the Coalition for App Fairness argued “SB 2223 will benefit consumers and app developers in North Dakota by limiting the ability of dominant platforms to impose onerous and anticompetitive restrictions on app developers.” Nonetheless, the state Senate rejected a weakened version of SB 2333 by an 11-36 vote, killing the legislation.

Last fall, a federal court denied Epic Games’ request for a preliminary injunction requiring Apple to put Fortnite back into the App Store. The judge assigned the case had signaled this request would likely fail as its request for a temporary restraining order was also rejected. A May 2021 trial date has been set. The United States District Court for the Northern District of California summarized Epic’s motion:

In this motion for preliminary injunction, Epic Games asks the Court to force Apple to reinstate Fortnite to the Apple App Store, despite its acknowledged breach of its licensing agreements and operating guidelines, and to stop Apple from terminating its affiliates’ access to developer tools for other applications, including Unreal Engine, while Epic Games litigates its claims.

The court stated:

Epic Games bears the burden in asking for such extraordinary relief. Given the novelty and the magnitude of the issues, as well as the debate in both the academic community and society at large, the Court is unwilling to tilt the playing field in favor of one party or the other with an early ruling of likelihood of success on the merits. Epic Games has strong arguments regarding Apple’s exclusive distribution through the iOS App Store, and the in-app purchase (“IAP”) system through which Apple takes 30% of certain IAP payments. However, given the limited record, Epic Games has not sufficiently addressed Apple’s counter arguments. The equities, addressed in the temporary restraining order, remain the same.

The court held:

Apple and all persons in active concert or participation with Apple, are preliminarily enjoined from taking adverse action against the Epic Affiliates with respect to restricting, suspending or terminating the Epic Affiliates from the Apple’s Developer Program, on the basis that Epic Games enabled IAP direct processing in Fortnite through means other than the Apple IAP system, or on the basis of the steps Epic Games took to do so. This preliminary injunction shall remain in effect during the pendency of this litigation unless the Epic Affiliates breach: (1) any of their governing agreements with Apple, or (2) the operative App Store guidelines. This preliminary injunction supersedes the prior temporary restraining order.

In its complaint, Epic Games is arguing that Apple’s practices violate federal and California antitrust and anti-competition laws. Epic Games argued:

  • This case concerns Apple’s use of a series of anti-competitive restraints and monopolistic practices in markets for (i) the distribution of software applications (“apps”) to users of mobile computing devices like smartphones and tablets, and (ii) the processing of consumers’ payments for digital content used within iOS mobile apps(“in-app content”).
  • Apple imposes unreasonable and unlawful restraints to completely monopolize both markets and prevent software developers from reaching the over one billion users of its mobile devices (e.g., iPhone and iPad) unless they go through a single store controlled by Apple, the App Store, where Apple exacts an oppressive 30% tax on the sale of every app. Apple also requires software developers who wish to sell digital in-app content to those consumers to use a single payment processing option offered by Apple, In-App Purchase, which likewise carries a 30% tax.
  • In contrast, software developers can make their products available to users of an Apple personal computer (e.g., Mac or MacBook) in an open market, through a variety of stores or even through direct downloads from a developer’s website, with a variety of payment options and competitive processing fees that average 3%, a full ten times lower than the exorbitant 30% fees Apple applies to its mobile device in-app purchases.

In its late August denial of Epic Games’ request for a temporary restraining order, the court decided the plaintiff does not necessarily have an antitrust case strong enough to succeed on the merits, has not demonstrated irreparable harm because the “current predicament appears to be of its own making,” would unjustifiably be enriched if Fortnite is reinstated to the App Store without having to pay 30% of in app purchases to Apple, and is not operating in a public interest strong enough to overcome he expectation private parties will honor their contracts or resolve disputes through normal means.

Another North Dakota technology bill appears to have died in committee. The North Dakota House Industry, Business and Labor Committee voted not to pass HB 1330, a bill that would ban the sale of one’s personal data without opt-in consent. The penalties for doing so in violation of this proposed law are stiff and would depend entirely on private lawsuits with class actions being explicitly allowed. A company that violates this proscription would be liable for at least $10,000 and reasonable attorney’s fees while companies that knowingly violate the law would be facing at least $100,000 in damages, reasonable attorney’s fees, and punitive damages. As mentioned, the bill explicitly states class actions may be filed, which would likely result in massive cases arguing for millions of dollars in damages if a multinational were to knowingly violate this bill.

HB 1330 provides simple parameters to how entities may sell personal data:

A covered entity may not sell a user’s protected data to another person unless the user opts-in to allow the sale. To opt-in, the covered entity shall provide the user with the opportunity to affirmatively click or select approval of the sale. The user must be given the opportunity to opt-in to the sale of each type of protected data by individual selection. Protected data collected and sold by the covered entity must be described clearly in plain language to the user.

Given the bill does not include a definition of sell or sale, it is unclear if trading personal data or some other exchange short of money changing hands would qualify. If not, this considerable loophole would probably not stop companies like Facebook and Google from amassing massive troves of data, processing them, and then selling targeted advertising or other services based on the value of its data and profiles.

The definition of what is “personal data” is fairly expansive:

a user’s location; screen name; website address; interests; hometown; professional history; friends or followers; shopping habits; test scores; health conditions, insurance, or interests; internet browsing history; purchases or purchase history; the number of friends or followers of the user; socioeconomic status; religious affiliation; alcohol, tobacco, or drug usage; gambling habits; banking relationships; residence details; children’s information or household information; credit; banking and insurance policies; media usage; and relationship status.

And yet, some notable omissions include sexual orientation and political beliefs. Arguably, those types of information could be considered part of one’s “interests,” “internet browsing history” or “household information.” If a covered entity decided to make the case such information is outside the definition of “personal data,” then the collecting and selling of these data could continue without consent.

It bears note this bill does not give residents of North Dakota any control over whether data may be collected, processed, shared, or used. It merely bars the sale of certain data without opt-in consent.

It bears note that for whatever flaws this bill has, it uses an opt-in model whereas one currently enacted state privacy law and a pending privacy law do not. The California Privacy Rights Act (CPRA) would continue the right of California residents currently enjoy under the “California Consumer Privacy Act” (CCPA) (AB 375) to opt out of the sale of their personal data (see here for more analysis.). Likewise, Virginia’s the “Consumer Data Protection Act” (SB 1392/HB 2307) allows for the opting out of the sale of personal data (see here for more analysis.)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by dksecord from Pixabay

Further Reading, Other Developments, and Coming Events (10 February 2021)

Further Reading

  • A Hacker Tried to Poison a Florida City’s Water Supply, Officials Say” By Andy Greenberg — WIRED. Given the fact that most water and sewage systems are linked to the internet, even their operational systems, it is surprising these sorts of incidents do not occur more frequently.
  • UK regulator to write to WhatsApp over Facebook data sharing” By Alex Hern — The Guardian. The United Kingdom’s (UK) Information Commissioner Elizabeth Denham said her agency will be pressing Facebook to keep the data its subsidiary, WhatsApp, separate. Now that the UK has exited the European Union, it is no longer bound by the EU‘s system which made Ireland’s Data Protection Commission the lead regulator on Facebook and WhatsApp. And so, WhatsApp’s 2017 commitment not to hand over user data to Facebook until it was compliant with the General Data Protection Regulation (GDPR) falls to the ICO to oversee in the UK.
  • Telegram, Pro-Democracy Tool, Struggles Over New Fans From Far Right” By Michael Schwirtz — The New York Times. The same features that makes messaging app Telegram ideal for warding off attacks by authoritarian regimes to shut down communication makes the platform ideal for right-wing extremists in the United States (U.S.) Federal and state authorities may see their attempts to track and monitor domestic terrorism hit the same roadblocks that foiled Moscow and Tehran’s attempts to crack down on Telegram. The platform uses end-to-end encrypted communications and has servers all over the world.
  • Exclusive: The end of the Maher era at Wikipedia” By Felix Salmon — Axios. The CEO who revitalized Wikimedia is leaving the organization stronger than she found it.
  • After Defending Its Low-Cost Internet Offering, Comcast Agrees To Increase Speeds” By Caroline O’Donovan — BuzzFeed News. The bad publicity seems to have worked on Comcast as the company is now meeting most of the demands of activists, students, and officials by increasing the speed of its low cost broadband option. Comcast said the changes will take effect on 1 March.

Other Developments

  • The Federal Communications Commission (FCC) announced that it is “seeking comment on several petitions requesting permission to use E-Rate program funds to support remote learning during the pandemic.” Comments are due by 16 February and reply comments are due by 23 February. The FCC explained:
    • Today’s Public Notice from the FCC’s Wireline Competition Bureau highlights three petitions that cover the bulk of issues presented in other petitions filed with the Commission.  These include petitions filed by a coalition of E-Rate stakeholders led by the Schools, Health & Libraries Broadband (SHLB) Coalition; a petition filed on behalf of the State of Colorado; and a petition filed by the State of Nevada, Nevada Board of Education and Nevada Department of Education. 
    • The FCC noted:
      • The E-Rate program was authorized by Congress as part of the Telecommunications Act of 1996 (the Telecommunications Act), and created by the Commission in 1997 to, among other things, enhance, to the extent technically feasible and economically reasonable, access to advanced telecommunications and information services for all public and nonprofit elementary and secondary schools and libraries. Under the E-Rate program, eligible schools, libraries, and consortia (comprised of eligible schools and libraries) may request universal service discounts for eligible services and/or equipment (collectively, eligible services), including connections necessary to support broadband connectivity to eligible schools and libraries. Eligible services must be used “primarily for educational purposes.” In the case of schools, “educational purposes” is defined as “activities that are integral, immediate, and proximate to the education of students. In the case of libraries, “educational purposes” is defined as activities that are “integral, immediate, and proximate to the provision of library services to library patrons.”
      • As the pandemic continues to force schools and libraries across the country to remain closed and rely on remote learning and virtual services, either in whole or in part, the need for broadband connections—particularly for those students, teachers, staff, and patrons that lack an adequate connection at home—is more critical than ever.  Eligible schools and libraries explain that they are hampered in their ability to address the connectivity needs brought on, and in many cases exacerbated, by COVID-19 because of the restrictions on off-campus use of E-Rate-funded services and facilities.   Last spring, as the COVID-19 pandemic forced schools and libraries to grapple with the challenges of transitioning to remote learning, the FCC began to receive requests for emergency relief aimed at ensuring that all students have sufficient connectivity at home.
  • The European Commission’s President appealed to the United States (U.S.) in joining the European Union to jointly regulate technology. At the Davos Agenda, EC President Ursula von der Leyen made remarks, a significant portion of which focused on technological issues and the European Union’s (EU) proposals, the Digital Services Act and Digital Markets Act. It is unclear to extent to which the new administration in Washington will be willing to work with the EU. Undoubtedly, the Biden Administration will interpret a number of EU policies and decisions as being implicitly aimed at the U.S. technology sector but there may be common ground. Von der Leyen stated:
    • A year ago at Davos, we talked also intensively about digitalisation. The pandemic has massively accelerated the process. The European Union will dedicate 20% of NextGenerationEU to digital projects. To nurture innovative ecosystems, for example where universities, companies, innovators can access data and cooperate. To boost the vibrant start-up scene we have in cities like Sofia and Lisbon and to become a global hub for Artificial Intelligence. So that the 2020s can finally be Europe’s Digital Decade.
    • But for this to be a success, we must also address the darker sides of the digital world. Like for so many of us, the storming of the Capitol came as a shock to me. We are always quick to say: Democracy and values, they are part of our DNA. And that is true. But we must nurture our democracy every day, and defend our institutions against the corrosive power of hate speech, of disinformation, fake news and incitement to violence. In a world where polarising opinions are the loudest, it is a short step from crude conspiracy theories to the death of a police officer. Unfortunately, the storming of the Capitol Hill showed us how just true that is.
    • The business model of online platforms has an impact – and not only on free and fair competition, but also on our democracies, our security and on the quality of our information. That is why we need to contain this immense power of the big digital companies. Because we want the values we cherish in the offline world also to be respected online. At its most basic, this means that what is illegal offline should be illegal online too. And we want the platforms to be transparent about how their algorithms work. Because we cannot accept that decisions, that have a far-reaching impact on our democracy, are taken by computer programmes alone.
    • Right after von der Leyen addressed the unease she and others felt about the U.S. President’s freedom of expression being abridged because of a company’s rules outside of any controlling legal framework, she stated:
      • I want to invite our friends in the United States to join our initiatives. Together, we could create a digital economy rulebook that is valid worldwide: It goes from data protection and privacy to the security of critical infrastructure. A body of rules based on our values: Human rights and pluralism, inclusion and the protection of privacy. So Europe stands ready.
      • The challenges to our democracy, the pandemic, climate change – in his inauguration speech President Joe Biden so aptly spoke of a Cascade of Crises. And indeed, we face an outstanding set of challenges. But we can meet them – if we work together. That is what we all have to learn again after four long years. That it is not a sign of weakness, to reach out and help each other, but a signal of strength.
  • Consumer Reports tried to become an authorized agent under the “California Consumer Privacy Act” (CCPA) (AB 375) to make do not sell personal data requests or opt out requests. The CCPA was designed to allow California residents to use services that would handle these preferences on a global scale. In their report on the pilot program, Consumer Reports concluded:
    • Unfortunately, too many companies have made it difficult, if not impossible, for agents and consumers to submit opt-out requests. The AG should enforce companies’ compliance with the law so that the authorized agent provisions work as intended. Moreover, the AG should promulgate additional common-sense rules to make sure that opt outs are simple and effective, even when submitted by an authorized agent.
    • Consumer Reports made these recommendations:
      • The AG should hold companies accountable when they violate the law. The AG needs to hold companies accountable for failure to comply with the CCPA’s authorized agent provisions. Without a viable authorized agent option, consumers could be left to navigate complicated processes or interfaces in order to exercise their California privacy rights themselves. Enforcement will help ensure that companies work harder to make sure that they have appropriate agent flows. The AG should also step in when customer service isn’t effective, and should consider directing enforcement resources to encourage better training in this area.
      • The AG should clarify that data shared for cross-context targeted advertising is a sale, and tighten the restrictions on service providers. Many companies have exploited ambiguities in the definition of sale and the rules surrounding service providers to ignore consumers’ requests to opt out of behavioral advertising. While the newly-passed California Privacy Rights Act will largely address these loopholes, these provisions will not go into effect until January 1, 2023. Thus, the AG should exercise its broad authority to issue rules to clarify that the transfer of data between unrelated companies for any commercial purpose falls under the definition of sale. Another common way for companies to avoid honoring consumers’ right to opt out of behavioral advertising is by claiming a service provider exemption. For example, the Interactive Advertising Bureau (IAB), a trade group that represents the ad tech industry, developed a framework for companies to evade the opt out by abusing a provision in the CCPA meant to permit a company to perform certain limited services on its behalf. To address this problem, the AG should clarify that companies cannot transfer data to service providers for behavioral advertising if the consumer has opted out of sale.
      • The AG should prohibit dark patterns as outlined in the Third Set of Proposed Modifications. We appreciate that the AG has proposed to “require minimal steps to allow the consumer to opt-out” and to prohibit dark patterns, “a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out[,]” in the Third Set of Proposed Modifications to the CCPA Regulations. This proposal should be finalized as quickly as possible. This is essential, given the difficulties that authorized agents and consumers have experienced in attempting to stop the sale of their information, as demonstrated in the study.
      • The AG should require companies to notify agents when the opt-out request has been received and when it has been honored. Too often, the company provided no information on whether or not the opt-out request had been honored. While the CCPA rules require companies to notify consumers if an opt-out request has been rejected, there is no requirement to provide notice of receipt, or notice of confirmation—nor is there guidance on how to respond to opt-out requests when the company does not possess the consumer’s data. The authorized agent was, in some cases, unable to explain to the consumer whether not the opt-out process had been completed. To ensure that the authorized agent service is effective, companies must be required to provide notification upon receipt and completion of the opt-out request. Required notification is also important for compliance purposes. For example, the regulations require companies to comply with opt outs within 15 business days. Without providing adequate notification, there’s no way to judge whether or not the company has honored the law and to hold them accountable if not. Further, if the company does sell consumers’ personal information, but does not have personal information about the consumer who is the subject of the request, the company should be required to notify the agent that the request has been received, and that the company will honor the opt out if and when they do collect the consumer’s data. In the case of an agent opt out, the notification should go to the agent. Otherwise, the consumer could end up getting emails from hundreds, if not thousands, of different companies.
      • The AG should clarify that if an agent inadvertently submits a request incorrectly, the company should either accept it or inform the agent how to submit it appropriately. The regulations provide helpful guidance with respect to consumer access and deletion requests, which ensures that even if a consumer inadvertently submits a request incorrectly, there is a process in place to help them submit it properly. If a consumer submits a request in a manner that is not one of the designated methods of submission, or is deficient in some manner unrelated to the verification process, the business shall either: (1) Treat the request as if it had been submitted in accordance with the business’s designated manner, or (2) Provide the consumer with information on how to submit the request or remedy any deficiencies with the request, if applicable. The AG should clarify that this guidance applies to all authorized agent-submitted requests as well.
  • The Government Accountability Office (GAO) assessed the Department of Defense’s (DOD) efforts to transition to a more secure version of the Global Positioning System (GPS), an initiative that spans back to the administration of former President George W. Bush. The GAO stated “due to the complexity of the technology, M-code remains years away from being widely fielded across DOD. M-code-capable receiver equipment includes different components, and the development and manufacture of each is key to the modernization effort. These include:
    • special M-code application-specific integrated circuit chips,
    • special M-code receiver cards, being developed under the Air Force Military GPS User Equipment (MGUE) programs, and
    • the next generation of GPS receivers capable of using M-code signals from GPS satellites.
    • The GAO added:
      • DOD will need to integrate all of these components into different types of weapon systems… Integration across DOD will be a considerable effort involving hundreds of different weapon systems, including some with complex and unique integration needs or configurations.
    • The GAO further asserted:
      • The Air Force is almost finished—approximately one year behind schedule— developing and testing one M-code card for testing on the Marine Corps Joint Light Tactical Vehicle and the Army Stryker vehicle. However, one card intended for use in aircraft and ships is significantly delayed and missed key program deadlines. The Air Force is revising its schedule for testing this card.
      • The M-code card development delays have had ripple effects on GPS receiver modernization efforts and the weapon systems that intend to use them.
  • The advocate who brought the cases that brought down both the Safe Harbor and Privacy Shield agreements between the United States (U.S.) and European Union (EU) announced that Ireland’s Data Protection Commission (DPC) has agreed to finally decide on the legality of Facebook’s data transfers to the U.S. that gave rise to both lawsuits. In a press release, none of your business (noyb). Last fall, noyb announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish DPC today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.)” In September 2020, after the DPC directed Facebook to stop transferring the personal data of European Union citizens to the U.S., the company filed suit in Ireland’s court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • In explaining the most recent development, noyb further asserted:
      • The DPC has agreed with Max Schrems’ demand to swiftly end a 7.5 year battle over EU-US data transfers by Facebook and come to a decision on Facebook’s EU-US data flows. This only came after a Judicial Review against the DPC was filed by Mr Schrems. The case would have been heard by the Irish High Court today.
      • New “own volition” procedure blocked pending complaint from 2013. The Irish DPC oversees the European operations of Facebook. In Summer 2020 the European Court of Justice (CJEU) ruled on a complaint by Mr Schrems that had been pending since 2013 and came before the CJEU for the second time (“Schrems II”): Under the CJEU judgment the DPC must stop Facebook’s EU-US data flows over extreme US Surveillance Laws (like FISA 702). Instead of implementing this ruling, the DPC started a new “own volition” case and paused the original procedure for an indefinite time. Mr Schrems and Facebook brought two Judicial Review procedures against the DPC: While Facebook argued in December that the “own volition” procedure should not go ahead, Mr Schrems argued that his complaints procedure should be heard independently of the “own volition” case.
      • Walls are closing in on Facebook’s EU-US data transfers. The DPC has now settled the second Judicial Review with Mr Schrems just a day before the hearing was to take place, and pledged to finalize his complaints procedure swiftly.
      • As part of the settlement, Mr Schrems will also be heard in the “own volition” procedure and get access to all submissions made by Facebook, should the Court allow the “own volition” investigation to go ahead. Mr Schrems and the DPC further agreed that the case will be dealt with under the GDPR, not the Irish Data Protection Act that was applicable before 2018. The DPC may await the High Court judgement in Facebook’s Judicial Review before investigating the original complaint.
      • This agreement could in essence make the original complaints procedure from 2013 the case that ultimately determines the destiny of Facebook’s EU-US transfers in the wake of the Snowden disclosures. Under the GDPR the DPC has every liberty to issue fines of up to 4% pf Facebook’s global turnover and transfer prohibitions, even on the basis of this individual case.
  • The Information Technology Industry Council (ITI), BSA | The Software Alliance, Internet Association, Computer and Communications Industry Association, and the National Foreign Trade Council made recommendations to the Biden Administration on technology policy and asserted in their press release:
    • Prioritize strategic engagement with U.S. trading partners by ensuring continued protected transatlantic data flows, establishing a U.S.-EU Trade & Technology Council, engaging China through prioritization of digital and technology issues, broadening U.S. engagement and leadership in the Asia-Pacific region, addressing key barriers to digital trade with India, and providing capacity building assistance to the African Union;
    • Promote U.S. competitiveness through leadership on digital trade by countering unilateral, targeted digital taxes, building acceptance of state-of-the-art digital trade commitments, promoting workforce development initiatives globally, and more; and
    • Reassert U.S. multilateral leadership by strengthening and leveraging engagement in global fora such as the WTO, OECD, United Nations, G20, G7, APEC, and others, and by expanding existing plurilateral trade agreements.
  • A group of civil rights organizations and public interest organizations issued “Civil Rights, Privacy, and Technology: Recommended 2021 Oversight Priorities for the 117th Congress” that builds upon the October 2020 Civil Rights Principles for the Era of Big Data. These groups stated:
    • The 117th Congress must take action to ensure that technology serves all people in the United States, rather than facilitating discrimination or reinforcing existing inequities.
    • They cited the following areas of policy that need to be addressed:
      • Broadband Internet
      • Democracy: Voting, the Census, and Hateful Content Online
      • Policing and Justice
      • Immigration Surveillance Technology
      • Commercial Data Practices and Privacy
      • Workers, Labor, and Hiring
  • The United Kingdom’s (UK) Information Commissioner Elizabeth Denham sketched out how she is approaching her final year in office in a blog post. Denham stated:
    • The ICO’s immediate focus remains supporting organisations through the impacts of COVID 19. We have prioritised providing advice and support on data protection related aspects of the pandemic since the start, and will continue to do so, adjusting and responding to the new challenges the country will face until, well, ‘all this is finished’. That work includes protecting people’s rights, and making sure data protection is considered at the earliest stage of any innovations.
    • The Age Appropriate Design Code will start to have a real impact, as the transition period around its introduction comes to an end, and we will be working hard to support organisations to make the necessary changes to comply with the law.
    • We’ll also be focused on supporting organisations around data sharing, following the publication of our guidance last month. The guidance is accompanied by practical resources to help organisations share data in line with the law. As I discussed with the House of Lords Public Services Committee this month, data sharing is an important area of focus, and we will also be supporting broader work to encourage the necessary culture change to remove obstacles to data sharing.
    • Other support for organisations planned for this year includes guidance on political campaigning, facial recognition, and codes of conduct and certification schemes, as well as a digital version of our Data Protection Practitioners’ Conference in April. We’ll also have the latest phases of our grants scheme and sandbox programme. Both are an effective way of the ICO supporting original thinking around privacy, illustrated by the innovative data sharing projects we’ve recently worked with.
    • Our operational work will also continue, including the latest phases of our work looking at data broking, the use of sexual crime victims’ personal information, and adtech, including audits focused on digital marketing platforms.

Coming Events

  • On 10 February, the House Homeland Committee will hold a hearing titled “Homeland Cybersecurity: Assessing Cyber Threats and Building Resilience” with these witnesses:
    • Mr. Chris Krebs, Former Director, Cybersecurity and Infrastructure Security Agency, U.S. Department of Homeland Security
    • Ms. Sue Gordon, Former Principal Deputy Director of National Intelligence, Office of the Director of National Intelligence
    • Mr. Michael Daniel, President & CEO, Cyber Threat Alliance
    • Mr. Dmitri Alperovitch, Executive Chairman, Silverado Policy Accelerator
  • The House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Justice Restored: Ending Forced Arbitration and Protecting Fundamental Rights” on 11 February.
  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Supushpitha Atapattu from Pexels

Further Reading, Other Developments, and Coming Events (4 February 2021)

Further Reading

  • Global Privacy Control wants to succeed where Do Not Track failed” By Russell Brandom — The Verge. A new effort to block tracking people across the internet and selling people’s information has launched, the Global Privacy Control. This initiative is looking to leverage a provision currently effective in the “California Consumer Privacy Act” (CCPA) (AB 375) that is also in the recently enacted “California Privacy Rights Act” (CPRA) (aka Proposition 24) that requires covered entities to honor when people opt out in a global fashion. This browser add on will transmit the message to websites and other entities that the user does not want to have her data sold, which will have to be honored under California law. The piece cites a Tweet from outgoing California Attorney General Xavier Becerra (D) endorsing the notion generally. Of course, much remains to unfold on this front, but it may prove an easy, effective way for people to guard their privacy.
  • A Former Comcast Employee Explains Why Low-Income WiFi Packages Aren’t Helping Students” By Caroline O’Donovan — BuzzFeed News. Comcast’s Internet Essentials seems insufficient for low-income families with multiple children needing to use videoconferencing for school. A group of students in Baltimore tried working with the company to increase the speed of this low cost package, but the company did nothing more than offer to help the students doing the advocacy. There are other stakeholders in the government and other sectors who think Comcast’s efforts are not enough in the midst of a pandemic.
  • Facebook Ad Services Let Anyone Target US Military Personnel” By Lily Hay Newman — WIRED. Researchers have turned up evidence that united states military personnel could be easily targeted with misinformation as part of attempts to radicalize them or run psychological operations on them. Facebook, naturally, denies there is any such capability with its targeted advertising system, and this new type of threat seems outside the scope of what most experts consider as the main threats from social media.
  • Nextdoor Is Quietly Replacing the Small-Town Paper” By Will Oremus — OneZero. There is another social media platform on which misinformation may be flourishing although perhaps at the cost of local media losing revenue. Nextdoor allows neighbors (but only those with snail mail addresses screening out the homeless) to share information, data, rumors, biases, paranoia, etc. And while the platform fences off each community (e.g., members of the Savannah, Georgia cohort cannot get access to the Jacksonville, Florida group), there is no seemingly effective mechanism to fight lies and misinformation. So it sounds much like the neighborhood WhatsApp group I’m on where one gentlemen is forever spamming everyone with anti-vaccine claims and news about how well Sweden was handling COVID-19 by doing nothing, at least until the government in Stockholm disavowed that approach. I find the WhatsApp group a breeding ground for racial and class biases, and a number of Nextdoor users are reporting the same. Moreover the platform is competing with local media for some of the same advertisers, exacerbating the trend of reduced revenue for media since Facebook and Google came to dominate the advertising market.
  • Google switches ad tracking tech ahead of Apple privacy update” By Rae Hodge — c/net. Google is taking a quieter path than Facebook in pushing back against Apple’s forthcoming change to its iOS that will prompt iPhone users to agree to letting apps track them (i.e., App Tracking Transparency (ATT) policy). Google is switching from the use of IDFA to another Apple tool, SKAdNetwork, which is considered not as good as IDFA.
  • Facebook strikes back against Apple privacy change, prompts users to accept tracking to get ‘better ads experience” By Salvador Rodriguez — CNBC. Speaking of Apple’s pending change, Facebook seems to be moving preemptively to start offering iPhone and iPad users a choice on letting the social media giant use their information to show them personalized ads. The Facebook popup will appear before Apple’s popup. We should probably expect an Apple countermove soon.

Other Developments

  • The Biden White House issued a “Memorandum on Restoring Trust in Government Through Scientific Integrity and Evidence-Based Policymaking” that will change how the United States (U.S.) government uses and deploys data and evidence. There are a range of actions for agencies inside the White House and the Administration to neutralize and remove procedures put in place during the Trump Administration that disregarded science.
    • In relevant part, the memorandum says:
      • Scientific findings should never be distorted or influenced by political considerations.  When scientific or technological information is considered in policy decisions, it should be subjected to well-established scientific processes, including peer review where feasible and appropriate, with appropriate protections for privacy.  Improper political interference in the work of Federal scientists or other scientists who support the work of the Federal Government and in the communication of scientific facts undermines the welfare of the Nation, contributes to systemic inequities and injustices, and violates the trust that the public places in government to best serve its collective interests.
  • Facebook Oversight Board issued its first decisions, overturning Facebook in four of the five cases. Facebook has committed itself to being bound by these decisions. The panel also made “nine policy recommendations to the company” in the decisions. The Oversight Board explained:
    • Facebook now has seven days to restore content in line with the Board’s decisions. The company will also examine whether identical content with parallel context associated with the Board’s decisions should remain on its platform. In addition, Facebook must publicly respond to any policy recommendations the Board has made in its decisions within 30 days.
    • The Oversight Board made the following decisions:
      • Overturned Facebook’s decision on case 2020-002-FB-UA to remove a post under its Community Standard on Hate Speech. The post commented on the supposed lack of reaction to the treatment of Uyghur Muslims in China, compared to the violent reaction to cartoons in France. Click here for more information.
      • Upheld Facebook’s decision on case 2020-003-FB-UA to remove a post under its Community Standard on Hate Speech. The post used the Russian word “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed have no history compared to Armenians. Click here for more information.
      • Overturned Facebook’s original decision on case 2020-004-IG-UA to remove a post under its Community Standard on Adult Nudity and Sexual Activity. The post included photos of breast cancer symptoms which, in some cases, showed uncovered female nipples. Click here for more information.
      • Overturned Facebook’s decision on case 2020-005-FB-UA to remove a post under its Community Standard on Dangerous Individuals and Organizations. The post included an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. Click here for more information.
      • Overturned Facebook’s decision on case 2020-006-FB-FBR to remove a post under its Community Standard on Violence and Incitement. The post criticized the lack of a health strategy in France and included claims that a cure for COVID-19 exists. Click here for more information.
  • House Armed Services Committee announced the creation of a new cyber-focused subcommittee that will split off from the existing the Intelligence and Emerging Threats and Capabilities Subcommittee. The former chair of that subcommittee, Representative James Langevin (D-RI), will chair the Cyber, Innovative Technologies, and Information Systems (CITI) Subcommittee with jurisdiction over the following:
    • Cyber Security, Operations, and Forces
    • Information Technology, Systems, and Operations
    • Science and Technology Programs and Policy
    • Defense-Wide Research and Development (except Missile Defense and Space)
    • Artificial Intelligence Policy and Programs
    • Electromagnetic Spectrum Policy
    • Electronic Warfare Policy
    • Computer Software Acquisition Policy
    • Now the House Armed Services Committee will match the Senate Armed Services Committee, which as a Cybersecurity Committee established when the late Senator John McCain (R-AZ) chaired the full committee.
  • The European Union Agency for Cybersecurity (ENISA) published a report “on pseudonymisation for personal data protection, “Data Pseudonymisation: Advanced Techniques and Use Cases,” providing a technical analysis of cybersecurity measures in personal data protection and privacy.” ENISA stated:
    • As there is no one-size-fits-all pseudonymisation technique, a high level of competence is needed to reduce threats and maintain efficiency in processing pseudonymised data across different scenarios. The ENISA report aims to support data controllers and processors in implementing pseudonymisation by providing possible techniques and use cases that could fit different scenarios.
    • The report underlines the need to take steps that include the following:
      • Each case of personal data processing needs to be analysed to determine the most suitable technical option in relation to pseudonymisation;
      • An in-depth look into the context of personal data processing before data pseudonymisation is applied;
      • Continuous analysis of state-of-the-art in the field of data pseudonymisation, as new research and business models break new ground;
      • Developing advanced pseudonymisation scenarios for more complex cases, for example when the risks of personal data processing are deemed to be high;
      • Further discussion on the broader adoption of data pseudonymisation at EU and Member States levels alike.
  • The United States (U.S.) Chamber of Commerce’s Center for Capital Markets Competitiveness (CCMC) released a new report, “Digital Assets: A Framework for Regulation to Maintain the United States’ Status as an Innovation Leader,” “providing recommendations to help guide policymakers in developing a more closely coordinated response to the regulation of digital assets.” In its press release, the CCMC explained the “report has a focus on financial services regulatory systems due to their significant impact on digital assets and related blockchain innovation, and outlines several recommendations for promoting innovation in the digital assets space, including:
    • Implement technology-neutral regulation
    • Implement principles-based regulation
    • Avoid regulation by enforcement
    • Ensure good faith compliance
    • Establish regulatory flexibility
    • Create digital asset categorization
    • Establish a White House Task Force focused on digital assets
  • The Australian Securities and Investments Commission (ASIC) revealed that “an unidentified threat actor accessed an ASIC server containing attachments to Australian credit licence applications submitted to ASIC between 1 July 2020 and 28 December 2020.” ASIC added:
    • The cyber incident occurred due to a vulnerability in a file transfer appliance (FTA) provided by California-based Accellion and used by ASIC to receive attachments to Australian credit licence applications.
    • ASIC has determined that the credit licence application forms held within the server were not accessed. Analysis by ASIC’s independent forensic investigators shows no evidence that attachments were opened or downloaded.
    • However, the filenames of attachments for credit licence applications that were submitted to ASIC between 1 July 2020 and 28 December 2020 may have been viewed by the threat actor. For example, the credit licence applicant’s name or the name of an individual responsible manager, if these were used in the filename of the attachment (e.g. police check, CV) may have been viewed by the threat actor.
  • In a blog posting, the United Kingdom’s (UK) Information Commissioner’s Office (ICO) regarding “the recently agreed UK and EU Trade and Cooperation Agreement (TCA).” Information Commissioner Elizabeth Denham explained her view on data protection in the UK during the period when data transfers to the UK will be treated as if the European Union (EU) has an adequacy decision about UK law:
    • High standards and co-operation 
      • I must begin by welcoming the commitment by both the EU and UK to ensuring a high level of personal data protection, and to working together to promote high international standards.
      • As envisaged by the TCA, I look forward to developing a new regulatory relationship with European data protection authorities, sharing ideas and data protection expertise and co-operating on enforcement actions where appropriate. As evidenced by our work globally, regulatory cooperation remains key to ensuring we can protect the public’s personal data wherever it resides. The ICO will also continue to develop its international strategy.
    • Data flows: short term bridging provisions and adequacy
      • The TCA contains an important safety net, allowing transfers of data from the EU to UK to continue without restriction for four months whilst the EU considers the UK’s application for adequacy. This is the usual mechanism used by the EU to allow for continued data flow with third countries. This is very welcome news and was the best possible outcome for UK organisations given the risks and impacts of no adequacy. This bridge contained within the TCA will provide a legally robust mechanism that can give UK organisations confidence to continue digital trade in the coming months.
      • The EU has committed (in a Declaration alongside the TCA) to consider promptly the UK’s adequacy application. The Government is taking the lead on that process, with the ICO providing independent regulatory advice when appropriate. We’ll publish more details in due course as the outcome of the adequacy process becomes clear.
      • Whilst we wait for an adequacy decision, for the bridge to continue any new UK adequacy regulations, standard contractual clauses or ICO approvals of international transfer mechanisms, must be put before the TCA’s oversight mechanisms.
    • Data flows: keeping us safe
      • Our police and other law enforcement authorities, in the UK and EU, rely on sharing information with each other to prevent, investigate and prosecute crimes, and ultimately to keep us all safe.
      • Part three of the TCA sets out detailed provisions allowing data sharing for law enforcement. It includes arrangements for the transfer of DNA data, fingerprints, vehicle registrations and Passenger Name Record (PNR) data. It also allows for the UK to access data from EUROPOL and EUROJUST. Part three also contains important commitments to key elements of data protection and for the ICO to be consulted about data protection assessments related to PNR data.
      • I welcome the provisions in the TCA which bake-in the importance of high standards of data protection and international data flows for UK citizens and for the UK economy – they keep us safe, they support our economy, they keep us connected. In our ever-innovating, inter-connected world, my role is to make sure that data flows continue, and continue to protect UK citizens, so they can continue to enjoy digital services underpinned by a seamless flow of data.

Coming Events

  • The Federal Communications Commission’s (FCC) acting Chair Jessica Rosenworcel will hold a virtual Roundtable on Emergency Broadband Benefit Program on 12 February “a new a program that would enable eligible households to receive a discount on the cost of broadband service and certain connected devices during the COVID-19 pandemic.” The FCC also noted “[i]n the Consolidated Appropriations Act of 2021, Congress appropriated $3.2 billion” for the program.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by michelmondadori from Pixabay

Further Reading, Other Developments, and Coming Events (2 February 2021)

Further Reading

  • I checked Apple’s new privacy ‘nutrition labels.’ Many were false.” By Geoffrey Fowler — The Washington Post. It turns out the blue check mark in Apple’s App Store signifying that an app does not collect personal data is based on the honor system. As the Post’s technology columnist learned, Apple tells users this in very small print: “This information has not been verified by Apple.” And so, as Fowler explains, this would seem contrary to the company’s claims of making user privacy a core value. Also, Apple’s definition of tracking is narrow, suggesting the company may be defining its way to being a champion of privacy. Finally, Apple’s practices in light of the coming changes to its iOS to defeat Facebook and others’ tracking people across digital space seem to belie the company’s PR and branding. It would seem like the Federal Trade Commission (FTC) and its overseas counterparts would be interested in such deceptive and unfair practices.
  • Lawmakers Take Aim at Insidious Digital ‘Dark Patterns’” By Tom Simonite — WIRED. Language in the “California Privacy Rights Act” (CPRA) makes consent gained through the use of “dark patterns” (i.e., all those cognitive tricks online and real-life entities use to slant the playing field against consumers) invalid. However, lest one celebrate that policymakers are addressing these underhanded means of gaining consent or selling things, the to be established California Privacy Protection Agency will need to define what dark patterns are and write the regulations barring whatever those will be. In Washington state, the sponsors of the Washington Privacy Act (SB 5062) copied the CPRA language, setting up the possibility Washington state could follow California. It remains to be seen how, or even if, federal privacy legislation proposals deal with dark patterns. And it well may considering that Senators Mark Warner (D-VA) and Deb Fischer (R-NE) introduced the “Deceptive Experiences To Online Users Reduction (DETOUR) Act” (S.1084) in 2019. Moreover, again, as in the previous article, one might think the Federal Trade Commission (FTC) and its overseas counterparts might be interested in policing dark patterns.
  • A PR screwup draws unwanted attention to Google’s Saudi data centers” By Issie Lapowsky — Protocol. The best case scenario is that Google and Snap misstated what cloud infrastructure and content are in the Kingdom of Saudi Arabia. And in this case, privacy and civil liberties groups are unfairly pouncing on the companies over essentially garbling the truth. On the other hand, it may turn out that the companies are routing traffic and content through the repressive regime, allowing a government with an abysmal human rights record to access the data of people. Time may tell what is actually happening, but the two companies are furiously telling the world that there’s nothing to see here.
  • China’s Leader Attacks His Greatest Threat” By John Pomfret — The Atlantic. Xi Jinping, President of the People’s Republic of China (PRC) and Chairman of the Chinese Communist Party (CCP) has accelerated a crack down on entrepreneurs and technology companies started by his predecessors. This would ultimately impair the PRC’s ambitions of becoming the world’s dominant power through technological superiority.
  • Why Is Big Tech Policing Speech? Because the Government Isn’t” By Emily Bazelon — The New York Times. The First Amendment to the United States (U.S.) Constitution is invariably cited in the online speech debate as a reason why people cannot be silenced and as to why social media platforms can silence whom they like. This is an interesting survey of this right in the U.S. and how democracies in Europe have a different understanding of permissible speech.

Other Developments

  • In a recent press conference, White House Press Secretary Jen Psaki shed light on how the Biden Administration will change United States (U.S.) policy towards the People’s Republic of China (PRC). In response to a question about how the U.S. government will deal with TikTok and the PRC generally, Psaki stated:
    • I think our approach to China remains what it has been since — for the last months, if not longer.  We’re in a serious competition with China.  Strategic competition with China is a defining feature of the 21st century.  China is engaged in conduct that it hurts American workers, blunts our technological edge, and threatens our alliances and our influence in international organizations.  
    • What we’ve seen over the last few years is that China is growing more authoritarian at home and more assertive abroad.  And Beijing is now challenging our security, prosperity, and values in significant ways that require a new U.S. approach. 
    • And this is one of the reasons, as we were talking about a little bit earlier, that we want to approach this with some strategic patience, and we want to conduct reviews internally, through our interagency….We wanted to engage more with Republicans and Democrats in Congress to discuss the path forward.  And most importantly, we want to discuss this with our allies. 
    • We believe that this moment requires a strategic and a new approach forward.
    • [T]echnology, as I just noted, is, of course, at the center of the U.S.-China competition.  China has been willing to do whatever it takes to gain a technological advantage — stealing intellectual property, engaging in industrial espionage, and forcing technology transfer.
    • Our view — the President’s view is we need to play a better defense, which must include holding China accountable for its unfair and illegal practices and making sure that American technologies aren’t facilitating China’s military buildup.
    • So he’s firmly committed to making sure that Chinese companies cannot misappropriate and misuse American data.  And we need a comprehensive strategy, as I’ve said, and a more systematic approach that actually addresses the full range of these issues.
    • So there is, again, an ongoing review of a range of these issues.  We want to look at them carefully, and we’ll be committed to approaching them through the lens of ensuring we’re protecting U.S. data and America’s technological edge. 
  • The top Republican on the House Foreign Affairs Committee is calling on Senate Republicans to block Governor Gina Raimondo’s nomination to be the Secretary of Commerce until the White House indicates whether they will keep Huawei on a list of entities to whom the United States (U.S.) restricts exports. Ranking Member Michael McCaul (R-TX) asserted:
    • It is incredibly alarming the Biden Administration has refused to commit to keeping Huawei on the Department of Commerce’s Entity List. Huawei is not a normal telecommunications company – it is a CCP military company that threatens 5G security in our country, steals U.S. intellectual property, and supports the Chinese Communist Party’s genocide in Xinjiang and their human rights abuses across the country. We need a Commerce Department with strong national security credentials and a Secretary with a clear understanding of the CCP threat. Saying people should not use Huawei and actually keeping them on the Entity List are two very different things that result in very different outcomes. I again strongly urge the Biden Administration to reconsider this dangerous position. Until they make their intentions clear on whether they will keep Huawei on the Entity List, I urge my Senate colleagues to hold Ms. Raimondo’s confirmation.
    • McCaul added this background:
      • After the Biden Administration’s nominee for Commerce Secretary, Gina Raimondo, caused heads to turn by refusing to commit to keeping Huawei on the Entity List, White House Press Secretary Jen Psaki seemed to double down by declining on two separate occasions when directly asked to say where President Biden stood on the issue.
      • Huawei was placed on the Commerce Department’s Entity List in August of 2019. Their addition to the Entity List was also one of the recommendations of the [House Republican’s] China Task Force Report.
  • The National Highway Traffic Safety Administration (NHTSA), an agency of the United States (U.S.) Department of Transportation (DOT) is asking for comment “on the Agency’s updated draft cybersecurity best practices document titled Cybersecurity Best Practices for the Safety of Modern Vehicles” according to the notice published in the Federal Register. Comments are due by 15 March 2021. NHTSA explained:
    • In October 2016, NHTSA issued its first best practices document focusing on the cybersecurity of motor vehicles and motor vehicle equipment.Cybersecurity Best Practices for Modern Vehicles (“2016 Best Practices”) was the culmination of years of extensive engagement with public and private stakeholders and NHTSA research on vehicle cybersecurity and methods of enhancing vehicle cybersecurity industry-wide. As explained in the accompanying Federal Register document, NHTSA’s 2016 Best Practices was released with the goal of supporting industry-led efforts to improve the industry’s cybersecurity posture and provide the Agency’s views on how the automotive industry could develop and apply sound risk-based cybersecurity management processes during the vehicle’s entire lifecycle.
    • The 2016 Best Practices leveraged existing automotive domain research as well as non-automotive and IT-focused standards such as the National Institute of Standards and Technology (NIST) Cybersecurity Framework and the Center for internet Security’s Critical Security Controls framework. NHTSA considered these sources to be reasonably applicable and appropriate to augment the limited industry-specific guidance that was available at the time. At publication, NHTSA noted that the 2016 Best Practices were intended to be updated with new information, research, and other cybersecurity best practices related to the automotive industry. NHTSA invited comments from stakeholders and interested parties in response to the document.
    • NHTSA is docketing a draft update to the agency’s 2016 Best Practices, titled Cybersecurity Best Practices for the Safety of Modern Vehicles (2020 Best Practices) for public comments. This update builds upon agency research and industry progress since 2016, including emerging voluntary industry standards, such as the ISO/SAE Draft International Standard (DIS) 21434, “Road Vehicles—Cybersecurity Engineering.” In addition, the draft update references a series of industry best practice guides developed by the Auto-ISAC through its members.
    • The 2020 Best Practices also reflect findings from NHTSA’s continued research in motor vehicle cybersecurity, including over-the-air updates, encryption methods, and building our capability in cybersecurity penetration testing and diagnostics, and the new learnings obtained through researcher and stakeholder engagement. Finally, the updates included in the 2020 Best Practices incorporate insights gained from public comments received in response to the 2016 guidance and from information obtained during the annual SAE/NHTSA Vehicle Cybersecurity Workshops.
  • Ireland’s Data Protection Commission (DPC) has released a draft Fundamentals for a Child-Oriented Approach to Data Processing Draft Version for Consultation (Fundamentals) for consultation until 31 March 2021. The DPC asserted the
    • Fundamentals have been drawn up by the Data Protection Commission (DPC) to drive improvements in standards of data processing. They introduce child-specific data protection interpretative principles and recommended measures that will enhance the level of protection afforded to children against the data processing risks posed to them by their use of/ access to services in both an online and offline world. In tandem, the Fundamentals will assist organisations that process children’s data by clarifying the principles, arising from the high-level obligations under the GDPR, to which the DPC expects such organisations to adhere.
    • The DPC “identified the following 14 Fundamentals that organisations should follow to enhance protections for children in the processing of their personal data:
      • 1. FLOOR OF PROTECTION: Online service providers should provide a “floor” of protection for all users, unless they take a risk-based approach to verifying the age of their users so that the protections set out in these Fundamentals are applied to all processing of children’s data (Section 1.4 “Complying with the Fundamentals”).
      • 2. CLEAR-CUT CONSENT: When a child has given consent for their data to be processed, that consent must be freely given, specific, informed and unambiguous, made by way of a clear statement or affirmative action (Section2.4 “Legal bases for processing children’s data”).
      • 3. ZERO INTERFERENCE: Online service providers processing children’s data should ensure that the pursuit of legitimate interests do not interfere with, conflict with or negatively impact, at any level, the best interests of the child (Section 2.4 “Legal bases for processing children’s data”).
      • 4. KNOW YOUR AUDIENCE: Online service providers should take steps to identify their users and ensure that services directed at/ intended for or likely to be accessed by children have child-specific data protection measures in place (Section 3.1 “Knowing your audience”)
      • 5. INFORMATION IN EVERY INSTANCE: Children are entitled to receive information about the processing of their own personal data irrespective of the legal basis relied on and even if consent was given by a parent on their behalf to the processing of their personal data (Section 3 “Transparency and children”).
      • 6. CHILD-ORIENTED TRANSPARENCY: Privacy information about how personal data is used must be provided in a concise, transparent, intelligible and accessible way, using clear and plain language that is comprehensible and suited to the age of the child (Section 3 “Transparency and children”).
      • 7 .LET CHILDREN HAVE THEIR SAY: Online service providers shouldn’t forget that children are data subjects in their own right and have rights in relation to their personal data at any age. The DPC considers that a child may exercise these rights at any time, as long as they have the capacity to do so and it is in their best interests. (Section 4.1 “The position of children as rights holders”)
      • 8. CONSENT DOESN’T CHANGE CHILDHOOD: Consent obtained from children or from the guardians/ parents should not be used as a justification to treat children of all ages as if they were adults (Section 5.1 “Age of digital consent”).
      • 9. YOUR PLATFORM, YOUR RESPONSIBILITY: Companies who derive revenue from providing or selling services through digital and online technologies pose particular risks to the rights and freedoms of children. Where such a company uses age verification and/ or relies on parental consent for processing, the DPC will expect it to go the extra mile in proving that its measures around age verification and verification of parental consent are effective. (Section 5.2 “Verification of parental consent)
      • 10. DON’T SHUT OUT CHILD USERS OR DOWNGRADE THEIR EXPERIENCE: If your service is directed at, intended for, or likely to be accessed by children, you can’t bypass your obligations simply by shutting them out or depriving them of a rich service experience. (Section 5.4 “Age verification and the child’s user experience”)
      • 11. MINIMUM USER AGES AREN’T AN EXCUSE: Theoretical user age thresholds for accessing services don’t displace the obligations of organisations to comply with the controller obligations under the GDPR and the standards and expectations set out in these Fundamentals where “underage” users are concerned. (Section 5.5 “Minimum user ages”)
      • 12. PROHIBITION ON PROFILING: Online service providers should not profile children and/ or carry out automated decision making in relation to children, or otherwise use their personal data, for marketing/advertising purposes due to their particular vulnerability and susceptibility to behavioural advertising, unless they can clearly demonstrate how and why it is in the best interests of the child to do so (Section 6.2 “Profiling and automated decision making”).
      • 13. DO A DPIA: Online service providers should undertake data protection impact assessments to minimise the data protection risks of their services, and in particular the specific risks to children which arise from the processing of their personal data. The principle of the best interests of the child must be a key criterion in any DPIA and must prevail over the commercial interests of an organisation in the event of a conflict between the two sets of interests (Section 7.1 “Data Protection Impact Assessments”).
      • 14. BAKE IT IN: Online service providers that routinely process children’s personal data should, by design and by default, have a consistently high level of data protection which is “baked in” across their services (Section 7.2 “Data Protection by Design and Default”)
  • The United Kingdom’s (UK) Competition and Markets Authority (CMA) “is now seeking evidence from academics and industry experts on the potential harms to competition and consumers caused by the deliberate or unintended misuse of algorithms…[and] is also looking for intelligence on specific issues with particular firms that the CMA could examine and consider for future action.” CMA stated “[t]he research and feedback will inform the CMA’s future work in digital markets, including its programme on analysing algorithms and the operation of the new Digital Markets Unit (DMU), and the brand-new regulatory regime that the DMU will oversee.” The CMA stated:
    • Algorithms can be used to personalise services in ways that are difficult to detect, leading to search results that can be manipulated to reduce choice or artificially change consumers’ perceptions. An example of this is misleading messages which suggest a product is in short supply.
    • Companies can also use algorithms to change the way they rank products on websites, preferencing their own products and excluding competitors. More complex algorithms could aid collusion between businesses without firms directly sharing information. This could lead to sustained higher prices for products and services.
    • The majority of algorithms used by private firms online are currently subject to little or no regulatory oversight and the research concludes that more monitoring and action is required by regulators, including the CMA. The CMA has already considered the impact of algorithms on competition and consumers in previous investigations, for example monitoring the pricing practices of online travel agents.
    • In the algorithms paper, the CMA explained:
      • The publication of this paper, and the accompanying call for information mark the launch of a new CMA programme of work on analysing algorithms, which aims to develop our knowledge and help us better identify and address harms. This paper reviews the potential harms to competition and consumers from the use of algorithms, focussing on those the CMA or other national competition or consumer authorities may be best placed to address.
      • We first describe direct harms to consumers, many of which involve personalisation. Personalisation can be harmful because it is difficult to detect either by consumers or others, targets vulnerable consumers or has unfair distributive effects. These harms often occur through the manipulation of consumer choices, without the awareness of the consumer.
      • The paper then explores how the use of algorithms can exclude competitors and so reduce competition (for example, a platform preferencing its own products). We outline the most recent developments in the algorithmic collusion literature; collusion appears an increasingly significant risk if the use of more complex pricing algorithms becomes widespread. We also describe how using ineffective algorithms to oversee platform activity fails to prevent harm.
      • Next, we summarise techniques that could be used to analyse algorithmic systems. Potentially problematic systems can be identified even without access to underlying algorithms and data. However, to understand fully how an algorithmic system works and whether consumer or competition law is being breached, regulators need appropriate methods to audit the system. We finally discuss the role of regulators. Regulators can help to set standards and facilitate better accountability of algorithmic systems, including support for the development of ethical approaches, guidelines, tools and principles. They can also use their information gathering powers to identify and remedy harms on either a case-by-case basis or as part of an ex-ante regime overseen by a regulator of technology firms, such as the proposed Digital Markets Unit (DMU) in the UK.
  • The National Institute of Standards and Technology (NIST) is making available for comment a draft of NIST Special Publication (SP) 800-47 Revision 1, Managing the Security of Information Exchanges, that “provides guidance on identifying information exchanges; risk-based considerations for protecting exchanged information before, during, and after the exchange; and example agreements for managing the protection of the exchanged information.” NIST is accepting comments through 12 March 2021. The agency stated:
    • Rather than focus on any particular type of technology-based connection or information access, this draft publication has been updated to define the scope of information exchange, describe the benefits of securely managing the information exchange, identify types of information exchanges, discuss potential security risks associated with information exchange, and detail a four-phase methodology to securely manage information exchange between systems and organizations. Organizations are expected to further tailor the guidance to meet specific organizational needs and requirements.
    • NIST is specifically interested in feedback on:
      • Whether the agreements addressed in the draft publication represent a comprehensive set of agreements needed to manage the security of information exchange.
      • Whether the matrix provided to determine what types of agreements are needed is helpful in determining appropriate agreement types.
      • Whether additional agreement types are needed, as well as examples of additional agreements.
      • Additional resources to help manage the security of information exchange.

Coming Events

  • On 3 February, the Senate Commerce, Science, and Transportation Committee will consider the nomination of Rhode Island Governor Gina Raimondo to be the Secretary of Commerce.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by John Howard from Pixabay

Don’t Look Now; Second State After CA On The Verge Of Enacting Privacy Legislation…and It Isn’t WA

Acting on bills introduced in January, the VA House and Senate have passed identical privacy bills.

There are times where it seems there are far too many technology policy developments to stay on top of, and that is just in the United States (U.S.) And while I have written at some length about the Washington legislature making yet another run at enacting privacy legislation for the third straight year, I apparently should have been paying attention to the state of my residence, Virginia. Last month, the legislature start working on privacy bills and over the last week or so, both chambers have passed bills with identical text, meaning enactment is all but assured. And so, as this will be only the second universal privacy regime passed by a state and with no sign that 2021 is the year Congress and the White House agree on federal legislation, this may be the most significant development on privacy this year.

In mid-January, the “Consumer Data Protection Act” (SB 1392/HB 2307) was introduced and quickly made its way through both chambers of the Virginia legislature. In the last week, identical bills were passed by the Senate and the House of Delegates with only the formality remaining of reconciling the two bills before it is sent to the Governor. If it is enacted, as seems very likely, the bill becomes effective on 1 January 2023, giving entities covered by the bill just shy of two full years to prepare.

Big picture, this bill is one of the weaker privacy bills within sight of enactment. It would permit many of the same data collection and processing activities currently occurring in Virginia to continue largely in the same fashion in 2023. The bill uses the opt out consent model but only in limited circumstances, for if entities disclose how they propose to process personal information, there limited cases in which people could opt out. There is no private right of action, and the attorney general would have to give entities a 30 day window to cure any potential violations and would be barred from proceeding if his office receives an express written statement that the violations have been cured. Given how much weaker this bill than others, it is little wonder it is sailing through the Virginia legislature.

Those entities subject to the act are:

  • An entity that controls or processes the personal data of 100,000 or more residents; or
  • An entity that controls or processes the personal data of 25,000 or more residents and earned more than 50% of its gross revenue in the previous year from selling personal data

However, the bill has more carveouts characteristic of a number of privacy bills introduced over the last few years, including those covered by some of the following federal privacy regimes, among others:

  • Health Information Portability and Accountability Act of 1996 (HIPAA)/Health Information Technology for Economic and Clinical Health (HITECH) Act
  • Financial Services Modernization Act of 1999 (aka Gramm-Leach-Bliley)
  • Fair Credit Reporting Act (FCRA)
  • Family Educational Rights and Privacy Act (FERPA)
  • Children’s Online Privacy Protection Act (COPPA)

A key difference between this bill and others with similar language is that an entity merely needs to be covered by one of these laws and not necessarily compliant. Most other privacy bills require compliance with these and other federal regimes in order to be exempted.

The Consumer Data Protection Act uses the same terminology as the European Union’s (EU) General Data Protection Regulation (GDPR) regarding entities that determine how personal data will be processed and those that do the processing: controllers and processors respectively.

A number of definitions are crucial in the bill. Personal data excludes publicly available data and de-identified data, the latter of which creates a safe harbor incentive for entities to de-identify the personal data they collect, maintain, and process, for many of the new obligations entities covered by this bill face pertain to personal data. The definition of personal data is fairly broad as it includes “any information that is linked or reasonably linkable to an identified or identifiable natural person.” There is a subset of these data subject to more stringent protection: sensitive data which includes:

  • Personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status;
  • The processing of genetic or biometric data for the purpose of uniquely identifying a natural person;
  • The personal data collected from a known child; or
  • Precise geolocation data.

The definition of “Sale of personal data” may be so narrow that some common practices in the data world would be outside the definition, and this would matter because people are given the right to opt out of the sales of data and not necessarily the sharing of their personal data. Companies like Facebook have gone before Congress and stated they do not sell the personal data of its users, and this seems to be accurate. Instead, they trade and share personal data, activities which would seem to fall outside the definition in this bill which involves “the exchange of personal data for monetary consideration by the controller to a third party.” Had it been just “consideration,” then activities like Facebook would have been subject to the limitation that people can use to opt out. On the other hand, a fair reading of monetary consideration would seem to be cash or its equivalent, and it is arguable whether a controller trading personal data with a third party qualifies. This may get sorted out by a Virginia court.

There are the by now expected exceptions to the strictures against collecting and processing data without the consent of residents, some of which controllers and processors may bend out of all shape.

The Consumer Data Privacy Act would create the same sorts of rights for people that other privacy bills would. And as with almost all the other privacy bills, a person could submit a request to a controller that must be responded to within 45 days, which is not to say that action must occur within that timeframe. If the request is complex or there is some other reason why 45 days is not enough the controller may alert the person and then take another 45 days. If the controller denies the request, the person may use the appeal system each controller must have, and if they are still denied they may file a complaint with the state attorney general’s office.

Among the rights people would get visa vis controllers under the Consumer Data Privacy Act are:

  • Requesting whether a controller is processing their personal data, and if so, obtaining access to such personal data
  • Correcting inaccuracies in personal data depending the nature of the information and the purposes of the processing, suggesting for lower stakes processing and information of lesser importance controllers may be free to deny such requests
  • Asking that personal data be deleted
  • Receiving one’s data in portable format
  • Opting out of processing:
    • for the purpose of targeted advertising
    • the sale of personal data; and
    • “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer”

Taking the last right first, it appears people could not opt of most processing of their personal data. There are some other circumstances under which people in Virginia would be able to opt out but these are limited. Consequently, it appears the default would be controllers are able to collect and process within certain limits to be discussed below. The rights to delete, correct, and port are fairly standard.

Controllers must “[l]imit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer.” Moreover, it is made clear that processing without consent is permissible so long as it is reasonably necessary or compatible “with the disclosed purposes for which such personal data is processed.” Processing for purposes beyond those reasonably necessary or compatible is permitted but only with the consent of the person. And so, there will be fights about the purposes that would be exempted from the consent requirement as controllers will almost certainly seek to push the boundaries of what is “reasonably necessary” or “compatible. Of course, a controller may also write a disclosure notifying people of the very broad processing of personal data and so people would be on notice about this processing.

The Consumer Data Privacy Act uses boilerplate language about security requirements. Controllers must “[e]stablish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data…[and] [s]uch data security practices shall be appropriate to the volume and nature of the personal data at issue.” The implicit sliding scale in this formulation is elegant in style, but how difficult will it be for controllers, processors, the attorney general, people, and courts to determine where the lines lie for certain classes of information.

The bill bars processing of personal data in violation of federal and state anti-discrimination laws. Controllers cannot retaliate against people who exercise the rights established  by the act with some important caveats. This provision states that nothing “prohibit[s] a controller from offering a different price, rate, level, quality, or selection of goods or services to a consumer, including offering goods or services for no fee, if the consumer has exercised his right to opt out pursuant to § 59.1-573 (i.e., opting out of targeted advertising, the sale of one’s personal data, or for profiling to make decisions with legal effects.) Hence, exercising the opt out right could get costly as controllers would be free to offer different tiers of services or products. There is also a carveout for loyalty and rewards programs. And yet, sensitive data may not be processed without consent.

There is a provision nullifying contractual language ostensibly forcing people to forgo any of the rights bestowed by the bill.

Controllers must provide privacy policies that identify the categories of personal data being processed and the purposes of the processing, inform people how they can exercise their rights, and name the categories of personal data shared with third parties and the categories of third parties with whom personal data are shared. Controllers who process for targeted advertising or sell data must make these facts conspicuous in their privacy policies. There is no language on the complexity, comprehensibility or length of such policies. Given the dense and impenetrable privacy policies currently available to people, it stands to reason that this will continue to be the norm in Virginia.

Processors are bound to follow the direction of the controllers that share personal data with them, and this and other obligations must be set down in a contract between the parties. Processors will also need to help controllers in a number of ways, including helping them respond to requests and assisting them in the event of a data breach. Processors will be required to assist controllers which perform audits. Moreover, processors must return or delete personal data to the controller upon request and will have a duty of confidentiality.

For certain classes of processing, controllers will need to conduct data protection assessments:

  • Selling data
  • Targeted advertising
  • Profiling but only if there are “reasonably foreseeable risks” of
    • unfair or deceptive treatment of, or unlawful disparate impact on, consumers; (ii) financial, physical, or reputational injury to consumers; (iii) a physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person; or (iv) other substantial injury to consumers;
  • Sensitive data; and
  • “Any processing activities involving personal data that present a heightened risk of harm to consumers”

Controllers must conduct these assessments according to a number of factors and considerations:

Data protection assessments…shall identify and weigh the benefits that may flow, directly and indirectly, from the processing to the controller, the consumer, other stakeholders, and the public against the potential risks to the rights of the consumer associated with such processing, as mitigated by safeguards that can be employed by the controller to reduce such risks. The use of de-identified data and the reasonable expectations of consumers, as well as the context of the processing and the relationship between the controller and the consumer whose personal data will be processed, shall be factored into this assessment by the controller.

The Attorney General may request and receive these data protection assessments in the course of an investigation, but they must be kept confidential and would not be subject to a freedom of information request.

Regarding de-identified data, controllers holding this type of data must commit to not re-identifying it and make reasonable efforts to ensure these data cannot be associated with people. Additionally, if a controller holds personal data in pseudonymous form with “any information necessary to identify the consumer” being held safely and securely separate from the pseudonymous data, then the controller does not need to respond to a number of consumer requests.

Naturally, this privacy bill contains a long list of exceptions, including compliance with federal and state law and court orders and warrants. Many of these are fairly standard, but there are some that may lend themselves to creative, expansive interpretations by controllers and processors looking to get out of complying with the act such as:

  • Prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, malicious or deceptive activities, or any illegal activity; preserve the integrity or security of systems; or investigate, report, or prosecute those responsible for any such action;
  • Conduct internal research to develop, improve, or repair products, services, or technology;
  • Effectuate a product recall;
  • Identify and repair technical errors that impair existing or intended functionality; or
  • Perform internal operations that are reasonably aligned with the expectations of the consumer or reasonably anticipated based on the consumer’s existing relationship with the controller or are otherwise compatible with processing data in furtherance of the provision of a product or service specifically requested by a consumer or the performance of a contract to which the consumer is a party.

The Virginia Attorney General would be able to enforce the act. However, before bringing an action, the Attorney General must “provide a controller or processor 30 days’ written notice identifying the specific provisions of this chapter the Attorney General, on behalf of a consumer, alleges have been or are being violated.” And amazingly, if the controller or processor provides “an express written statement that the alleged violations have been cured and that no further violations shall occur,” the Attorney General cannot bring an action for statutory damages unless there are further violations. In this case, the Attorney General could seek $7500 per violation.

There was a private right of action in the House’s version of the bill last year. It would have utilized the right of action currently available in the Virginia Consumer Act that would have been available to residents in the event the act is violated.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Developments, and Coming Events (10 December)

Further Reading

  • Social media superspreaders: Why Instagram, not Facebook, will be the real battleground for COVID-19 vaccine misinformation” By Isobel Asher Hamilton — Business Insider. According to one group, COVID-19 anti-vaccination lies and misinformation are proliferating on Instagram despite its parent company’s, Facebook, efforts to find and remove such content. There has been dramatic growth in such content on Instagram, and Facebook seems to be applying COVID-19 standards more loosely on Instagram. In fact, some people kicked off of Facebook for violating that platform’s standards on COVID-19 are still on Instagram spreading the same lies, misinformation, and disinformation. For example, British anti-vaccination figure David Icke was removed from Facebook for making claims that COVID-19 was caused by or related to 5G, but he has a significant following on Instagram.
  • ‘Grey area’: China’s trolling drives home reality of social media war” By Chris Zappone — The Sydney Morning Herald. The same concept that is fueling aggressive cyber activity at a level below outright war has spread to diplomacy. The People’s Republic of China (PRC) has been waging “gray” social media campaigns against a number of Western nations, including Australia, mainly be propagating lies and misinformation. The most recent example is the spreading a fake photo of an Australian soldier appearing to kill an Afghan child. This false material seems designed to distract from the real issues between the two nations arising from clashing policies on trade and human rights. The PRC’s activities do not appear to violate Australia’s foreign interference laws and seem to have left Canberra at a loss as to how to respond effectively.
  • Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show” By Elizabeth Dwoskin, Nitasha Tiku and Heather Kelly — The Washington Post. Facebook will apparently seek to revamp its algorithms to target the types of hate speech that have traditionally targeted women and minority groups. Up until now all attacks were treated equally so that something like “white people suck” would be treated the same way as anti-Semitic content. Facebook has resisted changes for years even though experts and civil rights groups made the case that people of color, women, and LGBTI people endure far more abuse online. There is probably no connection between Facebook’s more aggressive content moderation policies and the advent of a new administration in Washington more receptive to claims that social media platforms allow the abuse of these people.
  • How Joe Biden’s Digital Team Tamed the MAGA Internet” By Kevin Roose — The New York Times. Take this piece with a block of salt. The why they won articles are almost always rife with fallacies, including the rationale that if a candidate won, his or her strategy must have worked. It is not clear that the Biden Campaign’s online messaging strategy of being nice and emphasizing positive values actually beat the Trump Campaign’s “Death Star” so much as the President’s mishandling of the pandemic response and cratering of the economy did him in.
  • Coronavirus Apps Show Promise but Prove a Tough Sell” By Jennifer Valentino-DeVries — The New York Times. It appears the intersection of concerns about private and public sector surveillance from two very different groups has worked to keep down rates of adopting smartphone COVID tracking apps in the United States. There are people wary of private sector practices to hoover up as much data as possible, and others concerned about the government’s surveillance activities. Consequently, many are shunning Google and Apple’s COVID contact tracing apps to the surprise of government, industry, and academia. A pair of studies show resistance to downloading or using such apps even if there are very strong privacy safeguards. This result may well be a foreseeable outcome from U.S. policies that have allowed companies and the security services to collect and use vast quantities of personal information.
  • UAE target of cyber attacks after Israel deal, official says” — Reuters. A top cybersecurity official in the United Arab Emirates claimed his nation’s financial services industries were targeted for cyber attack and implied Iran and affiliated hackers were responsible.

Other Developments

  • President-elect Joe Biden announced his intention to nominate California Attorney General Xavier Becerra to serve as the next Secretary of Health and Human Services (HHS). If confirmed by the Senate, California Governor Gavin Newsom would name Becerra’s successor who would need to continue enforcement of the “California Consumer Privacy Act” (CCPA) (AB 375) while also working towards the transition to the “California Privacy Rights Act” (Proposition 24) approved by California voters last month. The new statute establishes the California Privacy Protection Agency that will assume the Attorney General’s responsibilities regarding the enforcement of California’s privacy laws. However, Becerra’s successor may play a pivotal role in the transition between the two regulators and the creation of the new regulations needed to implement Proposition 24.
  • The Senate approved the nomination of Nathan Simington to be a Commissioner of the Federal Communications Commission (FCC) by a 49-46 vote. Once FCC Chair Ajit Pai steps down, the agency will be left with two Democratic and two Republican Commissioners, pending the Biden Administration’s nominee to fill Pai’s spot. If the Senate stays Republican, it is possible the calculation could be made that a deadlocked FCC is better than a Democratic agency that could revive net neutrality rules among other Democratic and progressive policies. Consequently, Simington’s confirmation may be the first step in a FCC unable to develop substantive policy.
  • Another federal court has broadened the injunction against the Trump Administration’s ban on TikTok to encompass the entirety of the Department of Commerce’s September order meant to stop the usage of the application in the United States (U.S.) It is unclear as to whether the Trump Administration will appeal, and if it should, whether a court would decide the case before the Biden Administration begins in mid-January. The United States Court for the District of Columbia found that TikTok “established that  the government likely exceeded IEEPA’s express limitations as part of an agency action that was arbitrary and capricious” and would likely suffer irreparable harm, making an injunction an appropriate remedy.
  • The United States’ National Security Agency (NSA) “released a Cybersecurity Advisory on Russian state-sponsored actors exploiting CVE-2020-4006, a command-injection vulnerability in VMware Workspace One Access, Access Connector, Identity Manager, and Identity Manager Connector” and provided “mitigation and detection guidance.”
  • The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint alert, warning that U.S. think tanks are being targeted by “persistent continued cyber intrusions by advanced persistent threat (APT) actors.” The agencies stated “[t]his malicious activity is often, but not exclusively, directed at individuals and organizations that focus on international affairs or national security policy.” CISA and the FBI stated its “guidance may assist U.S. think tanks in developing network defense procedures to prevent or rapidly detect these attacks.” The agencies added:
    • APT actors have relied on multiple avenues for initial access. These have included low-effort capabilities such as spearphishing emails and third-party message services directed at both corporate and personal accounts, as well as exploiting vulnerable web-facing devices and remote connection capabilities. Increased telework during the COVID-19 pandemic has expanded workforce reliance on remote connectivity, affording malicious actors more opportunities to exploit those connections and to blend in with increased traffic. Attackers may leverage virtual private networks (VPNs) and other remote work tools to gain initial access or persistence on a victim’s network. When successful, these low-effort, high-reward approaches allow threat actors to steal sensitive information, acquire user credentials, and gain persistent access to victim networks.
    • Given the importance that think tanks can have in shaping U.S. policy, CISA and FBI urge individuals and organizations in the international affairs and national security sectors to immediately adopt a heightened state of awareness and implement the critical steps listed in the Mitigations section of this Advisory.
  • A group of Democratic United States Senators have written the CEO of Alphabet and Google about its advertising policies and how its platforms may have been used to spread misinformation and contribute to voter suppression. Thus far, most of the scrutiny about the 2020 election and content moderation policy has fallen on Facebook and Twitter even though Google-owned YouTube has been flagged as containing the same amount of misinformation. Senators Amy Klobuchar (D-MN) and Mark Warner (D-VA) led the effort and expressed “serious concerns regarding recent reports that Google is profiting from the sale of ads spreading election-related disinformation” to Alphabet and Google CEO Sundar Pichai. Klobuchar, Warner, and their colleagues asserted:
    • Google is also helping organizations spreading election-related disinformation to raise revenue by placing ads on their websites. While Google has some policies in place to prevent the spread of election misinformation, they are not properly enforced and are inadequate. We urge you to immediately strengthen and improve enforcement of your policies on election-related disinformation and voter suppression, reject all ads spreading election-related disinformation, and stop providing advertising services on sites that spread election-related disinformation.
    • …a recent study by the Global Disinformation Index (GDI) found that Google services ads on 145 out of 200 websites GDI examined that publish disinformation. 
    • Similarly, a recent report from the Center for Countering Digital Hate (CCDH) found that Google has been placing ads on websites publishing disinformation designed to undermine elections. In examining just six websites publishing election-related disinformation, CCDH estimates that they receive 40 million visits a month, generating revenue for these sites of up to $3.4 million annually from displaying Google ads. In addition, Google receives $1.6 million from the advertisers’ payments annually.  These sites published stories ahead of the 2020 general election that contained disinformation alleging that voting by mail was not secure, that mail-in voting was being introduced to “steal the election,” and that election officials were “discarding mail ballots.” 
  • A bipartisan group of United States Senators on one committee are urging Congressional leadership to include funding to help telecommunications companies remove and replace Huawei and ZTE equipment and to aid the Federal Communications Commission (FCC) in drafting accurate maps of broadband service in the United States (U.S.). Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) and a number of his colleagues wrote the leadership of both the Senate and House and argued:
    • we urge you to provide full funding for Public Law 116-124, the Secure and Trusted Communications Networks Act, and Public Law 116-130, the Broadband DATA Act.   
    • Closing the digital divide and winning the race to 5G are critical to America’s economic prosperity and global leadership in technology. However, our ability to connect all Americans and provide access to next-generation technology will depend in large part on the security of our communications infrastructure. The Secure and Trusted Communications Networks Act (“rip and replace”) created a program to help small, rural telecommunications operators remove equipment posing a security threat to domestic networks and replace it with equipment from trusted providers. This is a national security imperative. Fully funding this program is essential to protecting the integrity of our communications infrastructure and the future viability of our digital economy at large.
    • In addition to safeguarding the security of the nation’s communications systems, developing accurate broadband maps is also critically important. The United States faces a persistent digital divide, and closing this divide requires accurate maps that show where broadband is available and where it is not. Current maps overstate broadband availability, which prevents many underserved communities, particularly in rural areas, from receiving the funds needed to build or expand broadband networks to millions of unconnected Americans. Fully funding the Broadband DATA Act will ensure more accurate broadband maps and better stewardship over the millions of dollars the federal government awards each year to support broadband deployment. Without these maps, the government risks overbuilding existing networks, duplicating funding already provided, and leaving communities unserved.  
  • The Government Accountability Office (GAO) released an assessment of 5G policy options that “discusses (1) how the performance goals and expected uses are to be realized in U.S. 5Gwireless networks; (2) the challenges that could affect the performance or usage of 5G wireless networks in the U.S.; and (3) policy options to address these challenges.” The report had been requested by the chairs and ranking members of the House Armed Services, Senate Armed Services, Senate Intelligence, and House Intelligence Committees along with other Members. The GAO stated “[w]hile 5G is expected to deliver significantly improved network performance and greater capabilities, challenges may hinder the performance or usage of 5G technologies in the U.S. We grouped the challenges into the following four categories:
    • availability and efficient use of spectrum
    • security of 5G networks
    • concerns over data privacy
    • concerns over possible health effects
    • The GAO presented the following policy options along with opportunities and considerations for each:
      • Spectrum-Sharing Technologies Opportunities:
        • Could allow for more efficient use of the limited spectrum available for 5G and future generations of wireless networks.
        • It may be possible to leverage existing5G testbeds for testing the spectrum sharing technologies developed through applied research.
      • Spectrum-Sharing Technologies Considerations:
        • Research and development is costly, must be coordinated and administered, and its potential benefits are uncertain. Identifying a funding source, setting up the funding mechanism, or determining which existing funding streams to reallocate will require detailed analysis.
      • Coordinated Cybersecurity Monitoring Opportunities:
        • A coordinated monitoring program would help ensure the entire wireless ecosystem stays knowledgeable about evolving threats, in close to real time; identify cybersecurity risks; and allow stakeholders to act rapidly in response to emerging threats or actual network attacks.
      • Coordinated Cybersecurity Monitoring Considerations:
        • Carriers may not be comfortable reporting incidents or vulnerabilities, and determinations would need to be made about what information is disclosed and how the information will be used and reported.
      • Cybersecurity Requirements Opportunities
        • Taking these steps could produce a more secure network. Without a baseline set of security requirements the implementation of network security practices is likely to be piecemeal and inconsistent.
        • Using existing protocols or best practices may decrease the time and cost of developing and implementing requirements.
      • Cybersecurity Requirements Considerations
        • Adopting network security requirements would be challenging, in part because defining and implementing the requirements would have to be done on an application-specific basis rather than as a one-size-fits-all approach.
        • Designing a system to certify network components would be costly and would require a centralized entity, be it industry-led or government-led.
      • Privacy Practices Considerations
        • Development and adoption of uniform privacy practices would benefit from existing privacy practices that have been implemented by states, other countries, or that have been developed by federal agencies or other organizations.
      • Privacy Practices Opportunities
        • Privacy practices come with costs, and policymakers would need to balance the need for privacy with the direct and indirect costs of implementing privacy requirements. Imposing requirements can be burdensome, especially for smaller entities.
      • High-band Research Opportunities
        • Could result in improved statistical modeling of antenna characteristics and more accurately representing propagation characteristics.
        • Could result in improved understanding of any possible health effects from long-term radio frequency exposure to high-band emissions.
      • High-band Research Considerations
        • Research and development is costly and must be coordinated and administered, and its potential benefits are uncertain. Policymakers will need to identify a funding source or determine which existing funding streams to reallocate.

Coming Events

  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up on 10 December.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Tima Miroshnichenko from Pexels

Tech Election Results

A number of tech ballot initiatives were considered.

There were a number of significant technology measures put before voters in states in yesterday’s election. The most significant were in California as voters agreed to replace the “California Consumer Privacy Act” (CCPA) (AB 375) with a new privacy bill, voted for another technology-related ballot initiative, and rejected another one. In voting for Proposition 24, California voters chose to replace the recently effective CCPA with the “California Privacy Rights Act” (CPRA) (see here for my analysis) that will largely be operative on 1 January 2023, meaning the CCPA will continue to be the law of California until then unless a federal privacy law is enacted that preempts all state laws.

California voters voted for Proposition 22 that would allow Uber, Lyft and other companies to “Classif[y] app-based drivers as “independent contractors,” instead of “employees,” and provide[] independent-contractor drivers other compensation, unless certain criteria are met.” This ballot initiative would essentially negate AB 5, legislation that codified a court ruling that created the presumption that a person hired by an employer is an employee and not a contractor. Uber and Lyft have been fighting enforcement of AB 5 in court.

Voters also rejected Proposition 25 that would have permitted a 2018 statute to take effect that would have abolished cash bail in California with a system that determines who gets bail on the basis of algorithms. Elsewhere, Michigan voters overwhelmingly voted to support Proposal 20-2. Require Warrant for Electronic Data that would change state law to make electronic communications data protected to the extent police would need to obtain a search warrant before accessing it. In Massachusetts, voters supported expanding a right to repair cars law that would require auto manufacturers to make available telematic data to third-party repair garages. This law is seen as a precursor of a similar right to repair hardware that could soon be placed on ballots throughout the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by David Mark from Pixabay

Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay

Further Reading, Other Developments, and Coming Events (22 October)

Further Reading

  •  “A deepfake porn Telegram bot is being used to abuse thousands of women” By Matt Burgess — WIRED UK. A bot set loose on Telegram can take pictures of women and, apparently teens, too, and “takes off” their clothing, rendering a naked image of females who never took naked pictures. This seems to be the next iteration in deepfake porn, a problem that will surely get worse until governments legislate against it and technology companies have incentives to locate and take down such material.
  • The Facebook-Twitter-Trump Wars Are Actually About Something Else” By Charlie Warzel — The New York Times. This piece makes the case that there are no easy fixes for American democracy or for misinformation on social media platforms.
  • Facebook says it rejected 2.2m ads for breaking political campaigning rules” — Agence France-Presse. Facebook’s Vice President of Global Affairs and Communications Nick Clegg said the social media giant is employing artificial intelligence and humans to find and remove political advertisements that violate policy in order to avoid a repeat of 2016 where untrue information and misinformation played roles in both Brexit and the election of Donald Trump as President of the United States.
  • Huawei Fallout—Game-Changing New China Threat Strikes At Apple And Samsung” By Zak Doffman — Forbes. Smartphone manufacturers from the People’s Republic of China (PRC) appear ready to step into the projected void caused by the United States (U.S.) strangling off Huawei’s access to chips. Xiaomi and Oppo have already seen sales surge worldwide and are poised to pick up where Huawei is being forced to leave off, perhaps demonstrating the limits of U.S. power to blunt the rise of PRC technology companies.
  • As Local News Dies, a Pay-for-Play Network Rises in Its Place” By Davey Alba and Jack Nicas — The New York Times. With a decline and demise of many local media outlets in the United States, new groups are stepping into the void, and some are politically minded but not transparent about biases. The organization uncovered in this article is nakedly Republican and is running and planting articles at both legitimate and artificial news sites for pay. Sometimes conservative donors pay, sometimes campaigns do. Democrats are engaged in the same activity but apparently to a lesser extent. These sorts of activities will only erode further faith in the U.S. media.
  • Forget Antitrust Laws. To Limit Tech, Some Say a New Regulator Is Needed.” By Steve Lohr — The New York Times. This piece argues that anti-trust enforcement actions are plodding, tending to take years to finish. Consequently, this body of law is inadequate to the task of addressing the market dominance of big technology companies. Instead, a new regulatory body is needed along the lines of those regulating the financial services industries that is more nimble than anti-trust. Given the problems in that industry with respect to regulation, this may not be the best model.
  • “‘Do Not Track’ Is Back, and This Time It Might Work” By Gilad Edelman — WIRED. Looking to utilize the requirement in the “California Consumer Privacy Act” (CCPA) (AB 375) that requires regulated entities to respect and effectuate the use of a one-time opt-out mechanism, a group of entities have come together to build and roll out the Global Privacy Control. In theory, users could download this technical specification to their phones and computers, install it, use it once, and then all websites would be on notice regarding that person’s privacy preferences. Such a means would go to the problem turned up by Consumer Reports recent report on the difficulty of trying to opt out of having one’s personal information sold.
  • EU countries sound alarm about growing anti-5G movement” By Laurens Cerulus — Politico. 15 European Union (EU) nations wrote the European Commission (EC) warning that the nascent anti-5G movement borne of conspiracy thinking and misinformation threatens the Eu’s position vis-à-vis the United States (U.S.) and the People’s Republic of China (PRC). There have been more than 200 documented arson attacks in the EU with the most having occurred in the United Kingdom, France, and the Netherlands. These nations called for a more muscular, more forceful debunking of the lies and misinformation being spread about 5G.
  • Security firms call Microsoft’s effort to disrupt botnet to protect against election interference ineffective” By Jay Greene — The Washington Post. Microsoft seemingly acted alongside the United States (U.S.) Cyber Command to take down and impair the operation of Trickbot, but now cybersecurity experts are questioning how effective Microsoft’s efforts really were. Researchers have shown the Russian operated Trickbot has already stood up operations and has dispersed across servers around the world, showing how difficult it is to address some cyber threats.
  • Governments around the globe find ways to abuse Facebook” By Sara Fischer and Ashley Gold — Axios. This piece puts a different spin on the challenges Facebook faces in countries around the world, especially those that ruthlessly use the platform to spread lies and misinformation than the recent BuzzFeed News article. The new article paints Facebook as the well-meaning company being taken advantage of while the other one portrays a company callous to content moderation except in nations where it causes them political problems such as the United States, the European Union, and other western democracies.

Other Developments

  • The United States (U.S.) Department of Justice’s (DOJ) Cyber-Digital Task Force (Task Force) issued “Cryptocurrency: An Enforcement Framework,” that “provides a comprehensive overview of the emerging threats and enforcement challenges associated with the increasing prevalence and use of cryptocurrency; details the important relationships that the Department of Justice has built with regulatory and enforcement partners both within the United States government and around the world; and outlines the Department’s response strategies.” The Task Force noted “[t]his document does not contain any new binding legal requirements not otherwise already imposed by statute or regulation.” The Task Force summarized the report:
    • [I]n Part I, the Framework provides a detailed threat overview, cataloging the three categories into which most illicit uses of cryptocurrency typically fall: (1) financial transactions associated with the commission of crimes; (2) money laundering and the shielding of legitimate activity from tax, reporting, or other legal requirements; and (3) crimes, such as theft, directly implicating the cryptocurrency marketplace itself. 
    • Part II explores the various legal and regulatory tools at the government’s disposal to confront the threats posed by cryptocurrency’s illicit uses, and highlights the strong and growing partnership between the Department of Justice and the Securities and Exchange Commission, the Commodity Futures Commission, and agencies within the Department of the Treasury, among others, to enforce federal law in the cryptocurrency space.
    • Finally, the Enforcement Framework concludes in Part III with a discussion of the ongoing challenges the government faces in cryptocurrency enforcement—particularly with respect to business models (employed by certain cryptocurrency exchanges, platforms, kiosks, and casinos), and to activity (like “mixing” and “tumbling,” “chain hopping,” and certain instances of jurisdictional arbitrage) that may facilitate criminal activity.    
  • The White House’s Office of Science and Technology Policy (OSTP) has launched a new website for the United States’ (U.S.) quantum initiative and released a report titled “Quantum Frontiers: Report On Community Input To The Nation’s Strategy For Quantum Information Science.” The Quantum Initiative flows from the “National Quantum Initiative Act” (P.L. 115-368) “to  provide  for  a  coordinated  Federal  program  to  accelerate  quantum  research  and  development  for  the  economic and national security of the United States.” The OSTP explained that the report “outlines eight frontiers that contain core problems with fundamental questions confronting quantum information science (QIS) today:
    • Expanding Opportunities for Quantum Technologies to Benefit Society
    • Building the Discipline of Quantum Engineering
    • Targeting Materials Science for Quantum Technologies
    • Exploring Quantum Mechanics through Quantum Simulations
    • Harnessing Quantum Information Technology for Precision Measurements
    • Generating and Distributing Quantum Entanglement for New Applications
    • Characterizing and Mitigating Quantum Errors
    • Understanding the Universe through Quantum Information
    • OSTP asserted “[t]hese frontier areas, identified by the QIS research community, are priorities for the government, private sector, and academia to explore in order to drive breakthrough R&D.”
  • The New York Department of Financial Services (NYDFS) published its report on the July 2020 Twitter hack during which a team of hacker took over a number of high-profile accounts (e.g. Barack Obama, Kim Kardashian West, Jeff Bezos, and Elon Musk) in order to perpetrate a cryptocurrency scam. The NYDFS has jurisdiction over cryptocurrencies and companies dealing in this item in New York. The NYDFS found that the hackers used the most basic means to acquire permission to take over accounts. The NYDFS explained:
    • Given that Twitter is a publicly traded, $37 billion technology company, it was surprising how easily the Hackers were able to penetrate Twitter’s network and gain access to internal tools allowing them to take over any Twitter user’s account. Indeed, the Hackers used basic techniques more akin to those of a traditional scam artist: phone calls where they pretended to be from Twitter’s Information Technology department. The extraordinary access the Hackers obtained with this simple technique underscores Twitter’s cybersecurity vulnerability and the potential for devastating consequences. Notably, the Twitter Hack did not involve any of the high-tech or sophisticated techniques often used in cyberattacks–no malware, no exploits, and no backdoors.
    • The implications of the Twitter Hack extend far beyond this garden-variety fraud. There are well-documented examples of social media being used to manipulate markets and interfere with elections, often with the simple use of a single compromised account or a group of fake accounts.In the hands of a dangerous adversary, the same access obtained by the Hackers–the ability to take control of any Twitter users’ account–could cause even greater harm.
    • The Twitter Hack demonstrates the need for strong cybersecurity to curb the potential weaponization of major social media companies. But our public institutions have not caught up to the new challenges posed by social media. While policymakers focus on antitrust and content moderation problems with large social media companies, their cybersecurity is also critical. In other industries that are deemed critical infrastructure, such as telecommunications, utilities, and finance, we have established regulators and regulations to ensure that the public interest is protected. With respect to cybersecurity, that is what is needed for large, systemically important social media companies.
    • The NYDFS recommended the cybersecurity measures cryptocurrency companies in New York should implement to avoid similar hacks, including its own cybersecurity regulations that bind its regulated entities in New York. The NYDFS also called for a national regulator to address the lack of a dedicated regulator of Twitter and other massive social media platforms. The NYDFS asserted:
      • Social media companies currently have no dedicated regulator. They are subject to the same general oversight applicable to other companies. For instance, the SEC’s regulations for all public companies apply to public social media companies, and antitrust and related laws and regulations enforced by the Department of Justice and the FTC apply to social media companies as they do to all companies. Social media companies are also subject to generally applicable laws, such as the California Consumer Privacy Act and the New York SHIELD Act. The European Union’s General Data Protection Regulation, which regulates the storage and use of personal data, also applies to social media entities doing business in Europe.
      • But there are no regulators that have the authority to uniformly regulate social media platforms that operate over the internet, and to address the cybersecurity concerns identified in this Report. That regulatory vacuum must be filled.
      • A useful starting point is to create a “systemically important” designation for large social media companies, like the designation for critically important bank and non-bank financial institutions. In the wake of the 2007-08 financial crisis, Congress established a new regulatory framework for financial institutions that posed a systemic threat to the financial system of the United States. An institution could be designated as a Systemically Important Financial Institution (“SIFI”) “where the failure of or a disruption to the functioning of a financial market utility or the conduct of a payment, clearing, or settlement activity could create, or increase, the risk of significant liquidity or credit problems spreading among financial institutions or markets and thereby threaten the stability of the financial system of the United States.”
      • The risks posed by social media to our consumers, economy, and democracy are no less grave than the risks posed by large financial institutions. The scale and reach of these companies, combined with the ability of adversarial actors who can manipulate these systems, require a similarly bold and assertive regulatory approach.
      • The designation of an institution as a SIFI is made by the Financial Stability Oversight Council (“FSOC”), which Congress established to “identify risks to the financial stability of the United States” and to provide enhanced supervision of SIFIs.[67] The FSOC also “monitors regulatory gaps and overlaps to identify emerging sources of systemic risk.” In determining whether a financial institution is systemically important, the FSOC considers numerous factors including: the effect that a failure or disruption to an institution would have on financial markets and the broader financial system; the nature of the institution’s transactions and relationships; the nature, concentration, interconnectedness, and mix of the institution’s activities; and the degree to which the institution is regulated.
      • An analogue to the FSOC should be established to identify systemically important social media companies. This new Oversight Council should evaluate the reach and impact of social media companies, as well as the society-wide consequences of a social media platform’s misuse, to determine which companies they should designate as systemically important. Once designated, those companies should be subject to enhanced regulation, such as through the provision of “stress tests” to evaluate the social media companies’ susceptibility to key threats, including cyberattacks and election interference.
      • Finally, the success of such oversight will depend on the establishment of an expert agency to oversee designated social media companies. Systemically important financial companies designated by the FSOC are overseen by the Federal Reserve Board, which has a long-established and deep expertise in banking and financial market stability. A regulator for systemically important social media would likewise need deep expertise in areas such as technology, cybersecurity, and disinformation. This expert regulator could take various forms; it could be a completely new agency or could reside within an established agency or at an existing regulator.
  • The Government Accountability Office (GAO) evaluated how well the Trump Administration has been implementing the “Open, Public, Electronic and Necessary Government Data Act of 2018” (OPEN Government Data Act) (P.L. 115-435). As the GAO explained, this statute “requires federal agencies to publish their information as open data using standardized, nonproprietary formats, making data available to the public open by default, unless otherwise exempt…[and] codifies and expands on existing federal open data policy including the Office of Management and Budget’s (OMB) memorandum M-13-13 (M-13-13), Open Data Policy—Managing Information as an Asset.”
    • The GAO stated
      • To continue moving forward with open government data, the issuance of OMB implementation guidance should help agencies develop comprehensive inventories of their data assets, prioritize data assets for publication, and decide which data assets should or should not be made available to the public.
      • Implementation of this statutory requirement is critical to agencies’ full implementation and compliance with the act. In the absence of this guidance, agencies, particularly agencies that have not previously been subject to open data policies, could fall behind in meeting their statutory timeline for implementing comprehensive data inventories.
      • It is also important for OMB to meet its statutory responsibility to biennially report on agencies’ performance and compliance with the OPEN Government Data Act and to coordinate with General Services Administration (GSA) to improve the quality and availability of agency performance data that could inform this reporting. Access to this information could inform Congress and the public on agencies’ progress in opening their data and complying with statutory requirements. This information could also help agencies assess their progress and improve compliance with the act.
    • The GAO made three recommendations:
      • The Director of OMB should comply with its statutory requirement to issue implementation guidance to agencies to develop and maintain comprehensive data inventories. (Recommendation 1)
      • The Director of OMB should comply with the statutory requirement to electronically publish a report on agencies’ performance and compliance with the OPEN Government Data Act. (Recommendation 2)
      • The Director of OMB, in collaboration with the Administrator of GSA, should establish policy to ensure the routine identification and correction of errors in electronically published performance information. (Recommendation 3)
  • The United States’ (U.S.) National Security Agency (NSA) issued a cybersecurity advisory titled “Chinese State-Sponsored Actors Exploit Publicly Known Vulnerabilities,” that “provides Common Vulnerabilities and Exposures (CVEs) known to be recently leveraged, or scanned-for, by Chinese state-sponsored cyber actors to enable successful hacking operations against a multitude of victim networks.” The NSA recommended a number of mitigations generally for U.S. entities, including:
    • Keep systems and products updated and patched as soon as possible after patches are released.
    • Expect that data stolen or modified (including credentials, accounts, and software) before the device was patched will not be alleviated by patching, making password changes and reviews of accounts a good practice.
    • Disable external management capabilities and set up an out-of-band management network.
    • Block obsolete or unused protocols at the network edge and disable them in device configurations.
    • Isolate Internet-facing services in a network Demilitarized Zone (DMZ) to reduce the exposure of the internal network.
    • Enable robust logging of Internet-facing services and monitor the logs for signs of compromise.
    • The NSA then proceeded to recommend specific fixes.
    • The NSA provided this policy backdrop:
      • One of the greatest threats to U.S. National Security Systems (NSS), the U.S. Defense Industrial Base (DIB), and Department of Defense (DOD) information networks is Chinese state-sponsored malicious cyber activity. These networks often undergo a full array of tactics and techniques used by Chinese state-sponsored cyber actors to exploit computer networks of interest that hold sensitive intellectual property, economic, political, and military information. Since these techniques include exploitation of publicly known vulnerabilities, it is critical that network defenders prioritize patching and mitigation efforts.
      • The same process for planning the exploitation of a computer network by any sophisticated cyber actor is used by Chinese state-sponsored hackers. They often first identify a target, gather technical information on the target, identify any vulnerabilities associated with the target, develop or re-use an exploit for those vulnerabilities, and then launch their exploitation operation.
  • Belgium’s data protection authority (DPA) (Autorité de protection des données in French or Gegevensbeschermingsautoriteit in Dutch) (APD-GBA) has reportedly found that the Transparency & Consent Framework (TCF) developed by the Interactive Advertising Bureau (IAB) violates the General Data Protection Regulation (GDPR). The Real-Time Bidding (RTB) system used for online behavioral advertising allegedly transmits the personal information of European Union residents without their consent even before a popup appears on their screen asking for consent. The APD-GBA is the lead DPA in the EU in investigating the RTB and will likely now circulate their findings and recommendations to other EU DPAs before any enforcement will commence.
  • None Of Your Business (noyb) announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish Data Protection Commission (DPC) today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.) Last month, after the DPC directed Facebook to stop transferring the personal data of EU citizens to the U.S., the company filed suit in the Irish High Court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • noyb further asserted:
      • Instead of making a decision in the pending procedure, the DPC has started a second, new investigation into the same subject matter (“Parallel Procedure”), as widely reported (see original reporting by the WSJ). No logical reasons for the Parallel Procedure was given, but the DPC has maintained that Mr Schrems will not be heard in this second case, as he is not a party in this Parallel Procedure. This Paralell procedure was criticised by Facebook publicly (link) and instantly blocked by a Judicial Review by Facebook (see report by Reuters).
      • Today’s Judicial Review by noyb is in many ways the counterpart to Facebook’s Judicial Review: While Facebook wants to block the second procedure by the DPC, noyb wants to move the original complaints procedure towards a decision.
      • Earlier this summer, the CJEU struck down the adequacy decision for the agreement between the EU and (U.S. that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data to the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • Most likely with the aim of helping British chances for an adequacy decision from the European Union (EU), the United Kingdom’s Information Commissioner’s Office (ICO) published guidance that “discusses the right of access [under the General Data Protection Regulation (GDPR)] in detail.” The ICO explained “is aimed at data protection officers (DPOs) and those with specific data protection responsibilities in larger organisations…[but] does not specifically cover the right of access under Parts 3 and 4 of the Data Protection Act 2018.”
    • The ICO explained
      • The right of access, commonly referred to as subject access, gives individuals the right to obtain a copy of their personal data from you, as well as other supplementary information.
  • The report the House Education and Labor Ranking Member requested from the Government Accountability Office (GAO) on the data security and data privacy practices of public schools. Representative Virginia Foxx (R-NC) asked the GAO “to review the security of K-12 students’ data. This report examines (1) what is known about recently reported K-12 cybersecurity incidents that compromised student data, and (2) the characteristics of school districts that experienced these incidents.” Strangely, the report did have GAO’s customary conclusions or recommendations. Nonetheless, the GAO found:
    • Ninety-nine student data breaches reported from July 1, 2016 through May 5, 2020 compromised the data of students in 287 school districts across the country, according to our analysis of K-12 Cybersecurity Resource Center (CRC) data (see fig. 3). Some breaches involved a single school district, while others involved multiple districts. For example, an attack on a vendor system in the 2019-2020 school year affected 135 districts. While information about the number of students affected was not available for every reported breach, examples show that some breaches affected thousands of students, for instance, when a cybercriminal accessed 14,000 current and former students’ personally identifiable information (PII) in one district.
    • The 99 reported student data breaches likely understate the number of breaches that occurred, for different reasons. Reported incidents sometimes do not include sufficient information to discern whether data were breached. We identified 15 additional incidents in our analysis of CRC data in which student data might have been compromised, but the available information was not definitive. In addition, breaches can go undetected for some time. In one example, the personal information of hundreds of thousands of current and former students in one district was publicly posted for 2 years before the breach was discovered.
    • The CRC identified 28 incidents involving videoconferences from April 1, 2020 through May 5, 2020, some of which disrupted learning and exposed students to harm. In one incident, 50 elementary school students were exposed to pornography during a virtual class. In another incident in a different district, high school students were targeted with hate speech during a class, resulting in the cancellation that day of all classes using the videoconferencing software. These incidents also raise concerns about the potential for violating students’ privacy. For example, one district is reported to have instructed teachers to record their class sessions. Teachers said that students’ full names were visible to anyone viewing the recording.
    • The GAO found gaps in the protection and enforcement of student privacy by the United States government:
      • [The Department of] Education is responsible for enforcing Family Educational Rights and Privacy Act (FERPA), which addresses the privacy of PII in student education records and applies to all schools that receive funds under an applicable program administered by Education. If parents or eligible students believe that their rights under FERPA have been violated, they may file a formal complaint with Education. In response, Education is required to take appropriate actions to enforce and deal with violations of FERPA. However, because the department’s authority under FERPA is directly related to the privacy of education records, Education’s security role is limited to incidents involving potential violations under FERPA. Further, FERPA amendments have not directly addressed educational technology use.
      • The “Children’s Online Privacy Protection Act” (COPPA) requires the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s privacy. The COPPA Rule, which took effect in 2000 and was later amended in 2013, requires operators of covered websites or online services that collect personal information from children under age 13 to provide notice and obtain parental consent, among other things. COPPA generally applies to the vendors who provide educational technology, rather than to schools directly. However, according to FTC guidance, schools can consent on behalf of parents to the collection of students’ personal information if such information is used for a school-authorized educational purpose and for no other commercial purpose.
  • Upturn, an advocacy organization that “advances equity and justice in the design, governance, and use of technology,” has released a report showing that United States (U.S.) law enforcement agencies have multiple means of hacking into encrypted or protected smartphones. There have long been the means and vendors available in the U.S. and abroad for breaking into phones despite the claims of a number of nations like the Five Eyes (U.S., the United Kingdom, Australia, Canada, and New Zealand) that default end-to-end encryption was a growing problem that allowed those preying on children and engaged in terrorism to go undetected. In terms of possible bias, Upturn is “is supported by the Ford Foundation, the Open Society Foundations, the John D. and Catherine T. MacArthur Foundation, Luminate, the Patrick J. McGovern Foundation, and Democracy Fund.”
    • Upturn stated:
      • Every day, law enforcement agencies across the country search thousands of cellphones, typically incident to arrest. To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”
      • This report documents the widespread adoption of MDFTs by law enforcement in the United States. Based on 110 public records requests to state and local law enforcement agencies across the country, our research documents more than 2,000 agencies that have purchased these tools, in all 50 states and the District of Columbia. We found that state and local law enforcement agencies have performed hundreds of thousands of cellphone extractions since 2015, often without a warrant. To our knowledge, this is the first time that such records have been widely disclosed.
    • Upturn argued:
      • Law enforcement use these tools to investigate not only cases involving major harm, but also for graffiti, shoplifting, marijuana possession, prostitution, vandalism, car crashes, parole violations, petty theft, public intoxication, and the full gamut of drug-related offenses. Given how routine these searches are today, together with racist policing policies and practices, it’s more than likely that these technologies disparately affect and are used against communities of color.
      • We believe that MDFTs are simply too powerful in the hands of law enforcement and should not be used. But recognizing that MDFTs are already in widespread use across the country, we offer a set of preliminary recommendations that we believe can, in the short-term, help reduce the use of MDFTs. These include:
        • banning the use of consent searches of mobile devices,
        • abolishing the plain view exception for digital searches,
        • requiring easy-to-understand audit logs,
        • enacting robust data deletion and sealing requirements, and
        • requiring clear public logging of law enforcement use.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Mehmet Turgut Kirkgoz from Pixabay

Further Reading, Other Developments, and Coming Events (14 October)

Further Reading

  •  “The Man Who Speaks Softly—and Commands a Big Cyber Army” By Garrett Graff — WIRED. A profile of General Paul Nakasone, the leader of both the United States’ National Security Agency (NSA) and Cyber Command, who has operated mostly in the background during the tumultuous Trump Administration. He has likely set the template for both organizations going forward for some time. A fascinating read chock with insider details.
  • Facebook Bans Anti-Vaccination Ads, Clamping Down Again” by Mike Isaac — The New York Times. In another sign of the social media platform responding to pressure in the United States and Europe, it was announced that anti-vaccination advertisements would no longer be accepted. This follows bans on Holocaust denial and QAnon material. Of course, this newest announcement is a classic Facebook half-step. Only paid advertisements will be banned, but users can continue to post about their opposition to vaccination.
  • To Mend a Broken Internet, Create Online Parks” By Eli Pariser — WIRED. An interesting argument that a public online space maintained by the government much like parks or public libraries may be just what democracies across the globe need to roll back the tide of extremism and division.
  • QAnon is tearing families apart” By Travis Andrews — The Washington Post. This is a terrifying tour through the fallout of the QAnon conspiracy that sucks some in so deeply they are marginally connected to reality in many ways.
  • AT&T has trouble figuring out where it offers government-funded Internet” By John Brodkin — Ars Technica.  So, yeah, about all that government cash given to big telecom companies that was supposed to bring more broadband coverage. Turns out, they definitely took the cash. The broadband service has been a much more elusive thing to verify. In one example, AT&T may or may not have provided service to 133,000 households in Mississippi after receiving funds from the Federal Communications Commission (FCC). Mississippi state authorities are arguing most of the service is non-existent. AT&T is basically saying it’s all a misunderstanding.

Other Developments

  • The California Attorney General’s Office (AG) has released yet another revision of the regulations necessary to implement the “California Consumer Privacy Act” (CCPA) (AB 375) and comments are due by 28 October. Of course, if Proposition 24 passes next month, the “California Privacy Rights Act” will largely replace the CCPA, requiring the drafting of even more regulations. Nonetheless, what everyone thought was the final set of CCPA regulations took effect on 14 August, but in the notice from the Office of Administrative Law was notice that the AG had withdrawn four portions of the proposed regulations. In the new draft regulations, the AG explained:
    • Proposed section 999.306, subd. (b)(3), provides examples of how businesses that collect personal information in the course of interacting with consumers offline can provide the notice of right to opt-out of the sale of personal information through an offline method.
    • Proposed section 999.315, subd. (h), provides guidance on how a business’s methods for submitting requests to opt-out should be easy and require minimal steps. It provides illustrative examples of methods designed with the purpose or substantial effect of subverting or impairing a consumer’s choice to opt-out.
    • Proposed section 999.326, subd. (a), clarifies the proof that a business may require an authorized agent to provide, as well as what the business may require a consumer to do to verify their request.
    • Proposed section 999.332, subd. (a), clarifies that businesses subject to either section 999.330, section 999.331, or both of these sections are required to include a description of the processes set forth in those sections in their privacy policies.
  • Facebook announced an update to its “hate speech policy to prohibit any content that denies or distorts the Holocaust.” Facebook claimed:
    • Following a year of consultation with external experts, we recently banned anti-Semitic stereotypes about the collective power of Jews that often depicts them running the world or its major institutions.  
    • Today’s announcement marks another step in our effort to fight hate on our services. Our decision is supported by the well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people. According to a recent survey of adults in the US aged 18-39, almost a quarter said they believed the Holocaust was a myth, that it had been exaggerated or they weren’t sure.
  • In a 2018 interview, Facebook CEO Mark Zuckerberg asserted:
    • I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong…
    • What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed.
    • He clarified in a follow up email:
      • I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.
      • Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed. These issues are very challenging but I believe that often the best way to fight offensive bad speech is with good speech.
  • The Government Accountability Office (GAO) issued an evaluation of the Trump Administration’s 5G Strategy and found more processes and actions are needed if this plan to vault the United States (U.S.) ahead of other nations will come to fruition. Specifically, “report examines the extent to which the Administration has developed a national strategy on 5G that address our six desirable characteristics of an effective national strategy.” The GAO identified the six desirable characteristics: (1) purpose, scope, and methodology; (2) problem definition and risk assessment; (3) goals, subordinate objectives, activities, and performance measures; (4) resources, investments, and risk management; (5) organizational roles, responsibilities, and coordination; and (6) integration and implementation. However, this assessment is necessarily limited, for National Security Council staff took the highly unusual approach of not engaging with the GAO, which may be another norm broken by the Trump Administration. The GAO stated “[t]he March 2020 5G national strategy partially addresses five of our desirable characteristics of an effective national strategy and does not address one, as summarized in table 1:
    • The GAO explained:
      • According to National Telecommunications and Information Administration (NTIA) and Office of Science and Technology Policy (OSTP) officials, the 5G national strategy was intentionally written to be at a high level and as a result, it may not include all elements of our six desirable characteristics of national strategies. These officials stated that the 5G implementation plan required by the Secure 5G and Beyond Act of 2020 is expected to include specific details, not covered in the 5G national strategy, on the U.S. government’s response to 5G risks and challenges. The implementation plan is expected to align and correspond to the lines of effort in the 5G national strategy. NTIA officials told us that the implementation plan to the 5G national strategy would be finalized by the end of October 2020. However, the officials we spoke to were unable to provide details on the final content of the implementation plan such as whether the plan would include all elements of our six desirable characteristics of national strategies given that it was not final. National strategies and their implementation plans should include all elements of the six desirable characteristics to enhance their usefulness as guidance and to ensure accountability and coordinate investments. Until the administration ensures that the implementation plan includes all elements of the six desirable characteristics, the guidance the plan provides decision makers in allocating resources to address 5G risks and challenges will likely be limited.
  • The Irish Council for Civil Liberties (ICCL) wrote the European Commission (EC) to make the case the United Kingdom (UK) is not deserving of an adequacy decision after Brexit because of institutional and cultural weaknesses at the Information Commissioner’s Office (ICO). The ICCL made the case that the ICO has been one of the most ineffectual enforcers of the General Data Protection Regulation (GDPR), especially with respect to what the ICCL called the largest data infringement under the GDPR and the largest data breach of all time: Real-Time Bidding. The ICCL took the ICO to task with having not followed through on fining companies for GDPR violations and having a tiny staff dedicated to data protection and technology issues. The ICCL invoked Article 45 of the GDPR to encourage the EC to deny the UK the adequacy decision it would need in order to transfer the personal data of EU residents to the UK.
  • In an unrelated development, the Information Commissioner’s Office (ICO) wrapped up its investigation into Facebook and Cambridge Analytica and detailed its additional findings in a letter to the Digital, Culture and Media and Sport Select Committee in the House of Commons. ICO head Elizabeth Denham asserted:
    • [w]e concluded that SCL Elections Ltd and Cambridge Analytica (SCL/CA) were purchasing significant volumes of commercially available personal data (at one estimate over 130 billion data points), in the main about millions of US voters, to combine it with the Facebook derived insight information they had obtained from an academic at Cambridge University, Dr Aleksandr Kogan, and elsewhere. In the main their models were also built from ‘off the shelf’ analytical tools and there was evidence that their own staff were concerned about some of the public statements the leadership of the company were making about their impact and influence.
    • From my review of the materials recovered by the investigation I have found no further evidence to change my earlier view that SCL/CA were not involved in the EU referendum campaign in the UK -beyond some initial enquiries made by SCL/CA in relation to UKIP data in the early stages of the referendum process. This strand of work does not appear to have then been taken forward by SCL/CA
    • I have concluded my wider investigations of several organisations on both the remain and the leave side of the UK’s referendum about membership of the EU. I identified no significant breaches of the privacy and electronic marketing regulations and data protection legislation that met the threshold for formal regulatory action. Where the organisation continued in operation, I have provided advice and guidance to support better future compliance with the rules.
    • During the investigation concerns about possible Russian interference in elections globally came to the fore. As I explained to the sub-committee in April 2019, I referred details of reported possible Russia-located activity to access data linked to the investigation to the National Crime Agency. These matters fall outside the remit of the ICO. We did not find any additional evidence of Russian involvement in our analysis of material contained in the SCL / CA servers we obtained.
  • The United States Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint cybersecurity advisory regarding “recently observed advanced persistent threat (APT) actors exploiting multiple legacy vulnerabilities in combination with a newer privilege escalation vulnerability.” CISA and the FBI revealed that that these tactics have penetrated systems related to elections but claimed there has been no degrading of the integrity of electoral systems.
  • The agencies stated:
    • The commonly used tactic, known as vulnerability chaining, exploits multiple vulnerabilities in the course of a single intrusion to compromise a network or application. 
    • This recent malicious activity has often, but not exclusively, been directed at federal and state, local, tribal, and territorial (SLTT) government networks. Although it does not appear these targets are being selected because of their proximity to elections information, there may be some risk to elections information housed on government networks.
    • CISA is aware of some instances where this activity resulted in unauthorized access to elections support systems; however, CISA has no evidence to date that integrity of elections data has been compromised.
  • Canada’s Privacy Commissioner Daniel Therrien released the “2019-2020 Annual Report to Parliament on the Privacy Act and Personal Information Protection and Electronic Documents Act” and asserted:
    • Technologies have been very useful in halting the spread of COVID-19 by allowing essential activities to continue safely. They can and do serve the public good.
    • At the same time, however, they raise new privacy risks. For example, telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. E-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.
    • As the pandemic speeds up digitization, basic privacy principles that would allow us to use public health measures without jeopardizing our rights are, in some cases, best practices rather than requirements under the existing legal framework.
    • We see, for instance, that the law has not properly contemplated privacy protection in the context of public-private partnerships, nor does it mandate app developers to consider Privacy by Design, or the principles of necessity and proportionality.
    • The law is simply not up to protecting our rights in a digital environment. Risks to privacy and other rights are heightened by the fact that the pandemic is fueling rapid societal and economic transformation in a context where our laws fail to provide Canadians with effective protection.
    • In our previous annual report, we shared our vision of how best to protect the privacy rights of Canadians and called on parliamentarians to adopt rights-based privacy laws.
    • We noted that privacy is a fundamental human right (the freedom to live and develop free from surveillance). It is also a precondition for exercising other human rights, such as equality rights in an age when machines and algorithms make decisions about us, and democratic rights when technologies can thwart democratic processes.
    • Regulating privacy is essential not only to support electronic commerce and digital services; it is a matter of justice.

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The House Intelligence Committee will conduct a virtual hearing titled “Misinformation, Conspiracy Theories, and ‘Infodemics’: Stopping the Spread Online.”
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • The Senate Commerce, Science, and Transportation Committee will reportedly hold a hearing on 29 October regarding 47 U.S.C. 230 with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Thanks for your Like • donations welcome from Pixabay